Free Republic
Browse · Search
News/Activism
Topics · Post Article

Skip to comments.

On Plato, the Early Church, and Modern Science: An Eclectic Meditation
November 30, 2004 | Jean F. Drew

Posted on 11/30/2004 6:21:11 PM PST by betty boop

On Plato, the Early Church, and Modern Science: An Eclectic Meditation
By Jean F. Drew

God, purposing to make the universe most nearly like the every way perfect and fairest of intelligible beings, created one visible living being, containing within itself all living beings of the same natural order.

Thus does Plato (d. 347 B.C.) succinctly describe how all that exists is ultimately a single, living organism. At Timaeus20, he goes on to say:

“There exists: first, the unchanging form, uncreated and indestructible, admitting no modification and entering no combination … second, that which bears the same name as the form and resembles it … and third, space which is eternal and indestructible, which provides a position for everything that comes to be.”

And thus we find a description of the universe in which Being and Existence (Becoming) — the one God and the multiplicity of things — are bound together as a single living reality whose extension is mediated by Space (which for us moderns implies Time).

Our aim in this essay is to define these ideas and their relationships, and trace their historical development from the ancient world to the present. Taking a page from the late Eric Voegelin (1901–1985, philosopher of history specializing in the evolution of symbolization), we will follow a history-of-ideas approach to these issues. Along the way we will find that not only philosophy and cosmology, but also theology and even modern science can illuminate these seminal conceptions of Platonic thought. We must begin at the beginning, that is, with God — who is absolute Being in Plato’s speculation, of whom the cosmos itself is but the image (eikon) or reflection.

When Plato speaks of God (or when Aristotle does for that matter, as in e.g., Nicomachean Ethics), he is not referring to the Olympian gods, to Zeus, Hera, Athena, Poseidon, and the rest of the gang of “immortals.” For the Olympians are like man in that they are creatures of a creating God. Not only that, but they are a second generation of gods, the first having reigned in the antediluvian Age of Chronos; which is to say that the Olympians’ rule or law is not everlasting, but contingent. Thus they are not self-subsistent, but dependent (contingent) on a principle outside of themselves. We might say that the central difference between Plato’s God and the Olympians consists in the fact that the latter are “intracosmic” gods, and the former is “extracosmic,” that is, transcending all categories and conditions of space-time reality. In contrast, the intracosmic gods are subject to change, to contingency; and so, though they may truly be said to exist in some fashion, cannot be said to possess true Being. (More on these distinctions in a minute.)

It is clear that for Plato, God is the “Beyond” of the universe, or in other words, utterly transcendent, perfectly self-subsistent Being, the “uncaused cause” of all the multiplicity of existents in the universe. In yet other words we can say that, for Plato, the cosmos is a theophany, a manifestation or “presence” of the divine Idea — in Christian parlance, the Logos if I might draw that association — in the natural world.

As Wolfgang Smith notes, “Christian teaching is based upon the doctrine of the Logos, the Word of God, a term which in itself clearly suggests the idea of theophany. Moreover, what is implicit in the famous Prologue of St. John [“In the beginning was the Word, and the Word was with God, and the Word was God. The same was in the beginning with God. All things were made by him; and without him was not any thing made that was made. In him was life; and the life was the light of men. And the light shineth in darkness; and the darkness comprehended it not.” (John 1:1–5)] is openly affirmed by St. Paul when he declares that “the invisible things of Him from the creation of the world have been clearly seen, being understood by the things that are made, even His power and Godhead” (Rom. 1:20) … The indisputable fact is that at its deepest level Christianity perceives the cosmos as a self-revelation of God.” [Wolfgang Smith, Cosmos and Transcendence, 1984]

Being and Existence (Becoming)
Being is a concept so difficult that it comes close to eluding our grasp altogether. It is utterly beyond space and time; imperishable; entirely self-subsistent, needing nothing from outside itself in order to be complete; essential; immutable; and eternally perduring. Contrast this with the concept of existence, regarding which Plato asks “how can that which is never in the same state be anything?” And this is the clue to the profound difference between being and existence: The existing things of this world are mutable and transient.

We must in my opinion begin by distinguishing between that which always is and never becomes from that which is always becoming but never is. The one is apprehensible by intelligence with the aid of reasoning, being eternally the same, the other is the object of opinion and irrational sensation, coming to be and ceasing to be, but never fully real. In addition, everything that becomes or changes must do so owing to some cause; for nothing can come to be without a cause. [Timaeus, 3:28]

Smith writes of the existing or “becoming” things that

“… they come upon the scene, we know not from whence; they grow, change, and decay; and at last they disappear, to be seen no more. The physical cosmos itself, we are told, is a case in point: it, too, has made its appearance, perhaps some twenty billion years ago, and will eventually cease to exist [i.e., finally succumbing, we are told, to thermodynamic entropy or “heat death”]. What is more, even now, at this very moment, all things are passing away. ‘Dead is the man of yesterday,’ wrote Plutarch, ‘for he dies into the man of today: and the man of today is dying into the man of tomorrow.’ Indeed, ‘to be in time’ is a sure symptom of mortality. It is indicative, not of being, but of becoming, of ceaseless flux.”

All the multiplicity of existents in the universe are in a state of becoming and passing away. But Plato’s great insight is that all things in the state of becoming — that is, all existing things — are whatever they are because they are participations in Being. That is to say, “we perceive the trace of being in all that exists,” writes Smith, “and that is why we say, with reference to any particular thing, that it is.” Existence, in other words, is contingent on Being.

But we wonder: In what way is this possible? And if existents participate in being, what is that Being in which they participate?

In Exodus 3:14 Moses has experienced a theophany: While tending his flock on Mount Horeb, suddenly he hears the voice of God issuing from a burning bush: God is speaking to him! Reverentially, Moses inquires of God what is His name (meaning: what is His nature or character).

And God said unto Moses, I AM WHO AM: and He said, Thus shalt thou say unto the children of Israel, I AM hath sent me unto you.

God has told Moses: that He is Being (“I AM”). And the strong implication is that there is no “other” being: “I alone AM.” For “I” is plainly singular in form.

Smith draws the crucial point, “God alone IS. But how are we to understand this? ‘It seems to me,’ writes St. Gregory of Nyssa, ‘that at the time the great Moses was instructed in the theophany he came to know that none of those things which are apprehended by sense perception and contemplated by the understanding really subsist, but that the transcendent essence and cause of the universe, on which everything depends, alone subsists.’ But why? Does not the world exist? Are there not myriads of stars and galaxies and particles of dust, each existing in its own right? And yet we are told that the transcendent essence alone subsists. ‘For even if the understanding looks upon any other existing things,’ the great theologian goes on to say, ‘reason observes in absolutely none of them the self-sufficiency by which they could exist without participating in true Being. On the other hand, that which is always the same, neither increasing nor diminishing, immutable to all change whether to better or to worse (for it is far removed from the inferior and has no superior), standing in need of nothing else, alone desirable, participated in by all but not lessened by their participation — this is truly real Being.’”

Smith continues: “In the words of St. Gregory, ‘that which is always the same, neither increasing nor diminishing, immutable to all change … is truly real being.’ As concerns ‘existing things,’ on the other hand, the teaching implies that these entities are always changing, always in a state of flux, so that their very existence is in a way a process of becoming, in which however nothing is actually produced. This has been said time and again, beginning with Heraclitus and the Buddhist philosophers. And there can be little doubt that it is true: even modern physics, as we can see, points to the same conclusion. Only there is another side to the coin which is not always recognized. Existent things — the very flux itself — presuppose what Gregory and the Platonists have termed ‘a participation in Being.’ The point is that relative or contingent existences cannot stand alone. They have not an independent existence, a being of their own. ‘In Him we live, and move, and have our being,’ says St. Paul….”

St. Augustine confirms the Platonic insight this way:

I beheld these others beneath Thee, and saw that they neither altogether are, nor altogether are not. An existence they have, because they are from Thee; and yet no existence, because they are not what Thou art. For only that really is, that remains unchangeably.

Space
Space is the third essential term of the Platonic cosmology: It is the matrix in which living things and all other existents participate in Being. Plato’s creation myth — the Myth of the Demiurge in Timaeus — elucidates the Platonic conception of Space.

For Plato, the God of the Beyond is so “beyond” that, when it came time for creating the Cosmos, he didn’t even do it himself. He sent an agent: the Demiurge, a mythical being endued by God to be in divine likeness of God’s own perfect love, truth, beauty, justice, and goodness. The embodiment of divine perfections, the Demiurge wishes to create creatures just as good and beautiful as himself, according to the standard of the divine Idea — a direct analog, it seems to me, of the Logos theory of the ancient Church. Indeed, Eric Voegelin sees in the Demiurge the symbol of Incarnation [Order and History Vol. 3: Plato and Aristotle, 1957]:

“The Demiurge is the symbol of Incarnation, understood not as the result of the process but as the process itself, as the permanent tension in reality between the taxis of form or idea and the ataxia of formlessness.”

Similarly to the Christian account, the Demiurge in a certain way creates ex nihilo — that is, out of Nothing. At first glance, Plato is seen specifying, not a pre-existing “material” but a universal field of pure possibility called Chora, “Space.” Perhaps we may find in this concept a strong analogy to Isaac Newton’s concept of Absolute Space (see below).

Chora seems to indicate the idea of an eternal, universal field of pure stochastic potentiality that needs to become “activated” in order to bring actual beings into existence. In itself, it is No-thing, i.e., “nothing.” This “activation” the Demiurge may not effect by fiat: He does not, for instance, “command” to “Let there be Light!” The main tool at his disposal is Peitho, “persuasion.”

And if Chora is not so persuaded, it will remain in a state of “nothingness.” It will remain unformed, in the condition of ataxia. Of itself it is “Nothing”; by itself, it can do nothing. It cannot generate anything out of itself, not even matter in primaeval form.

And thus Plato introduces the figure of the Demiurge into his creation myth, symbolizing form or idea — the principle of (formative) taxia that draws (formless) ataxia into existence. We moderns might be tempted to describe the Demiurge as constituting an “information set” together with an “energy source,” who “persuades” the pure stochastic potentiality of formless, absolute, empty space into actualized form, and thus existence. From the cosmic standpoint, he makes unity out of multiplicity, in harmony and geometrical proportion:

“The best bond is the one that effects the closest unity between itself and the terms it is combining; and this is best done by a continued geometrical proportion.” [Timaeus, 4]

Thus the Demiurge is a kind of “divine geometer,” producing the forms (or mathematical ideas) that Chora can be persuaded to conform to, and thus come into existence.

But the Demiurge does more than just get things started: As bearer of the divine Idea — as pure love and beauty and goodness and truth — he continues always persuading Chora to generate creatures as like himself as possible (i.e., reflecting his own divine qualities at whatever generic stage), throughout all eternity. Thus creation is a continuous process in space-time. Moreover, it is the source and driver of evolution as a universal natural process.

Through the ongoing activity of the Demiurge, men and the world are constantly being informed and renewed by the divine Idea; and thus a unified cosmic whole, a “One Cosmos,” a universal order comes into being at the intersection of time and timelessness, of immanent and transcendent reality, in the medium of Space (and Time).

Compare the Platonic creation myth with the philosophy of Dionysius the [Pseudo-]Areopagite, said to be the Greek converted by St. Paul in Acts, 17:34. For Dionyius, the “names of God” — the divine qualities — are goodness, being, life, wisdom, power, and justice. Joseph Stiglmayr writes [Cath. Encycl. at the entry for Dionysius the Pseudo-Areopagite], that for Dionysius, God is

“… the One Being (to hen), transcending all quality and predication, all affirmation and negation, and all intellectual conception, [Who] by the very force of His love and goodness gives to beings outside Himself their countless gradations, unites them in the closest bonds (proodos), keeps each by His care and direction in its appointed sphere, and draws them again in an ascending order to Himself (epistrophe) … all created things [proceed] from God by the exuberance of being in the Godhead (to hyperpleres), its outpouring and overflowing … and as a flashing forth from the sun of the Deity. Exactly according to their physical nature created things absorb more or less the radiated light, which, however, grows weaker the farther it descends. As the mighty root sends forth a multitude of plants which it sustains and controls, so created things owe their origin and conservation to the All-Ruling Deity…. Patterned upon the original of Divine love, righteousness, and peace, is the harmony that pervades the universe…. All things tend to God, and in Him are merged and completed, just as the circle returns into itself, as the radii are joined at the centre, or as the numbers are contained in unity.”

The Platonic resonances seem unmistakeable in these lines. It appears that both Platonic speculation and the Logos doctrine of the ancient Church as articulated by Dionysius are in agreement that Creator must be “beyond” Creation in order to resonate with it — which resonance is what makes the universe to be alive — i.e., a living universe.

C. A. Dubrey points out [Cath. Encycl. at the entry “Teleology”], that the theology of St. Thomas Aquinas makes it clear that, “Intrinsic finality [we are to think of this as a blend or merger of efficient and final causes in the Aristotelian sense] consists in the fact that every being has within itself a natural tendency whereby its activity is directed towards the perfection of its own nature…. St. Thomas does not hesitate to speak of ‘natural appetite,’ ‘natural inclination,’ and even ‘intention of nature,’ [we moderns might be tempted to add ‘instinct’ to this list] to mean that every being has within itself a directive principle of activity. Accordingly, God does not direct creatures to their ends from outside, but through their own nature…. The Divine plan of creation is carried out by the various beings themselves acting in conformity with their nature.

When, however, this finality is called immanent, this expression must not be understood in a pantheistic sense, as if the intelligence which the world manifests were to be identified with the world itself, but in the sense that the immediate principle of finality is immanent in every being…. Thus the unconscious finality in the world leads to the conclusion that there must be an intelligent cause of the world.” [Emphasis added.]

Aquinas’ insight, and also Plato’s, evokes a reconsideration of Isaac Newton’s concept of Absolute Space. Possibly this may be understood in the following terms. First, Absolute Space is “empty” space. Second, it is not a property of God, but an effect of His Presence; i.e., we advert to theophany again. The question then arises, in what “where” or “when” does this theophany take place? Perhaps Newton’s answer would be: In the beginning, and continuously thereafter. Second, it has been suggested that Newton intends us to understand Absolute Space as the sensorium Dei: “God constitutes space and time through his eternity and omnipresence” [ existendo semper et ubique, durationem et spatium consitutit: Philosophiae Naturalis Principia Mathematica, 3d ed., 1726]. Wolfhart Pannenberg writes,

“Now there are a number of good reasons — suggested by both philosophical and scientific thought — to consider time and space as inseparable. Einstein’s field concept comprises space, time, and energy. It takes the form of a geometrical description, and this seems to amount to a spatialization of time. The totality of space, time, and energy or force are all properties of a cosmic field.

“Long before our own age a theological interpretation of this subject matter had been proposed, and it was Isaac Newton who offered this proposal. It too referred everything to space or, more precisely, to the correlation of force as in the case of a force like gravitation acting at a distance. Newton’s well-known conception of space as sensory of God (sensorium Dei) did not intend to ascribe to God an organ of sense perception, the like of which God does not need, according to Newton, because of divine omnipresence. Rather, Newton took space as the medium of God’s creative presence at the finite place of his creatures in creating them.” [Wolfhart Pannenberg, Toward a Theology of Nature, 1993]

Thus the infinite takes priority over every finite experience, including intellectual experience — a position decisively argued by Descartes, as Pannenberg avers, “in his thesis that the idea of God is a prior condition in the human mind for the possibility of any other idea, even that of the ego itself.”

* * * * * *

The Influence of Platonic Speculation on the Early History of the Church
D. Edmund Joaquin, an insightful and gracious Christian friend, writes, “We understand that the universe is created and sustained by the Word [the Logos], and not only that, but by the Word sounding. God sustains the universe consciously and actively. He has not gone away and left us. In fact, He reveals Himself to us, and His final revelation is in the person of Christ [the Logos]. Christ is not an abstract aspect of God, like wisdom. He is God. He is God incarnating in the world that He himself has made.”

Joaquin further observes that “[the Gospel of] John is written to the Greeks and put into words that they could understand.” It seems there’s a mystery buried in here somewhere. Consider: Socrates was the teacher of Plato, who was the teacher of Aristotle, who was the teacher of Alexander — and Alexander spread Greek culture throughout Eurasia, the Middle East, and the Indian subcontinent. Add to this the fact that the great evangelist, St. Paul, had some difficulty converting the Jews to the Christian faith; but he converted the Greeks in droves. Not only St. John, but also St. Paul speaks in terms the Greek mind could readily grasp, as when he says God is He “in Whom we live and move and have our being.” These historical connections do not appear to be accidental, coincidental, nor incidental to the spread of the early Christian Church.

According to The Catholic Encyclopedia, the Greeks strongly responded to Christianity for its moral beauty as well as its truth. A case in point is St. Justin Martyr. He was a man of Greek culture, born in Palestinian Syria about the year 100 A.D, who converted to the faith around 130 A.D. Justin became one of Christianity’s earliest and most powerful apologists, and ended up condemned by the Roman authority for refusing to sacrifice to the pagan gods, for which offense he was summarily executed by the Imperium, along with several other of his “refusnik” co-religionists. The official record of their martyrdom is extant:

“The Prefect Rusticus says: Approach and sacrifice, all of you, to the gods. Justin says: No one in his right mind gives up piety for impiety. The Prefect Rusticus says: If you do not obey, you will be tortured without mercy. Justin replies: That is our desire, to be tortured for Our Lord Jesus, and so to be saved, for that will give us salvation and firm confidence at the more terrible universal tribunal of Our Lord and Saviour. And all the martyrs said: Do as you wish; for we are Christians, and we do not sacrifice to idols. The Prefect Rusticus read the sentence: Those who do not wish to sacrifice to the gods and to obey the emperor will be scourged and beheaded according to the laws. The holy martyrs glorifying God betook themselves to the customary place, where they were beheaded and consummated their martyrdom confessing their Saviour.”

Jules Lebreton writes (at the entry for St. Justin Martyr in Cath. Encycl.) “Justin tries to trace a real bond between philosophy and Christianity: according to him, both one and the other have a part in the Logos, partially disseminated among men and wholly manifest in Jesus Christ.”

Yet for all their apparent similarities and resemblances in many respects, there is a profound difference between Platonic insight and the Christian one: and this pertains to the relations between God and man.

Both Plato and Justin proclaim the transcendent God. Yet for Plato, God is so “beyond” as to be almost impossible of human grasp. Yet Plato felt the “divine pulls” in his own nature. These Plato thought could be accounted for and articulated by an act of pure unaided intellect, that is by nous, in a state of intense contemplation.

Contrast this position with Justin Martyr’s, who insisted that human wisdom was impossible without the testimony of the Prophets (whom God himself had informed and instructed) and the action of the Holy Spirit. For Plato, man’s relations with God consist of operations of the mind. For Justin, they are operations of the heart, of the Spirit. For Justin, God is not a mental abstraction: He is real Personality with whom one can have direct personal relations, in the Spirit.

A later writer, John Scotus Eriugina (ninth century) elaborates the Justinian position, in the process noting that there is a “downward tendency” of the soul towards the conditions of animal existence, and that this has only one remedy: Divine grace, the free gift of the Holy Spirit. “By means of this heavenly gift,” writes William Turner [at the entry for Scotus in the Catholic Encyclopedia], “man is enabled to rise superior to the needs of the sensuous body, to place the demands of reason above those of bodily appetite, and from reason to ascend through contemplation to ideas, and thence by intuition to God Himself.”

The pull of animal nature is an idea we also find in Plato, and also the countervailing pull from the divine Beyond. Man lives in the metaxy, in the “in-between reality” constituted by the two. Man’s task is to resolve this tension, and establish the proper balance that expresses the highest and best development of his human nature. But man must do this entirely by himself by means of nous or reason. There is no spiritual help “extra” to the human psyche available to facilitate this process.

In contrast, as Lebreton points out, Justin Martyr

“…admits that the soul can naturally comprehend what God is, just as it understands that virtue is beautiful … but he denies that the soul without the assistance of the Holy Ghost [Spirit] can see God or contemplate him directly through ecstasy, as the Platonic philosophers contended. And yet this knowledge of God is necessary for us: ‘We cannot know God as we know music, arithmetic, or astronomy’; it is necessary for us to know God not with an abstract knowledge but as we know any person with whom we have relations. The problem which it seems impossible to solve is settled by revelation; God has spoken directly to the Prophets, who in their turn have made Him known to us…. It is the first time in Christian theology that we find so concise an explanation of the difference that separates Christian revelation from human speculation.” [Emphasis added]

* * * * * *

Natural Law, Contingency, and the Scientific Method
The Platonic model encourages us to recognize that the universe is zoon empsychon ennoun, a living creature endowed with soul and intelligence. The myth of the Demiurge describes the world process as a type of incarnation, a dynamic relation of absolute being and contingent becoming evolving in space and time in a manner expressing a perduring taxia–ataxia relation. The Cosmos itself — the totality of all existing things — like its constituents, for example man and even the stars, is an eikon of being-in-becoming, a reflection or image of the divine Idea. Time itself is but a “moving image of eternity.” The life of the cosmos is wholly dependent, contingent on the Idea from which it manifests.

It is a lawful, orderly universe, yet one in which new occurrences are always arising. These new events are coming from, as it were, a “sea of contingency” analogous to Plato’s conception of Space, that is Chora — the infinite field of unformed, pure potentiality.

The immediately foregoing ideas, of course, are not scientific ones strictly speaking. Still, there are elements here that perhaps science would do well to consider, in order to maintain the integrity of its own method. For one thing, it seems science itself, in its disclosure of the regularities of nature, seems to have an in-built tendency to overlook contingency. We may define an event as contingent if a description of it is neither self-evident nor necessary, “if it could have happened differently,” as Ted Peters puts it in his Preface to Pannenberg’s Towards a Theology of Nature.

C. A. Dubray writes [“Teleology,” Cath. Encycl.], “The fact that the world is governed by laws, far from giving any support to the mechanistic conception, is rather opposed to it. A law is not a cause, but the expression of the constant manner in which causes produce their effects.” In other words, natural laws are expressions of observable regularities that occur in the world of existent phenomena in ordinary space-time reality. Thus, the laws themselves have no force as “causes”: they are descriptions.

Yet the focus on regularity inevitably masks the particularity and contingency of unique events. As Ted Peters notes, it is here that “we run into a problem of focus in the scientific community, because virtually all the theoretical attention is given to the regularity of nature’s laws, while the contingency of natural events slips into the nearly invisible background.” Peters continues:

“What researchers concentrate on are the uniformities that can be expressed in timeless equations. A dictionary of equations describing these uniformities allegedly constitutes scientific knowledge…. A closer examination, however, reveals that the applicability of these equations to concrete cases of natural processes requires certain initial and marginal conditions, conditions that in every case are contingent. Only when contingent conditions permit can we expect a natural law to operate as expected.”

To the extent that the scientific method of inquiry is premised on an “If/Then” logical construction — which seems ever to be the case — the method itself is an exercise in contingency, yet nonetheless one in which “Determinacy gets thematized, whereas contingency gets ignored.” Arguably this is a serious bias having epistemological implications; for e.g., “if the laws of classical dynamics are in principle temporally reversible, the actual course of natural events from which those laws have been abstracted is not. The reality of nature is first and foremost a historical reality.”

Pannenberg suggests a corrective for this “bias,” acknowledging: “That modern science so easily lends itself to abuse cannot be prevented in principle. It is one of the risks involved in the abstract study of regularities that either are inherent in nature itself or can be imposed on natural processes [e.g., as in ideological, technical, or engineering solutions]. This risk cannot be met on the level of scientific description itself but must be met first on the level of philosophical reflection on the work of science. It is on this level that the abstract form of scientific description must be considered with special attention to what it is “abstracted from” and what is methodically disregarded in the abstract formulas of science.”

And so contingent conditions — i.e, initial and boundary conditions — must be restored to their proper place in our deliberations, for they “are required for any formula of natural law to be applied. They are contingent at least in that they cannot be derived from the particular formula of law under consideration.… The mathematical formula of a natural law may be valid without regard to time. The physical regularity that is described by such a formula is not independent of time and temporal sequence. But it is only that physical regularity which makes the mathematical formula a law of nature. This suggests that the laws of nature are not eternal or atemporal because the fields of their application, the regularities of natural processes, originate in the course of time. Thus it also becomes understandable that new patterns of regularity emerging in the sequence of time constitute a field of application for a new set of natural laws….”

We may recognize that the total process of natural events presents itself to observation as a mesh of contingency and regularities. It is the task of science to pursue thematically the aspect of regularity. But, asks Pannenberg, can science “ever succeed in bringing into view the entirety of nature as determined in all details by a number of laws that are in any case not infinitely complex? This would mean at the same time that a stage of research is conceivable from which nothing more could be discovered. Many natural scientists have had this nightmare because of the successes of their own research. Fortunately it probably is not a truthful dream.”

For, says Pannenberg, “laws always uncover what is necessary superimposed on what is contingent. Given the undeniable contingency of occurrences in natural events, can we recognize in their special character as occurrences … [that] regularity as their own element in such a way that the presence of regularity can be thought together with the contingency of occurrences, not only under abstraction from the contingency of occurrences?” [Emphasis added]

Which is why Pannenberg advocates an opening up of new viewpoints in scientific research, “not because physical hypotheses or insights can be derived from them but because they open up and enlarge the intellectual space on which the formation of physical hypotheses depends…. In physics also, horizons of questioning have to be opened up first of all in order that hypotheses that arise in them can be examined by experiment and classified theoretically.”

Perhaps we need a greater appreciation of the “fitness” of the scientific method to engage the truly great questions of life, which ever seem to involve the relations of law and contingency. Leibniz propounds two great questions of perennial interest to the human mind: (1) Why are things the way they are and not some other way? (2) Why does anything exist at all?

Such questions, scientists will readily tell you, are beyond the purview of the scientific method. But does that mean such questions have no force or meaning such that they should not be asked at all?

Perhaps the incapability of the scientific method to answer such questions owes to the fact that all the great physical laws are acknowledged to be time-reversible; but we know that existence in space and time is not a time-reversible process. As Pannenberg states, it is a historical process. We might even say it is an evolutionary process.

Which suggests an analogy that might enlighten these questions, sharpen their meanings, and suggest additional questions: an analogy to direct human experience. Pannenberg writes of human beings, who do seem to live in a “time-irreversible,” that is “historical” process:

“Human beings never live only in the now. Rather, they experience their present as heirs of the past and as its active change. They anticipate the future in fear, hope, and planning; and in the light of such anticipation of the future they return to their present and the heritage of their past. The fact that we know of historical continuity is at least also conditioned by this peculiarity of human experience with time. If there is a new event, then it modifies the context of our consciousness of time which is already found present. It throws light back on earlier occurrences which have become a part of our experience already. In the same way, ideas that occur to us throw light on our previous expectations and plans in justifying, fulfilling, modifying, or disappointing and thwarting them. Thus the contingent event always enters already into a context of experience or tradition…. The future, beginning in the present happenings, is thus the origin of the perspective in which the past occurrences are put by every new experience.”

Worldviews and Paradigm Shifts
It is perhaps a truism that we tend to find what we’re looking for by screening out any and all potential elements which do not fit the pattern of our expectation. Arguably, the scientific method may be said inherently to suffer exposure to potential danger from this side, as suggested in the above remarks. Indeed, Schröedinger’s theory of wavefunction seems to predict this. Consider these remarks from Stephen M. Barr [Modern Physics and Ancient Faith, 2003]:

“In quantum theory, as traditionally formulated, there are ‘systems’ and ‘observers.’ Or rather, in any particular case, there is the system and the observer. The observer makes measurements of the system. As long as the system is undisturbed by external influences (that is, as long as it is ‘isolated’), its wavefunction — which is to say its probability amplitudes — will evolve in time by the Schröedinger equation…. However, when a measurement is made of the system the observer must obtain a definite outcome. Suddenly, the probability for the outcome that is actually obtained is no longer what the mathematics said it was just before the measurement, but jumps to 100 percent. And the probabilities for all the alternative outcomes, the ones that did not occur, fall to 0 percent.”

Thus we might say that the “reality” we humans experience ever involves “a moving goal-post.” And as the mover of this goal-post, the human agent is most indispensably involved in this process.

Faced with such “indeterminacy” regarding the foundations of experience, it is not surprising that people usually have recourse to mediating worldviews, or organized frames of ideational reality that constitute the conceptual space in which active experience is engaged and accordingly analyzed and interpreted. Certainly Plato has offered such a model. And so has Nobel laureate Jacques Monod [in Chance and Necessity, 1971]:

“Chance alone is the source of every innovation, of all creation in the biosphere. Pure chance, absolutely free but blind, is at the very root of the stupendous edifice of evolution. The central concept of biology … is today the sole conceivable hypothesis, the only one compatible with observed and tested fact. All forms of life are the product of chance….”

Needless to say, these two models are polar opposite conceptualizations. Yet having received each on “good authority,” which do we choose?

Such are not idle considerations; for as James Hannam points out [“The Development of Scientific and Religious Ideas,” 2003], “grand theories … often suffer death by detail where it is found that up close the situation is too complicated for the theory to handle…. [Yet] in the end, after it has changed the course of the river of enquiry, the theory can end up as a mortlake cut off from the general flow….”

Hannam cites historian Thomas Kuhn, who documents an historical process he terms “paradigm shift,” describing a situation in which the findings of authoritative science move “out of science and into practically every other field of human endeavor.” Once a given, albeit partial or even defective theory becomes “dominant,” writes Hannam, “far from being thrown out, a falsified theory is enhanced to deal with new information until such time as it finally collapses under the weight of anomalous results. Then, after a chaotic period, a new theory emerges that can deal with the anomalies and normal service resumes…. A paradigm refers to but one field, say classical mechanics or health policy whereas the ideology/worldview is the general background that underpins all the paradigms.”

The worldview (or ideology, if you prefer), for better or worse, implicitly shapes the background knowledge of thinking agents to which new experiences constantly are being conformed. Hannam says that worldview “is often so deeply embedded in the psyche that it is very rarely considered explicitly except by specialists,” but that nonetheless, “the worldview is seen as [a] self-confirming fact of life and hence it is not strictly rational…. The existence of a dominant worldview does not mean that a particular individual is unable to think outside the box but rather that his ideas are unlikely to fall on fertile ground. Unless new ideas can be stated in a language that makes them comprehensible to his peers, his intention in writing will not be met.”

Which is the not-too-subtle way to put the fact that every man has a worldview, without exception, whether articulate or inarticulate; and that somehow, for the “intention of writing to be met” — that is, for accurate and meaningful (i.e., successful) communication of ideas to take place — some deeper, common ground of shared truth must first be accessed, for the purpose of providing a more capacious intellectual space in which the human pursuit of knowledge and wisdom might unfold or evolve from its present point of attainment.

But where today in our modern world is such a common ground or field to be found? Hannam proposes the examination of the history of ideas as a possibly useful method in the search for common ground. He writes,

“To examine the history of ideas the only fair way to proceed would seem to place before ourselves the evidence and authority that the historical agents had before them and assume they acted rationally on that basis. Otherwise, there is no hope of ever tracing intellectual development because ‘cause and effect’ assumes some sort of logical causality that is impossible with non-rational agents. The best that could be hoped for would be a catalog of mental positions, with no way to say how one led to another except by being pushed by blind exterior forces. This might be precisely what determinists are advocating but they would have to give up any hope of finding causes and restrict themselves to explanations.”

Perhaps we moderns would do well to reconsider the common assumption that people living before our own time were somehow inferior in knowledge, experience, and observational powers as compared with our own status as enlightened individuals. Arguably, the ancient world produced some of the most powerful thinkers in the history of mankind, formulating ideas that were, in the words of Hannam, “the fruits of unfettered metaphysical speculation that inevitably hits on the right answer occasionally.”

Democritus, for example, proposed a theory predicting the atom as the ultimate constituent of matter, more than two-thousand years before the technical means existed to isolate atoms experimentally or, as Hannam notes, any “useful applications for them” could be found. Then it was discovered that the atom itself is an ordered constellation of even finer parts. There seems to be an historical progression of ideas here, the new building up on a framework originally laid up in the past, modifying it, improving on it in light of new insights and technical capabilities.

Hannam gives another example of more recent vintage: “Copernicus needed Nicole Oresme’s solution as to why we do not feel the movement of the Earth even though in Oresme’s time it was just a curiosity as no one thought the Earth actually was moving … each new idea, once accepted, shifts the boundaries of the worldview and makes it possible for further new ideas to be accepted into the pale.”

We can extend the examples even further. Reimann constructed a geometry, apparently because his mind could grasp the logic and beauty it revealed for its own sake. But at the time, it had no apparent “external referent” in the field of nature. It was a beautiful and glorious abstraction — until Einstein came along, and picked it up “off the shelf” as it were, to become the very language of relativity theory.

Thus it might be said that the evolution or “progress” of science depends on successive enlargements of the conceptual space it requires to do its work. In other words, science inherently is a participation in the historicity of the world.

Whatever our personal worldview, perhaps it would be well to recall that science is an historical process. Perhaps this understanding could open up additional, needed conceptual space that science itself requires in order to advance.


TOPICS: Philosophy
KEYWORDS: aquinas; augustine; christianity; churchhistory; contingency; cosmology; epistemology; justinmartyr; metaphysics; newton; ontology; plato; quantumfieldtheory; relativitytheory; schroedinger; spacetime; theology
Navigation: use the links below to view more comments.
first previous 1-20 ... 441-460461-480481-500 ... 921-935 next last
To: Doctor Stochastic; PatrickHenry; betty boop; tortoise; marron; cornelis; StJacques
Thank you for your replies!

"Message capacity" is descriptive and more nearly accurately captures the concept. It's a bit bulky. "Messcap" sounds like an article of clothing though.

Another possibility is just use "Anzeigekapazität" or "Anzkap" (keeping with the German tendency to abbreaviate) rather than "information." The French could put it on the front of a camera as a l'Anzkap.

LOLOL! But, er, if we combine or concatenate the terms message and capacity (channel capacity) - we would be intermingling two concepts of Shannon-Weaver and omitting encoding/decoding and the communication itself, the decrease of entropy in the receiver.

461 posted on 01/04/2005 8:31:14 PM PST by Alamo-Girl (Please donate monthly to Free Republic!)
[ Post Reply | Private Reply | To 451 | View Replies]

To: PatrickHenry; Doctor Stochastic; betty boop; StJacques; tortoise; cornelis; marron
Thank you for your reply!

Think about the term "exchange" (which is used in physics) or maybe "transfer" instead of communication. Think about "status" or "condition" instead of information content. I donno. The whole field reeks with sloppy, and thus potentially misleading terminology. Gives me a brain-ache.

LOLOLOL! But, er, Shannon's theory is titled A Mathematical Theory of Communication. So far we have been using terms from his theory very precisely.

IMHO, if we go with common words such as exchange or transfer there is a risk of additional confusion. If we fabricate new terms (like BioComm which I suggested at post post 448 or some German moniker like Doctor Staochastic suggested at post 453 to equate to "information [Shannon, reduction of uncertainty in the receiver"]") - then we are free to define it. IOW, Lurkers won't read the term thinking they already know what it means.

462 posted on 01/04/2005 8:43:12 PM PST by Alamo-Girl (Please donate monthly to Free Republic!)
[ Post Reply | Private Reply | To 452 | View Replies]

To: betty boop
Thank you so very much for your reply!

Just a sketch: the universe evolves as "a population of one"; what we typically think of as "Darwinist theory" is helpless before this postulate. Nothing within the Darwinist purview can come remotely close to explaining such a conception.

The new theory you have found concerning the role of entropy in living systems sounds wonderfully engaging. I’m looking forward to hearing more about it. I hope you will bring it to bear right away!

This discussion is yours, betty boop, not mine. I’ve only been trying to stir the abiogenesis sidebar to the issues which are at the root of the difference between life and non-life so that we can have a more thorough analysis “on the record” so to speak.

Entropy is on the table – both information entropy and thermodynamic entropy – and how, in molecular machines, the reduction of entropy in the receiver increases thermodynamic entropy in the local surroundings. As usual, you are evidently in the big picture of the forest while I am still standing at a tree with a magnifying glass. So, please, do tell us more!

I’m so glad you have found the discussion of Kolmogorov complexity and negative entropy to be helpful! Your enthusiasm over new directions is contagious. I’ll ever appreciate the lead you gave to the possibility of additional temporal dimensions!

May God always bless you, too, my dear sister. And I join with you in asking for a blessing for all who are reading this discussion.

463 posted on 01/04/2005 8:58:38 PM PST by Alamo-Girl (Please donate monthly to Free Republic!)
[ Post Reply | Private Reply | To 458 | View Replies]

To: betty boop

You're welcome!


464 posted on 01/04/2005 9:11:35 PM PST by Matchett-PI (Today's DemocRATS are either religious moral relativists, libertines or anarchists.)
[ Post Reply | Private Reply | To 460 | View Replies]

To: betty boop
And also for earlier pointing out that "negative entropy" is a no-go term. Apparently, entropy is always positive, in that its "natural habit," so to speak, is to increase and spread. To say it can have a "negative" value would seem to take us clear out of the second law of thermodynamics.

The problem with entropy being negative has more to do with it being a length or "magnitude", as in a vector. It no more makes sense to have negative entropy than a length of negative three inches. There are quite a few measures like this.

I actually tried to think about negative entropy from a proper mathematical perspective and my brain core dumped. Please excuse me while I locate all the pieces.

465 posted on 01/04/2005 9:15:00 PM PST by tortoise (All these moments lost in time, like tears in the rain.)
[ Post Reply | Private Reply | To 458 | View Replies]

To: tortoise

However, one can have negative temperatures (statistical mechanics version). They do not lead to negative entropies though.


466 posted on 01/04/2005 9:17:46 PM PST by Doctor Stochastic (Vegetabilisch = chaotisch is der Charakter der Modernen. - Friedrich Schlegel)
[ Post Reply | Private Reply | To 465 | View Replies]

To: betty boop
And I think PatrickHenry is right to point out that thermodynamic entropy most closely bears on our problem of trying to figure out the nature and source of biological "information."

Yes! This is the direction it needs to be approached from, a transaction theoretic view. I think most people chasing these threads "get" static Shannon information at this point but the missing piece, and arguably the more important piece, is the notion of transactional information. Doubly so in effectively finite systems. It is not an easy concept, and I have never really seen it successfully communicated to people that did not already figure it out the hard way (my own best efforts included).

As I said in a prior post, once you figure out why some arbitrary system of a certain class must express transactional entropy (like thermodynamics), a huge chunk of theory will fall into place. It just is not very easy to explain, certainly not in a relatively short amount of text, though a background in computational theory helps quite a bit.

467 posted on 01/04/2005 9:25:49 PM PST by tortoise (All these moments lost in time, like tears in the rain.)
[ Post Reply | Private Reply | To 458 | View Replies]

To: Doctor Stochastic
However, one can have negative temperatures (statistical mechanics version).

Yeah, but this is a general feature of statistical models of a lot of things -- you can "borrow" locally into some strange negatives as long as it disappears in the long-term integration. Which is kind of neat in some ways.

Me, I'm stuck in the computational theoretic view. The only statistical models we like are Bayesian (zzzzz....).

468 posted on 01/04/2005 9:38:01 PM PST by tortoise (All these moments lost in time, like tears in the rain.)
[ Post Reply | Private Reply | To 466 | View Replies]

To: tortoise

These negative temperatures don't disappear though. They're real. Just write the partition function: Nj=exp(-Ej/kT) where Nj is the number of elements with energy Ej and T is the temperature (and k is Boltzmann's constant. Then if there are more elements in an excited state than in a lower state, the temperature is negative. This happens in a laser. (Continuous energies work the same way; let Nj be a density.)

Now I guess I'll have to look at what the (informational) entropy of such a distribution looks like. (Probably nothing special; after all, lasers work as expected.)

Temperature is a weird quantity sometimes. I once worked with mono-energetic molecular beams (all the molecules moved with the same velocity but with a very narrow spread.) Two such beams could be crossed to get reactions. What is the "temperature" of each beam and what happens in the reaction zone?


469 posted on 01/04/2005 10:04:40 PM PST by Doctor Stochastic (Vegetabilisch = chaotisch is der Charakter der Modernen. - Friedrich Schlegel)
[ Post Reply | Private Reply | To 468 | View Replies]

To: tortoise; betty boop; Doctor Stochastic; PatrickHenry; cornelis; marron; StJacques
Thank you so much for your reply! I sure wish you had more time to contribute to the discussion since you are a wealth of insight into Kolmogorov complexity, algorithmic information theory and so much more.

Molecular biology and molbecular machinery is well outside the parametric space in which the Shannon-Weaver assumptions give good results.

A “drive-by” statement like this cannot be useful to us without your participation. If you seek to erase the entire chalkboard of discussion concerning Shannon-Weaver then you must replace it with something more than a few word, e.g. “algorithmic information theory”.

For instance, the object of this discussion is to analyze abiogenesis. Our first step was to ascertain the difference between life and non-life, e.g. a live skin cell v a dead skin cell. We have concluded the difference is information (successful communication, paraphrased from Shannon). The dead skin cell is no longer communicating.

If you wish to erase Shannon-Weaver then the question must be answered from the replacement “generalized theory” – what is the difference between life and non-life?

Cutting a single system into quasi-independent elements is probably the single most common engineering idealization, but one has to understand the limits of that idealization. A very common example of this is the distinction between "program" and "data" that is pervasive in computer science, a widely accepted distinction which has no theoretical basis and only exists for engineering convenience (and which will get you into trouble in some algorithm spaces).

I understand what you mean by the false distinction between “program” and “data” – however it is necessary to make a distinction between operative and inoperative bits. That is very much like the distinction we have here between the reduction of entropy in the receiver and the message.

DNA should not be treated as a "message" in a molecular system without making some assumptions that I do not think apply here, nor is it an element that can be legitimately treated as independent of the rest of the molecular machinery. The "message" that materializes is completely dependent on the context of the rest of the system/machinery, and the lack of functional independence for practical matters in real molecular systems makes the application of Shannon-Weaver doubtful. A simple example of this is that there are single machine code sequences for computers that are valid (and different) programs on wildly different machine architectures. The entire system determines the algorithm, not just some inextricable element that is arbitrarily determined to be "the program" or message.

Shannon-Weaver does not at all concern itself with the meaning or value of the message. It is entirely focused on the mathematics of the communications. The only reason we are concerned with DNA is that we are applying Shannon-Weaver to molecular machines and the message being broadcast is the DNA.

I would be all for going into algorithmic information theory! Years ago I offered a hypothesis on this forum that algorithm at inception (either of the universe or biological life) is proof of intelligent design. A regression of self-organizing complexity leads one to this conclusion.

But unlike you, I do not see any contention – any either/or – between algorithmic information theory and Shannon-Weaver. Shannon-Weaver is simplistic by comparison and only focuses on the communication itself – not even the message (despite your protests to the contrary).

I see algorithmic information theory directly applicable to the message (the DNA) itself, because it does preserve state as you say. It is the template for what I would call Biothought - the rise of complexity (by whatever term we decide) in biological systems.

All these definitions of entropy are identical, they are just distilled rules from an obscure generalization applied at different levels of the system. The transaction theoretic versions (e.g. as would apply to thermodynamics) are pretty esoteric. Algorithmic information theory unifies all the various notions of "entropy" into a single concept that is probably far more confusing than either the static or transactional version in isolation (though very elegant in its own way).

Here we will have to part in disagreement. There are different kinds of entropy.

Clausius invented the term in 1865. Essentially it means "For a closed system, the quantitative measure of the amount of thermal energy not available to do work." This leads to the second law of thermodynamics, ”Entropy in a closed system can never decrease.”

But entropy also means disorder: ”A measure of disorder or randomness in a closed system."” Boltzmann in 1877 “realised that the entropy of a system may be related to the number of possible "microstates" (microscopic states) consistent with its thermodynamic properties.” Wikipedia: Entropy Under Boltzmann, entropy is a function of state.

Feynman described it well: The second law of thermodynamics

Richard Feynman knew there is a difference between the two meanings of entropy. He discussed thermodynamic entropy in the section called "Entropy" of his Lectures on Physics published in 1963 (7), using physical units, joules per degree, and over a dozen equations (vol I section 44-6). He discussed the second meaning of entropy in a different section titled "Order and entropy" (vol I section 46-5) as follows:

So we now have to talk about what we mean by disorder and what we mean by order. ... Suppose we divide the space into little volume elements. If we have black and white molecules, how many ways could we distribute them among the volume elements so that white is on one side and black is on the other? On the other hand, how many ways could we distribute them with no restriction on which goes where? Clearly, there are many more ways to arrange them in the latter case. We measure "disorder" by the number of ways that the insides can be arranged, so that from the outside it looks the same. The logarithm of that number of ways is the entropy. The number of ways in the separated case is less, so the entropy is less, or the "disorder" is less

This is Boltzmann's model again. Notice that Feynman does not use Boltzmann's constant. He assigns no physical units to this kind of entropy, just a number. (A logarithm is a number, without physical units.) And he uses not a single equation in this section of his Lectures.

Notice another thing. The "number of ways" can only be established by first artificially dividing up the space into little volume elements. This is not a small point. In every real physical situation, counting the number of possible arrangements requires an arbitrary parceling. As Peter Coveney and Roger Highfield say (7.5):

There is, however, nothing to tell us how fine the [parceling] should be. Entropies calculated in this way depend on the size-scale decided upon, in direct contradiction with thermodynamics in which entropy changes are fully objective.

Claude Shannon himself seems to be aware of these differences in his famous 1948 paper, "A Mathematical Theory of Communcation" (8). With respect to the parcelling he writes, "In the continuous case the measurement is relative to the coordinate system. If we change coordinates the entropy will in general change" (Shannon's italics). In the same paper he attaches no physical units to his entropy and never mentions Boltzmann's constant, k. At one point he briefly introduces K, saying tersely, "The constant K merely amounts to a choice of a unit of measure." Shannon never specifies the unit of measure, and except in an appendix, K does not appear again in the 55-page paper.

This sort of entropy is clearly different. Physical units do not pertain to it, and (except in the case of digital information) an arbitrary convention must be imposed before it can be quantified.

Wikipedia: Information Theory

Shannon defined a measure of entropy:

(where pI is the probability of I) that, when applied to an information source, could determine the capacity of the channel required to transmit the source as encoded binary digits. If the logarithm in the formula is taken to base 2, then it gives a measure of entropy in bits. Shannon's measure of entropy came to be taken as a measure of the information contained in a message, as opposed to the portion of the message that is strictly determined (hence predictable) by inherent structures, like for instance redundancy in the structure of languages or the statistical properties of a language relating to the frequencies of occurrence of different letter or word pairs, triplets etc.

And, returning to the application of Shannon-Weaver to molecular biology:

Pitfalls in Information Theory and Molecular Biology

Using the term "Shannon entropy". Although Shannon himself did this, it was a mistake because it leads to thinking that the thermodynamic entropy is the same as the "Shannon Entropy". There are two extreme classes of error:

"Shannon entropy" is identical to "entropy". This is incorrect because they have different units: bits per symbol and joules per kelvin, respectively.

"Shannon entropy" is entirely unrelated to "entropy". This is incorrect since it is clear that the forms of the equation are similar and differ by a constant.

A better term to use for measuring the state of a set of symbols is "uncertainty". I take the middle road and say that entropy and uncertainty can be related under the condition when the microstates of the system correspond to symbols, as they do for molecular machines. In this case one can write a simple conversion equation. See the paper edmm: Energy Dissipation from Molecular Machines.

Examples:

Claim of identical: The creationist William Dembski in the book No Free Lunch stated that the two forms are mathematically identical (page 131). Of course just about every sentence of Dembski's work has an error or three ...

Claim of unrelated: In his book The Low-Down on Entropy and Interpretive Thermodynamics (DCW Industries, Inc., 1999, ISBN Number 1-928729-01-0) Stephen J. Kline claimed that the two forms are completely unrelated. Unfortunately he fell into other pitfalls too as he didn't distinguish information and uncertainty.

and…

Entropy is not "disorder"; it is a measure of the dispersal of energy by Dr. Frank L. Lambert. An entropy increase MIGHT lead to disorder (by that I mean the scattering of matter) but then - as in living things - it might not!

How can we relate this idea to molecular information theory? 'Disorder' is the patterns (or mess) left behind after energy dissipates away. The measure Rsequence (the information content of a binding site) is a measure of the residue of energy dissipation left as a pattern in the DNA (by mutation and selection) when a protein binds to DNA. On the other hand, Rfrequency, the information required to find a set of binding sites, corresonds to the decrease of the positional entropy of the protein. To drive this decrease the entropy of the surrounding must increase more by dissipation of energy. After the energy has dissipated out the protein is bound. So the protein bound at the specific genetic control points represents 'ordering'. This concept applies in general to the way life dissipates energy to survive.


470 posted on 01/04/2005 10:10:36 PM PST by Alamo-Girl (Please donate monthly to Free Republic!)
[ Post Reply | Private Reply | To 467 | View Replies]

To: Alamo-Girl

There is a (weak) connection between informational entropy and thermodynamic though. One can produce (by freezing) carbon monoxide ice both within a magnetic field and outside such a field. CO is a linear molecule and the molecules tend to line up in such a field. The CO crystals have Sqrt(2) more entropy (thermodynamic) if frozen in a field free environment. This is the same as the informational entropy added by the randomization due to lack of a field (it's late and I can't figure out how to avoid the double negatives.) With a field, no randomness; without a field, random (higher inforormational) ordering.


471 posted on 01/04/2005 10:22:14 PM PST by Doctor Stochastic (Vegetabilisch = chaotisch is der Charakter der Modernen. - Friedrich Schlegel)
[ Post Reply | Private Reply | To 470 | View Replies]

To: Alamo-Girl
I understand what you mean by the false distinction between “program” and “data” – however it is necessary to make a distinction between operative and inoperative bits.

What on earth is an "inoperative" bit? That sounds like a model simplification/idealization, not something that has theoretical distinction. It is either a part of the system or it isn't. And the nasty truth that we always trying to drive out of our engineering models is that there is only ONE system, and everything is a part of it.

472 posted on 01/04/2005 10:31:56 PM PST by tortoise (All these moments lost in time, like tears in the rain.)
[ Post Reply | Private Reply | To 470 | View Replies]

To: Doctor Stochastic; tortoise; betty boop; PatrickHenry; marron; cornelis
Great catch, Doctor Stochastic! Thank you!

We're getting into a long sidebar here trying to negotiate on some terms which is a very, very good thing because we are truly communicating with one another.

But for the Lurkers' sakes, I do hope we can come up with some language before everyone starts throwing rocks and sticks at their monitors. LOL!

German is o.k. with me or perhaps someone here speaks a dead language?

473 posted on 01/04/2005 10:35:29 PM PST by Alamo-Girl (Please donate monthly to Free Republic!)
[ Post Reply | Private Reply | To 471 | View Replies]

To: tortoise
Thank you so much for your reply!

I'm with you in wanting to look at the whole of "information theory and molecular biology" as a single transaction. That is the point of the term "information" used by Shannon in his theory - it is all about the action of the reduction of uncertainty in the receiver, not the condition of a gain of information content. The value or meaning of the message is outside the scope of Shannon. (Though it becomes relevant in our discussion of abiogenesis.)

With regard to "inoperative bits" - in the Shannon model, that would be noise. To apply "information theory and molecular biology" to evolution, noise would roughly equate to "random mutations". The distinction is between that which was encoded at the source and noise in the channel - both factor into the reduction of uncertainty in the receiver, if not filtered out by decoding, etc.

474 posted on 01/04/2005 10:44:09 PM PST by Alamo-Girl (Please donate monthly to Free Republic!)
[ Post Reply | Private Reply | To 472 | View Replies]

To: Alamo-Girl
With regard to "inoperative bits" - in the Shannon model, that would be noise. To apply "information theory and molecular biology" to evolution, noise would roughly equate to "random mutations".

Ah, that is a distinction between newer, more transaction-oriented models, and the old Shannon model. There is no such thing as "noise" (in a classical information theoretic sense), which is one of the major shifts in theory with time. "Noise" is a term that reflects our inability to predict the behavior of the system in some context, but in no way reflects on the fundamental predictability of the system. (ObEx: RC4 stream generators are often used to create perfect channel noise in a Shannon sense, but if you included the 258-bytes of internal state that actually generates the "noise" in your system model, it would no longer be "noise".) This kind of sounds like a distinction without a difference as a practical matter, but it is important for some purposes.

Where it gets fuzzy is that there is nothing "random" about any mutation in such a system even though our inability to make sense of it leads us to describe it as such. There is no uncertainty in any kind of absolute sense, only uncertainty (i.e. very poor predictive limits) in some subcontext, such as human experience. The mutations are mechanistic, like any other piece of machinery, and it is our decision whether or not to include every part of the machinery in our idealized model.

Uncertainty in communication is a bit of a fiction, in that any apparent "uncertainty" is a property of the observer rather than a property of the system. It is a simple thing to define a system in such a way that there is apparent uncertainty, but a view with no uncertainty exists at the same time whether we can see it or not. Uncertainty is created by how we choose to model the system; if we were smarter, the uncertainty would disappear. In that sense, the uncertainty exists in our own mind but not in the channel. The "noise" we ascribe to the channel does not really exist in the channel, it exists in the models of the "channel" we build in our minds.

475 posted on 01/04/2005 11:28:02 PM PST by tortoise (All these moments lost in time, like tears in the rain.)
[ Post Reply | Private Reply | To 474 | View Replies]

To: tortoise; betty boop; Doctor Stochastic; PatrickHenry; marron; cornelis; StJacques
Thank you oh so very much for your reply and your insight!!!

Actually, Shannon pretty much says the same thing about noise not necessarily being random. Here’s an article from a recent issue of Astrobiology magazine (emphasis mine):

Space Daily: Earthlings' Low Signal-To-Noise?

If ET ever phones home, chances are Earthlings wouldn't recognize the call as anything other than random noise or a star. New research shows that highly efficient electromagnetic transmissions from our neighbors in space would resemble the thermal radiation emitted by stars.

University of Michigan physicist Mark Newman, along with biologist Michael Lachmann and computer scientist Cristopher Moore, have extended the pioneering 1940s research of Claude Shannon to electromagnetic transmissions in a paper published last month in the American Journal of Physics called, "The Physical Limits of Communication, or Why any sufficiently advanced technology is indistinguishable from noise."

Lachmann is at the Max Planck Institute in Leipzig, Germany; Moore is at the University of New Mexico in Albuquerque. Their title echoes the well-known characterization by Sir Arthur C. Clarke that any sufficiently advanced technology is indistinguishable from magic.

Shannon showed that a message transmitted with optimal efficiency is indistinguishable from random noise to a receiver unfamiliar with the language in the message.

For example, an e-mail message whose first few letters are AAAAA contains little information because the reader can easily guess what probably comes next- another A.

The message is totally non-random. On the other hand, a message beginning with a sequence of letters like RPLUOFQX contains a lot of information because you cannot easily guess the next letter.

Paradoxically, however, the same message could just be a random jumble of letters containing no information at all; if you don't know the code used for the message you can't tell the difference between an information-rich message and a random jumble of letters.

Newman and his collaborators have shown that a similar result holds true for radio waves.

When electromagnetic waves are used as the transmission medium, the most information efficient format for a message is indistinguishable from ordinary thermal radiation - the same kind of radio waves that are emitted by hot bodies like stars.

In other words, an efficiently coded radio message coming from outer space would look no different from a normal star in the sky.

Tom Schneider (the cancer research guy I keep quoting on information theory and molecular biology) attributes noise as the route of Darwin evolution [random mutation + natural selection > species] in his article: Evolution of Biological Information

For me, it is a “gimme” to the evolutionists in information theory and molecular biology. It only works if the noise is random - and, as y'all say, how can one know if it is random? So on that issue, I agree with you and the astrobiologists (and Shannon) - and not Schneider.

The presumption that noise is random skews the conclusion towards evolution and away from intelligent design. But there is no unbiased cause for that presumption.

Uncertainty in communication is a bit of a fiction, in that any apparent "uncertainty" is a property of the observer rather than a property of the system.

Thus we are struggling over a glossary! Shannon called it entropy. Schneider protests and says it ought to be called uncertainty. As far as I’m concerned, we could call it quifnix as long as we all know what we are talking about.

Perhaps this description will help us arrive at a term and definition for the glossary:

bio-net info theory FAQS

I'm Confused: How Could Information Equal Entropy?

If someone says that information = uncertainty = entropy, then they are confused, or something was not stated that should have been. Those equalities lead to a contradiction, since entropy of a system increases as the system becomes more disordered. So information corresponds to disorder according to this confusion.

If you always take information to be a decrease in uncertainty at the receiver and you will get straightened out:

R = Hbefore - Hafter.

where H is the Shannon uncertainty:

H = - sum (from i = 1 to number of symbols) Pi log2 Pi (bits per symbol)

and Pi is the probability of the ith symbol. If you don't understand this, please refer to "Is There a Quick Introduction to Information Theory Somewhere?".

Imagine that we are in communication and that we have agreed on an alphabet. Before I send you a bunch of characters, you are uncertain (Hbefore) as to what I'm about to send. After you receive a character, your uncertainty goes down (to Hafter). Hafter is never zero because of noise in the communication system. Your decrease in uncertainty is the information (R) that you gain.

Since Hbefore and Hafter are state functions, this makes R a function of state. It allows you to lose information (it's called forgetting). You can put information into a computer and then remove it in a cycle.

Many of the statements in the early literature assumed a noiseless channel, so the uncertainty after receipt is zero (Hafter=0). This leads to the SPECIAL CASE where R = Hbefore. But Hbefore is NOT "the uncertainty", it is the uncertainty of the receiver BEFORE RECEIVING THE MESSAGE.

A way to see this is to work out the information in a bunch of DNA binding sites.

Definition of "binding": many proteins stick to certain special spots on DNA to control genes by turning them on or off. The only thing that distinguishes one spot from another spot is the pattern of letters (nucleotide bases) there. How much information is required to define this pattern?

Here is an aligned listing of the binding sites for the cI and cro proteins of the bacteriophage (i.e., virus) named lambda:

[note to Lurkers: I cannot seem to get the listing to align in my preview of this post, so please refer back to the link to read the example]

Each horizontal line represents a DNA sequence, starting with the 5' end on the left, and proceeding to the 3' end on the right. The first sequence begins with: 5' tgctcag ... and ends with ... tttatgt 3'. Each of these twelve sequences is recognized by the lambda repressor protein (called cI) and also by the lambda cro protein.

What makes these sequences special so that these proteins like to stick to them? Clearly there must be a pattern of some kind.

Read the numbers on the top vertically. This is called a "numbar". Notice that position +7 always has a T (marked with the ^). That is, according to this rather limited data set, one or both of the proteins that bind here always require a T at that spot. Since the frequency of T is 1 and the frequencies of other bases there are 0, H(+7) = 0 bits. But that makes no sense whatsoever! This is a position where the protein requires information to be there.

That is, what is really happening is that the protein has two states. In the BEFORE state, it is somewhere on the DNA, and is able to probe all 4 possible bases. Thus the uncertainty before binding is Hbefore = log2(4) = 2 bits. In the AFTER state, the protein has bound and the uncertainty is lower: Hafter(+7) = 0 bits. The information content, or sequence conservation, of the position is Rsequence(+7) = Hbefore - Hafter = 2 bits. That is a sensible answer. Notice that this gives Rsequence close to zero outside the sites.

If you have uncertainty and information and entropy confused, I don't think you would be able to work through this problem. For one thing, one would get high information OUTSIDE the sites. Some people have published graphs like this.

A nice way to display binding site data so you can see them and grasp their meaning rapidly is by the sequence logo method. The sequence logo for the example above is at http://www.lecb.ncifcrf.gov/~toms/gallery/hawaii.fig1.gif. More information on sequence logos is in the section What are Sequence Logos?

[Note to Lurkers: I couldn't do the listing, but here is the sequence logo gif:]

More information about the theory of BEFORE and AFTER states is given in the papers http://www.lecb.ncifcrf.gov/~toms/paper/nano2 , http://www.lecb.ncifcrf.gov/~toms/paper/ccmm and http://www.lecb.ncifcrf.gov/~toms/paper/edmm.


476 posted on 01/05/2005 8:33:43 AM PST by Alamo-Girl (Please donate monthly to Free Republic!)
[ Post Reply | Private Reply | To 475 | View Replies]

To: Alamo-Girl

ARRRRRRGHHHH!


477 posted on 01/05/2005 8:39:10 AM PST by PatrickHenry (The List-O-Links for evolution threads is at my freeper homepage.)
[ Post Reply | Private Reply | To 476 | View Replies]

To: PatrickHenry
?????
478 posted on 01/05/2005 8:42:52 AM PST by Alamo-Girl (Please donate monthly to Free Republic!)
[ Post Reply | Private Reply | To 477 | View Replies]

To: Alamo-Girl

I in over my head. But keep up the good work.


479 posted on 01/05/2005 11:38:33 AM PST by PatrickHenry (The List-O-Links for evolution threads is at my freeper homepage.)
[ Post Reply | Private Reply | To 478 | View Replies]

To: Alamo-Girl

I'm in over my head. But keep up the good work.


480 posted on 01/05/2005 11:38:45 AM PST by PatrickHenry (The List-O-Links for evolution threads is at my freeper homepage.)
[ Post Reply | Private Reply | To 478 | View Replies]


Navigation: use the links below to view more comments.
first previous 1-20 ... 441-460461-480481-500 ... 921-935 next last

Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.

Free Republic
Browse · Search
News/Activism
Topics · Post Article

FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson