Archive for the ‘Science And Religion’ Category

h1

Quantum Physics: The Multiverse of Parmenides 2 — Heinrich Pas

July 10, 2014
The bizarre properties of quantum physics naturally inspired the fantasies of both journalists and authors. The parallel existence of different realities in quantum physics, for example, became the subject of a Physics World cover in 1998, which depicts a couple on the phone arguing as follows: "Oh Alice . . . you're the one for me"-"But Bob . . . in a quantum world . . . How can we be sure?" Man’s best friends: A doggie selfie. Where does it all end?

The bizarre properties of quantum physics naturally inspired the fantasies of both journalists and authors. The parallel existence of different realities in quantum physics, for example, became the subject of a Physics World cover in 1998, which depicts a couple on the phone arguing as follows: “Oh Alice . . . you’re the one for me”-“But Bob . . . in a quantum world . . . How can we be sure?” Man’s best friends: A doggie selfie. Where does it all end?

Bohr summarized the apparent paradox of particles and waves under the concept of complementarity. After a guest lecture he gave at Moscow University, he left the following aphorism on the blackboard where famous visitors were sup­ posed to leave comments: Contraria non contradictoria sed complementa sunt (Opposites do not contradict but rather complement each other).

But back to Heisenberg, Plato, and the ancient Greeks: As the American philosopher of science Thomas S. Kuhn realized, science in times of scientific revolutions is particularly vulnerable to nonscientific influences. When changes to the scientific paradigm cause a shift in the generally accepted problems and solutions and thus also in the general perception and scientific world view, rational reasons like conformity with facts, consistency, scope, simplicity, and usefulness are not sufficient to understand the evolution of a new theory.

During these times, personal factors such as cultural back­ ground can also play a decisive role. And Heisenberg’s background was almost as Greek as it was German: As the son of a professor of Greek language, he became accustomed to Greek philosophy and culture and their reception in early twentieth-century Germany long before he himself learned Latin and ancient Greek in school. His biographer Armin Hermann suggests that the encounter with Plato’s philosophy influenced Heisenberg more than anything else. And not long after Heisenberg studied, climbed, and calculated in Helgoland, Paul Dirac in Cambridge and Erwin Schrodinger in Vienna worked out different but mathematically equivalent versions of quantum physics.

Since the standard interpretation of these works was developed basically in the inner circle around Bohr and Heisenberg, Heisenberg’s background seems particularly relevant for its appreciation. Also, Schrodinger made statements such as “Almost our entire intellectual heritage is of Greek origin” and “science can be correctly characterized as re­flecting on the Universe in a Greek way.”And Dirac left on the blackboard in Moscow, right next to Bohr’s principle of com­plementarity, only the laconic remark, “A physical law has to have mathematical beauty,” a statement that reminds us strongly of Goethe’s transfiguration of the classical worldview:

Nature and art, they seem each other to repel
Yet,
they fly together before one is aware;
The
antagonism has departed me as well,
And
now both of these seem to me equally fair.

And sure enough, quantum physics seems to be a Greek theory after all. This becomes evident when reading the thoughts in the book Die Einheit der Natur (The Unity of Na­ture) by Heisenberg’s student and friend, Carl Friedrich von Weizsacker, the brother of the subsequent German president, on the centerpiece of quantum physics — the wave-particle duality — and how it can be traced back to the arguments in Plato’s dialogue Parmenides.

Parmenides of Elea (Fig. 3.3) was a Greek philosopher in the pre-Socratic era around the fifth century BCE. Of his writing only the fragment of a philosophical poem remains; it deals with the unity of all being. It describes how an unnamed goddess-often understood as Persephone- invites the poet to perceive the truthful being-again a likely reference to the mystical experience in the mystery cults of Eleusis. 

The truth­ful being then is distinguished from mere appearances and described as the all-embracing One — uncreated and indestructible, alone, complete, immovable, and without an end — reminiscent of Aldous Huxley’s stage of egolessness. One is the All is correspondingly the central statement followed up by Weizsacker  when he discusses the argument between Socrates and Parmenides chronicled by Plato, which, according to the Italian author Luciano De Crescenzo, was the “most boring and complicated discussion in the entire history of philosophy.” 

In this battle of words, which supposedly took place on the occasion of a visit of Parmenides to Athens, Socrates tried to refute the identity of One and All. To this end Socrates argued that One is not Many and thus has no parts. On the other hand All refers to something which does not miss any of its parts. Consequently the One would consist of parts if it were All, and thus finally One cannot be the All. 

At this point Weizsacker comes to Parmenides’s defense by stressing the connection with quantum mechanics. And it is really astounding how the quantum mechanical interpretation of the One suddenly bestows this incomprehensible debate with lucidity and meaning. After all, in quantum mechanics the All is the wave function and, in its fullest manifestation, the all-embracing wave function of the universe. 

Moreover, in quantum mechanics the analysis of the individual parts of an object without destroying the object is impossible, since the measurement, as explained above, affects the object and thus distorts the unity of its parts. And of all possible states an object can assume, only an infinitesimally small fraction are states in which the parts of the object actually correspond to a clearly defined outcome of a measurement. Only in these states can one truthfully assign reality or existence to these parts.

For example, only two among the infinitely many possible states that Schrodinger’s cat can assume (such as 90 percent alive and 10 percent dead or 27.3 percent alive and 72.7 percent dead) — namely totally dead or totally alive-correspond to possible outcomes in a measurement. But quantum mechanically, a pair of two cats, half of them dead and the other half alive, is realizable not only with one living and one dead but also with two half-dead cats or one being 70 percent alive and one being 30 percent alive.

Consequently, in quantum physics the All is really more than its parts, the partial objects actually constituting through their association a new entity, or, just as postulated by Parmenides, a new unity, a new One. 

Now Parmenides, according to Plato, required further that the One possesses no properties: It has ho beginning, no center and no end, no shape and no location; it is neither in itself nor in anything else; it is neither at rest nor is it moving. Weizsacker can argue that a quantum mechanical object fulfills these requirements perfectly.

After all, a determination of any of these properties relies on a measurement, which implies a collapse of the wave function and thus destroys the unity of the collective object. On the other hand, isolation of the object from the surrounding universe is impossible: The object would not exist in the universe if it were not connected to the universe via some kind of interaction. 

Thus, strictly speaking, only the universe as a whole can constitute a real quantum mechanical object. 

Then, however, nobody would remain who could observe it from outside. Next Weizsacker and Parmenides follow the discussion back­ward: how the One — meaning the all-embracing universe barring all properties — unfurls into the colorful and multifaceted appearances of our everyday life. The argument relies here on the quirky assumption that the One, in the instant where it “is” — in the sense of exists — is already two things. It is the One and it is the Is. This argument can be iterated. Again both the One and the Is are two things: the Is is and is the One, and the One is and is the One. By repetition of this consideration the One acquires an infinite multiplicity: The being One unfolds itself into the universe. And again Weizsacker clarifies the discourse by referring to the quantum mechanical object.

After all, the way an object can exist is via interaction with other objects, which again results in the collapse of the wave function and the loss of quantum mechanical unity: In order to establish that an object exists, the object has to be measured and thus is affected in a way that implies that it is no longer one object according to the meaning of Par­menides’s One. In summary, Weizsacker arrives at an amazing conclusion, that the notion of complementarity has its source in ancient Greece: “We find . . . the foundation of complemen­tarity already foretold in Plato’s Parmenides.” We actually can recover the feel of what the ancient Greeks experienced in their mystery cults in modern twentieth-century physics! 

But this is not the end of the story: The atomism of Democritus, the idea that, the world is not continuously divisible but made out of indivisible particles, makes sense only in the context of quantum mechanics, where matter consists of compound objects that correspond to standing waves and thus can absorb or emit energy only in indivisible portions­ the quanta. 

Also, the idea of tracing the laws of nature back to fundamental symmetries, as proposed in Plato’s Timaeus is an integral part of contemporary particle physics. Finally, consider Einstein’s objection to the fundamental importance of probabilities. Because of that objection, he remained a lifelong opponent of quantum mechanics: God doesn’t play dice with the world. This statement appears as a direct response to the 2,500-year-old fragment of Heraclitus: “Eternity is a child moving counters in a game; the kingly power is a child’s.”

How can one really comprehend the lack of causality inherent in quantum physics and in particular the role of the puzzling quantum collapse, which are not described by the mathematical formalism and remain controversial today? The most modern and consistent interpretation of these puzzling phenomena seems to be at the same time the craziest one: Every­ thing that can happen does happen-albeit in different parallel universes. 

This idea was formulated for the first time in 1957 by Hugh Everett III while he was working on his doctoral dissertation at Princeton University. With the bizarre concept of parallel universes he asked too much of his contemporary physicists, even though Everett — like Richard Feynman, a founder of quantum electrodynamics, and Kip Thorne, the father of the wormhole time machine — was a student of the eminent John Archibald Wheeler, who was himself a rather unorthodox and creative associate of Einstein and who, among many other achievements coined the term black hole for the timeless star corpses in the universe.

But even with this first­ class mentor, Everett’s colleagues didn’t take him seriously. Everett left the academic world shortly after finishing his dissertation. During a frustrating visit in Copenhagen, during which Everett tried to convince Niels Bohr to take some interest in his work, he (Everett) transformed a standard approach in classical mechanics into a method for optimization that he could apply to commercial and military problems and that helped him to become a multimillionaire — but didn’t make him happy. He became a chain-smoking alcoholic and died of a heart attack when he was only fifty-one years old. 

According to his explicit wish, his ashes were disposed of in the garbage. Fourteen years after his death, his daughter Elizabeth, who suffered from schizophrenia, committed suicide. In her suicide note she wrote that she was going into a parallel universe, to meet her father. His son Mark Everett became the famous rock star E, lead singer of the Eels. He described his father as distant, depressed, and mentally absent, and his own childhood as strange .and lonely. Only his music saved him. But he also expressed sympathy for his father: “These guys, I don’t think they should be held to subscribe to normal rules. I think that about rock stars, too.” Hugh Everett’s ideas about quantum physics were finally popularized in the 1970s by his advisor Wheeler and Bryce DeWitt, who had also worked with Wheeler. It was DeWitt who added the “many-worlds” label, a term that Wheeler never liked. 

The interpretation essentially states that every measurement results in a split of the universe. Every possible outcome of a measurement — or more generally of any physical process — is being realized, but in different parallel universes. If a guy chats up a girl in a dance club, there is always a universe where the two of them get happily married and remain in love until they die, but also another one where she tells him to back off, he has too much to drink, and he wakes up the next morning with a serious hangover. This very in­sight made me particularly nervous when I prepared to jump out of an airplane 4,000 meters above Oahu’s north shore. After all, even if I survived in this universe, there are always countless universes where the parachute did not open. So somewhere one loses, every time. But somewhere there is also a parallel universe where Everett still lives happily together with his daughter.

The major advantage of the many-worlds interpretation, compared with the classical Copenhagen interpretation, is that no collapse of the wave function — which, in any case, is not really part of the theory — has to be assumed. Even after the measurement has been performed, both possible outcomes­ like an electron at place A and an electron at place B — coexist, but they decouple, so that an observer who measures the elec­tron at place A does not notice the alternative reality with the electron at place B. 

In contrast to the collapse of the wave function, this process of decoupling can be described within the formalism of quantum mechanics. Perhaps this process­ so-called decoherence — is the only reason we witness so little quantum weirdness in our everyday lives. The drawback of the many-worlds interpretation, however, is that we have to give up the concept of a unique reality.

The interaction of different parallel universes is suppressed after a measurement, but not totally lost. So even in our daily lives we don’t reside in clearly defined conditions such as dead or alive. The parallel universes in which we and our fellow human beings experience totally different fates instead resonate as unobservable tiny admixtures of alternative realities into our universe.

Thus the many-worlds interpretation exhibits the Parmenidic-neo-Platonic nature of quantum mechanics most clearly. According to this point of view, the unity of the different realities is not completely lost. It is actually possible to recognize the multiverse — the collection of all of Everett’s parallel universes — directly as Parmenides’s primeval One: the unity of the world the ancient Greeks felt they had lost in the charted modern world, and for whose reunification with the individualized ego they looked in the ecstasy of their mystery cults, in their Dionysian arts, or in the flush induced by psychedelic drugs.

The bizarre properties of quantum physics naturally inspired the fantasies of both journalists and authors. The parallel existence of different realities in quantum physics, for example, became the subject of a Physics World cover in 1998, which depicts a couple on the phone arguing as follows: “Oh Alice . . . you’re the one for me”-“But Bob . . . in a quantum world . . . How can we be sure?”

An even more radical take on the many-worlds interpretation can be found in Douglas Adams’s The Hitchhikers Guide to the Galaxy. Whenever the extraterrestrial crackpot Zaphod Beeblebrox, double-headed and addicted to Pan Galactic Gargle Blasters, starts the Infinite Improbability Drive, his stolen spaceship gets located in all places in the universe simultaneously, and tiny probabilities are amplified. In the novel this allows the spaceship to travel faster than light, and also causes various strange incidents, such as when a threatening pair of rockets gets sud­denly transformed into a dumbfounded whale and a flowerpot.

Finally, and now I am serious again, the many-worlds interpretation could protect time travelers from ludicrous paradoxes, and in this way make time travel a meaningful physics concept. But we’ll get to that later…

h1

Quantum Physics: The Multiverse of Parmenides 1 — Heinrich Pas

July 9, 2014
Heisenberg traveled to Copenhagen, Denmark, in the fall of 1941 to visit his fatherly friend and mentor Niels Bohr. According to Heisenberg, his intention was to inform Bohr that the construction of a nuclear bomb was possible but that the German physicists would not try to build it and to suggest that physicists in the allied nations should follow the same policy. This epic conversation, however, only resulted in a lasting breakdown of their friendship. Bohr, the son of a Jewish mother and the citizen of an occupied country, could not have much sympathy for any agreement with the German physicist. From left to right: Enrico Fermi, godfather of the neutrino; Werner Heisenberg, a creator of quantum mechanics; and Wolfgang Pauli, the father of the neutrino.

Heisenberg traveled to Copenhagen, Denmark, in the fall of 1941 to visit his fatherly friend and mentor Niels Bohr. According to Heisenberg, his intention was to inform Bohr that the construction of a nuclear bomb was possible but that the German physicists would not try to build it and to suggest that physicists in the allied nations should follow the same policy. This epic conversation, however, only resulted in a lasting breakdown of their friendship. Bohr, the son of a Jewish mother and the citizen of an occupied country, could not have much sympathy for any agreement with the German physicist. From left to right: Enrico Fermi, godfather of the neutrino; Werner Heisenberg, a creator of quantum mechanics; and Wolfgang Pauli, the father of the neutrino.

A major breakthrough in the story of quantum physics be­gins with a young man holed up in a rain pipe in order to find a quiet place for reading. It is the year 1919, in Munich, shortly after the end of World War I. The chaotic rioting in the streets that followed the revolution driving the German emperor out of office has finally calmed down, and now eighteen-year-old Werner Heisenberg can find some leisure time again.

He had been working as a local guide, assisting a vigilante group that was trying to reestablish order in the city, but now he could retreat, after the night watch on the command center’s hotline, onto the roof of the old seminary where his· cohort was accommodated. There he would lie, in the warm morning sun, in the rain pipe, reading Plato’s dia­logues. 

And on one of these mornings, while Ludwig street below him and the university building across the way with the small fountain in front slowly came to life, he came across that part in Timaeus where Plato philosophizes about the smallest constituents of matter, and the idea that the smallest particles can finally be resolved into mathematical structures and shapes, that one would encounter symmetries as the ba­sic pillar of nature-an idea that will fascinate him so deeply that it will capture him for the rest of his life.

Werner Heisenberg was to become one of the most important physicists of his generation. When just turned forty, he was the head of the German nuclear research program, which in World War II examined the possibilities for utilizing nuclear power, including the feasibility of nuclear weapons. In this position he was on the assassination list of the US Office of Strategic Services, but a special agent who had permission to kill Heisenberg in a lecture hall decided against it, after he heard Heisenberg’s lecture on abstract S-ma­trix theory and concluded. that the practical usefulness  of Heisenberg’s research was marginal.

Even today, historians debate Heisenberg’s role in Nazi Germany. His opponents criticize his remaining in Germany and his commitment to the nuclear research project, the so-called Uranverein, which, according to these critics, failed to build a nuclear weapon for Hitler only because Heisenberg was unable to do it. Extreme admirers, such as Thomas Powers in Heisenberg’s War, argue that Heisenberg used his position to prevent the construction of a German nuclear bomb by exaggerating its difficulties when questioned by officials, bestowing a moral mantle on Heisenberg he never had claimed for himself. 

What is well documented is that Heisenberg traveled to Copenhagen, Denmark, in the fall of 1941 to visit his fatherly friend and mentor Niels Bohr. According to Heisenberg, his intention was to inform Bohr that the construction of a nuclear bomb was possible but that the German physicists would not try to build it and to suggest that physicists in the allied nations should follow the same policy. This epic conversation, however, only resulted in a lasting breakdown of their friendship. Bohr, the son of a Jewish mother and the citizen of an occupied country, could not have much sympathy for any agreement with the German physicist.

In 1998, the British author Michael Frayn wove different perceptions of this meeting into a play that essentially deals with the parallel existence of different realities, both in psychology and in quantum mechanics. After all, among all his other activities, Heisenberg was famous for one thing: was one of the masterminds of a revolutionary new theory. 

Just six years after the sunny morning in the rain pipe, Heisenberg, now twenty-three years old and a postdoc at the University of Gottingen, was forced by his hay fever to leave his institute for two weeks, and he spent some sleepless time on Helgoland, a tiny and once holy red rock off Germany’s coast in the North Sea-days that would shatter the most basic grounds of physics. One-third of the day the young man climbed in the famous cliffs; one-third he memorized the works of Goethe, the poet who served as a national idol in Germany and who followed the classical paradigm of the ancient Greeks; and the last third he worked on his calculations. 

In these calculations he developed a formalism that would be the bed­ rock of modern quantum physics and would do nothing less than change the world: “In Helgoland there was one moment when it came to me just as a revelation . . . . It was rather late at night. I had finished this tedious calculation and at the end it came out correct. Then I climbed a rock, saw the sun rise and was happy.

“Nowadays the technical applications of quantum physics account for about one-third of the US gross domestic product. Nevertheless, Richard P.Feynman commented some forty years after Heisenberg’s work that the theory is so crazy that nobody can actually comprehend it, and Einstein had earlier declared bluntly: this is obvious nonsense. What makes quantum physics special is that this theory breaks radically with the concept of causality. In our daily lives we are used to ordered sequences of cause and effect: You and a friend clink your glasses with just a little bit too much verve; one glass breaks; beer runs down to the floor; your significant other/ roommate/parents cry out. 

One event causes the next one. This is exactly where quantum physics is different, where this strict connection between cause and effect no longer exists. For example, how a particle reacts to an influence can be predicted only in terms of probabilities. But this is not the end of the story: Unless the effect on the particle is actually observed, all possible consequences seem to be realized simultaneously. Particles can reside in two different locations at once! And particles exhibit properties of waves while waves behave in certain situations  like  particles. 

An object thus has both properties of a particle and of a wave, depending on how it is observed. The particle corresponds to an indivisible energy portion of the wave, a so-called quantum. On the other hand, the wave describes the probability for the particle to be located at a certain place. This property of quantum mechanics can be depicted most easily with the famous double-slit experiment (Figure below).

Figure 3.2. Double-slit experiment. As long as no measurement determines which slits the particles are passing through, they behave like interfering waves, which pass simultaneously though both slits (left side). Where two  wave crests coincide, the probability of detecting a particle is largest; where a crest coincides with a trough, the probability is very small or zero. The resulting image is called an interference pattern. As soon as an external measurement disturbs the system-for example, if one uses irradiation with light to determine which path the electrons take through the slits — the wave collapses into single particles, which accumulate in narrow bands behind the slits they were flying through (right side).

Figure 3.2. Double-slit experiment. As long as no measurement determines which slits the particles are passing through, they behave like interfering waves, which pass simultaneously though both slits (left side). Where two wave crests coincide, the probability of detecting a particle is largest; where a crest coincides with a trough, the probability is very small or zero. The resulting image is called an interference pattern. As soon as an external measurement disturbs the system-for example, if one uses irradiation with light to determine which path the electrons take through the slits — the wave collapses into single particles, which accumulate in narrow bands behind the slits they were flying through (right side).

When a particle beam hits a thin wall with two narrow slits in it, the corresponding wave penetrates both slits and spreads out on the other side as a circular wave. On a screen situated behind the wall, in accordance with the wave nature of the electrons, an interference  pattern appears, resulting from the superposition of the waves originating from the two slits in the wall.

Where a crest meets another crest or a trough meets another trough the wave gets amplified. A crest encountering a trough, on the other hand, results in little or no amplitude (left side). This pattern appears, however, only as long as it is unknown through which slit a single electron passed. As soon as this is determined, for example by blocking one of the slits or by irradiating the electrons with light, the two-slit interference pattern gets destroyed and the electrons behave just like classical particles. To be more accurate, a new wave emanates from the slit, and the pattern exhibited on the screen is the one for a wave passing through a single slit, which resembles a smooth probability distribution (right side).

Heisenberg and Bohr interpreted this as a collapse of the wave function due to  the  measurement  process  in  which one gets a result with the probability given by the amplitude squared of the wave. This is the so-called Copenhagen inter­pretation of quantum physics, which is still taught at universi­ties around the globe. According to this interpretation, a par­ticle is located in many places simultaneously until finally a measurement assigns it a concrete location. And this is true not only for position; it applies to other measurable quantities such as momentum, energy, the instant of a nuclear decay, and other properties as well. 

Erwin Schrodinger, both collaborator with and competitor of Heisenberg in the development of quantum physics, carried this idea to an extreme: “One can even set up quite ridiculous cases. A cat is penned up in a steel chamber, along with the following device (which must be se­ cured against direct interference by the cat).”

In Schrodinger’s experiment the death or life of the cat depends on whether a radioactive isotope does or doesn’t decay in a particular time period. As long as we do not check whether the isotope did decay or not, nor how the cat is doing, Schrodinger’s cat is simultaneously dead and alive, or as Schrodinger phrased it: “[The wave function of the system would have] in it the living and dead cat (pardon the expression) mixed or smeared out in equal parts.

There are two reasons why we don’t observe such bizarre phenomena in our daily lives: One is that the wavelengths of ordinary objects around us are tiny compared with the sizes of the objects themselves. The other is that the objects we deal with every day are always interacting with their environment and thus are being measured all the time. A beer bottle, for example, may very well be situated in two different locations, but only for an extremely short time and for an extremely small separation (too short and too small to measure).

h1

In the Beginning There Was an Atom — Amir D. Aczel

May 15, 2014
Georges Lemaître, (1894-1966), Belgian cosmologist, Catholic priest, and father of the Big Bang theory

Georges Lemaître, (1894-1966), Belgian cosmologist, Catholic priest, and father of the Big Bang theory

With respect to the big-bang theory anyways, science and faith are not at odds.

*********************************************************

According to a recent Associated Press poll a majority of Americans — 51% — do not believe the universe began with the “big bang.”

The skepticism of half the country may seem startling, given how essential the big-bang theory is to modern cosmology, but there is a good reason for it. The big bang is at first hard to swallow. I am a physics writer, and yet I remember how perplexed I was many years ago when I heard MIT cosmologist Alan Guth describe the universe expanding within a fraction of a second from the size of an atom to “as big as a marble.” My initial thought was: How could he possibly know the size of the entire universe when it was less than a second old? Believing in the big bang seemed to require a leap of faith.

And if you feel uncomfortable with big-bang cosmology, you’re in excellent company: The greatest physicist of the 20th century, Albert Einstein, stubbornly refused to believe in it. Ironically, it was a Catholic priest who first came up with the big-bang idea in 1927. The Belgian priest Georges Lemaître, who was also an astronomer and physicist, theoretically deduced the expansion of the universe and proposed that it was launched from a “primeval atom” — the process later known as the big bang.

At the time of Lemaître’s prescient idea, not only Einstein but other physicists and astronomers believed that the universe was static, with no beginning or end. Lemaître did not buy this supposition. He believed in the story of Genesis, which outlines the birth of the universe, and he searched for a way to prove it scientifically. He presented his complicated mathematical results on the beginning of the universe — based on Einstein’s own general theory of relativity — in a 1931 meeting in London of the British Science Association that was dedicated to the relationship between science and spirituality.

This was after Edwin Hubble’s astronomical observations of 1929 had proved that Lemaître was right about the expansion of the universe, and as the news about Hubble’s discovery spread around the world, Einstein and many other scientists eventually came to accept the big-bang theory.

On March 17 of this year, in a dramatic news conference held at the Harvard-Smithsonian Center for Astrophysics, the Background Imaging of Cosmic Extragalactic Polarization (Bicep) research group of astronomers presented their discovery of gravitational waves, which confirmed the existence of this major theoretical phenomenon associated with Einstein’s general relativity, thus providing overwhelming evidence for the big-bang theory. It also strongly supported cosmic inflation, a mechanism by which the early universe expanded from the size of an atom to that of a marble and beyond — just as predicted by Alan Guth three decades ago.

And so the big-bang theory is verified not only by the Bicep evidence, but also from decades of data on the microwave background radiation in space (“embers of the big bang”) as well as high-energy particle collisions from the Large Hadron Collider (a tiny-scale simulation of the big bang). It also fundamentally does not conflict with scripture. So why do so many deny it?

The culprits might be “scientific atheists,” a small but vocal group of thinkers who employ science to claim that there is no God. Some argue that the universe came into existence all on its own. In particular, physicist Lawrence M. Krauss’s 2012 book “A Universe from Nothing” insists that the big bang occurred within a complete emptiness, and thus there is no need for a “God.” But the key assumption of Mr. Krauss’s conjecture is flawed and at odds with modern cosmology. The big bang did not occur in “nothing.” It had to be spawned in some kind of pre-existent medium, known by physicists as “quantum foam,” though we don’t know exactly what it is.

Despite the damage scientific atheists are doing to public opinion, the truth is that — at least with respect to big-bang cosmology — science and faith are not at odds. For it was the story in Genesis that inspired the big bang’s founder to discover how the universe came to be. And it was Genesis that provided the stimulus for the first mathematical calculations that led to the “primeval atom.” The 51% of Americans who deny the big bang — if they do so because they think the theory conflicts with faith — should come to trust our science.

 

h1

Book Review: The Language of God by Francis S. Collins

May 8, 2014
Francis Sellers Collins (born April 14, 1950) is an American physician-geneticist noted for his discoveries of disease genes and his leadership of the Human Genome Project (HGP). He currently serves as Director of the National Institutes of Health (NIH) in Bethesda, Maryland. He is the atheists greatest fear: a scientist who believes in God and lives a life of faith AND science.

Francis Sellers Collins (born April 14, 1950) is an American physician-geneticist noted for his discoveries of disease genes and his leadership of the Human Genome Project (HGP). He currently serves as Director of the National Institutes of Health (NIH) in Bethesda, Maryland. He is the atheists’ greatest fear: a scientist who believes in God and lives a life of faith AND science.

Stephen M. Barr is a theoretical particle physicist at the Bartol Research Institute of the University of Delaware and author of Modern Physics and Ancient Faith.I am performing a little Blog House Cleaning here, redoing my pages and using them to introduce the categories and all the posts contained therein.: hopefully will show readers of payingattentiontothesky.com how much “stuff” lurks under their mouse just a click away…

*********************************

“Today we are learning the language in which God created life.” With these words, President Clinton announced one of the great feats of modern science, the mapping of the human genome. Standing next to him in the East Room of the White House was the leader of the Human Genome Project, Francis S. Collins.

Collins has now written a book, The Language of God, but it is not the sort of book one might have expected him to write, for only a small part is devoted to the genome project. Rather, Collins has written the story of his other great discovery: the discovery not of new truths but of old truths. It is the story of how and why he came to believe in God.

As such, this book is almost unique. There are many conversion stories and many scientific autobiographies, but few books in which prominent scientists tell how they came to faith. If nothing else, Collins’ book gives the lie, in most spectacular fashion, to the claim made by Richard Dawkins in an interview not long ago: “You won’t find any intelligent person who feels the need for the supernatural,” Dawkins declared, “unless [he was] brought up that way.”

Francis Collins was not brought up that way; his family’s view was that religion “just wasn’t very important.” Almost the only contact Collins had with religion as a child was singing in the choir at the local Episcopal church, where his parents had sent him to learn music with the admonition that he shouldn’t take the theology too seriously. After discovering, in high-school science classes, “the intense satisfaction of the ordered nature of the universe,” Collins entered the University of Virginia at the age of sixteen to major in chemistry.

Up to then, he had given little thought to religion, though in his early teens he had had “occasional moments of . . . longing for something outside myself,” most often associated with profound experiences of nature or of music. Exposed to the challenges of “one or two aggressive atheists” in his dorm, however, he quickly concluded that no religion had any “foundational truth.”

The mathematical elegance of physics drew him into physical chemistry, where he was “immersed in quantum mechanics and second-order differential equations” and “gradually became convinced that everything in the universe could be explained on the basis of equations and physical principles.” Discovering that Einstein, one of his heroes, had not believed in the God of the Jewish people, Collins concluded that “no thinking scientist” could take the idea of God seriously, and he “gradually shifted from agnosticism to atheism.”

While working on his doctorate at Yale, Collins happened to take a course in biochemistry and was “astounded” by DNA and proteins “in all of their satisfying digital glory.” It was a “revelation” to him that mathematics and “rigorous intellectual principles” could be applied to biology, a field he had previously disdained. Around this time, however, he began to wonder how he could “make a difference in the lives of real people” and whether he was cut out for a life of research. And so, just before completing his degree in chemistry, he switched to medical school.

It was in medical school that his atheism suffered a blow: “I found the relationships [I] developed with sick and dying patients almost overwhelming.” The strength and solace so many of them derived from faith profoundly impressed him and left him thinking that “if faith was a psychological crutch . . . it must be a very powerful one.” His “most awkward moment” came when an older woman, suffering from a severe and untreatable heart problem, asked him what he believed. “I felt my face flush as I stammered out the words ’I’m not really sure.’”

Suddenly it was brought home to him that he had dismissed religion without ever really considering-or even knowing-the arguments in its favor. How could someone who prided himself on his scientific rationality do that? He was deeply shaken and felt impelled to carry out an honest and unprejudiced examination of religion. Attempts to read the sacred scriptures of various world religions left him baffled, however, so he sought out a local Methodist minister and asked him point-blank “whether faith made any logical sense.” The minister took a book down from his shelf and handed it to him. It was C.S. Lewis’ Mere Christianity.

Lewis gave Collins a simple, though crucial, insight: God is not a part of the physical universe and therefore cannot be perceived by the methods of science. Yet God speaks to us in our hearts and minds, both in such “longings” for the transcendent as Collins had himself experienced and in the sense of objective right and wrong, “the Moral Law.” A key aspect of this moral sense is “the altruistic impulse, the voice of conscience calling us to help others even if nothing is received in return.” Such altruism, says Collins, “is quite frankly a scandal for reductionist reasoning,” for it goes directly contrary to the selfishness of the “selfish gene.”

Collins reviews some of the attempts to explain altruism in evolutionary terms. One theory is that our primate ancestors rated altruism a positive attribute in potential mates. Another is that altruism provided survival advantages to its practitioners through “indirect reciprocal benefits.” A third is that altruism benefited the whole group in which it was prevalent rather than the individuals who practiced it.

Collins explains why none of these theories works. He then goes on to discuss several common objections to belief in God that troubled him at first but to which he was able to find satisfactory answers with the help of Lewis and other Christian writers. Collins presents these answers in clear, simple, and appealing language. Their power lies not only in strength of argument but also in their personal character, as when he discusses the problem of evil in the context of a tragedy that befell his own daughter.

Collins also examines what science has to say about the origins of the universe, life, and human beings. As he traces the history of the universe, he points to three discoveries that bolster the case for a creator. One is the “existence of mathematical principles and order in creation,” laws whose “mathematical representation invariably turns out to be elegant, surprisingly simple, and even beautiful.”

Another is the Big Bang, the putative beginning of the universe about fourteen billion years ago. And a third is the remarkable concatenation of “anthropic” coincidences and fine-tunings in the laws of physics that made possible the evolution of life.

It is interesting that Collins, a biologist, should take most of his “evidence for belief” from physics. As someone who came to biology through the physical sciences, he is obviously keenly aware of what Pope Benedict has called “the mathematical structure of matter, its intrinsic rationality, . . . the Platonic element in the modern understanding of nature.” One notes, by contrast, that some of the biologists who are most outspoken in their atheism have come from a background in zoology rather than the physical sciences. It may be that the scientists most susceptible to crude materialism are those who know the least about matter.

The physics and cosmology in the book are well done, but Collins’ discussion of the Big Bang is open to several criticisms. It is not quite accurate to say that the Big Bang “forces the conclusion that nature had a defined beginning.” Most physicists and cosmologists think it possible that the Big Bang was only the beginning of one phase of the universe’s history; the conclusion that the universe had a beginning at some point (whether at the Big Bang or earlier) is not yet forced by the physics alone. Collins also too simply equates the creation of the universe with the fact that it had a beginning in time.

Even a universe that had no beginning in time would still require its existence to be explained. And finally, there are points at which Collins seems to speak of the Big Bang as miraculous in the sense that the laws of physics broke down there, which is very doubtful. To be fair, these are issues that may be too subtle for a satisfactory treatment in a book aimed at such a wide audience. And Collins’ main point is certainly valid: Nature could not have created itself, and the Big Bang, by underlining the contingency of the world’s existence, supports the idea of creation.

As Collins moves from discussing the origin and development of the physical universe to the origin and development of life, he must enter on the battle-scarred terrain of evolution, a subject that takes up most of the latter half of the book. Here his message and his primary audience change. Up to this point he has been speaking on behalf of religious belief.

He now turns around and speaks to his fellow Christians, especially his fellow evangelicals, on behalf of evolution. His fundamental purpose, however, remains the same: “to call a truce in the escalating war between science and spirit,” a war that “was never really necessary” but “was initiated and intensified by extremists on both sides.”

Collins is appalled that “Young Earth Creationism is the view held by approximately 45 percent of Americans” and that “many evangelical Christian churches are aligned” with it. The persistence of this view, which is at once so theologically simplistic and scientifically indefensible, is “one of the great puzzles and tragedies of our time.” The danger is not to science but to faith: “Young people brought up in homes and churches that insist on Creationism sooner or later encounter the overwhelming scientific evidence in favor of an ancient universe and the relatedness of all living things through the process of evolution and natural selection. What a terrible and unnecessary choice they then face!”

In his appeal to young-earth creationists, Collins deploys both scientific and theological arguments. Though the evidence for evolution comes from many directions, he naturally focuses on the recent, powerful evidence that comes from studying the genomes of different species, evidence that, he says, “could fill hundreds of books of this length.” One of the examples he gives is the existence of “pseudogenes.” These are genes that have suffered mutations that “turn their script into gibberish” and render them defunct. “The human gene known as caspase-12, for instance, has sustained several knockout blows, though it is found in the identical relative location in the [genome of the] chimp. The chimp caspase-12 works just fine, as does the similar gene in nearly all mammals.” If the body of man did not evolve, but was formed as the young-earth creationists believe, then “why would God have gone to the trouble of inserting such a non-functional gene in this precise location?”

In Collins’ view, the Intelligent Design movement, unlike young-earth creationism, “deserves serious consideration” scientifically. Nonetheless, he sees it as a misguided and doomed effort that is, ironically, “on a path toward doing considerable damage to faith.” It is driven by a fear that Darwinism is incompatible with biblical belief and is an attempt “to find a scientifically respectable alternative.”

Collins argues forcefully that Darwinian evolution is, in fact, perfectly compatible with biblical faith. He avoids the trap into which so many liberal theologians have fallen: thinking that the lesson of evolution is that everything evolves, including God. Collins sees clearly that the key to harmonizing Darwinian evolution with Jewish and Christian faith is through the traditional teaching, so profoundly elaborated by St. Augustine, that God is outside time:

“If God is outside of nature, then He is outside of space and time. In that context, God could in the moment of creation of the universe also know every detail of the future. That could include the formation of the stars, planets, and galaxies, all of the chemistry, physics, geology, and biology that led to the formation of life on earth, and the evolution of humans. . . . In that context, evolution could appear to us to be driven by chance, but from God’s perspective the outcome would be entirely specified. Thus, God could be completely and intimately involved in the creation of all species, while from our perspective, limited as it is by the tyranny of linear time, this would appear a random and undirected process.”

With the aid of St. Augustine and C.S. Lewis, Collins knocks down one theological objection to Darwinian evolution after another.

For reasons that are unclear, Collins chooses to end his book with a lengthy appendix on medical-ethics issues, in which he defends certain positions that are necessitated neither by science nor religion. Not only does this run counter to the aims of the rest of the book, but the level of argument by which he attempts to justify “somatic cell nuclear transfer,” a form of cloning, hardly does him credit.

Still, The Language of God is a book of enormous value. At a time when so many people on both sides are trying to foment a conflict between science and religion, Collins is a sorely needed voice of reason. His book may do more to promote better understanding between the worlds of faith and science than any other so far written. I suspect that Collins himself would regard that as an achievement no less important than the one for which he was honored six years ago in the East Room of the White House.

h1

Max Tegmark’s Our Mathematical Universe — Peter Woit

January 21, 2014
The Multiverse theory for the universe has been a recently accepted theory that describes the continuous formation of universes through the collapse of giant stars and the formation of black holes.  With each of these black holes there is a new point of singularity and a new possible universe.  As Rees describes it, "Our universe may be just one element - one atom, as it were - in an infinite ensemble: a cosmic archipelago.  Each universe starts with its own big bang, acquires a distinctive imprint (and its individual physical laws) as it cools, and traces out its own cosmic cycle.  The big bang that triggered our entire universe is, in this grander perspective, an infinitesimal part of an elaborate structure that extends far beyond the range of any telescopes."  (Rees)  This puts our place in the Multiverse into a small spectrum.  While the size of the earth in relation to the sun is minuscule, the size of the sun, the solar system, the galaxy, and even the universe, could pale in comparison to this proposed Multiverse.  It would be a shift in thinking that may help explain our big bang theory and possibly give light to the idea of parallel universes.

The Multiverse theory for the universe has been a recently accepted theory that describes the continuous formation of universes through the collapse of giant stars and the formation of black holes. With each of these black holes there is a new point of singularity and a new possible universe. As Rees describes it, “Our universe may be just one element – one atom, as it were – in an infinite ensemble: a cosmic archipelago. Each universe starts with its own big bang, acquires a distinctive imprint (and its individual physical laws) as it cools, and traces out its own cosmic cycle. The big bang that triggered our entire universe is, in this grander perspective, an infinitesimal part of an elaborate structure that extends far beyond the range of any telescopes.” (Rees) This puts our place in the Multiverse into a small spectrum. While the size of the earth in relation to the sun is minuscule, the size of the sun, the solar system, the galaxy, and even the universe, could pale in comparison to this proposed Multiverse. It would be a shift in thinking that may help explain our big bang theory and possibly give light to the idea of parallel universes.

Mr. Woit is the author of “Not Even Wrong: The Failure of String Theory and the Search for Unity in Physical Law.His review of Our Mathematical Universe is a report from the front lines of mathematics and physics.

*****************************************

It’s a truly remarkable fact that our deepest understanding of the material world is embodied in mathematics, often in concepts that were originated with some very different motivation. A good example is our best description of how gravity works, Einstein’s 1915 theory of general relativity, in which the gravitational force comes from the curvature of space and time.

The formulation of this theory required Einstein to use mathematics developed 60 years earlier by the great German mathematician Bernhard Riemann, who was studying abstract questions involving geometry. There’s now a long history of intertwined and experimentally tested discoveries about physics and mathematics. This unity between mathematics and physics is a source of wonder for those who study the two subjects, as well as an eternal conundrum for philosophers.

Max Tegmark thus begins his new book with a deep truth when he articulates a “Mathematical Universe Hypothesis,” which states simply that “physical reality is a mathematical structure.” His central claim ends up being that such a hypothesis implies a surprising new vision of how to do physics, but the slipperiness of that word “is” should make the reader wary. Mr. Tegmark raises here the age-old question of whether math just describes physical reality or whether it defines physical reality. This distinction is of relevance to philosophers, but its significance for practicing physicists is unclear.

“Our Mathematical Universe” opens with a memoir of Mr. Tegmark’s own career in physics. He’s now a cosmologist at MIT whose specialty is interpreting data about the structure and evolution of the universe, much of it gathered from new space and earth-based instruments.

His book, however, quickly turns to the topic of the “multiverse” — the idea that our universe is part of some larger unobservable structure. Multiverse theories come in a baffling number of different versions. They have been a hot topic for the past dozen years, with Brian Greene’s “The Hidden Reality” (2011) a good example of a recent book covering this material.

Mr. Tegmark categorizes different multiverse proposals in terms of “Levels,” a useful method designed to keep track of the various theories. Many of these include some version of the idea that our universe is one of many unconnected universes obeying the same physical laws. This “Level I” type of multiverse is like Jorge Luis Borges’s “Library of Babel,” which contains all possible books, though most remain inaccessible to his story’s narrator due to their remoteness. As far back as 1584, Giordano Bruno proposed a universe of this sort, provoking mind-bending paradoxes involving infinite copies of oneself acting out completely different lives.

A much different type of multiverse arises in what is sometimes called the “many-worlds interpretation” of quantum theory. This is one way of thinking about the relationship between quantum mechanics and conventional human-scale physics. The idea is that while any quantum system is described by a single mathematical object called a quantum wave-function, this can contain within itself a description of an infinity of different possible worlds.

These correspond to the different possible states we may observe when we probe a quantum system with a macroscopic experimental apparatus. This multiverse is more like the “Garden of Forking Paths” that Borges describes in his story of that title, with each world branching off when we make an observation. Philosophical debate rages over what to think of such possible worlds: Are the ones we don’t end up in “real” or just a convenient calculational fiction? Mr. Tegmark calls the multiverse of such worlds a “Level III” multiverse.

These Level I and III possibilities fit reasonably well within variants of conventional views about our current best understanding of physics. The controversy surrounds what Mr. Tegmark calls “Level II” multiverses. At this level, different parts of a multiverse can have different physics — for instance, different fundamental forces, as well as different fundamental particles with different masses.

The problem: There is no experimental evidence for this and, arguably, no way of ever getting any, since our universe likely interacts in no way with any universes whose physics differs from our own. When someone is trying to sell a Level II multiverse theory, pay close attention to what exactly is being marketed; it comes without the warranty of an experimental test.

Since 1984 many physicists have worked on “string theory,” which posits a new unification of general relativity and quantum theory, achieved in part by abandoning the idea of fundamental particles. Early on, the new fundamental objects were supposed to be relatively well-defined one-dimensional vibrating string-like objects. Over the years this theory has evolved into something often called “M-theory,” which includes a wealth of poorly understood and mathematically complex components.

As far as one can now tell, if M-theory is to make sense, it will have so many possible solutions that one could produce just about any prediction about our observable universe that one might want. Such an unfalsifiable theory normally would be dismissed as unscientific, but proponents hope to salvage the situation by invoking a Level II multiverse containing all solutions to the theory. Our observed laws of physics would just represent a particular solution.

Mr. Tegmark wants to go even further down this rabbit hole. He assumes that what we observe is governed by something like M-theory, with its multiverse of different physical laws. But he wants to find a wider view that explains M-theory in terms of his “math is physics” hypothesis. He argues that his hypothesis implies the conclusion that “all mathematical structures exist.” The idea is that every example mathematicians teach in their classes, whether it’s a polynomial equation, a circle, a cube, or something much more complicated, represents an equally good universe. The collection of all mathematical structures he calls the “Level IV” multiverse, the highest and most general level.

Interpreting the meaning of “exists” in this way — to include all possible worlds — is a philosophical position known as “modal realism.” The innovation here is the claim that this carries a new insight into physics. The problem with such a conception of the ultimate nature of reality is not that it’s wrong but that it’s empty, far more radically untestable than even the already problematic proposals of M-theory. Mr. Tegmark proposes abandoning the historically proven path of pursuing a single exceptionally deep and very special mathematical structure at the core of both math and physics in favor of the hypothesis that, at the deepest level, “anything goes.”

Mr. Tegmark’s proposal takes him deep in the realm of speculation, and few of his fellow scientists are likely to want to follow him. There’s a danger, though, that his argument will convince some that “anything goes” is all there is to ultimate reality, discouraging their search for a better and more elegant version of our current best theories.

To be fair, Mr. Tegmark acknowledges he is going beyond conventional science, even including pithy advice about how to pursue a successful career while indulging in speculative topics that one’s colleagues are likely to see as beyond the bounds of what can be taken seriously. It’s worth remarking that not taking itself too seriously is one of the book’s virtues.

A final chapter argues for the importance of the “scientific lifestyle,” meaning scientific rationality as a basis for our decisions about important questions affecting the future of our species. But the great power of the scientific worldview has always come from its insistence that one should accept ideas based on experimental evidence, not on metaphysical reasoning or the truth-claims of authority figures. “Our Mathematical Universe” is a fascinating and well-executed dramatic argument from a talented expositor, but reading it with the skeptical mind-set of a scientist is advised.

h1

Does Quantum Physics Make it Easier to Believe in God? — Stephen M. Barr

August 23, 2013
If the human mind transcends matter to some extent, could there not exist minds that transcend the physical universe altogether? And might there not even exist an ultimate Mind?

If the human mind transcends matter to some extent, could there not exist minds that transcend the physical universe altogether? And might there not even exist an ultimate Mind?

A reblog from the site Big Questions Online

**************************************************

Not in any direct way. That is, it doesn’t provide an argument for the existence of God.  But it does so indirectly, by providing an argument against the philosophy called materialism (or “physicalism”), which is the main intellectual opponent of belief in God in today’s world.

Materialism is an atheistic philosophy that says that all of reality is reducible to matter and its interactions. It has gained ground because many people think that it’s supported by science. They think that physics has shown the material world to be a closed system of cause and effect, sealed off from the influence of any non-physical realities — if any there be.

Since our minds and thoughts obviously do affect the physical world, it would follow that they are themselves merely physical phenomena. No room for a spiritual soul or free will: for materialists we are just “machines made of meat.”

Quantum mechanics, however, throws a monkey wrench into this simple mechanical view of things.  No less a figure than Eugene Wigner, a Nobel Prize winner in physics, claimed that materialism — at least with regard to the human mind — is not “logically consistent with present quantum mechanics.” And on the basis of quantum mechanics, Sir Rudolf Peierls, another great 20th-century physicist, said, “the premise that you can describe in terms of physics the whole function of a human being … including [his] knowledge, and [his] consciousness, is untenable. There is still something missing.”

How, one might ask, can quantum mechanics have anything to say about the human mind?  Isn’t it about things that can be physically measured, such as particles and forces?  It is; but while minds cannot be measured, it is ultimately minds that do the measuring. And that, as we shall see, is a fact that cannot be ignored in trying to make sense of quantum mechanics.

If one claims that it is possible (in principle) to give a complete physical description of what goes on during a measurement — including the mind of the person who is doing the measuring — one is led into severe difficulties. This was pointed out in the 1930s by the great mathematician John von Neumann.  Though I cannot go into technicalities in an essay such as this, I will try to sketch the argument.

It all begins with the fact that quantum mechanics is inherently probabilistic. Of course, even in “classical physics” (i.e. the physics that preceded quantum mechanics and that still is adequate for many purposes) one sometimes uses probabilities; but one wouldn’t have to if one had enough information.  Quantum mechanics is radically different: it says that even if one had complete information about the state of a physical system, the laws of physics would typically only predict probabilities of future outcomes. These probabilities are encoded in something called the “wavefunction” of the system.

A familiar example of this is the idea of “half-life.”  Radioactive nuclei are liable to “decay” into smaller nuclei and other particles.  If a certain type of nucleus has a half-life of, say, an hour, it means that a nucleus of that type has a 50% chance of decaying within 1 hour, a 75% chance within two hours, and so on. The quantum mechanical equations do not (and cannot) tell you when a particular nucleus will decay, only the probability of it doing so as a function of time. This is not something peculiar to nuclei. The principles of quantum mechanics apply to all physical systems, and those principles are inherently and inescapably probabilistic.

This is where the problem begins. It is a paradoxical (but entirely logical) fact that a probability only makes sense if it is the probability of something definite. For example, to say that Jane has a 70% chance of passing the French exam only means something if at some point she takes the exam and gets a definite grade.  At that point, the probability of her passing no longer remains 70%, but suddenly jumps to 100% (if she passes) or 0% (if she fails). In other words, probabilities of events that lie in between 0 and 100% must at some point jump to 0 or 100% or else they meant nothing in the first place.

This raises a thorny issue for quantum mechanics. The master equation that governs how wavefunctions change with time (the “Schrödinger equation”) does not yield probabilities that suddenly jump to 0 or 100%, but rather ones that vary smoothly and that generally remain greater than 0 and less than 100%.

Radioactive nuclei are a good example. The Schrödinger equation says that the “survival probability” of a nucleus (i.e. the probability of its not having decayed) starts off at 100%, and then falls continuously, reaching 50% after one half-life, 25% after two half-lives, and so on — but never reaching zero. In other words, the Schrödinger equation only gives probabilities of decaying, never an actual decay! (If there were an actual decay, the survival probability should jump to 0 at that point.) 

To recap: (a) Probabilities in quantum mechanics must be the probabilities of definite events. (b) When definite events happen, some probabilities should jump to 0 or 100%. However, (c) the mathematics that describes all physical processes (the Schrödinger equation) does not describe such jumps.  One begins to see how one might reach the conclusion that not everything that happens is a physical process describable by the equations of physics.

So how do minds enter the picture?  The traditional understanding is that the “definite events” whose probabilities one calculates in quantum mechanics are the outcomes of “measurements” or “observations” (the words are used interchangeably).  If someone (traditionally called “the observer”) checks to see if, say, a nucleus has decayed (perhaps using a Geiger counter), he or she must get a definite answer: yes or no.

Obviously, at that point the probability of the nucleus having decayed (or survived) should jump to 0 or 100%, because the observer then knows the result with certainty.  This is just common sense. The probabilities assigned to events refer to someone’s state of knowledge: before I know the outcome of Jane’s exam I can only say that she has a 70% chance of passing; whereas after I know I must say either 0 or 100%.

Thus, the traditional view is that the probabilities in quantum mechanics — and hence the “wavefunction” that encodes them — refer to the state of knowledge of some “observer”.  (In the words of the famous physicist Sir James Jeans, wavefunctions are “knowledge waves.”)

An observer’s knowledge — and hence the wavefunction that encodes it — makes a discontinuous jump when he/she comes to know the outcome of a measurement (the famous “quantum jump”, traditionally called the “collapse of the wave function”). But the Schrödinger equations that describe any physical process do not give such jumps!  So something must be involved when knowledge changes besides physical processes.

An obvious question is why one needs to talk about knowledge and minds at all. Couldn’t an inanimate physical device (say, a Geiger counter) carry out a “measurement”?  That would run into the very problem pointed out by von Neumann: If the “observer” were just a purely physical entity, such as a Geiger counter, one could in principle write down a bigger wavefunction that described not only the thing being measured but also the observer. And, when calculated with the Schrödinger equation, that bigger wave function would not jump! Again: as long as only purely physical entities are involved, they are governed by an equation that says that the probabilities don’t jump.

That’s why, when Peierls was asked whether a machine could be an “observer,” he said no, explaining that “the quantum mechanical description is in terms of knowledge, and knowledge requires somebody who knows.” Not a purely physical thing, but a mind.  

But what if one refuses to accept this conclusion, and maintains that only physical entities exist and that all observers and their minds are entirely describable by the equations of physics? Then the quantum probabilities remain in limbo, not 0 and 100% (in general) but hovering somewhere in between. They never get resolved into unique and definite outcomes, but somehow all possibilities remain always in play. One would thus be forced into what is called the “Many Worlds Interpretation” (MWI) of quantum mechanics.

In MWI, reality is divided into many branches corresponding to all the possible outcomes of all physical situations. If a probability was 70% before a measurement, it doesn’t jump to 0 or 100%; it stays 70% after the measurement, because in 70% of the branches there’s one result and in 30% there’s the other result! For example, in some branches of reality a particular nucleus has decayed — and “you” observe that it has, while in other branches it has not decayed — and “you” observe that it has not. (There are versions of “you” in every branch.)

In the Many Worlds picture, you exist in a virtually infinite number of versions: in some branches of reality you are reading this article, in others you are asleep in bed, in others you have never been born. Even proponents of the Many Worlds idea admit that it sounds crazy and strains credulity.

The upshot is this: If the mathematics of quantum mechanics is right (as most fundamental physicists believe), and if materialism is right, one is forced to accept the Many Worlds Interpretation of quantum mechanics. And that is awfully heavy baggage for materialism to carry.

If, on the other hand, we accept the more traditional understanding of quantum mechanics that goes back to von Neumann, one is led by its logic (as Wigner and Peierls were) to the conclusion that not everything is just matter in motion, and that in particular there is something about the human mind that transcends matter and its laws.  It then becomes possible to take seriously certain questions that materialism had ruled out of court: If the human mind transcends matter to some extent, could there not exist minds that transcend the physical universe altogether? And might there not even exist an ultimate Mind?

h1

The Many Splendored Blobs of Neurocentrism — Matthew Hutson

June 17, 2013
Neuroimaging isn't the hard science we like to think it is. Our interpretations of those splotches of color depend upon multiple assumptions about the human mind, and applying fMRI insights outside the lab requires many more. To some degree, the blobs are a cultural construct, a useful fiction. In other words, they're all in our heads.

Neuroimaging isn’t the hard science we like to think it is. Our interpretations of those splotches of color depend upon multiple assumptions about the human mind, and applying fMRI insights outside the lab requires many more. To some degree, the blobs are a cultural construct, a useful fiction. In other words, they’re all in our heads.

A review of Brainwashed by Sally Satel and Scott O. Lilienfeld most recently in the WSJ. Yes, love IS a many splendored blob…

****************************************

Humanity is under attack by blobs. Nestled in our brains, they appear to control our emotions. These infiltrators remain invisible without sophisticated technology, but when discovered they often make headlines.

Actually, to say that we discover them isn’t quite right. We create them: They are the bits of color seen in brain scans, or “functional magnetic resonance imaging,” in the parlance of the scientists, doctors and marketers who conduct this research. By measuring, analyzing and making inferences, scientists can learn that one part of your brain lights up when you wrestle with a decision; that another is exercised when you shop online; or that a third part makes you fall in love. (One branding expert used fMRI data to claim that Apple users literally adore their devices.)

Such neuroscientific techniques — fMRI is one of many — provide plenty to be excited about. The authors of “Brainwashed: The Seductive Appeal of Mindless Neuroscience,” while sharing in this enthusiasm, offer a more skeptical take. At issue for psychiatrist Sally Satel and clinical psychologist Scott Lilienfeld is “neurocentrism,” or “the view that human experience and behavior can be best explained from the predominant or even exclusive perspective of the brain.” In their concise and well-researched book, they offer a reasonable and eloquent critique of this fashionable delusion, chiding the premature or unnecessary application of brain science to commerce, psychiatry, the law and ethics.

Brain scanning — at least as the technology stands today — suffers from a number of limitations. For starters, it often relies on a one-to-one mapping of cognitive function to brain area that simply doesn’t exist. Most thoughts are distributed, and “most neural real estate is zoned for mixed-use development,” as Dr. Satel and Mr. Lilienfeld write. So just knowing that disgust lights up your insula — a part of the cerebral cortex involved in attention, emotion and other functions — doesn’t imply that whenever the insula lights up you’re disgusted.

Despite such complexities, several firms have profited from selling, and perhaps overselling, fMRI’s capacity to peer into our souls. “Neuromarketers” try to suss out what drives us to buy one product rather than another. But there’s little public data to indicate that their methods work any better than the old standbys of surveys and focus groups. And they can blunder: In 2006, a neuroscientist declared a racy GoDaddy.com Super Bowl ad a flop after it failed to activate viewers’ pleasure centers. It had increased traffic to the site 16-fold.

If neurocentrism’s worst result were inspiring facile, gee-whiz headlines or bilking corporate advertisers out of cash, we could all go home with a good laugh over our obsession with Lite-Brite phrenology. But the neurocentric worldview has also crept into law enforcement and criminal justice. Predictably, defense attorneys try to use brain scans to prove that their clients lack rationality or impulse control and therefore can’t be held legally responsible. Companies such as No Lie MRI and Brain Fingerprinting Laboratories even claim to offer fMRI methods of lie detection.

One process looks for signs of recognition in a suspect’s brain as he views key evidence. This technique is fairly accurate in controlled conditions but requires evidence that has not been altered or leaked — i.e., details that the perpetrator and only the perpetrator would recognize. Another method looks for signs of neural conflict during questioning, indicating suppression of the truth. But no indicator is consistent across all liars or across all types of lies — spontaneous, rehearsed, remorseful, glib. The authors argue that fMRI lie detection is crummy legal evidence, and several courts have excluded such data because their accuracy outside the lab hasn’t been demonstrated.

Mr. Lilienfeld and Dr. Satel, who has worked in methadone clinics, spend a chapter confronting the popular model of addiction as a chronic brain disease. The trouble, they point out, is that most addicts eventually quit. In short, you can choose to stop using, but you can’t choose to stop having, say, Alzheimer’s. Those who promote the brain-disease model of addiction, including the National Institute on Drug Abuse, mean well when they strive to destigmatize addicts. But the authors say this model has distracted from behavioral therapies. Until a cocaine vaccine is available, they write, “the most effective interventions aim not at the brain but at the person.”

There are still more profound perils associated with the neurocentric vision. If the brain is just a biological machine, we have no free will and thus, strictly speaking, no claim to praise or blame. This violates both social norms and our own moral intuitions, and in the book’s final chapter the authors wade deeply into the philosophical debate about this new neurological determinism.

Moral responsibility, they argue, has practical benefits: “No society . . . can function and cohere unless its citizens exist within a system of personal accountability that stigmatizes some actions and praises others.” The position that Dr. Satel and Mr. Lilienfeld adopt is “compatibilism,” which holds that free will may not exist in an “ultimate” sense but exists in an “ordinary” sense, in that we feel free of constraints on our behavior. In everyday life, they argue, we should act as though the “ghost in the machine” were real.”

In a book that uses “mindless” accusatively in the subtitle, you might expect an excitable series of attacks on purveyors of what’s variously called neurohype, neurohubris and neurobollocks. But more often than not Dr. Satel and Mr. Lilienfeld stay fair and levelheaded. Good thing, because this is a topic that requires circumspection on all sides. Neuroimaging isn’t the hard science we like to think it is. Our interpretations of those splotches of color depend upon multiple assumptions about the human mind, and applying fMRI insights outside the lab requires many more. To some degree, the blobs are a cultural construct, a useful fiction. In other words, they’re all in our heads.

h1

God the Creator 2 –Benedict XVI

April 5, 2013
Staring across interstellar space, the Cat's Eye Nebula lies three thousand light-years from Earth. One of the most famous planetary nebulae, NGC 6543 is over half a light-year across and represents a final, brief yet glorious phase in the life of a sun-like star... “We must not in our own day conceal our faith in creation. We may not conceal it, for only if it is true that the universe comes from freedom, love, and reason, and that these are the real underlying powers, can we trust one another, go forward into the future, and live as human beings. God is the Lord of all things because he is their creator, and only therefore can we pray to him. For this means that freedom and love are not ineffectual ideas but rather that they are sustaining forces of reality.”

Staring across interstellar space, the Cat’s Eye Nebula lies three thousand light-years from Earth. One of the most famous planetary nebulae, NGC 6543 is over half a light-year across and represents a final, brief yet glorious phase in the life of a sun-like star… “We must not in our own day conceal our faith in creation. We may not conceal it, for only if it is true that the universe comes from freedom, love, and reason, and that these are the real underlying powers, can we trust one another, go forward into the future, and live as human beings. God is the Lord of all things because he is their creator, and only therefore can we pray to him. For this means that freedom and love are not ineffectual ideas but rather that they are sustaining forces of reality.”

The Unity of the Bible as a Criterion for Its Interpretation
[Continued from previous post...] So now we still have to ask: Is the distinction between the image and what is intended to be expressed only an evasion, because we can no longer rely on the text even though we still want to make something of it, or are there criteria from the Bible itself that attest to this distinction? Does it give us access to indications of this sort, and did the faith of the church know of these indications in the past and acknowledge them?

Let us look at Holy Scripture anew with these questions in mind. There we can determine first of all that the creation account in Genesis 1, which we have just heard, is not, from its very beginning, something that is closed in on itself. Indeed, Holy Scripture in its entirety was not written from beginning to end like a novel or a textbook.

It is, rather, the echo of God’s history with his people. It arose out of the struggles and the vagaries of this history, and all through it we can catch a glimpse of the rises and falls, the sufferings and hopes, and the greatness and failures of this history. The Bible is thus the story of God’s struggle with human beings to make himself understandable to them over the course of time; but it is also the story of their struggle to seize hold of God over the course of time.

Hence the theme of creation is not set down once for all in one place; rather, it accompanies Israel throughout its history, and, indeed, the whole Old Testament is a journeying with the Word of God. Only in the process of this journeying was the Bible’s real way of declaring itself formed, step by step.

Consequently we ourselves can only discover where this way is leading if we follow it to the end. In this respect — as a way — the Old and New Testaments belong together. For the Christian the Old Testament represents, in its totality, an advance toward Christ; only when it attains to him does its real meaning, which was gradually hinted at, become clear.

Thus every individual part derives its meaning from the whole, and the whole derives its meaning from its end — from Christ. Hence we only interpret an individual text theologically correctly (as the fathers of the church recognized and as the faith of the church in every age has recognized) when we see it as a way that is leading us ever forward, when we see in the text where this way is tending and what its inner direction is .

What significance, now, does this insight have for the understanding of the creation account? The first thing to be said is this: Israel always believed in the Creator God, and this faith it shared with all the great civilizations of the ancient world. For, even in the moments when monotheism was eclipsed, all the great civilizations always knew of the Creator of heaven and earth.

There is a surprising commonality here even between civilizations that could never have been in touch with one another. In this commonality we can get a good grasp of the profound and never altogether lost contact that human beings had with God’s truth. In Israel itself the creation theme went through several different stages. It was never completely absent, but it was not always equally important.

There were times when Israel was so preoccupied with the sufferings or the hopes of its own history, so fastened upon the here and now, that there was hardly any use in its looking back at creation; indeed, it hardly could. The moment when creation became a dominant theme occurred during the Babylonian Exile. It was then that the account that we have just heard — based, to be sure, on very ancient traditions — assumed its present form. Israel had lost its land and its temple.

According to the mentality of the time this was something incomprehensible, for it meant that the God of Israel was vanquished a God whose people, whose land, and whose worshipers could be snatched away from him. A God who could not defend his worshipers and his worship was seen to be, at the time, a weak God. Indeed, he was no God at all; he had abandoned his divinity. And so, being driven out of their own land and being erased from the map was for Israel a terrible trial: Has our God been vanquished, and is our faith void?

At this moment the prophets opened a new page and taught Israel that it was only then that the true face of God appeared and that he was not restricted to that particular piece of land. He had never been: He had promised this piece of land to Abraham before he settled there, and he had been able to bring his people out of Egypt. He could do both things because he was not the God of one place but had power over heaven and earth.

Therefore he could drive his faithless people into another land in order to make himself known there. And so it came to be understood that this God of Israel was not a God like the other gods, but that he was the God who held sway over every land and people. He could do this, however, because he himself had created everything in heaven and on earth. It was in exile and in the seeming defeat of Israel that there occurred an opening to the awareness of the God who holds every people and all of history in his hands, who holds everything because he is the creator of everything and the source of all power.

This faith now had to find its own contours, and it had to do so precisely vis-a-vis the seemingly victorious religion of Babylon, which was displayed in splendid liturgies, like that of the New Year, in which the re-creation of the world was celebrated and brought to its fulfillment. It had to find its contours vis-a-vis the great Babylonian creation account of Enuma Elish, which depicted the origin of the world in its own fashion.

There it is said that the world was produced out of a struggle between opposing powers and that it assumed its form when Marduk, the god of light, appeared and split in two the body of the primordial dragon. From this sundered body heaven and earth came to be. Thus the firmament and the earth were produced from the sundered body of the dead dragon, but from its blood Marduk fashioned human beings.

It is a foreboding picture of the world and of humankind that we encounter here: The world is a dragon’s body, and human beings have dragon’s blood in them. At the very origin of the world lurks something sinister, and in the deepest part of humankind there lies something rebellious, demonic, and evil. In this view of things only a dictator, the king of Babylon, who is the representative of Marduk, can repress the demonic and restore the world to order.

Such views were not simply fairy tales. They expressed the discomfiting realities that human beings experienced in the world and among themselves. For often enough it looks as if the world is a dragon’s lair and human blood is dragon’s blood. But despite all oppressive experiences the scriptural account says that it was not so. The whole tale of these sinister powers melts away in a few words: “The earth was without form and void.”

Behind these Hebrew words lie the dragon and the demonic powers that are spoken of elsewhere. Now it is the void that alone remains and that stands as the sole power over against God. And in the face of any fear of these demonic forces we are told that God alone, who is the eternal Reason that is eternal love, created the world, and that it rests in his hands. Only with this in mind can we appreciate the dramatic confrontation implicit in this biblical text, in which all these confused myths were rejected and the world was given its origin in God’s Reason and in his Word.

This could be shown almost word for word in the present text — as, for example, when the sun and the moon are referred to as lamps that God has hung in the sky for the measurement of time. To the people of that age it must have seemed a terrible sacrilege to designate the great gods sun and moon as lamps for measuring time. Here we see the audacity and the temperateness of the faith that, in confronting the pagan myths, made the light of truth appear by showing that the world was not a demonic contest but that it arose from God’s Reason and reposes on God’s Word.

Hence this creation account may be seen as the decisive “enlightenment” of history and as a breakthrough out of the fears that had oppressed humankind. It placed the world in the context of reason and recognized the world’s reasonableness and freedom. But it may also be seen as the true enlightenment from the fact that it put human reason firmly on the primordial basis of God’s creating Reason, in order to establish it in truth and in love, without which an “enlightenment” would be exorbitant and ultimately foolish.

To this something further must be added. I just said how, gradually, in confronting its pagan environment and its own heart, the people of Israel experienced what “creation” was. Implicit here is the fact that the classic creation account is not the only creation text of sacred Scripture. Immediately after it there follows another one, composed earlier and containing other imagery.

In the Psalms there are still others, and there the movement to clarify the faith concerning creation is carried further: In its confrontation with Hellenistic civilization, Wisdom literature reworks the theme without sticking to the old images such as the seven days. Thus we can see how the Bible itself constantly readapts its images to a continually developing way of thinking, how it changes time and again in order to bear witness, time and again, to the one thing that has come to it, in truth, from God’s Word, which is the message of his creating act.

In the Bible itself the images are free and they correct themselves ongoingly. In this way they show, by means of a gradual and interactive process, that they are only images, which reveal something deeper and greater.

Christology as a Criterion
One decisive fact must still be mentioned at this point: The Old Testament is not the end of the road. What is worked out in the so-called Wisdom literature is the final bridge on a long road that leads to the message of Jesus Christ and to the New Testament. Only there do we find the conclusive and normative scriptural creation account, which reads: “In the beginning was the Word, and the Word was with God, and the Word was God…. All things were made through him, and without him was not anything made that was made” (John 1:1, 3).

John quite consciously took up here once again the first words of the Bible and read the creation account anew, with Christ, in order to tell us definitively what the Word is which appears throughout the Bible and with which God desires to shake our hearts. Thus it becomes clear to us that we Christians do not read the Old Testament for its own sake but always with Christ and through Christ. Consequently the law of Moses, the rituals of purification, the regulations concerning food, and all other such things are not to be carried out by us; otherwise the biblical Word would be senseless and meaningless.

We read all of this not as if it were something complete in itself. We read it with him in whom all things have been fulfilled and in whom all of its validity and truth are revealed. Therefore we read the law, like the creation account, with him; and from him (and not from some subsequently discovered trick) we know what God wished over the course of centuries to have gradually penetrate the human heart and soul. Christ frees us from the slavery of the letter, and precisely thus does he give back to us, renewed, the truth of the images.

The ancient church and the church of the Middle Ages also knew this. They knew that the Bible is a whole and that we only understand its truth when we understand it with Christ in mind — with the freedom that he bestowed on us and with the profundity whereby he reveals what is enduring through images.

Only at the beginning of the modern era was this dynamic forgotten — this dynamic that is the living unity of Scripture, which we can only understand with Christ in the freedom that he gives us and in the certitude that comes from that freedom. The new historical thinking wanted to read every text in itself, in its bare literalness. Its interest lay only in the exact explanation of particulars, but meanwhile it forgot the Bible as a whole.

In a word, it no longer read the texts forward but backward — that is, with a view not to Christ but to the probable origins of those texts. People were no longer concerned with understanding what a text said or what a thing was from the aspect of its fulfillment, but from that of its beginning, its source.

As a result of this isolation from the whole and of this literal-mindedness with respect to particulars, which contradicts the entire inner nature of the Bible but which was now considered to be the truly scientific approach, there arose that conflict between the natural sciences and theology which has been, up to our own day, a burden for the faith.

This did not have to be the case, because the faith was, from its very beginnings, greater, broader, and deeper. Even today faith in creation is not unreal; even today it is reasonable; even from the perspective of the data of the natural sciences it is the “better hypothesis,” offering a fuller and better explanation than any of the other theories. Faith is reasonable. The reasonableness of creation derives from God’s Reason, and there is no other really convincing explanation. What the pagan Aristotle said four hundred years before Christ — when he opposed those who asserted that everything has come to exist through chance, even though he said what he did without the knowledge that our faith in creation gives us — is still valid today.

The reasonableness of the universe provides us with access to God’s Reason, and the Bible is and continues to be the true “enlightenment,” which has given the world over to human reason and not to exploitation by human beings, because it opened reason to God’s truth and love. Therefore we must not in our own day conceal our faith in creation. We may not conceal it, for only if it is true that the universe comes from freedom, love, and reason, and that these are the real underlying powers, can we trust one another, go forward into the future, and live as human beings. God is the Lord of all things because he is their creator, and only therefore can we pray to him. For this means that freedom and love are not ineffectual ideas but rather that they are sustaining forces of reality.

And so we wish to cite today, in thankfulness and joy, the church’s creed: “I believe in God, the Father Almighty, Creator of heaven and earth.” Amen.

h1

The Ineffable Mystery of God – Fr. Robert Barron

August 21, 2012

The cloister yard of Santa Sabina where it is reputed St. Thomas walked and pondered.

After many years of exile from the courts of Egypt where he had been raised, a Hebrew man named Moses, while tending the flock of his father-in-law on the slopes of Mount Sinai, saw an extraordinary sight: a bush that was on fire but was not being consumed. He resolved to take a closer look. As he approached, he heard a voice: “Moses! Moses! … Come no nearer! Remove the sandals from your feet, for the place where you stand is holy ground” (Exodus 3:5). Then the speaker identified himself as “the God of your father … the God of Abraham, the God of Isaac, the God of Jacob” (Exodus 3:6), and he gave Moses a mission to liberate his people enslaved in Egypt.

When Moses asked for the name of this mysterious speaker, he received the following answer: “I am who am” (Exodus 3:14). Moses was asking a reasonable enough question. He was wondering which of the many gods — deities of the river, the mountain, the various nations — this was. He was seeking to define and specify the nature of this particular heavenly power.

But the answer he received frustrated him. For the divine speaker was implying that he was not one god among many, not this deity rather than that, not a reality that could, even in principle, be captured or delimited by a name. In a certain sense, God’s response amounted to the undermining of the very type of question Moses posed. His name was simply “to be,” and therefore he could never be mastered. The ancient Israelites honored this essential mysteriousness of God by designating him with the unpronounceable name of YHWH.

Following the prompting of this conversation between Moses and God, the mainstream of the Catholic theological tradition has tended not to refer to God as a being, however supreme, among many. Thomas Aquinas, arguably the greatest theologian in the Catholic tradition, rarely designates God as ens summum (the highest being); rather he prefers the names ipsum esse (to be itself) or qui est (the one who is). In fact, Aquinas goes so far as to say that God cannot be defined or situated within any genus, even the genus of “being.” This means that it is wrong to say that trees, planets, automobiles, computers, and God — despite the obvious differences among them — have at least in common their status as beings. Aquinas expresses the difference that obtains between God and creatures through the technical language of essence and existence.

In everything that is not God there is a real distinction between essence (what the thing is) and existence (that the thing is); but in God no such distinction holds, for God’s act of existence is not received, delimited, or defined by anything extraneous to itself. A human being is the act of existence poured, as it were, into the receptacle of humanity, and a podium is the act of existence poured into the form of podium-ness, but God’s act of existence is not poured into any receiving element. To be God, therefore, is to be to be.

Saint Anselm of Canterbury, one of the greatest of the early medieval theologians, described God as “that than which nothing greater can be thought.” At first blush this seems straightforward enough: God is the highest conceivable thing. But the longer one meditates on Anselm’s description, the stranger it becomes. If God were simply the supreme being — the biggest reality among many — then God plus the world would be greater than God alone. But in that case he would not be that than which nothing greater can be thought. Zeus, for example, was, in ancient mythology, the supreme deity, but clearly Zeus plus the other gods, or Zeus plus the world of nature, would be greater than Zeus alone. Thus the God whom Anselm is describing is not like this at all. Though it is a very high paradox, the God whom Anselm describes added to the world as we know it is not greater than God alone.

This means that the true God exceeds all of our concepts, all of our language, all of our loftiest ideas. God (YHWH) is essentially mysterious, a term, by the way, derived from the Greek muein (to shut one’s mouth). How often the prophets and mystics of the Old Testament rail against idolatry, which is nothing other than reducing the true God to some creaturely object that we can know and hence try to control. The twentieth century theologian Karl Rahner commented that “God” is the last sound we should make before falling silent, and Saint Augustine, long ago, said, “si comprehendis, non est Deus” (if you understand, that isn’t God), All of this formal theologizing is but commentary on that elusive and confounding voice from the burning bush: “I am who am.”

Arguments For God’s Existence
I have firmly fended off the tendency to turn God into an idol, but have I left us thereby in an intellectual lurch, doomed simply to remain silent about God? If God cannot be in any sense defined, how do we explain the plethora of theological books and arguments? After all, the same Thomas Aquinas who said that God cannot be placed in any genus also wrote millions of words about God. Chapter 33 of Exodus gives us a clue to the resolution of this dilemma. Moses passionately asks God to reveal his glory to him, and Yahweh acquiesces. But the Lord specifies, “I will make all my beauty pass before you … But my face you cannot see, for no man sees me and still lives” (Exodus 33:19-20). God then tells Moses that while the divine glory passes by, God will place his servant in the cleft of a rock and cover Moses’s eyes. “Then I will remove my hand, so that you may see my back; but my face is not to be seen” (Exodus 33:22-23). God can indeed be seen in this life, but only indirectly, through his creatures and effects. We can understand him to a degree, but only obliquely, glimpsing him, as it were, out of the corners of our eyes. We see his “back” as it is disclosed in the beauty, the intelligibility, and the contingency of the world that he has made.

Following this principle of indirection, Thomas Aquinas formulated five arguments for God’s existence, each one of which begins from some feature of the created order. I will develop here the one that I consider the most elemental, the demonstration that commences with the contingency of the world. Though the term is technically philosophical, “contingency” actually names something with which we are all immediately familiar: the fact that things come into being and pass out of being. Consider a majestic summer cloud that billows up and then fades away in the course of a lazy August afternoon, coming into existence and then evanescing.

Now think of all of the plants and flowers that have grown up and subsequently withered away, and then of all the animals that have come into being, roamed the face of the earth, and then faded into dust. And ponder the numberless human beings who have come and gone, confirming the Psalmist’s intuition that “our years end like a sigh” (Psalms 90:9).  Even those things that seem most permanent — mountain ranges, the continents themselves, the oceans — have in fact emerged and will in fact fade. Indeed, if a time-lapse camera could record the entire life span of the Rocky Mountains, from the moment they began to emerge to the moment when they finally wear away, and if we could play that film at high speed, those mountains would look for all the world like that summer cloud.

The contingency of earthly things is the starting point of Aquinas’s proof, for it indicates something of great moment, namely, that such things do not contain within themselves the reason for their own existence. If they did, they would exist, simply and absolutely; they would not come and go so fleetingly. Therefore, in regard to contingent things, we have to look outside of them, to an extrinsic cause, or set of causes, in order to explain their existence. So let’s go back to that summer cloud. Instinctually, we know that it doesn’t exist through its own essence, and we therefore look for explanations. We say that it is caused by the moisture in the atmosphere, by the temperature, by the intensity of the winds, and so on, and as far as it goes, that explanation is adequate.

But as any meteorologist will tell us, those factors are altogether contingent, coming into being and passing out of being. Thus we go a step further and say that these factors in turn are caused by the jet stream, which is grounded in the movement of the planet. But a moment’s reflection reveals that the jet stream comes and goes, ebbs and flows, and that the earth itself is contingent, having emerged into existence four billion years ago and being destined one day to be incinerated by the expanding sun.

And so we go further, appealing to the solar system and events within the galaxy and finally perhaps to the very structures inherent in the universe. But contemporary astrophysics has disclosed to us the fundamental contingency of all of those realities, and indeed of the universe itself, which came into existence at the Big Bang some thirteen billion years ago. In our attempt to explain a contingent reality — that evanescent summer cloud — we have appealed simply to a whole series of similarly contingent realities, each one of which requires a further explanation.

Thomas Aquinas argues that if we are to avoid an infinite regress of contingent causes, which finally explain nothing at all, we must come finally to some “necessary” reality, something that exists simply through the power of its own essence. This, he concludes, is what people mean when they use the word “God.” With Aquinas’s demonstration in mind, reconsider that strange answer God gives to Moses’s question: “I am who am.” The biblical God is not one contingent reality among many; he is that whose very nature it is to exist, that power through which and because of which all other things have being.

Some contemporary theologians have translated Aquinas’s abstract metaphysical language into more experiential language. The Protestant theologian Paul Tillich said that “finitude in awareness is anxiety.” He means that when we know in our bones how contingent we are, we become afraid. We exist in time, and this means that we are moving, ineluctably, toward death; we have been “thrown” into being, and this means that one day we will be thrown out of being; and this state of affairs produces fear and trembling. In the grip of this anxiety, Tillich argues, we tend to thrash about, looking for something to reassure us, searching for some firm ground on which to stand.

We seek to alleviate our fears through the piling up of pleasure, wealth, power, or honor, but we discover, soon enough, that all of these worldly realities are as contingent as we are and hence cannot finally soothe us. It is at this point that the scriptural word “My soul rests in God alone” (Psalms 62:1) is heard in its deepest resonance. Our fear — born of contingency — will be assuaged only by that which is not contingent. Our shaken and fragile existence will be stabilized only when placed in relation to the eternal and necessary existence of God. Tillich is, in many ways, a contemporary disciple of Saint Augustine, who said, “Lord, you have made us for yourself, and our hearts are restless till they rest in Thee.”

In 1968 a young theology professor at the University of Tubingen formulated a neat argument for God’s existence that owed a good deal to Thomas Aquinas but that also drew on more contemporary sources. The theologian’s name was Joseph Ratzinger, now Pope Benedict XVI. Ratzinger commences with the observation that finite being, as we experience it, is marked, through and through, by intelligibility, that is to say, by a formal structure that makes it understandable to an inquiring mind. In point of fact, all of the sciences — physics, chemistry, psychology, astronomy, biology, and so forth — rest on the assumption that at all levels, microscopic and macrocosmic, being can be known. The same principle was acknowledged in ancient times by Pythagoras, who said that all existing things correspond to a numeric value, and in medieval times by the scholastic philosophers who formulated the dictum omne ens est scibile (all being is knowable).

Ratzinger argues that the only finally satisfying explanation for this universal objective intelligibility is a great Intelligence who has thought the universe into being. Our language provides an intriguing clue in this regard, for we speak of our acts of knowledge as moments of “recognition,” literally a re-cognition, a thinking again what has already been thought. Ratzinger cites Einstein in support of this connection: “in the laws of nature, a mind so superior is revealed that in comparison, our minds are as something worthless.”

The prologue to the Gospel of John states, “In the beginning was the Word,” and specifies that all things came to be through this divine Logos, implying thereby that the being of the universe is not dumbly there, but rather intelligently there, imbued by a creative mind with intelligible structure. The argument presented by Joseph Ratzinger is but a specification of that great revelation.

One of the particular strengths of this argument is that it shows the deep compatibility between religion and science, two disciplines that so often today are seen as implacable enemies. Ratzinger shows that the physical sciences rest upon the finally mystical intuition that reality has been thought into existence and hence can be known. I say it is mystical because it cannot itself be the product of empirical or experimental investigation, but is instead the very condition for the possibility of analyzing and experimenting in the first place. This is why many theorists have speculated that the emergence of the modern sciences in the context of a Christian intellectual milieu, in which the doctrine of creation through the power of an intelligent Creator is affirmed, is not the least bit accidental.

h1

The Abolition of Man Part One – C.S. Lewis

May 24, 2012

National Review ranked the1943 book #7 in its 100 Best Non-Fiction Books of the 20th Century list. The Intercollegiate Studies Institute ranked the book as the second best book of the 20th century. In a lecture on Walker Percy, Professor Peter Kreeft of Boston College listed the book as one of five “books to read to save Western Civilization,” alongside Lost in the Cosmos by Walker Percy, Mere Christianity by C.S. Lewis, The Everlasting Man by G.K. Chesterton, Orthodoxy by G.K. Chesterton, and Brave New World by Aldous Huxley

After the posts of the past couple weeks on pornography, I recalled a Woody Allen line about being on the losing side of the sexual revolution which dovetailed to this classic C.S. Lewis piece that concerns Man’s somewhat questionable conquest of Nature. If you have never read it, please do. A simple but depressing message: We have been sold for slaves. 

 

**********************************************

It came burning hot into my mind, whatever he said and however he flattered, when he got me home to his house, he would sell me for a slave.
John Bunyan

*********************************************

`Man’s conquest of Nature’ is an expression often used to describe the progress of applied science. `Man has Nature whacked,’ said someone to a friend of mine not long ago. In their context the words had a certain tragic beauty, for the speaker was dying of tuberculosis. `No matter’ he said, `I know I’m one of the casualties. Of course there are casualties on the winning as well as on the losing side. But that doesn’t alter the fact that it is winning.’

I have chosen this story as my point of departure in order to make it clear that I do not wish to disparage all that is really beneficial in the process described as `Man’s conquest’, much less all the real devotion and self-sacrifice that has gone to make it possible. But having done so I must proceed to analyse this conception a little more closely. In what sense is Man the possessor of increasing power over Nature?

Let us consider three typical examples: the airplane, the wireless, and the contraceptive. In a civilized community, in peace-time, anyone who can pay for them may use these things. But it cannot strictly be said that when he does so he is exercising his own proper or individual power over Nature. If I pay you to carry me, I am not therefore myself a strong man.

Any or all of the three things I have mentioned can be withheld from some men by other men — by those who sell, or those who allow the sale, or those who own the sources of production, or those who make the goods. What we call Man’s power is, in reality, a power possessed by some men which they may, or may not, allow other men to profit by. Again, as regards the powers manifested in the airplane or the wireless, Man is as much the patient or subject as the possessor, since he is the target both for bombs and for propaganda.

And as regards contraceptives, there is a paradoxical, negative sense in which all possible future generations are the patients or subjects of a power wielded by those already alive. By contraception simply, they are denied existence; by contraception used as a means of selective breeding, they are, without their concurring voice, made to be what one generation, for its own reasons, may choose to prefer. From this point of view, what we call Man’s power over Nature turns out to be a power exercised by some men over other men with Nature as its instrument.

It is, of course, a commonplace to complain that men have hitherto used badly, and against their fellows, the powers that science has given them, But that is not the point I am trying to make. I am not speaking of particular corruptions and abuses which an increase of moral virtue would cure: I am considering what the thing called `Man’s power over Nature’ must always and essentially be. No doubt, the picture could be modified by public ownership of raw materials and factories and public control of scientific research. But unless we have a world state this will still mean the power of one nation over others. And even within the world state or the nation it will mean (in principle) the power of majorities over minorities, and (in the concrete) of a government over the people. And all long-term exercises of power, especially in breeding, must mean the power of earlier generations over later ones.

The latter point is not always sufficiently emphasized, because those who write on social matters have not yet learned to imitate the physicists by always including Time among the dimensions. In order to understand fully what Man’s power over Nature, and therefore the power of some men over other men, really means, we must picture the race extended in time from the date of its emergence to that of its extinction. Each generation exercises power over its successors: and each, in so far as it modifies the environment bequeathed to it and rebels against tradition, resists and limits the power of its predecessors. This modifies the picture which is sometimes painted of a progressive emancipation from tradition and a progressive control of natural processes resulting in a continual increase of human power.

In reality, of course, if any one age really attains, by eugenics and scientific education, the power to make its descendants what it pleases, all men who live after it are the patients of that power. They are weaker, not stronger: for though we may have put wonderful machines in their hands we have pre-ordained how they are to use them. And if, as is almost certain, the age which had thus attained maximum power over posterity were also the age most emancipated from tradition, it would be engaged in reducing the power of its predecessors almost as drastically as that of its successors. And we must also remember that, quite apart from this, the later a generation comes — the nearer it lives to that date at which the species becomes extinct — the less power it will have in the forward direction, because its subjects will be so few.

There is therefore no question of a power vested in the race as a whole steadily growing as long as the race survives. The last men, far from being the heirs of power, will be of all men most subject to the dead hand of the great planners and conditioners and will themselves exercise least power upon the future.

The real picture is that of one dominant age — let us suppose the hundredth century A.D. — which resists all previous ages most successfully and dominates all subsequent ages most irresistibly, and thus is the real master of the human species. But then within this master generation (itself an infinitesimal minority of the species) the power will be exercised by a minority smaller still. Man’s conquest of Nature, if the dreams of some scientific planners are realized, means the rule of a few hundreds of men over billions upon billions of men. There neither is nor can be any simple increase of power on Man’s side. Each new power won by man is a power over man as well. Each advance leaves him weaker as well as stronger. In every victory, besides being the general who triumphs, he is also the prisoner who follows the triumphal car.

I am not yet considering whether the total result of such ambivalent victories is a good thing or a bad. I am only making clear what Man’s conquest of Nature really means and especially that final stage in the conquest, which, perhaps, is not far off. The final stage is come when Man by eugenics, by pre-natal conditioning, and by an education and propaganda based on a perfect applied psychology, has obtained full control over himself. Human nature will be the last part of Nature to surrender to Man. The battle will then be won. We shall have `taken the thread of life out of the hand of Clotho’ [One of the three fates, the daughter of Zeus and Themis {"divine law"}, who spins the thread of human life.]and be henceforth free to make our species whatever we wish it to be. The battle will indeed be won. But who, precisely, will have won it?

For the power of Man to make himself what he pleases means, as we have seen, the power of some men to make other men what they please. In all ages, no doubt, nurture and instruction have, in some sense, attempted to exercise this power. But the situation to which we must look forward will be novel in two respects. In the first place, the power will be enormously increased. Hitherto the plans of educationalists have achieved very little of what they attempted and indeed, when we read them — how Plato would have every infant “a bastard nursed in a bureau”, and Elyot would have the boy see no men before the age of seven and, after that, no women, and how Locke wants children to have leaky shoes and no turn for poetry — we may well thank the beneficent obstinacy of real mothers, real nurses, and (above all) real children for preserving the human race in such sanity as it still possesses. But the man-moulders of the new age will be armed with the powers of an omnicompetent state and an irresistible scientific technique: we shall get at last a race of conditioners who really can cut out all posterity in what shape they please.

The second difference is even more important. In the older systems both the kind of man the teachers wished to produce and their motives for producing him were prescribed by the Tao — a norm to which the teachers themselves were subject and from which they claimed no liberty to depart. They did not cut men to some pattern they had chosen. They handed on what they had received: they initiated the young neophyte into the mystery of humanity which over-arched him and them alike. It was but old birds teaching young birds to fly. This will be changed.

Values are now mere natural phenomena. Judgements of value are to be produced in the pupil as part of the conditioning. Whatever Tao there is will be the product, not the motive, of education. The conditioners have been emancipated from all that. It is one more part of Nature which they have conquered. The ultimate springs of human action are no longer, for them, something given. They have surrendered — like electricity: it is the function of the Conditioners to control, not to obey them. They know how to produce conscience and decide what kind of conscience they will produce. They themselves are outside, above. For we are assuming the last stage of Man’s struggle with Nature. The final victory has been won. Human nature has been conquered — and, of course, has conquered, in whatever sense those words may now bear.

The Conditioners, then, are to choose what kind of artificial Tao they will, for their own good reasons, produce in the Human race. They are the motivators, the creators of motives. But how are they going to be motivated themselves?

For a time, perhaps, by survivals, within their own minds, of the old `natural’ Tao. Thus at first they may look upon themselves as servants and guardians of humanity and conceive that they have a `duty’ to do it `good’. But it is only by confusion that they can remain in this state. They recognize the concept of duty as the result of certain processes which they can now control. Their victory has consisted precisely in emerging from the state in which they were acted upon by those processes to the state in which they use them as tools. One of the things they now have to decide is whether they will, or will not, so condition the rest of us that we can go on having the old idea of duty and the old reactions to it. How can duty help them to decide that? Duty itself is up for trial: it cannot also be the judge. And `good’ fares no better. They know quite well how to produce a dozen different conceptions of good in us. The question is which, if any, they should produce. No conception of good can help them to decide. It is absurd to fix on one of the things they are comparing and make it the standard of comparison.

To some it will appear that I am inventing a factitious difficulty for my Conditioners. Other, more simple-minded, critics may ask, `Why should you suppose they will be such bad men?’ But I am not supposing them to be bad men. They are, rather, not men (in the old sense) at all. They are, if you like, men who have sacrificed their own share in traditional humanity in order to devote themselves to the task of deciding what `Humanity’ shall henceforth mean.

`Good’ and `bad’, applied to them, are words without content: for it is from them that the content of these words is henceforward to be derived. Nor is their difficulty factitious, “We might suppose that it was possible to say `After all, most of us want more or less the same things — food and drink and sexual intercourse, amusement, art, science, and the longest possible life for individuals and for the species.

Let them simply say, This is what we happen to like, and go on to condition men in the way most likely to produce it. Where’s the trouble?’ But this will not answer. In the first place, it is false that we all really like the same things. But even if we did, what motive is to impel the Conditioners to scorn delights and live laborious days in order that we, and posterity, may have what we like? Their duty?

But that is only the Tao, which they may decide to impose on us, but which cannot be valid for them. If they accept it, then they are no longer the makers of conscience but still its subjects, and their final conquest over Nature has not really happened. The preservation of the species? But why should the species be preserved? One of the questions before them is whether this feeling for posterity (they know well how it is produced) shall be continued or not. However far they go back, or down, they can find no ground to stand on. Every motive they try to act on becomes at once petitio. It is not that they are bad men. They are not men at all. Stepping outside the Tao, they have stepped into the void. Nor are their subjects necessarily unhappy men. They are not men at all: they are artifacts. Man’s final conquest has proved to be the abolition of Man.

Follow

Get every new post delivered to your Inbox.

Join 274 other followers