Archive for the ‘Catholicism For Atheists’ Category

h1

Quantum Physics: The Multiverse of Parmenides 2 — Heinrich Pas

July 10, 2014
The bizarre properties of quantum physics naturally inspired the fantasies of both journalists and authors. The parallel existence of different realities in quantum physics, for example, became the subject of a Physics World cover in 1998, which depicts a couple on the phone arguing as follows: "Oh Alice . . . you're the one for me"-"But Bob . . . in a quantum world . . . How can we be sure?" Man’s best friends: A doggie selfie. Where does it all end?

The bizarre properties of quantum physics naturally inspired the fantasies of both journalists and authors. The parallel existence of different realities in quantum physics, for example, became the subject of a Physics World cover in 1998, which depicts a couple on the phone arguing as follows: “Oh Alice . . . you’re the one for me”-”But Bob . . . in a quantum world . . . How can we be sure?” Man’s best friends: A doggie selfie. Where does it all end?

Bohr summarized the apparent paradox of particles and waves under the concept of complementarity. After a guest lecture he gave at Moscow University, he left the following aphorism on the blackboard where famous visitors were sup­ posed to leave comments: Contraria non contradictoria sed complementa sunt (Opposites do not contradict but rather complement each other).

But back to Heisenberg, Plato, and the ancient Greeks: As the American philosopher of science Thomas S. Kuhn realized, science in times of scientific revolutions is particularly vulnerable to nonscientific influences. When changes to the scientific paradigm cause a shift in the generally accepted problems and solutions and thus also in the general perception and scientific world view, rational reasons like conformity with facts, consistency, scope, simplicity, and usefulness are not sufficient to understand the evolution of a new theory.

During these times, personal factors such as cultural back­ ground can also play a decisive role. And Heisenberg’s background was almost as Greek as it was German: As the son of a professor of Greek language, he became accustomed to Greek philosophy and culture and their reception in early twentieth-century Germany long before he himself learned Latin and ancient Greek in school. His biographer Armin Hermann suggests that the encounter with Plato’s philosophy influenced Heisenberg more than anything else. And not long after Heisenberg studied, climbed, and calculated in Helgoland, Paul Dirac in Cambridge and Erwin Schrodinger in Vienna worked out different but mathematically equivalent versions of quantum physics.

Since the standard interpretation of these works was developed basically in the inner circle around Bohr and Heisenberg, Heisenberg’s background seems particularly relevant for its appreciation. Also, Schrodinger made statements such as “Almost our entire intellectual heritage is of Greek origin” and “science can be correctly characterized as re­flecting on the Universe in a Greek way.”And Dirac left on the blackboard in Moscow, right next to Bohr’s principle of com­plementarity, only the laconic remark, “A physical law has to have mathematical beauty,” a statement that reminds us strongly of Goethe’s transfiguration of the classical worldview:

Nature and art, they seem each other to repel
Yet,
they fly together before one is aware;
The
antagonism has departed me as well,
And
now both of these seem to me equally fair.

And sure enough, quantum physics seems to be a Greek theory after all. This becomes evident when reading the thoughts in the book Die Einheit der Natur (The Unity of Na­ture) by Heisenberg’s student and friend, Carl Friedrich von Weizsacker, the brother of the subsequent German president, on the centerpiece of quantum physics — the wave-particle duality — and how it can be traced back to the arguments in Plato’s dialogue Parmenides.

Parmenides of Elea (Fig. 3.3) was a Greek philosopher in the pre-Socratic era around the fifth century BCE. Of his writing only the fragment of a philosophical poem remains; it deals with the unity of all being. It describes how an unnamed goddess-often understood as Persephone- invites the poet to perceive the truthful being-again a likely reference to the mystical experience in the mystery cults of Eleusis. 

The truth­ful being then is distinguished from mere appearances and described as the all-embracing One — uncreated and indestructible, alone, complete, immovable, and without an end — reminiscent of Aldous Huxley’s stage of egolessness. One is the All is correspondingly the central statement followed up by Weizsacker  when he discusses the argument between Socrates and Parmenides chronicled by Plato, which, according to the Italian author Luciano De Crescenzo, was the “most boring and complicated discussion in the entire history of philosophy.” 

In this battle of words, which supposedly took place on the occasion of a visit of Parmenides to Athens, Socrates tried to refute the identity of One and All. To this end Socrates argued that One is not Many and thus has no parts. On the other hand All refers to something which does not miss any of its parts. Consequently the One would consist of parts if it were All, and thus finally One cannot be the All. 

At this point Weizsacker comes to Parmenides’s defense by stressing the connection with quantum mechanics. And it is really astounding how the quantum mechanical interpretation of the One suddenly bestows this incomprehensible debate with lucidity and meaning. After all, in quantum mechanics the All is the wave function and, in its fullest manifestation, the all-embracing wave function of the universe. 

Moreover, in quantum mechanics the analysis of the individual parts of an object without destroying the object is impossible, since the measurement, as explained above, affects the object and thus distorts the unity of its parts. And of all possible states an object can assume, only an infinitesimally small fraction are states in which the parts of the object actually correspond to a clearly defined outcome of a measurement. Only in these states can one truthfully assign reality or existence to these parts.

For example, only two among the infinitely many possible states that Schrodinger’s cat can assume (such as 90 percent alive and 10 percent dead or 27.3 percent alive and 72.7 percent dead) — namely totally dead or totally alive-correspond to possible outcomes in a measurement. But quantum mechanically, a pair of two cats, half of them dead and the other half alive, is realizable not only with one living and one dead but also with two half-dead cats or one being 70 percent alive and one being 30 percent alive.

Consequently, in quantum physics the All is really more than its parts, the partial objects actually constituting through their association a new entity, or, just as postulated by Parmenides, a new unity, a new One. 

Now Parmenides, according to Plato, required further that the One possesses no properties: It has ho beginning, no center and no end, no shape and no location; it is neither in itself nor in anything else; it is neither at rest nor is it moving. Weizsacker can argue that a quantum mechanical object fulfills these requirements perfectly.

After all, a determination of any of these properties relies on a measurement, which implies a collapse of the wave function and thus destroys the unity of the collective object. On the other hand, isolation of the object from the surrounding universe is impossible: The object would not exist in the universe if it were not connected to the universe via some kind of interaction. 

Thus, strictly speaking, only the universe as a whole can constitute a real quantum mechanical object. 

Then, however, nobody would remain who could observe it from outside. Next Weizsacker and Parmenides follow the discussion back­ward: how the One — meaning the all-embracing universe barring all properties — unfurls into the colorful and multifaceted appearances of our everyday life. The argument relies here on the quirky assumption that the One, in the instant where it “is” — in the sense of exists — is already two things. It is the One and it is the Is. This argument can be iterated. Again both the One and the Is are two things: the Is is and is the One, and the One is and is the One. By repetition of this consideration the One acquires an infinite multiplicity: The being One unfolds itself into the universe. And again Weizsacker clarifies the discourse by referring to the quantum mechanical object.

After all, the way an object can exist is via interaction with other objects, which again results in the collapse of the wave function and the loss of quantum mechanical unity: In order to establish that an object exists, the object has to be measured and thus is affected in a way that implies that it is no longer one object according to the meaning of Par­menides’s One. In summary, Weizsacker arrives at an amazing conclusion, that the notion of complementarity has its source in ancient Greece: “We find . . . the foundation of complemen­tarity already foretold in Plato’s Parmenides.” We actually can recover the feel of what the ancient Greeks experienced in their mystery cults in modern twentieth-century physics! 

But this is not the end of the story: The atomism of Democritus, the idea that, the world is not continuously divisible but made out of indivisible particles, makes sense only in the context of quantum mechanics, where matter consists of compound objects that correspond to standing waves and thus can absorb or emit energy only in indivisible portions­ the quanta. 

Also, the idea of tracing the laws of nature back to fundamental symmetries, as proposed in Plato’s Timaeus is an integral part of contemporary particle physics. Finally, consider Einstein’s objection to the fundamental importance of probabilities. Because of that objection, he remained a lifelong opponent of quantum mechanics: God doesn’t play dice with the world. This statement appears as a direct response to the 2,500-year-old fragment of Heraclitus: “Eternity is a child moving counters in a game; the kingly power is a child’s.”

How can one really comprehend the lack of causality inherent in quantum physics and in particular the role of the puzzling quantum collapse, which are not described by the mathematical formalism and remain controversial today? The most modern and consistent interpretation of these puzzling phenomena seems to be at the same time the craziest one: Every­ thing that can happen does happen-albeit in different parallel universes. 

This idea was formulated for the first time in 1957 by Hugh Everett III while he was working on his doctoral dissertation at Princeton University. With the bizarre concept of parallel universes he asked too much of his contemporary physicists, even though Everett — like Richard Feynman, a founder of quantum electrodynamics, and Kip Thorne, the father of the wormhole time machine — was a student of the eminent John Archibald Wheeler, who was himself a rather unorthodox and creative associate of Einstein and who, among many other achievements coined the term black hole for the timeless star corpses in the universe.

But even with this first­ class mentor, Everett’s colleagues didn’t take him seriously. Everett left the academic world shortly after finishing his dissertation. During a frustrating visit in Copenhagen, during which Everett tried to convince Niels Bohr to take some interest in his work, he (Everett) transformed a standard approach in classical mechanics into a method for optimization that he could apply to commercial and military problems and that helped him to become a multimillionaire — but didn’t make him happy. He became a chain-smoking alcoholic and died of a heart attack when he was only fifty-one years old. 

According to his explicit wish, his ashes were disposed of in the garbage. Fourteen years after his death, his daughter Elizabeth, who suffered from schizophrenia, committed suicide. In her suicide note she wrote that she was going into a parallel universe, to meet her father. His son Mark Everett became the famous rock star E, lead singer of the Eels. He described his father as distant, depressed, and mentally absent, and his own childhood as strange .and lonely. Only his music saved him. But he also expressed sympathy for his father: “These guys, I don’t think they should be held to subscribe to normal rules. I think that about rock stars, too.” Hugh Everett’s ideas about quantum physics were finally popularized in the 1970s by his advisor Wheeler and Bryce DeWitt, who had also worked with Wheeler. It was DeWitt who added the “many-worlds” label, a term that Wheeler never liked. 

The interpretation essentially states that every measurement results in a split of the universe. Every possible outcome of a measurement — or more generally of any physical process — is being realized, but in different parallel universes. If a guy chats up a girl in a dance club, there is always a universe where the two of them get happily married and remain in love until they die, but also another one where she tells him to back off, he has too much to drink, and he wakes up the next morning with a serious hangover. This very in­sight made me particularly nervous when I prepared to jump out of an airplane 4,000 meters above Oahu’s north shore. After all, even if I survived in this universe, there are always countless universes where the parachute did not open. So somewhere one loses, every time. But somewhere there is also a parallel universe where Everett still lives happily together with his daughter.

The major advantage of the many-worlds interpretation, compared with the classical Copenhagen interpretation, is that no collapse of the wave function — which, in any case, is not really part of the theory — has to be assumed. Even after the measurement has been performed, both possible outcomes­ like an electron at place A and an electron at place B — coexist, but they decouple, so that an observer who measures the elec­tron at place A does not notice the alternative reality with the electron at place B. 

In contrast to the collapse of the wave function, this process of decoupling can be described within the formalism of quantum mechanics. Perhaps this process­ so-called decoherence — is the only reason we witness so little quantum weirdness in our everyday lives. The drawback of the many-worlds interpretation, however, is that we have to give up the concept of a unique reality.

The interaction of different parallel universes is suppressed after a measurement, but not totally lost. So even in our daily lives we don’t reside in clearly defined conditions such as dead or alive. The parallel universes in which we and our fellow human beings experience totally different fates instead resonate as unobservable tiny admixtures of alternative realities into our universe.

Thus the many-worlds interpretation exhibits the Parmenidic-neo-Platonic nature of quantum mechanics most clearly. According to this point of view, the unity of the different realities is not completely lost. It is actually possible to recognize the multiverse — the collection of all of Everett’s parallel universes — directly as Parmenides’s primeval One: the unity of the world the ancient Greeks felt they had lost in the charted modern world, and for whose reunification with the individualized ego they looked in the ecstasy of their mystery cults, in their Dionysian arts, or in the flush induced by psychedelic drugs.

The bizarre properties of quantum physics naturally inspired the fantasies of both journalists and authors. The parallel existence of different realities in quantum physics, for example, became the subject of a Physics World cover in 1998, which depicts a couple on the phone arguing as follows: “Oh Alice . . . you’re the one for me”-”But Bob . . . in a quantum world . . . How can we be sure?”

An even more radical take on the many-worlds interpretation can be found in Douglas Adams’s The Hitchhikers Guide to the Galaxy. Whenever the extraterrestrial crackpot Zaphod Beeblebrox, double-headed and addicted to Pan Galactic Gargle Blasters, starts the Infinite Improbability Drive, his stolen spaceship gets located in all places in the universe simultaneously, and tiny probabilities are amplified. In the novel this allows the spaceship to travel faster than light, and also causes various strange incidents, such as when a threatening pair of rockets gets sud­denly transformed into a dumbfounded whale and a flowerpot.

Finally, and now I am serious again, the many-worlds interpretation could protect time travelers from ludicrous paradoxes, and in this way make time travel a meaningful physics concept. But we’ll get to that later…

h1

Quantum Physics: The Multiverse of Parmenides 1 — Heinrich Pas

July 9, 2014
Heisenberg traveled to Copenhagen, Denmark, in the fall of 1941 to visit his fatherly friend and mentor Niels Bohr. According to Heisenberg, his intention was to inform Bohr that the construction of a nuclear bomb was possible but that the German physicists would not try to build it and to suggest that physicists in the allied nations should follow the same policy. This epic conversation, however, only resulted in a lasting breakdown of their friendship. Bohr, the son of a Jewish mother and the citizen of an occupied country, could not have much sympathy for any agreement with the German physicist. From left to right: Enrico Fermi, godfather of the neutrino; Werner Heisenberg, a creator of quantum mechanics; and Wolfgang Pauli, the father of the neutrino.

Heisenberg traveled to Copenhagen, Denmark, in the fall of 1941 to visit his fatherly friend and mentor Niels Bohr. According to Heisenberg, his intention was to inform Bohr that the construction of a nuclear bomb was possible but that the German physicists would not try to build it and to suggest that physicists in the allied nations should follow the same policy. This epic conversation, however, only resulted in a lasting breakdown of their friendship. Bohr, the son of a Jewish mother and the citizen of an occupied country, could not have much sympathy for any agreement with the German physicist. From left to right: Enrico Fermi, godfather of the neutrino; Werner Heisenberg, a creator of quantum mechanics; and Wolfgang Pauli, the father of the neutrino.

A major breakthrough in the story of quantum physics be­gins with a young man holed up in a rain pipe in order to find a quiet place for reading. It is the year 1919, in Munich, shortly after the end of World War I. The chaotic rioting in the streets that followed the revolution driving the German emperor out of office has finally calmed down, and now eighteen-year-old Werner Heisenberg can find some leisure time again.

He had been working as a local guide, assisting a vigilante group that was trying to reestablish order in the city, but now he could retreat, after the night watch on the command center’s hotline, onto the roof of the old seminary where his· cohort was accommodated. There he would lie, in the warm morning sun, in the rain pipe, reading Plato’s dia­logues. 

And on one of these mornings, while Ludwig street below him and the university building across the way with the small fountain in front slowly came to life, he came across that part in Timaeus where Plato philosophizes about the smallest constituents of matter, and the idea that the smallest particles can finally be resolved into mathematical structures and shapes, that one would encounter symmetries as the ba­sic pillar of nature-an idea that will fascinate him so deeply that it will capture him for the rest of his life.

Werner Heisenberg was to become one of the most important physicists of his generation. When just turned forty, he was the head of the German nuclear research program, which in World War II examined the possibilities for utilizing nuclear power, including the feasibility of nuclear weapons. In this position he was on the assassination list of the US Office of Strategic Services, but a special agent who had permission to kill Heisenberg in a lecture hall decided against it, after he heard Heisenberg’s lecture on abstract S-ma­trix theory and concluded. that the practical usefulness  of Heisenberg’s research was marginal.

Even today, historians debate Heisenberg’s role in Nazi Germany. His opponents criticize his remaining in Germany and his commitment to the nuclear research project, the so-called Uranverein, which, according to these critics, failed to build a nuclear weapon for Hitler only because Heisenberg was unable to do it. Extreme admirers, such as Thomas Powers in Heisenberg’s War, argue that Heisenberg used his position to prevent the construction of a German nuclear bomb by exaggerating its difficulties when questioned by officials, bestowing a moral mantle on Heisenberg he never had claimed for himself. 

What is well documented is that Heisenberg traveled to Copenhagen, Denmark, in the fall of 1941 to visit his fatherly friend and mentor Niels Bohr. According to Heisenberg, his intention was to inform Bohr that the construction of a nuclear bomb was possible but that the German physicists would not try to build it and to suggest that physicists in the allied nations should follow the same policy. This epic conversation, however, only resulted in a lasting breakdown of their friendship. Bohr, the son of a Jewish mother and the citizen of an occupied country, could not have much sympathy for any agreement with the German physicist.

In 1998, the British author Michael Frayn wove different perceptions of this meeting into a play that essentially deals with the parallel existence of different realities, both in psychology and in quantum mechanics. After all, among all his other activities, Heisenberg was famous for one thing: was one of the masterminds of a revolutionary new theory. 

Just six years after the sunny morning in the rain pipe, Heisenberg, now twenty-three years old and a postdoc at the University of Gottingen, was forced by his hay fever to leave his institute for two weeks, and he spent some sleepless time on Helgoland, a tiny and once holy red rock off Germany’s coast in the North Sea-days that would shatter the most basic grounds of physics. One-third of the day the young man climbed in the famous cliffs; one-third he memorized the works of Goethe, the poet who served as a national idol in Germany and who followed the classical paradigm of the ancient Greeks; and the last third he worked on his calculations. 

In these calculations he developed a formalism that would be the bed­ rock of modern quantum physics and would do nothing less than change the world: “In Helgoland there was one moment when it came to me just as a revelation . . . . It was rather late at night. I had finished this tedious calculation and at the end it came out correct. Then I climbed a rock, saw the sun rise and was happy.

“Nowadays the technical applications of quantum physics account for about one-third of the US gross domestic product. Nevertheless, Richard P.Feynman commented some forty years after Heisenberg’s work that the theory is so crazy that nobody can actually comprehend it, and Einstein had earlier declared bluntly: this is obvious nonsense. What makes quantum physics special is that this theory breaks radically with the concept of causality. In our daily lives we are used to ordered sequences of cause and effect: You and a friend clink your glasses with just a little bit too much verve; one glass breaks; beer runs down to the floor; your significant other/ roommate/parents cry out. 

One event causes the next one. This is exactly where quantum physics is different, where this strict connection between cause and effect no longer exists. For example, how a particle reacts to an influence can be predicted only in terms of probabilities. But this is not the end of the story: Unless the effect on the particle is actually observed, all possible consequences seem to be realized simultaneously. Particles can reside in two different locations at once! And particles exhibit properties of waves while waves behave in certain situations  like  particles. 

An object thus has both properties of a particle and of a wave, depending on how it is observed. The particle corresponds to an indivisible energy portion of the wave, a so-called quantum. On the other hand, the wave describes the probability for the particle to be located at a certain place. This property of quantum mechanics can be depicted most easily with the famous double-slit experiment (Figure below).

Figure 3.2. Double-slit experiment. As long as no measurement determines which slits the particles are passing through, they behave like interfering waves, which pass simultaneously though both slits (left side). Where two  wave crests coincide, the probability of detecting a particle is largest; where a crest coincides with a trough, the probability is very small or zero. The resulting image is called an interference pattern. As soon as an external measurement disturbs the system-for example, if one uses irradiation with light to determine which path the electrons take through the slits — the wave collapses into single particles, which accumulate in narrow bands behind the slits they were flying through (right side).

Figure 3.2. Double-slit experiment. As long as no measurement determines which slits the particles are passing through, they behave like interfering waves, which pass simultaneously though both slits (left side). Where two wave crests coincide, the probability of detecting a particle is largest; where a crest coincides with a trough, the probability is very small or zero. The resulting image is called an interference pattern. As soon as an external measurement disturbs the system-for example, if one uses irradiation with light to determine which path the electrons take through the slits — the wave collapses into single particles, which accumulate in narrow bands behind the slits they were flying through (right side).

When a particle beam hits a thin wall with two narrow slits in it, the corresponding wave penetrates both slits and spreads out on the other side as a circular wave. On a screen situated behind the wall, in accordance with the wave nature of the electrons, an interference  pattern appears, resulting from the superposition of the waves originating from the two slits in the wall.

Where a crest meets another crest or a trough meets another trough the wave gets amplified. A crest encountering a trough, on the other hand, results in little or no amplitude (left side). This pattern appears, however, only as long as it is unknown through which slit a single electron passed. As soon as this is determined, for example by blocking one of the slits or by irradiating the electrons with light, the two-slit interference pattern gets destroyed and the electrons behave just like classical particles. To be more accurate, a new wave emanates from the slit, and the pattern exhibited on the screen is the one for a wave passing through a single slit, which resembles a smooth probability distribution (right side).

Heisenberg and Bohr interpreted this as a collapse of the wave function due to  the  measurement  process  in  which one gets a result with the probability given by the amplitude squared of the wave. This is the so-called Copenhagen inter­pretation of quantum physics, which is still taught at universi­ties around the globe. According to this interpretation, a par­ticle is located in many places simultaneously until finally a measurement assigns it a concrete location. And this is true not only for position; it applies to other measurable quantities such as momentum, energy, the instant of a nuclear decay, and other properties as well. 

Erwin Schrodinger, both collaborator with and competitor of Heisenberg in the development of quantum physics, carried this idea to an extreme: “One can even set up quite ridiculous cases. A cat is penned up in a steel chamber, along with the following device (which must be se­ cured against direct interference by the cat).”

In Schrodinger’s experiment the death or life of the cat depends on whether a radioactive isotope does or doesn’t decay in a particular time period. As long as we do not check whether the isotope did decay or not, nor how the cat is doing, Schrodinger’s cat is simultaneously dead and alive, or as Schrodinger phrased it: “[The wave function of the system would have] in it the living and dead cat (pardon the expression) mixed or smeared out in equal parts.

There are two reasons why we don’t observe such bizarre phenomena in our daily lives: One is that the wavelengths of ordinary objects around us are tiny compared with the sizes of the objects themselves. The other is that the objects we deal with every day are always interacting with their environment and thus are being measured all the time. A beer bottle, for example, may very well be situated in two different locations, but only for an extremely short time and for an extremely small separation (too short and too small to measure).

h1

In the Beginning There Was an Atom — Amir D. Aczel

May 15, 2014
Georges Lemaître, (1894-1966), Belgian cosmologist, Catholic priest, and father of the Big Bang theory

Georges Lemaître, (1894-1966), Belgian cosmologist, Catholic priest, and father of the Big Bang theory

With respect to the big-bang theory anyways, science and faith are not at odds.

*********************************************************

According to a recent Associated Press poll a majority of Americans — 51% — do not believe the universe began with the “big bang.”

The skepticism of half the country may seem startling, given how essential the big-bang theory is to modern cosmology, but there is a good reason for it. The big bang is at first hard to swallow. I am a physics writer, and yet I remember how perplexed I was many years ago when I heard MIT cosmologist Alan Guth describe the universe expanding within a fraction of a second from the size of an atom to “as big as a marble.” My initial thought was: How could he possibly know the size of the entire universe when it was less than a second old? Believing in the big bang seemed to require a leap of faith.

And if you feel uncomfortable with big-bang cosmology, you’re in excellent company: The greatest physicist of the 20th century, Albert Einstein, stubbornly refused to believe in it. Ironically, it was a Catholic priest who first came up with the big-bang idea in 1927. The Belgian priest Georges Lemaître, who was also an astronomer and physicist, theoretically deduced the expansion of the universe and proposed that it was launched from a “primeval atom” — the process later known as the big bang.

At the time of Lemaître’s prescient idea, not only Einstein but other physicists and astronomers believed that the universe was static, with no beginning or end. Lemaître did not buy this supposition. He believed in the story of Genesis, which outlines the birth of the universe, and he searched for a way to prove it scientifically. He presented his complicated mathematical results on the beginning of the universe — based on Einstein’s own general theory of relativity — in a 1931 meeting in London of the British Science Association that was dedicated to the relationship between science and spirituality.

This was after Edwin Hubble’s astronomical observations of 1929 had proved that Lemaître was right about the expansion of the universe, and as the news about Hubble’s discovery spread around the world, Einstein and many other scientists eventually came to accept the big-bang theory.

On March 17 of this year, in a dramatic news conference held at the Harvard-Smithsonian Center for Astrophysics, the Background Imaging of Cosmic Extragalactic Polarization (Bicep) research group of astronomers presented their discovery of gravitational waves, which confirmed the existence of this major theoretical phenomenon associated with Einstein’s general relativity, thus providing overwhelming evidence for the big-bang theory. It also strongly supported cosmic inflation, a mechanism by which the early universe expanded from the size of an atom to that of a marble and beyond — just as predicted by Alan Guth three decades ago.

And so the big-bang theory is verified not only by the Bicep evidence, but also from decades of data on the microwave background radiation in space (“embers of the big bang”) as well as high-energy particle collisions from the Large Hadron Collider (a tiny-scale simulation of the big bang). It also fundamentally does not conflict with scripture. So why do so many deny it?

The culprits might be “scientific atheists,” a small but vocal group of thinkers who employ science to claim that there is no God. Some argue that the universe came into existence all on its own. In particular, physicist Lawrence M. Krauss’s 2012 book “A Universe from Nothing” insists that the big bang occurred within a complete emptiness, and thus there is no need for a “God.” But the key assumption of Mr. Krauss’s conjecture is flawed and at odds with modern cosmology. The big bang did not occur in “nothing.” It had to be spawned in some kind of pre-existent medium, known by physicists as “quantum foam,” though we don’t know exactly what it is.

Despite the damage scientific atheists are doing to public opinion, the truth is that — at least with respect to big-bang cosmology — science and faith are not at odds. For it was the story in Genesis that inspired the big bang’s founder to discover how the universe came to be. And it was Genesis that provided the stimulus for the first mathematical calculations that led to the “primeval atom.” The 51% of Americans who deny the big bang — if they do so because they think the theory conflicts with faith — should come to trust our science.

 

h1

Book Review: The Language of God by Francis S. Collins

May 8, 2014
Francis Sellers Collins (born April 14, 1950) is an American physician-geneticist noted for his discoveries of disease genes and his leadership of the Human Genome Project (HGP). He currently serves as Director of the National Institutes of Health (NIH) in Bethesda, Maryland. He is the atheists greatest fear: a scientist who believes in God and lives a life of faith AND science.

Francis Sellers Collins (born April 14, 1950) is an American physician-geneticist noted for his discoveries of disease genes and his leadership of the Human Genome Project (HGP). He currently serves as Director of the National Institutes of Health (NIH) in Bethesda, Maryland. He is the atheists’ greatest fear: a scientist who believes in God and lives a life of faith AND science.

Stephen M. Barr is a theoretical particle physicist at the Bartol Research Institute of the University of Delaware and author of Modern Physics and Ancient Faith.I am performing a little Blog House Cleaning here, redoing my pages and using them to introduce the categories and all the posts contained therein.: hopefully will show readers of payingattentiontothesky.com how much “stuff” lurks under their mouse just a click away…

*********************************

“Today we are learning the language in which God created life.” With these words, President Clinton announced one of the great feats of modern science, the mapping of the human genome. Standing next to him in the East Room of the White House was the leader of the Human Genome Project, Francis S. Collins.

Collins has now written a book, The Language of God, but it is not the sort of book one might have expected him to write, for only a small part is devoted to the genome project. Rather, Collins has written the story of his other great discovery: the discovery not of new truths but of old truths. It is the story of how and why he came to believe in God.

As such, this book is almost unique. There are many conversion stories and many scientific autobiographies, but few books in which prominent scientists tell how they came to faith. If nothing else, Collins’ book gives the lie, in most spectacular fashion, to the claim made by Richard Dawkins in an interview not long ago: “You won’t find any intelligent person who feels the need for the supernatural,” Dawkins declared, “unless [he was] brought up that way.”

Francis Collins was not brought up that way; his family’s view was that religion “just wasn’t very important.” Almost the only contact Collins had with religion as a child was singing in the choir at the local Episcopal church, where his parents had sent him to learn music with the admonition that he shouldn’t take the theology too seriously. After discovering, in high-school science classes, “the intense satisfaction of the ordered nature of the universe,” Collins entered the University of Virginia at the age of sixteen to major in chemistry.

Up to then, he had given little thought to religion, though in his early teens he had had “occasional moments of . . . longing for something outside myself,” most often associated with profound experiences of nature or of music. Exposed to the challenges of “one or two aggressive atheists” in his dorm, however, he quickly concluded that no religion had any “foundational truth.”

The mathematical elegance of physics drew him into physical chemistry, where he was “immersed in quantum mechanics and second-order differential equations” and “gradually became convinced that everything in the universe could be explained on the basis of equations and physical principles.” Discovering that Einstein, one of his heroes, had not believed in the God of the Jewish people, Collins concluded that “no thinking scientist” could take the idea of God seriously, and he “gradually shifted from agnosticism to atheism.”

While working on his doctorate at Yale, Collins happened to take a course in biochemistry and was “astounded” by DNA and proteins “in all of their satisfying digital glory.” It was a “revelation” to him that mathematics and “rigorous intellectual principles” could be applied to biology, a field he had previously disdained. Around this time, however, he began to wonder how he could “make a difference in the lives of real people” and whether he was cut out for a life of research. And so, just before completing his degree in chemistry, he switched to medical school.

It was in medical school that his atheism suffered a blow: “I found the relationships [I] developed with sick and dying patients almost overwhelming.” The strength and solace so many of them derived from faith profoundly impressed him and left him thinking that “if faith was a psychological crutch . . . it must be a very powerful one.” His “most awkward moment” came when an older woman, suffering from a severe and untreatable heart problem, asked him what he believed. “I felt my face flush as I stammered out the words ’I’m not really sure.’”

Suddenly it was brought home to him that he had dismissed religion without ever really considering-or even knowing-the arguments in its favor. How could someone who prided himself on his scientific rationality do that? He was deeply shaken and felt impelled to carry out an honest and unprejudiced examination of religion. Attempts to read the sacred scriptures of various world religions left him baffled, however, so he sought out a local Methodist minister and asked him point-blank “whether faith made any logical sense.” The minister took a book down from his shelf and handed it to him. It was C.S. Lewis’ Mere Christianity.

Lewis gave Collins a simple, though crucial, insight: God is not a part of the physical universe and therefore cannot be perceived by the methods of science. Yet God speaks to us in our hearts and minds, both in such “longings” for the transcendent as Collins had himself experienced and in the sense of objective right and wrong, “the Moral Law.” A key aspect of this moral sense is “the altruistic impulse, the voice of conscience calling us to help others even if nothing is received in return.” Such altruism, says Collins, “is quite frankly a scandal for reductionist reasoning,” for it goes directly contrary to the selfishness of the “selfish gene.”

Collins reviews some of the attempts to explain altruism in evolutionary terms. One theory is that our primate ancestors rated altruism a positive attribute in potential mates. Another is that altruism provided survival advantages to its practitioners through “indirect reciprocal benefits.” A third is that altruism benefited the whole group in which it was prevalent rather than the individuals who practiced it.

Collins explains why none of these theories works. He then goes on to discuss several common objections to belief in God that troubled him at first but to which he was able to find satisfactory answers with the help of Lewis and other Christian writers. Collins presents these answers in clear, simple, and appealing language. Their power lies not only in strength of argument but also in their personal character, as when he discusses the problem of evil in the context of a tragedy that befell his own daughter.

Collins also examines what science has to say about the origins of the universe, life, and human beings. As he traces the history of the universe, he points to three discoveries that bolster the case for a creator. One is the “existence of mathematical principles and order in creation,” laws whose “mathematical representation invariably turns out to be elegant, surprisingly simple, and even beautiful.”

Another is the Big Bang, the putative beginning of the universe about fourteen billion years ago. And a third is the remarkable concatenation of “anthropic” coincidences and fine-tunings in the laws of physics that made possible the evolution of life.

It is interesting that Collins, a biologist, should take most of his “evidence for belief” from physics. As someone who came to biology through the physical sciences, he is obviously keenly aware of what Pope Benedict has called “the mathematical structure of matter, its intrinsic rationality, . . . the Platonic element in the modern understanding of nature.” One notes, by contrast, that some of the biologists who are most outspoken in their atheism have come from a background in zoology rather than the physical sciences. It may be that the scientists most susceptible to crude materialism are those who know the least about matter.

The physics and cosmology in the book are well done, but Collins’ discussion of the Big Bang is open to several criticisms. It is not quite accurate to say that the Big Bang “forces the conclusion that nature had a defined beginning.” Most physicists and cosmologists think it possible that the Big Bang was only the beginning of one phase of the universe’s history; the conclusion that the universe had a beginning at some point (whether at the Big Bang or earlier) is not yet forced by the physics alone. Collins also too simply equates the creation of the universe with the fact that it had a beginning in time.

Even a universe that had no beginning in time would still require its existence to be explained. And finally, there are points at which Collins seems to speak of the Big Bang as miraculous in the sense that the laws of physics broke down there, which is very doubtful. To be fair, these are issues that may be too subtle for a satisfactory treatment in a book aimed at such a wide audience. And Collins’ main point is certainly valid: Nature could not have created itself, and the Big Bang, by underlining the contingency of the world’s existence, supports the idea of creation.

As Collins moves from discussing the origin and development of the physical universe to the origin and development of life, he must enter on the battle-scarred terrain of evolution, a subject that takes up most of the latter half of the book. Here his message and his primary audience change. Up to this point he has been speaking on behalf of religious belief.

He now turns around and speaks to his fellow Christians, especially his fellow evangelicals, on behalf of evolution. His fundamental purpose, however, remains the same: “to call a truce in the escalating war between science and spirit,” a war that “was never really necessary” but “was initiated and intensified by extremists on both sides.”

Collins is appalled that “Young Earth Creationism is the view held by approximately 45 percent of Americans” and that “many evangelical Christian churches are aligned” with it. The persistence of this view, which is at once so theologically simplistic and scientifically indefensible, is “one of the great puzzles and tragedies of our time.” The danger is not to science but to faith: “Young people brought up in homes and churches that insist on Creationism sooner or later encounter the overwhelming scientific evidence in favor of an ancient universe and the relatedness of all living things through the process of evolution and natural selection. What a terrible and unnecessary choice they then face!”

In his appeal to young-earth creationists, Collins deploys both scientific and theological arguments. Though the evidence for evolution comes from many directions, he naturally focuses on the recent, powerful evidence that comes from studying the genomes of different species, evidence that, he says, “could fill hundreds of books of this length.” One of the examples he gives is the existence of “pseudogenes.” These are genes that have suffered mutations that “turn their script into gibberish” and render them defunct. “The human gene known as caspase-12, for instance, has sustained several knockout blows, though it is found in the identical relative location in the [genome of the] chimp. The chimp caspase-12 works just fine, as does the similar gene in nearly all mammals.” If the body of man did not evolve, but was formed as the young-earth creationists believe, then “why would God have gone to the trouble of inserting such a non-functional gene in this precise location?”

In Collins’ view, the Intelligent Design movement, unlike young-earth creationism, “deserves serious consideration” scientifically. Nonetheless, he sees it as a misguided and doomed effort that is, ironically, “on a path toward doing considerable damage to faith.” It is driven by a fear that Darwinism is incompatible with biblical belief and is an attempt “to find a scientifically respectable alternative.”

Collins argues forcefully that Darwinian evolution is, in fact, perfectly compatible with biblical faith. He avoids the trap into which so many liberal theologians have fallen: thinking that the lesson of evolution is that everything evolves, including God. Collins sees clearly that the key to harmonizing Darwinian evolution with Jewish and Christian faith is through the traditional teaching, so profoundly elaborated by St. Augustine, that God is outside time:

“If God is outside of nature, then He is outside of space and time. In that context, God could in the moment of creation of the universe also know every detail of the future. That could include the formation of the stars, planets, and galaxies, all of the chemistry, physics, geology, and biology that led to the formation of life on earth, and the evolution of humans. . . . In that context, evolution could appear to us to be driven by chance, but from God’s perspective the outcome would be entirely specified. Thus, God could be completely and intimately involved in the creation of all species, while from our perspective, limited as it is by the tyranny of linear time, this would appear a random and undirected process.”

With the aid of St. Augustine and C.S. Lewis, Collins knocks down one theological objection to Darwinian evolution after another.

For reasons that are unclear, Collins chooses to end his book with a lengthy appendix on medical-ethics issues, in which he defends certain positions that are necessitated neither by science nor religion. Not only does this run counter to the aims of the rest of the book, but the level of argument by which he attempts to justify “somatic cell nuclear transfer,” a form of cloning, hardly does him credit.

Still, The Language of God is a book of enormous value. At a time when so many people on both sides are trying to foment a conflict between science and religion, Collins is a sorely needed voice of reason. His book may do more to promote better understanding between the worlds of faith and science than any other so far written. I suspect that Collins himself would regard that as an achievement no less important than the one for which he was honored six years ago in the East Room of the White House.

h1

Nietzsches Nietzsches Everywhere — Patrick Connelly

March 17, 2014
Nihilism is the complete disregard for all things that cannot be scientifically proven or demonstrated. Nietzsche did not claim that nothing exists that cannot be proven, nor that those things should be disregarded. What Nietzsche did suggest was that many people used religion, especially Judeo-Christian teachings, as a crutch for avoiding decisive actions. Nietzsche's contribution to existentialism was the idea that men must accept that they are part of a material world, regardless of what else might exist. As part of this world, men must live as if there is nothing else beyond life. A failure to live, to take risks, is a failure to realize human potential.

Nihilism is the complete disregard for all things that cannot be scientifically proven or demonstrated. Nietzsche did not claim that nothing exists that cannot be proven, nor that those things should be disregarded. What Nietzsche did suggest was that many people used religion, especially Judeo-Christian teachings, as a crutch for avoiding decisive actions. Nietzsche’s contribution to existentialism was the idea that men must accept that they are part of a material world, regardless of what else might exist. As part of this world, men must live as if there is nothing else beyond life. A failure to live, to take risks, is a failure to realize human potential.

Patrick Connelly is associate professor of history and director of the Honors Program at Montreat College. This is a reblog from books and culture.com. He reviews Jennifer Ratner-Rosenhagen’s American Nietzsche: The History of an Icon and His Ideas below.

***********************************

Zarathustra in America.
The tragic and ironic final chapter of Friedrich Nietzsche’s life began with a spectacular collapse into debilitating insanity on the streets of Turin, Italy. It ended with the incapacitated philosopher occupying the second floor of a Weimar villa that housed the archives from which his sister Elizabeth would assume controversial control over his legacy. Prior to his breakdown, Nietzsche balanced his expectation of being a seer and facilitator of a civilizational crisis with the conviction that he was criminally underappreciated in his lifetime.

The European “Nietzsche vogue” of his incomprehensible final years, however, gave credence to the notion that his time had indeed come. Among the witnesses of this phenomenon was Wilbur Urban, an American doctoral student at the University of Leipzig and son of an Episcopal priest who discovered The Genealogy of Morals in a local bookstore. Urban later described the resulting personal encounter with Nietzsche’s ideas, as he read through the night and undertook an intellectual and spiritual reevaluation of everything he held dear.

Urban’s experience of reading Nietzsche is recounted in Jennifer Ratner-Rosenhagen’s richly textured and absorbing American Nietzsche: The History of an Icon and His Ideas. The juxtaposition of Nietzsche near death in Weimar with a young American graduate student transfixed by his writings just miles away captures the importance of biography in Ratner-Rosenhagen’s account. Nietzsche’s “persona” became a focal point for readers and reviewers who “interpreted his philosophy through the lens of his biography.”

Yet American Nietzsche is also about the stories, emotions, and longings of Nietzsche’s readers and the “strong affective dimension” involved in how his ideas were received. The act of reading Nietzsche is narrated through published sources, personal recollections, marginalia, and fan letters written to Nietzsche and his sister. They give evidence of a thinker who struck a nerve with American readers due to his unconventional biography and singular vision of a modern world without foundations.

“Antifoundationalism” is a foundational idea in American Nietzsche, which explores how a motley crew of readers in the United States appropriated Nietzsche’s “denial of universal truth” in a distinctly American context. Academic philosophers, literary radicals, clergy, and political thinkers of various stripes are among the cast of characters concerned with the implications of Nietzsche’s ideas for “the moral and cultural grounds” of modern Americans. Ratner-Rosenhagen draws a portrait of Friedrich Nietzsche helping Americans understand themselves. This transatlantic intellectual and cultural exchange began, as Ratner-Rosenhagen tells the story, with an American thinker providing a transformative reading experience for Nietzsche himself.

Other commentators have noted Ralph Waldo Emerson’s influence on Nietzsche and discussed the affinities between the two thinkers, but no one has made such a forceful case that Nietzsche’s encounter with Emerson was so decisive and transformative. Ratner-Rosenhagen analyzes Nietzsche’s heavily annotated reading copy of Emerson’s Essays and notebook of Emerson quotations.

She credits Emerson with teaching Nietzsche about the “external forces that constrain individual autonomy.” Emerson is presented as providing Nietzsche with the example of the intellectual as provocateur, as one who doesn’t provide direct answers but provokes from a position “without inherited faith, without institutional affiliation, without rock or refuge for his truth claims.” Emerson’s influence, Ratner-Rosenhagen speculates, was particularly crucial in Nietzsche’s loss of faith, with his discovery of Emerson seemingly “the turning point” leading to his decision to abandon Christianity.

 Nietzsche biographers may wonder whether Ratner-Rosenhagen overstates Emerson’s role in Nietzsche’s personal and professional development (a recent biography by Julian Young contains only two references to Emerson in 562 pages of text), but American Nietzsche persuasively portrays Emerson as the “exemplar of the aboriginal intellect” abroad who helped Nietzsche to feel at home. It was a favor that Nietzsche would return to American readers in the decades to come.

Ratner-Rosenhagen does not exhaustively record every reference to Nietzsche in American print, though she examines in great detail how Americans experienced Nietzsche’s ideas. Her thematic and somewhat chronological survey begins with “the making of the American Nietzsche” by literary radicals and cultural critics. Literary radicals fretted over the state of American culture while hoping for a “cosmopolitanism” that would look to the example of Europe, which they believed had already been transformed by Nietzsche’s “challenge to all external authority.”

H. L. Mencken was among an eclectic group of cultural critics who focused on “the persona of Nietzsche.” Mencken’s influential monograph on Nietzsche refashioned the philosopher in Mencken’s image while suggesting that Americans desperately needed Nietzsche’s “fearless independence and fierce intelligence.” American Nietzsche later returns to the allure that Nietzsche contained for literary radicals and critics. Once again, Nietzsche’s biography educates these enthusiasts, who gravitated toward his paradigm of “the unaffiliated intellectual” changing the world through “literary expression and the social efficacy of ideas.” Writers and activists such as Emma Goldman, Kahlil Gibran, Randolph Bourne, and Walter Lippmann drew deeply from Nietzsche’s model of “the antifoundational intellect” and expressed hope that he could help them renew an impoverished American culture.

Mencken and other critics believed that religion was significantly to blame for that cultural poverty and were gripped by Nietzsche’s extraordinary attack on Christianity. The repercussions of Nietzsche’s critique were taken up by American clergy and theologians, who used the occasion to take inventory of Christianity’s future prospects and “to reassert their flagging moral authority in modernizing America.” Ratner-Rosenhagen’s account of Nietzsche’s religious readers is heavily weighted toward liberal Protestants and Social Gospelers, though she does consider Catholic apologists who viewed Nietzsche as the natural consequence of Protestantism.

Liberal Protestants and Social Gospelers understood Nietzsche as a “fellow seeker” and a “challenging doubter” who remained a vital instrument “for refitting their faith to the modern world.” Protestants less interested in this refitting, with a few exceptions, remain largely on the sidelines in American Nietzsche. Conservative Protestants are mentioned, but their collective perception of Nietzsche as an insidious force in the culture-shaping institutions of Germany and the United States remains underdeveloped.

Fundamentalists, who frequently lumped together Nietzsche and Darwin, are virtually absent from Ratner-Rosenhagen’s story. The addition of these neglected constituencies would strengthen the case that Protestants of all theological persuasions worried about the prospect of a civilization adrift from Christian foundations — even if they defined the problem and solution differently.

“A world after God” meant the arrival of the Übermensch for Nietzsche and his enthusiasts. The most ambitious section of American Nietzsche unites seemingly disparate individuals, discourses, and events around their fascination with one of Nietzsche’s signature ideas. Harvard philosophers, political radicals, and conservative New Humanists are portrayed as wrestling with the possibilities and limitations of the superman.

Nietzsche’s Übermensch later appeared in wartime debates less as a “constructive ideal” in antifoundational discourse than “a symbol for the German imperial temper.” The Übermensch seeped into the popular imagination through events such as the Leopold-Loeb trial, where the defendants murdered a boy due to their belief that “they were Nietzschean supermen.” Ratner-Rosenhagen makes a good case for the importance of the Übermensch for American readers, though as an interpretive construct it occasionally feels stretched.

Creating a coherent narrative is an arduous task for any reception study, of course, let alone one regarding a remarkably pliant thinker like Nietzsche. Ratner-Rosenhagen’s post-World War II examinations of the American reception illustrate that elasticity by focusing on the creation of numerous Nietzsches.

Walter Kaufmann’s Nietzsche was liberated from the taint of National Socialism, established as a serious philosopher who negotiated the analytic/existentialist divide in professional philosophy, and credited with transforming “American cold war culture” and fueling the discontent of the Sixties. Harold Bloom’s Nietzsche helped the critic move beyond the postmodern literary theories of Europe while enabling America to embrace its “alienated majesty” in a world without authorities beyond the self.

Richard Rorty’s Nietzsche inspired a “pragmatic antifoundationalism” that explored the tensions between self-creation and social solidarity in a world shorn of transcendent grounding. Stanley Cavell’s Nietzsche served as “a midwife of Emersonian philosophy” who helped Americans to rediscover their native antifoundational thinking. Allan Bloom’s Nietzsche was misappropriated to sustain “an unwholesome, lighthearted and softheaded ‘nihilism with a happy ending.’ “

Given these and other appropriations, can the real Nietzsche be discerned in an America awash in Nietzsches? It certainly goes against the grain of many of the thinkers discussed in American Nietzsche to suggest that a single understanding of Nietzsche is necessary or even possible. Epistemological and critical humility are needed — we do, after all, see through a glass darkly — but it is difficult to criticize misappropriations or misunderstandings of Nietzsche without having some sense of what he meant. This is especially challenging for a reception study.

Ratner-Rosenhagen contends that her book “is not even a book about Nietzsche” but rather “about his crucial role in the ever-dynamic remaking of modern American thought.” American Nietzsche is certainly about the latter — and engagingly so — but it is about Nietzsche as well. Ratner-Rosenhagen resists a full-fledged exposition of Nietzsche’s ideas, though she does selectively elaborate, taking several opportunities to correct perceived misreadings and resisting fashionable assumptions about the “death of the author.”

The emphasis on the act of reading Nietzsche also leads to questions about how to understand the cultural and intellectual setting those acts transformed. What impact did Nietzsche have on American culture as a whole, as opposed to select individuals? One instance where this issue becomes problematic is Ratner-Rosenhagen’s discussion of celebrity.

She discusses, in fascinating detail, letters from the Nietzsche Archive that were written to Elizabeth Förster-Nietzsche by her brother’s American fans. Ratner-Rosenhagen suggests that the letters reveal “Nietzsche’s emergence as a celebrity in American culture.” Historians of celebrity, she argues, have focused inordinate attention on musicians and actors and their respective industries while neglecting the emergence of “the prophetic thinker” as celebrity. But it is difficult to see how a relatively small sample of letters can be used as evidence of celebrity, which by its very nature is about mass appeal and consumption.

How then does one gauge Nietzsche’s broader impact on American culture? Ratner-Rosenhagen provides much food for thought on this question throughout American Nietzsche, particularly when she discerns a larger popular effect as in the case of Walter Kaufmann’s monograph and translations. I wonder whether a more specific distillation of the notion of cultural authority would be instructive as well.

“Cultural authority” is an amorphous term that sociologists and historians have used more than defined. It involves the authority of individuals, ideas, and institutions to promote certain understandings of meaning and values in the culture at large and to shape core assumptions about God, human personhood, social and political order, science, economics, law, and other spheres of public and private life.

Protestant Christianity had long informed the American cultural milieu but faced substantial challenges to its authority by the time Nietzsche’s ideas first registered in the United States. Sociologist Christian Smith writes that a “secular revolution” was afoot, involving “secularizing activists” seeking “to overthrow a religious establishment’s control over socially legitimate knowledge.” Many of the same American academics, critics, activists, and clergy who appear in American Nietzsche were participants in the seismic shifts of authority in culture-shaping institutions.

Nietzsche’s early American admirers may have questioned whether the European “Nietzsche vogue” would take root in the United States, but they recognized his awareness of and contribution to the larger story of secularism. William Mackintire Salter wrote in 1917 that “a subtle, slow secular revolution in the mental and moral realm was what Nietzsche had in mind.”

Nietzsche himself realized that uprooting Christianity’s cultural authority was a long historical process involving more than simply rejecting traditional beliefs. It would be overreaching, of course, to suggest that Nietzsche’s ideas singlehandedly accomplished this revolutionary aim in the United States, but many of the subjects of Ratner-Rosenhagen’s book were willing to utilize his ideas to accelerate the process.

The result of this secular revolution, along with the rise of competing authorities and understandings of the world, meant further openings were created for Nietzsche’s antifoundationalism to gain a hearing in the decades to come. Jennifer Ratner-Rosenhagen’s wonderfully written and stimulating American Nietzsche compels us to reckon not only with what he said, but with what we have become.

h1

Max Tegmark’s Our Mathematical Universe — Peter Woit

January 21, 2014
The Multiverse theory for the universe has been a recently accepted theory that describes the continuous formation of universes through the collapse of giant stars and the formation of black holes.  With each of these black holes there is a new point of singularity and a new possible universe.  As Rees describes it, "Our universe may be just one element - one atom, as it were - in an infinite ensemble: a cosmic archipelago.  Each universe starts with its own big bang, acquires a distinctive imprint (and its individual physical laws) as it cools, and traces out its own cosmic cycle.  The big bang that triggered our entire universe is, in this grander perspective, an infinitesimal part of an elaborate structure that extends far beyond the range of any telescopes."  (Rees)  This puts our place in the Multiverse into a small spectrum.  While the size of the earth in relation to the sun is minuscule, the size of the sun, the solar system, the galaxy, and even the universe, could pale in comparison to this proposed Multiverse.  It would be a shift in thinking that may help explain our big bang theory and possibly give light to the idea of parallel universes.

The Multiverse theory for the universe has been a recently accepted theory that describes the continuous formation of universes through the collapse of giant stars and the formation of black holes. With each of these black holes there is a new point of singularity and a new possible universe. As Rees describes it, “Our universe may be just one element – one atom, as it were – in an infinite ensemble: a cosmic archipelago. Each universe starts with its own big bang, acquires a distinctive imprint (and its individual physical laws) as it cools, and traces out its own cosmic cycle. The big bang that triggered our entire universe is, in this grander perspective, an infinitesimal part of an elaborate structure that extends far beyond the range of any telescopes.” (Rees) This puts our place in the Multiverse into a small spectrum. While the size of the earth in relation to the sun is minuscule, the size of the sun, the solar system, the galaxy, and even the universe, could pale in comparison to this proposed Multiverse. It would be a shift in thinking that may help explain our big bang theory and possibly give light to the idea of parallel universes.

Mr. Woit is the author of “Not Even Wrong: The Failure of String Theory and the Search for Unity in Physical Law.His review of Our Mathematical Universe is a report from the front lines of mathematics and physics.

*****************************************

It’s a truly remarkable fact that our deepest understanding of the material world is embodied in mathematics, often in concepts that were originated with some very different motivation. A good example is our best description of how gravity works, Einstein’s 1915 theory of general relativity, in which the gravitational force comes from the curvature of space and time.

The formulation of this theory required Einstein to use mathematics developed 60 years earlier by the great German mathematician Bernhard Riemann, who was studying abstract questions involving geometry. There’s now a long history of intertwined and experimentally tested discoveries about physics and mathematics. This unity between mathematics and physics is a source of wonder for those who study the two subjects, as well as an eternal conundrum for philosophers.

Max Tegmark thus begins his new book with a deep truth when he articulates a “Mathematical Universe Hypothesis,” which states simply that “physical reality is a mathematical structure.” His central claim ends up being that such a hypothesis implies a surprising new vision of how to do physics, but the slipperiness of that word “is” should make the reader wary. Mr. Tegmark raises here the age-old question of whether math just describes physical reality or whether it defines physical reality. This distinction is of relevance to philosophers, but its significance for practicing physicists is unclear.

“Our Mathematical Universe” opens with a memoir of Mr. Tegmark’s own career in physics. He’s now a cosmologist at MIT whose specialty is interpreting data about the structure and evolution of the universe, much of it gathered from new space and earth-based instruments.

His book, however, quickly turns to the topic of the “multiverse” — the idea that our universe is part of some larger unobservable structure. Multiverse theories come in a baffling number of different versions. They have been a hot topic for the past dozen years, with Brian Greene’s “The Hidden Reality” (2011) a good example of a recent book covering this material.

Mr. Tegmark categorizes different multiverse proposals in terms of “Levels,” a useful method designed to keep track of the various theories. Many of these include some version of the idea that our universe is one of many unconnected universes obeying the same physical laws. This “Level I” type of multiverse is like Jorge Luis Borges’s “Library of Babel,” which contains all possible books, though most remain inaccessible to his story’s narrator due to their remoteness. As far back as 1584, Giordano Bruno proposed a universe of this sort, provoking mind-bending paradoxes involving infinite copies of oneself acting out completely different lives.

A much different type of multiverse arises in what is sometimes called the “many-worlds interpretation” of quantum theory. This is one way of thinking about the relationship between quantum mechanics and conventional human-scale physics. The idea is that while any quantum system is described by a single mathematical object called a quantum wave-function, this can contain within itself a description of an infinity of different possible worlds.

These correspond to the different possible states we may observe when we probe a quantum system with a macroscopic experimental apparatus. This multiverse is more like the “Garden of Forking Paths” that Borges describes in his story of that title, with each world branching off when we make an observation. Philosophical debate rages over what to think of such possible worlds: Are the ones we don’t end up in “real” or just a convenient calculational fiction? Mr. Tegmark calls the multiverse of such worlds a “Level III” multiverse.

These Level I and III possibilities fit reasonably well within variants of conventional views about our current best understanding of physics. The controversy surrounds what Mr. Tegmark calls “Level II” multiverses. At this level, different parts of a multiverse can have different physics — for instance, different fundamental forces, as well as different fundamental particles with different masses.

The problem: There is no experimental evidence for this and, arguably, no way of ever getting any, since our universe likely interacts in no way with any universes whose physics differs from our own. When someone is trying to sell a Level II multiverse theory, pay close attention to what exactly is being marketed; it comes without the warranty of an experimental test.

Since 1984 many physicists have worked on “string theory,” which posits a new unification of general relativity and quantum theory, achieved in part by abandoning the idea of fundamental particles. Early on, the new fundamental objects were supposed to be relatively well-defined one-dimensional vibrating string-like objects. Over the years this theory has evolved into something often called “M-theory,” which includes a wealth of poorly understood and mathematically complex components.

As far as one can now tell, if M-theory is to make sense, it will have so many possible solutions that one could produce just about any prediction about our observable universe that one might want. Such an unfalsifiable theory normally would be dismissed as unscientific, but proponents hope to salvage the situation by invoking a Level II multiverse containing all solutions to the theory. Our observed laws of physics would just represent a particular solution.

Mr. Tegmark wants to go even further down this rabbit hole. He assumes that what we observe is governed by something like M-theory, with its multiverse of different physical laws. But he wants to find a wider view that explains M-theory in terms of his “math is physics” hypothesis. He argues that his hypothesis implies the conclusion that “all mathematical structures exist.” The idea is that every example mathematicians teach in their classes, whether it’s a polynomial equation, a circle, a cube, or something much more complicated, represents an equally good universe. The collection of all mathematical structures he calls the “Level IV” multiverse, the highest and most general level.

Interpreting the meaning of “exists” in this way — to include all possible worlds — is a philosophical position known as “modal realism.” The innovation here is the claim that this carries a new insight into physics. The problem with such a conception of the ultimate nature of reality is not that it’s wrong but that it’s empty, far more radically untestable than even the already problematic proposals of M-theory. Mr. Tegmark proposes abandoning the historically proven path of pursuing a single exceptionally deep and very special mathematical structure at the core of both math and physics in favor of the hypothesis that, at the deepest level, “anything goes.”

Mr. Tegmark’s proposal takes him deep in the realm of speculation, and few of his fellow scientists are likely to want to follow him. There’s a danger, though, that his argument will convince some that “anything goes” is all there is to ultimate reality, discouraging their search for a better and more elegant version of our current best theories.

To be fair, Mr. Tegmark acknowledges he is going beyond conventional science, even including pithy advice about how to pursue a successful career while indulging in speculative topics that one’s colleagues are likely to see as beyond the bounds of what can be taken seriously. It’s worth remarking that not taking itself too seriously is one of the book’s virtues.

A final chapter argues for the importance of the “scientific lifestyle,” meaning scientific rationality as a basis for our decisions about important questions affecting the future of our species. But the great power of the scientific worldview has always come from its insistence that one should accept ideas based on experimental evidence, not on metaphysical reasoning or the truth-claims of authority figures. “Our Mathematical Universe” is a fascinating and well-executed dramatic argument from a talented expositor, but reading it with the skeptical mind-set of a scientist is advised.

h1

On God and god 2 – David Bentley Hart

October 3, 2013
By giving the name "God" to whatever as yet unknown agent or property or quality might account for this or that particular appearance of design, they have produced a picture of God that it is conceivable the sciences could someday genuinely make obsolete, because it really is a kind of rival explanation to the explanations the sciences seek. This has never been true of the God described in the great traditional metaphysical systems. The true philosophical question of God has always been posed at a far simpler but far more primordial and comprehensive level; it concerns existence as such: the logical possibility of the universe, not its mere physical probability. God, properly conceived, is not a force or cause within nature, and certainly not a kind of supreme natural explanation.

By giving the name “God” to whatever as yet unknown agent or property or quality might account for this or that particular appearance of design, they have produced a picture of God that it is conceivable the sciences could someday genuinely make obsolete, because it really is a kind of rival explanation to the explanations the sciences seek. This has never been true of the God described in the great traditional metaphysical systems. The true philosophical question of God has always been posed at a far simpler but far more primordial and comprehensive level; it concerns existence as such: the logical possibility of the universe, not its mere physical probability. God, properly conceived, is not a force or cause within nature, and certainly not a kind of supreme natural explanation.

David Bentley Hart is an Eastern Orthodox scholar of religion, philosopher, writer, and cultural commentator. He has taught at the University of Virginia, Duke Divinity School, and Providence College (RI).  A selection from his book The Experience of God from Yale University Press.

***************************************************

At a trivial level, one sees the confusion in some of the more shopworn witticisms of popular atheism: “I believe neither in God nor in the fairies at the bottom of my garden,” for instance, or `All people are atheists in regard to Zeus, Wotan, and most other gods; I simply disbelieve in one god more.” Once, in an age long since vanished in the mists of legend, those might even have been amusing remarks, eliciting sincere rather than merely liturgical laughter; but, even so, all they have ever demonstrated is a deplorable ignorance of elementary conceptual categories.

If one truly imagines these are all comparable kinds of intellectual conviction then one is clearly confused about what is at issue. Beliefs regarding fairies are beliefs about a certain kind of object that may or may not exist within the world, and such beliefs have much the same sort of intentional shape and rational content as beliefs regarding one’s neighbors over the hill or whether there are such things as black swans. Beliefs regarding God concern the source and ground and end of all reality, the unity and existence of every particular thing and of the totality of all things, the ground of the possibility of anything at all.

Fairies and gods, if they exist, occupy something of the same conceptual space as organic cells, photons, and the force of gravity, and so the sciences might perhaps have something to say about them, if a proper medium for investigating them could be found. We can, if nothing else, disabuse ourselves of belief in certain gods by simple empirical methods; we know now, for example, that the sun is not a god named Tonatiuh, at least not one who must be nourished daily on human blood lest he cease to shine, because we have withheld his meals for centuries now without calamity.

God, by contrast, is the infinite actuality that makes it possible for either photons or (possibly) fairies to exist, and so can be “investigated” only, on the one hand, by acts of logical deduction and induction and conjecture or, on the other, by contemplative or sacramental or spiritual experiences. Belief or disbelief in fairies or gods could never be validated by philosophical arguments made from first principles; the existence or nonexistence of Zeus is not a matter that can be intelligibly discussed in the categories of modal logic or metaphysics, any more than the existence of tree frogs could be; if he is there at all, one must go on an expedition to find him, or at least find out his address.

The question of God, by contrast, is one that can and must be pursued in terms of the absolute and the contingent, the necessary and the fortuitous, potency and act, possibility and impossibility, being and nonbeing, transcendence and immanence. Evidence for or against the existence of Thor or King Oberon would consist only in local facts, not universal truths of reason; it would be entirely empirical, episodic, psychological, personal, and hence elusive. Evidence for or against the reality of God, if it is there, saturates every moment of the experience of existence, every employment of reason, every act of consciousness, every encounter with the world around us.

Now, manifestly, one should not judge an intellectual movement by its jokes (even if one suspects that there is little more to it than its jokes). But exactly the same confusion shows itself in the arguments that many contemporary atheists make in earnest: For instance, “If God made the world, then who made God?” Or the famous dilemma drawn, in badly garbled form, from Plato’s Euthyphro, “Does God command a thing because it is good, or is it good because God commands it?”

I address both questions below (in my third and fifth chapters, respectively), so I shall not do so here. I shall, however, note that not only do these questions not pose deep quandaries for believers or insuperable difficulties for a coherent concept of God; they are not even relevant to the issue. And, until one really understands why this is so, one has not yet begun to talk about God at all. One is talking merely about some very distinguished and influential gentleman or lady named “God,” or about some discrete object that can be situated within a class of objects called “gods” (even if it should turn out that there happens to be only one occupant of that class).

As it happens, the god with whom most modern popular atheism usually concerns itself is one we might call a “demiurge” (demiourgos): a Greek term that originally meant a kind of public technician or artisan but that came to mean a particular kind of divine “world-maker” or cosmic craftsman. In Plato’s Timaeus, the demiurge is a benevolent intermediary between the realm of eternal forms and the realm of mutability; he looks to the ideal universe -the eternal paradigm of the cosmos -and then fashions lower reality in as close a conformity to the higher as the intractable resources of material nature allow.

He is, therefore, not the source of the existence of all things but rather only the Intelligent Designer and causal agent of the world of space and time, working upon materials that lie outside and below him, under the guidance of divine principles that lie outside and above him. He is an immensely wise and powerful being, but he is also finite and dependent upon a larger reality of which he is only a part. Later Platonism interpreted the demiurge in a variety of ways, and in various schools of Gnosticism in late antiquity he reappeared as an incompetent or malevolent cosmic despot, either ignorant or jealous of the true God beyond this cosmos; but none of that is important here.

Suffice it to say that the demiurge is a maker, but not a creator in the theological sense: he is an imposer of order, but not the infinite ocean of being that gives existence to all reality ex nihilo. And he is a god who made the universe “back then,” at some specific point in time, as a discrete event within the course of cosmic events, rather than the God whose creative act is an eternal gift of being to the whole of space and time, sustaining all things in existence in every moment. It is certainly the demiurge about whom Stenger and Dawkins write; neither has actually ever written a word about God.

And the same is true of all the other new atheists as far as I can tell. To be fair to all sides, however, I should also point out that the demiurge has had some fairly vociferous champions over the past few centuries, and at the moment seems to be enjoying a small resurgence in popularity. His first great modern revival came in the Deism of the seventeenth and eighteenth centuries, a movement whose adherents were impatient with the metaphysical “obscurities” and doctrinal “absurdities” of traditional religion, and who preferred to think of God as some very powerful spiritual individual who designed and fabricated the universe at the beginning of things, much as a watchmaker might design and fabricate a watch and then set it running.

In David Hume’s Dialogues Concerning Natural Religion this is the view of God advanced by Cleanthes and then elegantly dismantled by Philo (the traditional metaphysical and theological view of God is represented by Demea, though not very well, and against him Philo marshals an altogether different -and much weaker -set of arguments). And, while Deism had more or less died out before Darwin’s day, the philosophical support, has never entirely lost its charm for some.

The recent Intelligent Design movement represents the demiurge’s boldest adventure in some considerable time. I know that it is fashionable to heap abuse upon this movement, and that is not my intention here. After all, if one looks at the extraordinary cornplexity of nature and then interprets it as a sign of superhuman intelligence, one is doing something perfectly defensible; even some atheists have done as much (the brilliant and eccentric Fred Hoyle being a notable example).

Moreover, if one already believes in God, it makes perfect sense to see, say, the ever more extraordinary discoveries of molecular biology, or the problem of protein folding, or the incredible statistical improbabilities of a whole host of cosmological conditions (and so on) as bearing witness to something miraculous and profoundly rational in the order of nature, and to ascribe these wonders to God.

But, however compelling the evidence may seem, one really ought not to reverse the order of discovery here and attempt to deduce or define God from the supposed evidence of design in nature. As either a scientific or a philosophical project, Intelligent Design theory is a deeply problematic undertaking; and, from a theological or metaphysical perspective, it is a massive distraction.

To begin with, much of the early literature of this movement concerned instances of supposedly “irreducible complexity” in the biological world, and from these developed an argument for some sort of intelligent agency at work in the process of evolution. That would, of course, be a fascinating discovery if it could be shown to be true; but I do not see how in principle one ever could conclusively demonstrate such a thing. It could never be more than an argument from probability, because one cannot prove that any organism, however intricate, could not have been produced by some unguided phylogenic history.

Probability is a powerful thing, of course, but notoriously difficult to measure in the realm of biology’s complex systems of interdependence, or over intervals of time as vast as distinct geological epochs. And it would be quite embarrassing to propose this or that organism or part of an organism as a specimen of an irreducibly complex biological mechanism, only for it to emerge later that many of its components had been found in a more primitive form in some other biological mechanism, serving another purpose.

Even if all this were not so, however, seen in the light of traditional theology the argument from irreducible complexity looks irredeemably defective, because it depends on the existence of causal discontinuities in the order of nature, “gaps” where natural causality proves inadequate. But all the classical theological arguments regarding the order of the world assume just the opposite: that God’s creative power can be seen in the rational coherence of nature as a perfect whole; that the universe was not simply the factitious product of a supreme intellect but the unfolding of an omnipresent divine wisdom or logos.

For Thomas Aquinas, for instance, God creates the order of nature by infusing the things of the universe with the wonderful power of moving of themselves toward determinate ends; he uses the analogy of a shipwright able to endow timbers with the power to develop into a ship without external intervention. According to the classical arguments, universal rational order — not just this or that particular instance of complexity — is what speaks of the divine mind: a cosmic harmony as resplendently evident in the simplicity of a raindrop as in the molecular labyrinths of a living cell.

After all, there may be innumerable finite causes of complexity, but a good argument can be made that only a single infinite cause can account for perfect, universal, intelligible, mathematically describable order. If, however, one could really show that there were interruptions in that order, places where the adventitious intrusions of an organizing hand were needed to correct this or that part of the process, that might well suggest some deficiency in the fabric of creation.

It might suggest that the universe was the work of a very powerful, but also somewhat limited, designer. It certainly would not show that the universe is the creature of an omnipotent wisdom, or an immediate manifestation of the God who is the being of all things. Frankly, the total absence of a single instance of irreducible complexity would be a far more forceful argument in favor of God’s rational action in creation.

As for theistic claims drawn from the astonishing array of improbable cosmological conditions that hold our universe together, including the cosmological constant itself, or from the mathematical razor’s edge upon which all of it is so exquisitely balanced, these rest upon a number of deeply evocative arguments, and those who dismiss them casually are probably guilty of a certain intellectual dishonesty. Certainly all of the cosmos’s exquisitely fine calibrations and consonances and exactitudes should speak powerfully to anyone who believes in a transcendent creator, and they might even have the power to make a reflective unbeliever curious about supernatural explanations.

But, in the end, such arguments also remain only probabilistic, and anyone predisposed to explain them away will find plentiful ways of doing so: perhaps the extravagant hypothesis that there are vastly many universes generated by quantum fluctuations, of the sort Stephen Hawking has recently said does away with any role for God in the origin of the universe, or perhaps the even more extravagant hypothesis that every possible universe must be actual (the former hypothesis reduces the odds considerably, and the latter does away with odds altogether). But in a sense none of this really matters, because ultimately none of these arguments has much to do with God in the first place.

This is obvious if one considers the terms in which they are couched. Hawking’s dismissal of God as an otiose explanatory hypothesis, for instance, is a splendid example of a false conclusion drawn from a confused question. He clearly thinks that talk of God’s creation of the universe concerns some event that occurred at some particular point in the past, prosecuted by some being who appears to occupy the shadowy juncture between a larger quantum landscape and the specific conditions of our current cosmic order; by “God,” that is to say, he means only a demiurge, coming after the law of gravity but before the present universe, whose job was to nail together all the boards and firmly mortar all the bricks of our current cosmic edifice.

So Hawking naturally concludes that such a being would be unnecessary if there were some prior set of laws — just out there, so to speak, happily floating along on the wave-functions of the quantum vacuum — that would permit the spontaneous generation of any and all universes. It never crosses his mind that the question of creation might concern the very possibility of existence as such, not only of this universe but of all the laws and physical conditions that produced it, or that the concept of God might concern a reality not temporally prior to this or that world, but logically and necessarily prior to all worlds, all physical laws, all quantum events, and even all possibilities of laws and events.

From the perspective of classical metaphysics, Hawking misses the whole point of talk of creation: God would be just as necessary even if all that existed were a collection of physical laws and quantum states, from which no ordered universe had ever arisen; for neither those laws nor those states could exist of themselves. But — and here is the crucial issue — those who argue for the existence of God principally from some feature or other of apparent cosmic design are guilty of the same conceptual confusion; they make a claim like Hawking’s seem solvent, or at least relevant, because they themselves have not advanced beyond the demiurgic picture of God.

By giving the name “God” to whatever as yet unknown agent or property or quality might account for this or that particular appearance of design, they have produced a picture of God that it is conceivable the sciences could someday genuinely make obsolete, because it really is a kind of rival explanation to the explanations the sciences seek. This has never been true of the God described in the great traditional metaphysical systems.

The true philosophical question of God has always been posed at a far simpler but far more primordial and comprehensive level; it concerns existence as such: the logical possibility of the universe, not its mere physical probability. God, properly conceived, is not a force or cause within nature, and certainly not a kind of supreme natural explanation.

Anyway, at this point I shall largely leave the new atheists, fundamentalists of every adherence, and Intelligent Design theorists all to their own devices, and perhaps to one another, and wish them all well, and hope that they do not waste too much time chasing after one another in circles. If I mention them below, it will be only to make a point in passing. From here onward, it is God — not gods, not the demiurge — of whom I wish to speak.

h1

On God and god 1 – David Bentley Hart

October 2, 2013
Yet the most pervasive error one encounters in contemporary arguments about belief in God -especially, but not exclusively, on the atheist side is the habit of conceiving of God simply as some very large object or agency within the universe, or perhaps alongside the universe, a being among other beings, who differs from all other beings in magnitude, power, and duration, but not ontologically, and who is related to the world more or less as a craftsman is related to an artifact.

Yet the most pervasive error one encounters in contemporary arguments about belief in God -especially, but not exclusively, on the atheist side is the habit of conceiving of God simply as some very large object or agency within the universe, or perhaps alongside the universe, a being among other beings, who differs from all other beings in magnitude, power, and duration, but not ontologically, and who is related to the world more or less as a craftsman is related to an artifact.

David Bentley Hart is an Eastern Orthodox scholar of religion, philosopher, writer, and cultural commentator. He has taught at the University of Virginia, Duke Divinity School, and Providence College (RI). A selection from his book The Experience of God from Yale University Press.

****************************************

There are two senses in which the word “God” or “god” can properly be used. Most modern languages generally distinguish between the two usages as I have done here, by writing only one of them with an uppercase first letter, as though it were a proper name -which it is not.

Most of us understand that “God” (or its equivalent) means the one God who is the source of all things, whereas “god” (or its equivalent) indicates one or another of a plurality of divine beings who inhabit the cosmos and reign over its various regions. This is not, however, merely a distinction in numbering, between monotheism and polytheism, as though the issue were merely that of determining how many “divine entities” one happens to think there are.

It is a distinction, instead, between two entirely different kinds of reality, belonging to two entirely disparate conceptual orders. In fact, the very division between monotheism and polytheism is in many cases a confusion of categories.

Several of the religious cultures that we sometimes inaccurately characterize as “polytheistic” have traditionally insisted upon an absolute differentiation between the one transcendent Godhead from whom all being flows and the various “divine” beings who indwell and govern the heavens and the earth. Only the one God, says Swami Prabhavananda, speaking more or less for the whole of developed Vedantic and Bhaktic Hinduism, is “the uncreated”: “gods, though supernatural, belong … among the creatures. Like the Christian angels, they are much nearer to man than to God.”

Conversely, many creeds we correctly speak of as “monotheistic” embrace the very same distinction. The Adi Granth of the Sikhs, for instance, describes the One God as the creator of Brahma, Vishnu, and Shiva. In truth, Prabhavananda’s comparison of the gods of India to Christianity’s angels is more apt than many modern Christians may realize.

Late Hellenistic pagan thought often tended to draw a clear demarcation between the one transcendent God (or, in Greek, ho theos, God with the definite article) and any particular or local god (any mere “inarticular” theos) who might superintend this or that people or nation or aspect of the natural world; at the same time, late Hellenistic Jews and Christians recognized a multitude of angelic “powers” and ,,principalities,” some obedient to the one transcendent God and some in rebellion, who governed the elements of nature and the peoples of the earth. To any impartial observer at the time, coming from some altogether different culture, the theological cosmos of a great deal of pagan “polytheism” would have seemed all but indistinguishable from that of a great deal of Jewish or Christian “monotheism.”

To speak of “God” properly, then -to use the word in a sense consonant with the teachings of orthodox Judaism, Christianity, Islam, Sikhism, Hinduism, Baha’i, a great deal of antique paganism, and so forth -is to speak of the one infinite source of all that is: eternal, omniscient, omnipotent, omnipresent, untreated, uncaused, perfectly transcendent of all things and for that very reason absolutely immanent to all things.

God so understood is not something posed over against the universe, in addition to it, nor is he the universe itself. He is not a “being,” at least not in the way that a tree, a shoemaker, or a god is a being; he is not one more object in the inventory of things that are, or any sort of discrete object at all.

Rather, all things that exist receive their being continuously from him, who is the infinite wellspring of all that is, in whom (to use the language of the Christian scriptures) all things live and move and have their being. In one sense he is “beyond being,” if by “being” one means the totality of discrete, finite things. In another sense he is “being itself,” in that he is the inexhaustible source of all reality, the absolute upon which the contingent is always utterly dependent, the unity and simplicity that underlies and sustains the diversity of finite and composite things.

Infinite being, infinite consciousness, infinite bliss, from whom we are, by whom we know and are known, and in whom we find our only true consummation. All the great theistic traditions agree that God, understood in this proper sense, is essentially beyond finite comprehension; hence, much of the language used of him is negative in form and has been reached only by a logical process of abstraction from those qualities of finite reality that make it insufficient to account for its own existence. All agree as well, however, that he can genuinely be known: that is, reasoned toward, intimately encountered, directly experienced with a fullness surpassing mere conceptual comprehension.

By contrast, when we speak of “gods” we are talking not of transcendent reality at all, but only of a higher or more powerful or more splendid dimension of immanent reality. Any gods who might be out there do not transcend nature but belong to it. Their theogonies can be recounted -how some rose out of the primal night, how some were born of other, more titanic progenitors, how others sprang up from an intermingling of divine and elemental forces, and so on -and according to many mythologies most of them will finally meet their ends.

They exist in space and time, each of them is a distinct being rather than “being itself,” and it is they who are dependent upon the universe for their existence rather than the reverse. Of such gods there may be an endless diversity, while of God there can be only one. Or, better, God is not merely one, in the way that a finite object might be merely singular or unique, but is oneness as such, the one act of being and unity by which any finite thing exists and by which all things exist together. He is one in the sense that being itself is one, the infinite is one, the source of everything is one. Thus a plurality of gods could not constitute an alternative to or contradiction of the unity of God; they still would not belong to the same ontological frame of reference as he.

Obviously, then, it is God in the former -the transcendent sense in whom it is ultimately meaningful to believe or not to believe. The possibility of gods or spirits or angels or demons, and so on, is a subordinate matter, a question not of metaphysics but only of the taxonomy of nature (terrestrial, celestial, and chthonic).

To be an atheist in the best modern sense, however, and so to be a truly intellectually and emotionally fulfilled naturalist in philosophy, one must genuinely succeed in not believing in God, with all the logical consequences such disbelief entails. It is not enough simply to remain indifferent to the whole question of God, moreover, because thus understood it is a question ineradicably present in the very mystery of existence, or of knowledge, or of truth, goodness, and beauty.

It is also the question that philosophical naturalism is supposed to have answered exhaustively in the negative, without any troubling explanatory lacunae, and therefore the question that any aspiring philosophical naturalist must understand before he or she can be an atheist in any intellectually significant way. And the best way to begin is to get a secure grasp on how radically, both conceptually and logically, belief in God differs from belief in the gods. This ought not to be all that difficult a matter; in Western philosophical tradition, for instance, it is a distinction that goes back at least as far as Xenophanes (c. 570–c. 475 BC).

Yet the most pervasive error one encounters in contemporary arguments about belief in God — especially, but not exclusively, on the atheist side — is the habit of conceiving of God simply as some very large object or agency within the universe, or perhaps alongside the universe, a being among other beings, who differs from all other beings in magnitude, power, and duration, but not ontologically, and who is related to the world more or less as a craftsman is related to an artifact.

h1

Does Quantum Physics Make it Easier to Believe in God? — Stephen M. Barr

August 23, 2013
If the human mind transcends matter to some extent, could there not exist minds that transcend the physical universe altogether? And might there not even exist an ultimate Mind?

If the human mind transcends matter to some extent, could there not exist minds that transcend the physical universe altogether? And might there not even exist an ultimate Mind?

A reblog from the site Big Questions Online

**************************************************

Not in any direct way. That is, it doesn’t provide an argument for the existence of God.  But it does so indirectly, by providing an argument against the philosophy called materialism (or “physicalism”), which is the main intellectual opponent of belief in God in today’s world.

Materialism is an atheistic philosophy that says that all of reality is reducible to matter and its interactions. It has gained ground because many people think that it’s supported by science. They think that physics has shown the material world to be a closed system of cause and effect, sealed off from the influence of any non-physical realities — if any there be.

Since our minds and thoughts obviously do affect the physical world, it would follow that they are themselves merely physical phenomena. No room for a spiritual soul or free will: for materialists we are just “machines made of meat.”

Quantum mechanics, however, throws a monkey wrench into this simple mechanical view of things.  No less a figure than Eugene Wigner, a Nobel Prize winner in physics, claimed that materialism — at least with regard to the human mind — is not “logically consistent with present quantum mechanics.” And on the basis of quantum mechanics, Sir Rudolf Peierls, another great 20th-century physicist, said, “the premise that you can describe in terms of physics the whole function of a human being … including [his] knowledge, and [his] consciousness, is untenable. There is still something missing.”

How, one might ask, can quantum mechanics have anything to say about the human mind?  Isn’t it about things that can be physically measured, such as particles and forces?  It is; but while minds cannot be measured, it is ultimately minds that do the measuring. And that, as we shall see, is a fact that cannot be ignored in trying to make sense of quantum mechanics.

If one claims that it is possible (in principle) to give a complete physical description of what goes on during a measurement — including the mind of the person who is doing the measuring — one is led into severe difficulties. This was pointed out in the 1930s by the great mathematician John von Neumann.  Though I cannot go into technicalities in an essay such as this, I will try to sketch the argument.

It all begins with the fact that quantum mechanics is inherently probabilistic. Of course, even in “classical physics” (i.e. the physics that preceded quantum mechanics and that still is adequate for many purposes) one sometimes uses probabilities; but one wouldn’t have to if one had enough information.  Quantum mechanics is radically different: it says that even if one had complete information about the state of a physical system, the laws of physics would typically only predict probabilities of future outcomes. These probabilities are encoded in something called the “wavefunction” of the system.

A familiar example of this is the idea of “half-life.”  Radioactive nuclei are liable to “decay” into smaller nuclei and other particles.  If a certain type of nucleus has a half-life of, say, an hour, it means that a nucleus of that type has a 50% chance of decaying within 1 hour, a 75% chance within two hours, and so on. The quantum mechanical equations do not (and cannot) tell you when a particular nucleus will decay, only the probability of it doing so as a function of time. This is not something peculiar to nuclei. The principles of quantum mechanics apply to all physical systems, and those principles are inherently and inescapably probabilistic.

This is where the problem begins. It is a paradoxical (but entirely logical) fact that a probability only makes sense if it is the probability of something definite. For example, to say that Jane has a 70% chance of passing the French exam only means something if at some point she takes the exam and gets a definite grade.  At that point, the probability of her passing no longer remains 70%, but suddenly jumps to 100% (if she passes) or 0% (if she fails). In other words, probabilities of events that lie in between 0 and 100% must at some point jump to 0 or 100% or else they meant nothing in the first place.

This raises a thorny issue for quantum mechanics. The master equation that governs how wavefunctions change with time (the “Schrödinger equation”) does not yield probabilities that suddenly jump to 0 or 100%, but rather ones that vary smoothly and that generally remain greater than 0 and less than 100%.

Radioactive nuclei are a good example. The Schrödinger equation says that the “survival probability” of a nucleus (i.e. the probability of its not having decayed) starts off at 100%, and then falls continuously, reaching 50% after one half-life, 25% after two half-lives, and so on — but never reaching zero. In other words, the Schrödinger equation only gives probabilities of decaying, never an actual decay! (If there were an actual decay, the survival probability should jump to 0 at that point.) 

To recap: (a) Probabilities in quantum mechanics must be the probabilities of definite events. (b) When definite events happen, some probabilities should jump to 0 or 100%. However, (c) the mathematics that describes all physical processes (the Schrödinger equation) does not describe such jumps.  One begins to see how one might reach the conclusion that not everything that happens is a physical process describable by the equations of physics.

So how do minds enter the picture?  The traditional understanding is that the “definite events” whose probabilities one calculates in quantum mechanics are the outcomes of “measurements” or “observations” (the words are used interchangeably).  If someone (traditionally called “the observer”) checks to see if, say, a nucleus has decayed (perhaps using a Geiger counter), he or she must get a definite answer: yes or no.

Obviously, at that point the probability of the nucleus having decayed (or survived) should jump to 0 or 100%, because the observer then knows the result with certainty.  This is just common sense. The probabilities assigned to events refer to someone’s state of knowledge: before I know the outcome of Jane’s exam I can only say that she has a 70% chance of passing; whereas after I know I must say either 0 or 100%.

Thus, the traditional view is that the probabilities in quantum mechanics — and hence the “wavefunction” that encodes them — refer to the state of knowledge of some “observer”.  (In the words of the famous physicist Sir James Jeans, wavefunctions are “knowledge waves.”)

An observer’s knowledge — and hence the wavefunction that encodes it — makes a discontinuous jump when he/she comes to know the outcome of a measurement (the famous “quantum jump”, traditionally called the “collapse of the wave function”). But the Schrödinger equations that describe any physical process do not give such jumps!  So something must be involved when knowledge changes besides physical processes.

An obvious question is why one needs to talk about knowledge and minds at all. Couldn’t an inanimate physical device (say, a Geiger counter) carry out a “measurement”?  That would run into the very problem pointed out by von Neumann: If the “observer” were just a purely physical entity, such as a Geiger counter, one could in principle write down a bigger wavefunction that described not only the thing being measured but also the observer. And, when calculated with the Schrödinger equation, that bigger wave function would not jump! Again: as long as only purely physical entities are involved, they are governed by an equation that says that the probabilities don’t jump.

That’s why, when Peierls was asked whether a machine could be an “observer,” he said no, explaining that “the quantum mechanical description is in terms of knowledge, and knowledge requires somebody who knows.” Not a purely physical thing, but a mind.  

But what if one refuses to accept this conclusion, and maintains that only physical entities exist and that all observers and their minds are entirely describable by the equations of physics? Then the quantum probabilities remain in limbo, not 0 and 100% (in general) but hovering somewhere in between. They never get resolved into unique and definite outcomes, but somehow all possibilities remain always in play. One would thus be forced into what is called the “Many Worlds Interpretation” (MWI) of quantum mechanics.

In MWI, reality is divided into many branches corresponding to all the possible outcomes of all physical situations. If a probability was 70% before a measurement, it doesn’t jump to 0 or 100%; it stays 70% after the measurement, because in 70% of the branches there’s one result and in 30% there’s the other result! For example, in some branches of reality a particular nucleus has decayed — and “you” observe that it has, while in other branches it has not decayed — and “you” observe that it has not. (There are versions of “you” in every branch.)

In the Many Worlds picture, you exist in a virtually infinite number of versions: in some branches of reality you are reading this article, in others you are asleep in bed, in others you have never been born. Even proponents of the Many Worlds idea admit that it sounds crazy and strains credulity.

The upshot is this: If the mathematics of quantum mechanics is right (as most fundamental physicists believe), and if materialism is right, one is forced to accept the Many Worlds Interpretation of quantum mechanics. And that is awfully heavy baggage for materialism to carry.

If, on the other hand, we accept the more traditional understanding of quantum mechanics that goes back to von Neumann, one is led by its logic (as Wigner and Peierls were) to the conclusion that not everything is just matter in motion, and that in particular there is something about the human mind that transcends matter and its laws.  It then becomes possible to take seriously certain questions that materialism had ruled out of court: If the human mind transcends matter to some extent, could there not exist minds that transcend the physical universe altogether? And might there not even exist an ultimate Mind?

h1

The Many Splendored Blobs of Neurocentrism — Matthew Hutson

June 17, 2013
Neuroimaging isn't the hard science we like to think it is. Our interpretations of those splotches of color depend upon multiple assumptions about the human mind, and applying fMRI insights outside the lab requires many more. To some degree, the blobs are a cultural construct, a useful fiction. In other words, they're all in our heads.

Neuroimaging isn’t the hard science we like to think it is. Our interpretations of those splotches of color depend upon multiple assumptions about the human mind, and applying fMRI insights outside the lab requires many more. To some degree, the blobs are a cultural construct, a useful fiction. In other words, they’re all in our heads.

A review of Brainwashed by Sally Satel and Scott O. Lilienfeld most recently in the WSJ. Yes, love IS a many splendored blob…

****************************************

Humanity is under attack by blobs. Nestled in our brains, they appear to control our emotions. These infiltrators remain invisible without sophisticated technology, but when discovered they often make headlines.

Actually, to say that we discover them isn’t quite right. We create them: They are the bits of color seen in brain scans, or “functional magnetic resonance imaging,” in the parlance of the scientists, doctors and marketers who conduct this research. By measuring, analyzing and making inferences, scientists can learn that one part of your brain lights up when you wrestle with a decision; that another is exercised when you shop online; or that a third part makes you fall in love. (One branding expert used fMRI data to claim that Apple users literally adore their devices.)

Such neuroscientific techniques — fMRI is one of many — provide plenty to be excited about. The authors of “Brainwashed: The Seductive Appeal of Mindless Neuroscience,” while sharing in this enthusiasm, offer a more skeptical take. At issue for psychiatrist Sally Satel and clinical psychologist Scott Lilienfeld is “neurocentrism,” or “the view that human experience and behavior can be best explained from the predominant or even exclusive perspective of the brain.” In their concise and well-researched book, they offer a reasonable and eloquent critique of this fashionable delusion, chiding the premature or unnecessary application of brain science to commerce, psychiatry, the law and ethics.

Brain scanning — at least as the technology stands today — suffers from a number of limitations. For starters, it often relies on a one-to-one mapping of cognitive function to brain area that simply doesn’t exist. Most thoughts are distributed, and “most neural real estate is zoned for mixed-use development,” as Dr. Satel and Mr. Lilienfeld write. So just knowing that disgust lights up your insula — a part of the cerebral cortex involved in attention, emotion and other functions — doesn’t imply that whenever the insula lights up you’re disgusted.

Despite such complexities, several firms have profited from selling, and perhaps overselling, fMRI’s capacity to peer into our souls. “Neuromarketers” try to suss out what drives us to buy one product rather than another. But there’s little public data to indicate that their methods work any better than the old standbys of surveys and focus groups. And they can blunder: In 2006, a neuroscientist declared a racy GoDaddy.com Super Bowl ad a flop after it failed to activate viewers’ pleasure centers. It had increased traffic to the site 16-fold.

If neurocentrism’s worst result were inspiring facile, gee-whiz headlines or bilking corporate advertisers out of cash, we could all go home with a good laugh over our obsession with Lite-Brite phrenology. But the neurocentric worldview has also crept into law enforcement and criminal justice. Predictably, defense attorneys try to use brain scans to prove that their clients lack rationality or impulse control and therefore can’t be held legally responsible. Companies such as No Lie MRI and Brain Fingerprinting Laboratories even claim to offer fMRI methods of lie detection.

One process looks for signs of recognition in a suspect’s brain as he views key evidence. This technique is fairly accurate in controlled conditions but requires evidence that has not been altered or leaked — i.e., details that the perpetrator and only the perpetrator would recognize. Another method looks for signs of neural conflict during questioning, indicating suppression of the truth. But no indicator is consistent across all liars or across all types of lies — spontaneous, rehearsed, remorseful, glib. The authors argue that fMRI lie detection is crummy legal evidence, and several courts have excluded such data because their accuracy outside the lab hasn’t been demonstrated.

Mr. Lilienfeld and Dr. Satel, who has worked in methadone clinics, spend a chapter confronting the popular model of addiction as a chronic brain disease. The trouble, they point out, is that most addicts eventually quit. In short, you can choose to stop using, but you can’t choose to stop having, say, Alzheimer’s. Those who promote the brain-disease model of addiction, including the National Institute on Drug Abuse, mean well when they strive to destigmatize addicts. But the authors say this model has distracted from behavioral therapies. Until a cocaine vaccine is available, they write, “the most effective interventions aim not at the brain but at the person.”

There are still more profound perils associated with the neurocentric vision. If the brain is just a biological machine, we have no free will and thus, strictly speaking, no claim to praise or blame. This violates both social norms and our own moral intuitions, and in the book’s final chapter the authors wade deeply into the philosophical debate about this new neurological determinism.

Moral responsibility, they argue, has practical benefits: “No society . . . can function and cohere unless its citizens exist within a system of personal accountability that stigmatizes some actions and praises others.” The position that Dr. Satel and Mr. Lilienfeld adopt is “compatibilism,” which holds that free will may not exist in an “ultimate” sense but exists in an “ordinary” sense, in that we feel free of constraints on our behavior. In everyday life, they argue, we should act as though the “ghost in the machine” were real.”

In a book that uses “mindless” accusatively in the subtitle, you might expect an excitable series of attacks on purveyors of what’s variously called neurohype, neurohubris and neurobollocks. But more often than not Dr. Satel and Mr. Lilienfeld stay fair and levelheaded. Good thing, because this is a topic that requires circumspection on all sides. Neuroimaging isn’t the hard science we like to think it is. Our interpretations of those splotches of color depend upon multiple assumptions about the human mind, and applying fMRI insights outside the lab requires many more. To some degree, the blobs are a cultural construct, a useful fiction. In other words, they’re all in our heads.

Follow

Get every new post delivered to your Inbox.

Join 272 other followers