Archive for the ‘Science And Religion’ Category

h1

Max Tegmark’s Our Mathematical Universe — Peter Woit

January 21, 2014
The Multiverse theory for the universe has been a recently accepted theory that describes the continuous formation of universes through the collapse of giant stars and the formation of black holes.  With each of these black holes there is a new point of singularity and a new possible universe.  As Rees describes it, "Our universe may be just one element - one atom, as it were - in an infinite ensemble: a cosmic archipelago.  Each universe starts with its own big bang, acquires a distinctive imprint (and its individual physical laws) as it cools, and traces out its own cosmic cycle.  The big bang that triggered our entire universe is, in this grander perspective, an infinitesimal part of an elaborate structure that extends far beyond the range of any telescopes."  (Rees)  This puts our place in the Multiverse into a small spectrum.  While the size of the earth in relation to the sun is minuscule, the size of the sun, the solar system, the galaxy, and even the universe, could pale in comparison to this proposed Multiverse.  It would be a shift in thinking that may help explain our big bang theory and possibly give light to the idea of parallel universes.

The Multiverse theory for the universe has been a recently accepted theory that describes the continuous formation of universes through the collapse of giant stars and the formation of black holes. With each of these black holes there is a new point of singularity and a new possible universe. As Rees describes it, “Our universe may be just one element – one atom, as it were – in an infinite ensemble: a cosmic archipelago. Each universe starts with its own big bang, acquires a distinctive imprint (and its individual physical laws) as it cools, and traces out its own cosmic cycle. The big bang that triggered our entire universe is, in this grander perspective, an infinitesimal part of an elaborate structure that extends far beyond the range of any telescopes.” (Rees) This puts our place in the Multiverse into a small spectrum. While the size of the earth in relation to the sun is minuscule, the size of the sun, the solar system, the galaxy, and even the universe, could pale in comparison to this proposed Multiverse. It would be a shift in thinking that may help explain our big bang theory and possibly give light to the idea of parallel universes.

Mr. Woit is the author of “Not Even Wrong: The Failure of String Theory and the Search for Unity in Physical Law.His review of Our Mathematical Universe is a report from the front lines of mathematics and physics.

*****************************************

It’s a truly remarkable fact that our deepest understanding of the material world is embodied in mathematics, often in concepts that were originated with some very different motivation. A good example is our best description of how gravity works, Einstein’s 1915 theory of general relativity, in which the gravitational force comes from the curvature of space and time.

The formulation of this theory required Einstein to use mathematics developed 60 years earlier by the great German mathematician Bernhard Riemann, who was studying abstract questions involving geometry. There’s now a long history of intertwined and experimentally tested discoveries about physics and mathematics. This unity between mathematics and physics is a source of wonder for those who study the two subjects, as well as an eternal conundrum for philosophers.

Max Tegmark thus begins his new book with a deep truth when he articulates a “Mathematical Universe Hypothesis,” which states simply that “physical reality is a mathematical structure.” His central claim ends up being that such a hypothesis implies a surprising new vision of how to do physics, but the slipperiness of that word “is” should make the reader wary. Mr. Tegmark raises here the age-old question of whether math just describes physical reality or whether it defines physical reality. This distinction is of relevance to philosophers, but its significance for practicing physicists is unclear.

“Our Mathematical Universe” opens with a memoir of Mr. Tegmark’s own career in physics. He’s now a cosmologist at MIT whose specialty is interpreting data about the structure and evolution of the universe, much of it gathered from new space and earth-based instruments.

His book, however, quickly turns to the topic of the “multiverse” — the idea that our universe is part of some larger unobservable structure. Multiverse theories come in a baffling number of different versions. They have been a hot topic for the past dozen years, with Brian Greene’s “The Hidden Reality” (2011) a good example of a recent book covering this material.

Mr. Tegmark categorizes different multiverse proposals in terms of “Levels,” a useful method designed to keep track of the various theories. Many of these include some version of the idea that our universe is one of many unconnected universes obeying the same physical laws. This “Level I” type of multiverse is like Jorge Luis Borges’s “Library of Babel,” which contains all possible books, though most remain inaccessible to his story’s narrator due to their remoteness. As far back as 1584, Giordano Bruno proposed a universe of this sort, provoking mind-bending paradoxes involving infinite copies of oneself acting out completely different lives.

A much different type of multiverse arises in what is sometimes called the “many-worlds interpretation” of quantum theory. This is one way of thinking about the relationship between quantum mechanics and conventional human-scale physics. The idea is that while any quantum system is described by a single mathematical object called a quantum wave-function, this can contain within itself a description of an infinity of different possible worlds.

These correspond to the different possible states we may observe when we probe a quantum system with a macroscopic experimental apparatus. This multiverse is more like the “Garden of Forking Paths” that Borges describes in his story of that title, with each world branching off when we make an observation. Philosophical debate rages over what to think of such possible worlds: Are the ones we don’t end up in “real” or just a convenient calculational fiction? Mr. Tegmark calls the multiverse of such worlds a “Level III” multiverse.

These Level I and III possibilities fit reasonably well within variants of conventional views about our current best understanding of physics. The controversy surrounds what Mr. Tegmark calls “Level II” multiverses. At this level, different parts of a multiverse can have different physics — for instance, different fundamental forces, as well as different fundamental particles with different masses.

The problem: There is no experimental evidence for this and, arguably, no way of ever getting any, since our universe likely interacts in no way with any universes whose physics differs from our own. When someone is trying to sell a Level II multiverse theory, pay close attention to what exactly is being marketed; it comes without the warranty of an experimental test.

Since 1984 many physicists have worked on “string theory,” which posits a new unification of general relativity and quantum theory, achieved in part by abandoning the idea of fundamental particles. Early on, the new fundamental objects were supposed to be relatively well-defined one-dimensional vibrating string-like objects. Over the years this theory has evolved into something often called “M-theory,” which includes a wealth of poorly understood and mathematically complex components.

As far as one can now tell, if M-theory is to make sense, it will have so many possible solutions that one could produce just about any prediction about our observable universe that one might want. Such an unfalsifiable theory normally would be dismissed as unscientific, but proponents hope to salvage the situation by invoking a Level II multiverse containing all solutions to the theory. Our observed laws of physics would just represent a particular solution.

Mr. Tegmark wants to go even further down this rabbit hole. He assumes that what we observe is governed by something like M-theory, with its multiverse of different physical laws. But he wants to find a wider view that explains M-theory in terms of his “math is physics” hypothesis. He argues that his hypothesis implies the conclusion that “all mathematical structures exist.” The idea is that every example mathematicians teach in their classes, whether it’s a polynomial equation, a circle, a cube, or something much more complicated, represents an equally good universe. The collection of all mathematical structures he calls the “Level IV” multiverse, the highest and most general level.

Interpreting the meaning of “exists” in this way — to include all possible worlds — is a philosophical position known as “modal realism.” The innovation here is the claim that this carries a new insight into physics. The problem with such a conception of the ultimate nature of reality is not that it’s wrong but that it’s empty, far more radically untestable than even the already problematic proposals of M-theory. Mr. Tegmark proposes abandoning the historically proven path of pursuing a single exceptionally deep and very special mathematical structure at the core of both math and physics in favor of the hypothesis that, at the deepest level, “anything goes.”

Mr. Tegmark’s proposal takes him deep in the realm of speculation, and few of his fellow scientists are likely to want to follow him. There’s a danger, though, that his argument will convince some that “anything goes” is all there is to ultimate reality, discouraging their search for a better and more elegant version of our current best theories.

To be fair, Mr. Tegmark acknowledges he is going beyond conventional science, even including pithy advice about how to pursue a successful career while indulging in speculative topics that one’s colleagues are likely to see as beyond the bounds of what can be taken seriously. It’s worth remarking that not taking itself too seriously is one of the book’s virtues.

A final chapter argues for the importance of the “scientific lifestyle,” meaning scientific rationality as a basis for our decisions about important questions affecting the future of our species. But the great power of the scientific worldview has always come from its insistence that one should accept ideas based on experimental evidence, not on metaphysical reasoning or the truth-claims of authority figures. “Our Mathematical Universe” is a fascinating and well-executed dramatic argument from a talented expositor, but reading it with the skeptical mind-set of a scientist is advised.

h1

Does Quantum Physics Make it Easier to Believe in God? — Stephen M. Barr

August 23, 2013
If the human mind transcends matter to some extent, could there not exist minds that transcend the physical universe altogether? And might there not even exist an ultimate Mind?

If the human mind transcends matter to some extent, could there not exist minds that transcend the physical universe altogether? And might there not even exist an ultimate Mind?

A reblog from the site Big Questions Online

**************************************************

Not in any direct way. That is, it doesn’t provide an argument for the existence of God.  But it does so indirectly, by providing an argument against the philosophy called materialism (or “physicalism”), which is the main intellectual opponent of belief in God in today’s world.

Materialism is an atheistic philosophy that says that all of reality is reducible to matter and its interactions. It has gained ground because many people think that it’s supported by science. They think that physics has shown the material world to be a closed system of cause and effect, sealed off from the influence of any non-physical realities — if any there be.

Since our minds and thoughts obviously do affect the physical world, it would follow that they are themselves merely physical phenomena. No room for a spiritual soul or free will: for materialists we are just “machines made of meat.”

Quantum mechanics, however, throws a monkey wrench into this simple mechanical view of things.  No less a figure than Eugene Wigner, a Nobel Prize winner in physics, claimed that materialism — at least with regard to the human mind — is not “logically consistent with present quantum mechanics.” And on the basis of quantum mechanics, Sir Rudolf Peierls, another great 20th-century physicist, said, “the premise that you can describe in terms of physics the whole function of a human being … including [his] knowledge, and [his] consciousness, is untenable. There is still something missing.”

How, one might ask, can quantum mechanics have anything to say about the human mind?  Isn’t it about things that can be physically measured, such as particles and forces?  It is; but while minds cannot be measured, it is ultimately minds that do the measuring. And that, as we shall see, is a fact that cannot be ignored in trying to make sense of quantum mechanics.

If one claims that it is possible (in principle) to give a complete physical description of what goes on during a measurement — including the mind of the person who is doing the measuring — one is led into severe difficulties. This was pointed out in the 1930s by the great mathematician John von Neumann.  Though I cannot go into technicalities in an essay such as this, I will try to sketch the argument.

It all begins with the fact that quantum mechanics is inherently probabilistic. Of course, even in “classical physics” (i.e. the physics that preceded quantum mechanics and that still is adequate for many purposes) one sometimes uses probabilities; but one wouldn’t have to if one had enough information.  Quantum mechanics is radically different: it says that even if one had complete information about the state of a physical system, the laws of physics would typically only predict probabilities of future outcomes. These probabilities are encoded in something called the “wavefunction” of the system.

A familiar example of this is the idea of “half-life.”  Radioactive nuclei are liable to “decay” into smaller nuclei and other particles.  If a certain type of nucleus has a half-life of, say, an hour, it means that a nucleus of that type has a 50% chance of decaying within 1 hour, a 75% chance within two hours, and so on. The quantum mechanical equations do not (and cannot) tell you when a particular nucleus will decay, only the probability of it doing so as a function of time. This is not something peculiar to nuclei. The principles of quantum mechanics apply to all physical systems, and those principles are inherently and inescapably probabilistic.

This is where the problem begins. It is a paradoxical (but entirely logical) fact that a probability only makes sense if it is the probability of something definite. For example, to say that Jane has a 70% chance of passing the French exam only means something if at some point she takes the exam and gets a definite grade.  At that point, the probability of her passing no longer remains 70%, but suddenly jumps to 100% (if she passes) or 0% (if she fails). In other words, probabilities of events that lie in between 0 and 100% must at some point jump to 0 or 100% or else they meant nothing in the first place.

This raises a thorny issue for quantum mechanics. The master equation that governs how wavefunctions change with time (the “Schrödinger equation”) does not yield probabilities that suddenly jump to 0 or 100%, but rather ones that vary smoothly and that generally remain greater than 0 and less than 100%.

Radioactive nuclei are a good example. The Schrödinger equation says that the “survival probability” of a nucleus (i.e. the probability of its not having decayed) starts off at 100%, and then falls continuously, reaching 50% after one half-life, 25% after two half-lives, and so on — but never reaching zero. In other words, the Schrödinger equation only gives probabilities of decaying, never an actual decay! (If there were an actual decay, the survival probability should jump to 0 at that point.) 

To recap: (a) Probabilities in quantum mechanics must be the probabilities of definite events. (b) When definite events happen, some probabilities should jump to 0 or 100%. However, (c) the mathematics that describes all physical processes (the Schrödinger equation) does not describe such jumps.  One begins to see how one might reach the conclusion that not everything that happens is a physical process describable by the equations of physics.

So how do minds enter the picture?  The traditional understanding is that the “definite events” whose probabilities one calculates in quantum mechanics are the outcomes of “measurements” or “observations” (the words are used interchangeably).  If someone (traditionally called “the observer”) checks to see if, say, a nucleus has decayed (perhaps using a Geiger counter), he or she must get a definite answer: yes or no.

Obviously, at that point the probability of the nucleus having decayed (or survived) should jump to 0 or 100%, because the observer then knows the result with certainty.  This is just common sense. The probabilities assigned to events refer to someone’s state of knowledge: before I know the outcome of Jane’s exam I can only say that she has a 70% chance of passing; whereas after I know I must say either 0 or 100%.

Thus, the traditional view is that the probabilities in quantum mechanics — and hence the “wavefunction” that encodes them — refer to the state of knowledge of some “observer”.  (In the words of the famous physicist Sir James Jeans, wavefunctions are “knowledge waves.”)

An observer’s knowledge — and hence the wavefunction that encodes it — makes a discontinuous jump when he/she comes to know the outcome of a measurement (the famous “quantum jump”, traditionally called the “collapse of the wave function”). But the Schrödinger equations that describe any physical process do not give such jumps!  So something must be involved when knowledge changes besides physical processes.

An obvious question is why one needs to talk about knowledge and minds at all. Couldn’t an inanimate physical device (say, a Geiger counter) carry out a “measurement”?  That would run into the very problem pointed out by von Neumann: If the “observer” were just a purely physical entity, such as a Geiger counter, one could in principle write down a bigger wavefunction that described not only the thing being measured but also the observer. And, when calculated with the Schrödinger equation, that bigger wave function would not jump! Again: as long as only purely physical entities are involved, they are governed by an equation that says that the probabilities don’t jump.

That’s why, when Peierls was asked whether a machine could be an “observer,” he said no, explaining that “the quantum mechanical description is in terms of knowledge, and knowledge requires somebody who knows.” Not a purely physical thing, but a mind.  

But what if one refuses to accept this conclusion, and maintains that only physical entities exist and that all observers and their minds are entirely describable by the equations of physics? Then the quantum probabilities remain in limbo, not 0 and 100% (in general) but hovering somewhere in between. They never get resolved into unique and definite outcomes, but somehow all possibilities remain always in play. One would thus be forced into what is called the “Many Worlds Interpretation” (MWI) of quantum mechanics.

In MWI, reality is divided into many branches corresponding to all the possible outcomes of all physical situations. If a probability was 70% before a measurement, it doesn’t jump to 0 or 100%; it stays 70% after the measurement, because in 70% of the branches there’s one result and in 30% there’s the other result! For example, in some branches of reality a particular nucleus has decayed — and “you” observe that it has, while in other branches it has not decayed — and “you” observe that it has not. (There are versions of “you” in every branch.)

In the Many Worlds picture, you exist in a virtually infinite number of versions: in some branches of reality you are reading this article, in others you are asleep in bed, in others you have never been born. Even proponents of the Many Worlds idea admit that it sounds crazy and strains credulity.

The upshot is this: If the mathematics of quantum mechanics is right (as most fundamental physicists believe), and if materialism is right, one is forced to accept the Many Worlds Interpretation of quantum mechanics. And that is awfully heavy baggage for materialism to carry.

If, on the other hand, we accept the more traditional understanding of quantum mechanics that goes back to von Neumann, one is led by its logic (as Wigner and Peierls were) to the conclusion that not everything is just matter in motion, and that in particular there is something about the human mind that transcends matter and its laws.  It then becomes possible to take seriously certain questions that materialism had ruled out of court: If the human mind transcends matter to some extent, could there not exist minds that transcend the physical universe altogether? And might there not even exist an ultimate Mind?

h1

The Many Splendored Blobs of Neurocentrism — Matthew Hutson

June 17, 2013
Neuroimaging isn't the hard science we like to think it is. Our interpretations of those splotches of color depend upon multiple assumptions about the human mind, and applying fMRI insights outside the lab requires many more. To some degree, the blobs are a cultural construct, a useful fiction. In other words, they're all in our heads.

Neuroimaging isn’t the hard science we like to think it is. Our interpretations of those splotches of color depend upon multiple assumptions about the human mind, and applying fMRI insights outside the lab requires many more. To some degree, the blobs are a cultural construct, a useful fiction. In other words, they’re all in our heads.

A review of Brainwashed by Sally Satel and Scott O. Lilienfeld most recently in the WSJ. Yes, love IS a many splendored blob…

****************************************

Humanity is under attack by blobs. Nestled in our brains, they appear to control our emotions. These infiltrators remain invisible without sophisticated technology, but when discovered they often make headlines.

Actually, to say that we discover them isn’t quite right. We create them: They are the bits of color seen in brain scans, or “functional magnetic resonance imaging,” in the parlance of the scientists, doctors and marketers who conduct this research. By measuring, analyzing and making inferences, scientists can learn that one part of your brain lights up when you wrestle with a decision; that another is exercised when you shop online; or that a third part makes you fall in love. (One branding expert used fMRI data to claim that Apple users literally adore their devices.)

Such neuroscientific techniques — fMRI is one of many — provide plenty to be excited about. The authors of “Brainwashed: The Seductive Appeal of Mindless Neuroscience,” while sharing in this enthusiasm, offer a more skeptical take. At issue for psychiatrist Sally Satel and clinical psychologist Scott Lilienfeld is “neurocentrism,” or “the view that human experience and behavior can be best explained from the predominant or even exclusive perspective of the brain.” In their concise and well-researched book, they offer a reasonable and eloquent critique of this fashionable delusion, chiding the premature or unnecessary application of brain science to commerce, psychiatry, the law and ethics.

Brain scanning — at least as the technology stands today — suffers from a number of limitations. For starters, it often relies on a one-to-one mapping of cognitive function to brain area that simply doesn’t exist. Most thoughts are distributed, and “most neural real estate is zoned for mixed-use development,” as Dr. Satel and Mr. Lilienfeld write. So just knowing that disgust lights up your insula — a part of the cerebral cortex involved in attention, emotion and other functions — doesn’t imply that whenever the insula lights up you’re disgusted.

Despite such complexities, several firms have profited from selling, and perhaps overselling, fMRI’s capacity to peer into our souls. “Neuromarketers” try to suss out what drives us to buy one product rather than another. But there’s little public data to indicate that their methods work any better than the old standbys of surveys and focus groups. And they can blunder: In 2006, a neuroscientist declared a racy GoDaddy.com Super Bowl ad a flop after it failed to activate viewers’ pleasure centers. It had increased traffic to the site 16-fold.

If neurocentrism’s worst result were inspiring facile, gee-whiz headlines or bilking corporate advertisers out of cash, we could all go home with a good laugh over our obsession with Lite-Brite phrenology. But the neurocentric worldview has also crept into law enforcement and criminal justice. Predictably, defense attorneys try to use brain scans to prove that their clients lack rationality or impulse control and therefore can’t be held legally responsible. Companies such as No Lie MRI and Brain Fingerprinting Laboratories even claim to offer fMRI methods of lie detection.

One process looks for signs of recognition in a suspect’s brain as he views key evidence. This technique is fairly accurate in controlled conditions but requires evidence that has not been altered or leaked — i.e., details that the perpetrator and only the perpetrator would recognize. Another method looks for signs of neural conflict during questioning, indicating suppression of the truth. But no indicator is consistent across all liars or across all types of lies — spontaneous, rehearsed, remorseful, glib. The authors argue that fMRI lie detection is crummy legal evidence, and several courts have excluded such data because their accuracy outside the lab hasn’t been demonstrated.

Mr. Lilienfeld and Dr. Satel, who has worked in methadone clinics, spend a chapter confronting the popular model of addiction as a chronic brain disease. The trouble, they point out, is that most addicts eventually quit. In short, you can choose to stop using, but you can’t choose to stop having, say, Alzheimer’s. Those who promote the brain-disease model of addiction, including the National Institute on Drug Abuse, mean well when they strive to destigmatize addicts. But the authors say this model has distracted from behavioral therapies. Until a cocaine vaccine is available, they write, “the most effective interventions aim not at the brain but at the person.”

There are still more profound perils associated with the neurocentric vision. If the brain is just a biological machine, we have no free will and thus, strictly speaking, no claim to praise or blame. This violates both social norms and our own moral intuitions, and in the book’s final chapter the authors wade deeply into the philosophical debate about this new neurological determinism.

Moral responsibility, they argue, has practical benefits: “No society . . . can function and cohere unless its citizens exist within a system of personal accountability that stigmatizes some actions and praises others.” The position that Dr. Satel and Mr. Lilienfeld adopt is “compatibilism,” which holds that free will may not exist in an “ultimate” sense but exists in an “ordinary” sense, in that we feel free of constraints on our behavior. In everyday life, they argue, we should act as though the “ghost in the machine” were real.”

In a book that uses “mindless” accusatively in the subtitle, you might expect an excitable series of attacks on purveyors of what’s variously called neurohype, neurohubris and neurobollocks. But more often than not Dr. Satel and Mr. Lilienfeld stay fair and levelheaded. Good thing, because this is a topic that requires circumspection on all sides. Neuroimaging isn’t the hard science we like to think it is. Our interpretations of those splotches of color depend upon multiple assumptions about the human mind, and applying fMRI insights outside the lab requires many more. To some degree, the blobs are a cultural construct, a useful fiction. In other words, they’re all in our heads.

h1

God the Creator 2 –Benedict XVI

April 5, 2013
Staring across interstellar space, the Cat's Eye Nebula lies three thousand light-years from Earth. One of the most famous planetary nebulae, NGC 6543 is over half a light-year across and represents a final, brief yet glorious phase in the life of a sun-like star... “We must not in our own day conceal our faith in creation. We may not conceal it, for only if it is true that the universe comes from freedom, love, and reason, and that these are the real underlying powers, can we trust one another, go forward into the future, and live as human beings. God is the Lord of all things because he is their creator, and only therefore can we pray to him. For this means that freedom and love are not ineffectual ideas but rather that they are sustaining forces of reality.”

Staring across interstellar space, the Cat’s Eye Nebula lies three thousand light-years from Earth. One of the most famous planetary nebulae, NGC 6543 is over half a light-year across and represents a final, brief yet glorious phase in the life of a sun-like star… “We must not in our own day conceal our faith in creation. We may not conceal it, for only if it is true that the universe comes from freedom, love, and reason, and that these are the real underlying powers, can we trust one another, go forward into the future, and live as human beings. God is the Lord of all things because he is their creator, and only therefore can we pray to him. For this means that freedom and love are not ineffectual ideas but rather that they are sustaining forces of reality.”

The Unity of the Bible as a Criterion for Its Interpretation
[Continued from previous post...] So now we still have to ask: Is the distinction between the image and what is intended to be expressed only an evasion, because we can no longer rely on the text even though we still want to make something of it, or are there criteria from the Bible itself that attest to this distinction? Does it give us access to indications of this sort, and did the faith of the church know of these indications in the past and acknowledge them?

Let us look at Holy Scripture anew with these questions in mind. There we can determine first of all that the creation account in Genesis 1, which we have just heard, is not, from its very beginning, something that is closed in on itself. Indeed, Holy Scripture in its entirety was not written from beginning to end like a novel or a textbook.

It is, rather, the echo of God’s history with his people. It arose out of the struggles and the vagaries of this history, and all through it we can catch a glimpse of the rises and falls, the sufferings and hopes, and the greatness and failures of this history. The Bible is thus the story of God’s struggle with human beings to make himself understandable to them over the course of time; but it is also the story of their struggle to seize hold of God over the course of time.

Hence the theme of creation is not set down once for all in one place; rather, it accompanies Israel throughout its history, and, indeed, the whole Old Testament is a journeying with the Word of God. Only in the process of this journeying was the Bible’s real way of declaring itself formed, step by step.

Consequently we ourselves can only discover where this way is leading if we follow it to the end. In this respect — as a way — the Old and New Testaments belong together. For the Christian the Old Testament represents, in its totality, an advance toward Christ; only when it attains to him does its real meaning, which was gradually hinted at, become clear.

Thus every individual part derives its meaning from the whole, and the whole derives its meaning from its end — from Christ. Hence we only interpret an individual text theologically correctly (as the fathers of the church recognized and as the faith of the church in every age has recognized) when we see it as a way that is leading us ever forward, when we see in the text where this way is tending and what its inner direction is .

What significance, now, does this insight have for the understanding of the creation account? The first thing to be said is this: Israel always believed in the Creator God, and this faith it shared with all the great civilizations of the ancient world. For, even in the moments when monotheism was eclipsed, all the great civilizations always knew of the Creator of heaven and earth.

There is a surprising commonality here even between civilizations that could never have been in touch with one another. In this commonality we can get a good grasp of the profound and never altogether lost contact that human beings had with God’s truth. In Israel itself the creation theme went through several different stages. It was never completely absent, but it was not always equally important.

There were times when Israel was so preoccupied with the sufferings or the hopes of its own history, so fastened upon the here and now, that there was hardly any use in its looking back at creation; indeed, it hardly could. The moment when creation became a dominant theme occurred during the Babylonian Exile. It was then that the account that we have just heard — based, to be sure, on very ancient traditions — assumed its present form. Israel had lost its land and its temple.

According to the mentality of the time this was something incomprehensible, for it meant that the God of Israel was vanquished a God whose people, whose land, and whose worshipers could be snatched away from him. A God who could not defend his worshipers and his worship was seen to be, at the time, a weak God. Indeed, he was no God at all; he had abandoned his divinity. And so, being driven out of their own land and being erased from the map was for Israel a terrible trial: Has our God been vanquished, and is our faith void?

At this moment the prophets opened a new page and taught Israel that it was only then that the true face of God appeared and that he was not restricted to that particular piece of land. He had never been: He had promised this piece of land to Abraham before he settled there, and he had been able to bring his people out of Egypt. He could do both things because he was not the God of one place but had power over heaven and earth.

Therefore he could drive his faithless people into another land in order to make himself known there. And so it came to be understood that this God of Israel was not a God like the other gods, but that he was the God who held sway over every land and people. He could do this, however, because he himself had created everything in heaven and on earth. It was in exile and in the seeming defeat of Israel that there occurred an opening to the awareness of the God who holds every people and all of history in his hands, who holds everything because he is the creator of everything and the source of all power.

This faith now had to find its own contours, and it had to do so precisely vis-a-vis the seemingly victorious religion of Babylon, which was displayed in splendid liturgies, like that of the New Year, in which the re-creation of the world was celebrated and brought to its fulfillment. It had to find its contours vis-a-vis the great Babylonian creation account of Enuma Elish, which depicted the origin of the world in its own fashion.

There it is said that the world was produced out of a struggle between opposing powers and that it assumed its form when Marduk, the god of light, appeared and split in two the body of the primordial dragon. From this sundered body heaven and earth came to be. Thus the firmament and the earth were produced from the sundered body of the dead dragon, but from its blood Marduk fashioned human beings.

It is a foreboding picture of the world and of humankind that we encounter here: The world is a dragon’s body, and human beings have dragon’s blood in them. At the very origin of the world lurks something sinister, and in the deepest part of humankind there lies something rebellious, demonic, and evil. In this view of things only a dictator, the king of Babylon, who is the representative of Marduk, can repress the demonic and restore the world to order.

Such views were not simply fairy tales. They expressed the discomfiting realities that human beings experienced in the world and among themselves. For often enough it looks as if the world is a dragon’s lair and human blood is dragon’s blood. But despite all oppressive experiences the scriptural account says that it was not so. The whole tale of these sinister powers melts away in a few words: “The earth was without form and void.”

Behind these Hebrew words lie the dragon and the demonic powers that are spoken of elsewhere. Now it is the void that alone remains and that stands as the sole power over against God. And in the face of any fear of these demonic forces we are told that God alone, who is the eternal Reason that is eternal love, created the world, and that it rests in his hands. Only with this in mind can we appreciate the dramatic confrontation implicit in this biblical text, in which all these confused myths were rejected and the world was given its origin in God’s Reason and in his Word.

This could be shown almost word for word in the present text — as, for example, when the sun and the moon are referred to as lamps that God has hung in the sky for the measurement of time. To the people of that age it must have seemed a terrible sacrilege to designate the great gods sun and moon as lamps for measuring time. Here we see the audacity and the temperateness of the faith that, in confronting the pagan myths, made the light of truth appear by showing that the world was not a demonic contest but that it arose from God’s Reason and reposes on God’s Word.

Hence this creation account may be seen as the decisive “enlightenment” of history and as a breakthrough out of the fears that had oppressed humankind. It placed the world in the context of reason and recognized the world’s reasonableness and freedom. But it may also be seen as the true enlightenment from the fact that it put human reason firmly on the primordial basis of God’s creating Reason, in order to establish it in truth and in love, without which an “enlightenment” would be exorbitant and ultimately foolish.

To this something further must be added. I just said how, gradually, in confronting its pagan environment and its own heart, the people of Israel experienced what “creation” was. Implicit here is the fact that the classic creation account is not the only creation text of sacred Scripture. Immediately after it there follows another one, composed earlier and containing other imagery.

In the Psalms there are still others, and there the movement to clarify the faith concerning creation is carried further: In its confrontation with Hellenistic civilization, Wisdom literature reworks the theme without sticking to the old images such as the seven days. Thus we can see how the Bible itself constantly readapts its images to a continually developing way of thinking, how it changes time and again in order to bear witness, time and again, to the one thing that has come to it, in truth, from God’s Word, which is the message of his creating act.

In the Bible itself the images are free and they correct themselves ongoingly. In this way they show, by means of a gradual and interactive process, that they are only images, which reveal something deeper and greater.

Christology as a Criterion
One decisive fact must still be mentioned at this point: The Old Testament is not the end of the road. What is worked out in the so-called Wisdom literature is the final bridge on a long road that leads to the message of Jesus Christ and to the New Testament. Only there do we find the conclusive and normative scriptural creation account, which reads: “In the beginning was the Word, and the Word was with God, and the Word was God…. All things were made through him, and without him was not anything made that was made” (John 1:1, 3).

John quite consciously took up here once again the first words of the Bible and read the creation account anew, with Christ, in order to tell us definitively what the Word is which appears throughout the Bible and with which God desires to shake our hearts. Thus it becomes clear to us that we Christians do not read the Old Testament for its own sake but always with Christ and through Christ. Consequently the law of Moses, the rituals of purification, the regulations concerning food, and all other such things are not to be carried out by us; otherwise the biblical Word would be senseless and meaningless.

We read all of this not as if it were something complete in itself. We read it with him in whom all things have been fulfilled and in whom all of its validity and truth are revealed. Therefore we read the law, like the creation account, with him; and from him (and not from some subsequently discovered trick) we know what God wished over the course of centuries to have gradually penetrate the human heart and soul. Christ frees us from the slavery of the letter, and precisely thus does he give back to us, renewed, the truth of the images.

The ancient church and the church of the Middle Ages also knew this. They knew that the Bible is a whole and that we only understand its truth when we understand it with Christ in mind — with the freedom that he bestowed on us and with the profundity whereby he reveals what is enduring through images.

Only at the beginning of the modern era was this dynamic forgotten — this dynamic that is the living unity of Scripture, which we can only understand with Christ in the freedom that he gives us and in the certitude that comes from that freedom. The new historical thinking wanted to read every text in itself, in its bare literalness. Its interest lay only in the exact explanation of particulars, but meanwhile it forgot the Bible as a whole.

In a word, it no longer read the texts forward but backward — that is, with a view not to Christ but to the probable origins of those texts. People were no longer concerned with understanding what a text said or what a thing was from the aspect of its fulfillment, but from that of its beginning, its source.

As a result of this isolation from the whole and of this literal-mindedness with respect to particulars, which contradicts the entire inner nature of the Bible but which was now considered to be the truly scientific approach, there arose that conflict between the natural sciences and theology which has been, up to our own day, a burden for the faith.

This did not have to be the case, because the faith was, from its very beginnings, greater, broader, and deeper. Even today faith in creation is not unreal; even today it is reasonable; even from the perspective of the data of the natural sciences it is the “better hypothesis,” offering a fuller and better explanation than any of the other theories. Faith is reasonable. The reasonableness of creation derives from God’s Reason, and there is no other really convincing explanation. What the pagan Aristotle said four hundred years before Christ — when he opposed those who asserted that everything has come to exist through chance, even though he said what he did without the knowledge that our faith in creation gives us — is still valid today.

The reasonableness of the universe provides us with access to God’s Reason, and the Bible is and continues to be the true “enlightenment,” which has given the world over to human reason and not to exploitation by human beings, because it opened reason to God’s truth and love. Therefore we must not in our own day conceal our faith in creation. We may not conceal it, for only if it is true that the universe comes from freedom, love, and reason, and that these are the real underlying powers, can we trust one another, go forward into the future, and live as human beings. God is the Lord of all things because he is their creator, and only therefore can we pray to him. For this means that freedom and love are not ineffectual ideas but rather that they are sustaining forces of reality.

And so we wish to cite today, in thankfulness and joy, the church’s creed: “I believe in God, the Father Almighty, Creator of heaven and earth.” Amen.

h1

The Ineffable Mystery of God – Fr. Robert Barron

August 21, 2012

The cloister yard of Santa Sabina where it is reputed St. Thomas walked and pondered.

After many years of exile from the courts of Egypt where he had been raised, a Hebrew man named Moses, while tending the flock of his father-in-law on the slopes of Mount Sinai, saw an extraordinary sight: a bush that was on fire but was not being consumed. He resolved to take a closer look. As he approached, he heard a voice: “Moses! Moses! … Come no nearer! Remove the sandals from your feet, for the place where you stand is holy ground” (Exodus 3:5). Then the speaker identified himself as “the God of your father … the God of Abraham, the God of Isaac, the God of Jacob” (Exodus 3:6), and he gave Moses a mission to liberate his people enslaved in Egypt.

When Moses asked for the name of this mysterious speaker, he received the following answer: “I am who am” (Exodus 3:14). Moses was asking a reasonable enough question. He was wondering which of the many gods — deities of the river, the mountain, the various nations — this was. He was seeking to define and specify the nature of this particular heavenly power.

But the answer he received frustrated him. For the divine speaker was implying that he was not one god among many, not this deity rather than that, not a reality that could, even in principle, be captured or delimited by a name. In a certain sense, God’s response amounted to the undermining of the very type of question Moses posed. His name was simply “to be,” and therefore he could never be mastered. The ancient Israelites honored this essential mysteriousness of God by designating him with the unpronounceable name of YHWH.

Following the prompting of this conversation between Moses and God, the mainstream of the Catholic theological tradition has tended not to refer to God as a being, however supreme, among many. Thomas Aquinas, arguably the greatest theologian in the Catholic tradition, rarely designates God as ens summum (the highest being); rather he prefers the names ipsum esse (to be itself) or qui est (the one who is). In fact, Aquinas goes so far as to say that God cannot be defined or situated within any genus, even the genus of “being.” This means that it is wrong to say that trees, planets, automobiles, computers, and God — despite the obvious differences among them — have at least in common their status as beings. Aquinas expresses the difference that obtains between God and creatures through the technical language of essence and existence.

In everything that is not God there is a real distinction between essence (what the thing is) and existence (that the thing is); but in God no such distinction holds, for God’s act of existence is not received, delimited, or defined by anything extraneous to itself. A human being is the act of existence poured, as it were, into the receptacle of humanity, and a podium is the act of existence poured into the form of podium-ness, but God’s act of existence is not poured into any receiving element. To be God, therefore, is to be to be.

Saint Anselm of Canterbury, one of the greatest of the early medieval theologians, described God as “that than which nothing greater can be thought.” At first blush this seems straightforward enough: God is the highest conceivable thing. But the longer one meditates on Anselm’s description, the stranger it becomes. If God were simply the supreme being — the biggest reality among many — then God plus the world would be greater than God alone. But in that case he would not be that than which nothing greater can be thought. Zeus, for example, was, in ancient mythology, the supreme deity, but clearly Zeus plus the other gods, or Zeus plus the world of nature, would be greater than Zeus alone. Thus the God whom Anselm is describing is not like this at all. Though it is a very high paradox, the God whom Anselm describes added to the world as we know it is not greater than God alone.

This means that the true God exceeds all of our concepts, all of our language, all of our loftiest ideas. God (YHWH) is essentially mysterious, a term, by the way, derived from the Greek muein (to shut one’s mouth). How often the prophets and mystics of the Old Testament rail against idolatry, which is nothing other than reducing the true God to some creaturely object that we can know and hence try to control. The twentieth century theologian Karl Rahner commented that “God” is the last sound we should make before falling silent, and Saint Augustine, long ago, said, “si comprehendis, non est Deus” (if you understand, that isn’t God), All of this formal theologizing is but commentary on that elusive and confounding voice from the burning bush: “I am who am.”

Arguments For God’s Existence
I have firmly fended off the tendency to turn God into an idol, but have I left us thereby in an intellectual lurch, doomed simply to remain silent about God? If God cannot be in any sense defined, how do we explain the plethora of theological books and arguments? After all, the same Thomas Aquinas who said that God cannot be placed in any genus also wrote millions of words about God. Chapter 33 of Exodus gives us a clue to the resolution of this dilemma. Moses passionately asks God to reveal his glory to him, and Yahweh acquiesces. But the Lord specifies, “I will make all my beauty pass before you … But my face you cannot see, for no man sees me and still lives” (Exodus 33:19-20). God then tells Moses that while the divine glory passes by, God will place his servant in the cleft of a rock and cover Moses’s eyes. “Then I will remove my hand, so that you may see my back; but my face is not to be seen” (Exodus 33:22-23). God can indeed be seen in this life, but only indirectly, through his creatures and effects. We can understand him to a degree, but only obliquely, glimpsing him, as it were, out of the corners of our eyes. We see his “back” as it is disclosed in the beauty, the intelligibility, and the contingency of the world that he has made.

Following this principle of indirection, Thomas Aquinas formulated five arguments for God’s existence, each one of which begins from some feature of the created order. I will develop here the one that I consider the most elemental, the demonstration that commences with the contingency of the world. Though the term is technically philosophical, “contingency” actually names something with which we are all immediately familiar: the fact that things come into being and pass out of being. Consider a majestic summer cloud that billows up and then fades away in the course of a lazy August afternoon, coming into existence and then evanescing.

Now think of all of the plants and flowers that have grown up and subsequently withered away, and then of all the animals that have come into being, roamed the face of the earth, and then faded into dust. And ponder the numberless human beings who have come and gone, confirming the Psalmist’s intuition that “our years end like a sigh” (Psalms 90:9).  Even those things that seem most permanent — mountain ranges, the continents themselves, the oceans — have in fact emerged and will in fact fade. Indeed, if a time-lapse camera could record the entire life span of the Rocky Mountains, from the moment they began to emerge to the moment when they finally wear away, and if we could play that film at high speed, those mountains would look for all the world like that summer cloud.

The contingency of earthly things is the starting point of Aquinas’s proof, for it indicates something of great moment, namely, that such things do not contain within themselves the reason for their own existence. If they did, they would exist, simply and absolutely; they would not come and go so fleetingly. Therefore, in regard to contingent things, we have to look outside of them, to an extrinsic cause, or set of causes, in order to explain their existence. So let’s go back to that summer cloud. Instinctually, we know that it doesn’t exist through its own essence, and we therefore look for explanations. We say that it is caused by the moisture in the atmosphere, by the temperature, by the intensity of the winds, and so on, and as far as it goes, that explanation is adequate.

But as any meteorologist will tell us, those factors are altogether contingent, coming into being and passing out of being. Thus we go a step further and say that these factors in turn are caused by the jet stream, which is grounded in the movement of the planet. But a moment’s reflection reveals that the jet stream comes and goes, ebbs and flows, and that the earth itself is contingent, having emerged into existence four billion years ago and being destined one day to be incinerated by the expanding sun.

And so we go further, appealing to the solar system and events within the galaxy and finally perhaps to the very structures inherent in the universe. But contemporary astrophysics has disclosed to us the fundamental contingency of all of those realities, and indeed of the universe itself, which came into existence at the Big Bang some thirteen billion years ago. In our attempt to explain a contingent reality — that evanescent summer cloud — we have appealed simply to a whole series of similarly contingent realities, each one of which requires a further explanation.

Thomas Aquinas argues that if we are to avoid an infinite regress of contingent causes, which finally explain nothing at all, we must come finally to some “necessary” reality, something that exists simply through the power of its own essence. This, he concludes, is what people mean when they use the word “God.” With Aquinas’s demonstration in mind, reconsider that strange answer God gives to Moses’s question: “I am who am.” The biblical God is not one contingent reality among many; he is that whose very nature it is to exist, that power through which and because of which all other things have being.

Some contemporary theologians have translated Aquinas’s abstract metaphysical language into more experiential language. The Protestant theologian Paul Tillich said that “finitude in awareness is anxiety.” He means that when we know in our bones how contingent we are, we become afraid. We exist in time, and this means that we are moving, ineluctably, toward death; we have been “thrown” into being, and this means that one day we will be thrown out of being; and this state of affairs produces fear and trembling. In the grip of this anxiety, Tillich argues, we tend to thrash about, looking for something to reassure us, searching for some firm ground on which to stand.

We seek to alleviate our fears through the piling up of pleasure, wealth, power, or honor, but we discover, soon enough, that all of these worldly realities are as contingent as we are and hence cannot finally soothe us. It is at this point that the scriptural word “My soul rests in God alone” (Psalms 62:1) is heard in its deepest resonance. Our fear — born of contingency — will be assuaged only by that which is not contingent. Our shaken and fragile existence will be stabilized only when placed in relation to the eternal and necessary existence of God. Tillich is, in many ways, a contemporary disciple of Saint Augustine, who said, “Lord, you have made us for yourself, and our hearts are restless till they rest in Thee.”

In 1968 a young theology professor at the University of Tubingen formulated a neat argument for God’s existence that owed a good deal to Thomas Aquinas but that also drew on more contemporary sources. The theologian’s name was Joseph Ratzinger, now Pope Benedict XVI. Ratzinger commences with the observation that finite being, as we experience it, is marked, through and through, by intelligibility, that is to say, by a formal structure that makes it understandable to an inquiring mind. In point of fact, all of the sciences — physics, chemistry, psychology, astronomy, biology, and so forth — rest on the assumption that at all levels, microscopic and macrocosmic, being can be known. The same principle was acknowledged in ancient times by Pythagoras, who said that all existing things correspond to a numeric value, and in medieval times by the scholastic philosophers who formulated the dictum omne ens est scibile (all being is knowable).

Ratzinger argues that the only finally satisfying explanation for this universal objective intelligibility is a great Intelligence who has thought the universe into being. Our language provides an intriguing clue in this regard, for we speak of our acts of knowledge as moments of “recognition,” literally a re-cognition, a thinking again what has already been thought. Ratzinger cites Einstein in support of this connection: “in the laws of nature, a mind so superior is revealed that in comparison, our minds are as something worthless.”

The prologue to the Gospel of John states, “In the beginning was the Word,” and specifies that all things came to be through this divine Logos, implying thereby that the being of the universe is not dumbly there, but rather intelligently there, imbued by a creative mind with intelligible structure. The argument presented by Joseph Ratzinger is but a specification of that great revelation.

One of the particular strengths of this argument is that it shows the deep compatibility between religion and science, two disciplines that so often today are seen as implacable enemies. Ratzinger shows that the physical sciences rest upon the finally mystical intuition that reality has been thought into existence and hence can be known. I say it is mystical because it cannot itself be the product of empirical or experimental investigation, but is instead the very condition for the possibility of analyzing and experimenting in the first place. This is why many theorists have speculated that the emergence of the modern sciences in the context of a Christian intellectual milieu, in which the doctrine of creation through the power of an intelligent Creator is affirmed, is not the least bit accidental.

h1

The Abolition of Man Part One – C.S. Lewis

May 24, 2012

National Review ranked the1943 book #7 in its 100 Best Non-Fiction Books of the 20th Century list. The Intercollegiate Studies Institute ranked the book as the second best book of the 20th century. In a lecture on Walker Percy, Professor Peter Kreeft of Boston College listed the book as one of five “books to read to save Western Civilization,” alongside Lost in the Cosmos by Walker Percy, Mere Christianity by C.S. Lewis, The Everlasting Man by G.K. Chesterton, Orthodoxy by G.K. Chesterton, and Brave New World by Aldous Huxley

After the posts of the past couple weeks on pornography, I recalled a Woody Allen line about being on the losing side of the sexual revolution which dovetailed to this classic C.S. Lewis piece that concerns Man’s somewhat questionable conquest of Nature. If you have never read it, please do. A simple but depressing message: We have been sold for slaves. 

 

**********************************************

It came burning hot into my mind, whatever he said and however he flattered, when he got me home to his house, he would sell me for a slave.
John Bunyan

*********************************************

`Man’s conquest of Nature’ is an expression often used to describe the progress of applied science. `Man has Nature whacked,’ said someone to a friend of mine not long ago. In their context the words had a certain tragic beauty, for the speaker was dying of tuberculosis. `No matter’ he said, `I know I’m one of the casualties. Of course there are casualties on the winning as well as on the losing side. But that doesn’t alter the fact that it is winning.’

I have chosen this story as my point of departure in order to make it clear that I do not wish to disparage all that is really beneficial in the process described as `Man’s conquest’, much less all the real devotion and self-sacrifice that has gone to make it possible. But having done so I must proceed to analyse this conception a little more closely. In what sense is Man the possessor of increasing power over Nature?

Let us consider three typical examples: the airplane, the wireless, and the contraceptive. In a civilized community, in peace-time, anyone who can pay for them may use these things. But it cannot strictly be said that when he does so he is exercising his own proper or individual power over Nature. If I pay you to carry me, I am not therefore myself a strong man.

Any or all of the three things I have mentioned can be withheld from some men by other men — by those who sell, or those who allow the sale, or those who own the sources of production, or those who make the goods. What we call Man’s power is, in reality, a power possessed by some men which they may, or may not, allow other men to profit by. Again, as regards the powers manifested in the airplane or the wireless, Man is as much the patient or subject as the possessor, since he is the target both for bombs and for propaganda.

And as regards contraceptives, there is a paradoxical, negative sense in which all possible future generations are the patients or subjects of a power wielded by those already alive. By contraception simply, they are denied existence; by contraception used as a means of selective breeding, they are, without their concurring voice, made to be what one generation, for its own reasons, may choose to prefer. From this point of view, what we call Man’s power over Nature turns out to be a power exercised by some men over other men with Nature as its instrument.

It is, of course, a commonplace to complain that men have hitherto used badly, and against their fellows, the powers that science has given them, But that is not the point I am trying to make. I am not speaking of particular corruptions and abuses which an increase of moral virtue would cure: I am considering what the thing called `Man’s power over Nature’ must always and essentially be. No doubt, the picture could be modified by public ownership of raw materials and factories and public control of scientific research. But unless we have a world state this will still mean the power of one nation over others. And even within the world state or the nation it will mean (in principle) the power of majorities over minorities, and (in the concrete) of a government over the people. And all long-term exercises of power, especially in breeding, must mean the power of earlier generations over later ones.

The latter point is not always sufficiently emphasized, because those who write on social matters have not yet learned to imitate the physicists by always including Time among the dimensions. In order to understand fully what Man’s power over Nature, and therefore the power of some men over other men, really means, we must picture the race extended in time from the date of its emergence to that of its extinction. Each generation exercises power over its successors: and each, in so far as it modifies the environment bequeathed to it and rebels against tradition, resists and limits the power of its predecessors. This modifies the picture which is sometimes painted of a progressive emancipation from tradition and a progressive control of natural processes resulting in a continual increase of human power.

In reality, of course, if any one age really attains, by eugenics and scientific education, the power to make its descendants what it pleases, all men who live after it are the patients of that power. They are weaker, not stronger: for though we may have put wonderful machines in their hands we have pre-ordained how they are to use them. And if, as is almost certain, the age which had thus attained maximum power over posterity were also the age most emancipated from tradition, it would be engaged in reducing the power of its predecessors almost as drastically as that of its successors. And we must also remember that, quite apart from this, the later a generation comes — the nearer it lives to that date at which the species becomes extinct — the less power it will have in the forward direction, because its subjects will be so few.

There is therefore no question of a power vested in the race as a whole steadily growing as long as the race survives. The last men, far from being the heirs of power, will be of all men most subject to the dead hand of the great planners and conditioners and will themselves exercise least power upon the future.

The real picture is that of one dominant age — let us suppose the hundredth century A.D. — which resists all previous ages most successfully and dominates all subsequent ages most irresistibly, and thus is the real master of the human species. But then within this master generation (itself an infinitesimal minority of the species) the power will be exercised by a minority smaller still. Man’s conquest of Nature, if the dreams of some scientific planners are realized, means the rule of a few hundreds of men over billions upon billions of men. There neither is nor can be any simple increase of power on Man’s side. Each new power won by man is a power over man as well. Each advance leaves him weaker as well as stronger. In every victory, besides being the general who triumphs, he is also the prisoner who follows the triumphal car.

I am not yet considering whether the total result of such ambivalent victories is a good thing or a bad. I am only making clear what Man’s conquest of Nature really means and especially that final stage in the conquest, which, perhaps, is not far off. The final stage is come when Man by eugenics, by pre-natal conditioning, and by an education and propaganda based on a perfect applied psychology, has obtained full control over himself. Human nature will be the last part of Nature to surrender to Man. The battle will then be won. We shall have `taken the thread of life out of the hand of Clotho’ [One of the three fates, the daughter of Zeus and Themis {"divine law"}, who spins the thread of human life.]and be henceforth free to make our species whatever we wish it to be. The battle will indeed be won. But who, precisely, will have won it?

For the power of Man to make himself what he pleases means, as we have seen, the power of some men to make other men what they please. In all ages, no doubt, nurture and instruction have, in some sense, attempted to exercise this power. But the situation to which we must look forward will be novel in two respects. In the first place, the power will be enormously increased. Hitherto the plans of educationalists have achieved very little of what they attempted and indeed, when we read them — how Plato would have every infant “a bastard nursed in a bureau”, and Elyot would have the boy see no men before the age of seven and, after that, no women, and how Locke wants children to have leaky shoes and no turn for poetry — we may well thank the beneficent obstinacy of real mothers, real nurses, and (above all) real children for preserving the human race in such sanity as it still possesses. But the man-moulders of the new age will be armed with the powers of an omnicompetent state and an irresistible scientific technique: we shall get at last a race of conditioners who really can cut out all posterity in what shape they please.

The second difference is even more important. In the older systems both the kind of man the teachers wished to produce and their motives for producing him were prescribed by the Tao — a norm to which the teachers themselves were subject and from which they claimed no liberty to depart. They did not cut men to some pattern they had chosen. They handed on what they had received: they initiated the young neophyte into the mystery of humanity which over-arched him and them alike. It was but old birds teaching young birds to fly. This will be changed.

Values are now mere natural phenomena. Judgements of value are to be produced in the pupil as part of the conditioning. Whatever Tao there is will be the product, not the motive, of education. The conditioners have been emancipated from all that. It is one more part of Nature which they have conquered. The ultimate springs of human action are no longer, for them, something given. They have surrendered — like electricity: it is the function of the Conditioners to control, not to obey them. They know how to produce conscience and decide what kind of conscience they will produce. They themselves are outside, above. For we are assuming the last stage of Man’s struggle with Nature. The final victory has been won. Human nature has been conquered — and, of course, has conquered, in whatever sense those words may now bear.

The Conditioners, then, are to choose what kind of artificial Tao they will, for their own good reasons, produce in the Human race. They are the motivators, the creators of motives. But how are they going to be motivated themselves?

For a time, perhaps, by survivals, within their own minds, of the old `natural’ Tao. Thus at first they may look upon themselves as servants and guardians of humanity and conceive that they have a `duty’ to do it `good’. But it is only by confusion that they can remain in this state. They recognize the concept of duty as the result of certain processes which they can now control. Their victory has consisted precisely in emerging from the state in which they were acted upon by those processes to the state in which they use them as tools. One of the things they now have to decide is whether they will, or will not, so condition the rest of us that we can go on having the old idea of duty and the old reactions to it. How can duty help them to decide that? Duty itself is up for trial: it cannot also be the judge. And `good’ fares no better. They know quite well how to produce a dozen different conceptions of good in us. The question is which, if any, they should produce. No conception of good can help them to decide. It is absurd to fix on one of the things they are comparing and make it the standard of comparison.

To some it will appear that I am inventing a factitious difficulty for my Conditioners. Other, more simple-minded, critics may ask, `Why should you suppose they will be such bad men?’ But I am not supposing them to be bad men. They are, rather, not men (in the old sense) at all. They are, if you like, men who have sacrificed their own share in traditional humanity in order to devote themselves to the task of deciding what `Humanity’ shall henceforth mean.

`Good’ and `bad’, applied to them, are words without content: for it is from them that the content of these words is henceforward to be derived. Nor is their difficulty factitious, “We might suppose that it was possible to say `After all, most of us want more or less the same things — food and drink and sexual intercourse, amusement, art, science, and the longest possible life for individuals and for the species.

Let them simply say, This is what we happen to like, and go on to condition men in the way most likely to produce it. Where’s the trouble?’ But this will not answer. In the first place, it is false that we all really like the same things. But even if we did, what motive is to impel the Conditioners to scorn delights and live laborious days in order that we, and posterity, may have what we like? Their duty?

But that is only the Tao, which they may decide to impose on us, but which cannot be valid for them. If they accept it, then they are no longer the makers of conscience but still its subjects, and their final conquest over Nature has not really happened. The preservation of the species? But why should the species be preserved? One of the questions before them is whether this feeling for posterity (they know well how it is produced) shall be continued or not. However far they go back, or down, they can find no ground to stand on. Every motive they try to act on becomes at once petitio. It is not that they are bad men. They are not men at all. Stepping outside the Tao, they have stepped into the void. Nor are their subjects necessarily unhappy men. They are not men at all: they are artifacts. Man’s final conquest has proved to be the abolition of Man.

h1

A History of Water — Karl W. Giberson

April 18, 2012

And God said, “Let there be a dome in the midst of the waters, and let it separate the waters from the waters.” So God made the dome and separated the waters that were under the dome from the waters that were above the dome. And it was so. God called the dome Sky. And there was evening and there was morning, the second day.
And God said, “Let the waters under the sky be gathered together into one place, and let the dry land appear.” And it was so. God called the dry land Earth, and the waters that were gathered together he called Seas. And God saw that it was good.
Genesis 1: 6-10

The Long And Winding Stream
The winds, the sea, and the moving tides are what they are. If there is wonder and beauty and majesty in them, science will discover these qualities. If they are not there, science cannot create them.
Rachel Carson

Water gets even more interesting when we look at the history of how it got to the earth. The story begins with the big bang, the cosmic fireball we met earlier.

Popular views of the big bang picture all the matter in the universe exploding outward like something blowing up in an action movie. This original matter then combined into the cosmic structures we find in the universe today. This picture is way too simple.

The big bang produced no matter. Only unimaginably high energies emerged from that mysterious and transcendent event. Picture the energy released as an atomic bomb explodes; now multiply this many times over. These energies were so high that matter simply could not exist. Of course, there was no such thing as matter in the universe then, so this statement is a bit odd.

A universe with no matter in it would remain quite uninteresting, but fortunately the universe was born with a set of remarkable physical laws. One of the most basic of those laws was discovered by Albert Einstein in 1905: E=MC2.

This law is the most well-known equation in all of science. Most people don’t know any equations at all, but if they do know one, it is E=MC2. It has graced the covers of magazines, T-shirts and posters. It inspired the atomic bomb, nuclear reactors and dreams of unlimited free energy from seawater. And most important, it was the door through which matter entered our universe.

As the early universe expanded, it cooled, following the same laws of physics running your refrigerator. Cooling is simply the name we give to a decrease in the energy content of a region of space, whether it is your freezer, a Canadian winter or the entire universe. Any quantity of energy will have to decrease in temperature if it spreads out to fill a larger volume. This is why opening your door in the winter cools your house — some of the heat energy flows out the door, futilely trying to warm up the front yard.

As the early universe expanded and cooled it reached critical temperatures where interesting things happened, like when water cools and freezes. If you were swim‑ming under water that was about to freeze — hopefully in a wetsuit to keep you from also freezing — you would see ice crystals suddenly appearing, seemingly out of nowhere. Small bits of water would suddenly be transformed into slivers of ice. Liquid would have become solid. This is what water does as it cools. In the same way, as the early universe cooled, matter popped into existence.

Matter first appeared in two forms — the familiar electrons, with negative electrical charges, and the less familiar quarks with electrical charges of 2/3 and –1/3. Quarks are odd particles conceived in the 1960s to explain the peculiar behavior of other particles. One of their many odd properties is that — like teenagers at the mall — they are never found alone. As soon as they appear, they immediately combine with each other. But they don’t just combine — they form specific particles that have total electrical charges of either 1 or 0.

The most familiar examples of particles with these charges are the proton and neutron, respectively, but there are others. One curious result of this rule of combination is that we never encounter particles with fractional charges, even though we know that both protons and neutrons are composed of particles with fractional charges. In the early days, before this odd rule was understood, heroic efforts were mounted to find a fractionally charged quark hanging out by itself, but none were discovered. Eventually the theory came to include a rule precluding lone-ranger quarks.

After the quarks combine in the early universe, the newly minted matter consists of protons, neutrons and electrons buzzing about in a chaotic but steadily cooling mix. The particles move at great speeds but gradually slow down as the universe expands and cools. Positively charged protons attract negatively charged electrons. As soon as the speeds get low enough — which occurs at a specific temperature — the electron drops into an orbit about a proton, like a child leaping onto a spinning merry-go-round when it slows down enough. The neutrons occasionally bang into protons and stick there, forming the combination still found today in the nucleus of a hydrogen atom. The universe is now full of hydrogen atoms, with a few helium atoms leavening the mixture.

All the particles in the universe are now electrically neutral atoms — their negatively charged electrons• balance their positively charged protons. The powerful electrical forces of attraction and repulsion no longer dominate, and the much weaker gravitational force takes over. The brand new hydrogen atoms float freely about but gravity gathers them ever so slowly together. Clouds of hydrogen gradually form, growing ever larger, and as they get larger they pull with more gravitational force on other atoms. Eventually much of the hydrogen is collected into huge steadily growing clouds that surpass the size of the moon, then the earth, then a large planet like Jupiter. As the clouds get larger they become more compressed, their gravity growing ever stronger.

Nothing limits how strong gravity can become. Eventually another threshold is crossed and the hydrogen atoms become so densely compacted they actually fuse together in a nuclear reaction. This fusion ignites the gigantic balls of hydrogen and, like a slow-motion fireworks display, great spheres of hydrogen turn into stars. Unfortunately, there are no life forms in the universe to witness this extraordinary display, especially since this turns out be a critical step in preparing the universe for life. But amazingly the images of these fireworks end up traveling for billions of years across the universe and are eventually observed, long after the events have faded into history.

The gravity within these newly born stars crushes the hydrogen nuclei, fusing them into helium nuclei and giving off great quantities of light and heat. The process begins to fill the blanks on the periodic table of the elements. Two hydrogens make helium. Add one more and we have lithium. Two helium make beryllium. Add another and we have carbon. Other combinations make nitrogen, oxygen, neon, sodium and on down the periodic table.

At this point the universe is billions of years old and still without an isolated drop of water anywhere. No stars have planets orbiting them, and no solid surfaces exist anywhere on which one could stand. The raw materials out of which planets and people will eventually be constructed are buried deep inside brightly shining stars, and if this were where it ended, there would be nobody to lament our brightly glowing but failed and stillborn universe. But there are more chapters to the story, as you might have anticipated, based on the simple fact that you exist.

Going Out With A Bang
Amazed, and as if astonished and stupefied, I stood still, gazing for a certain length of time with my eyes fixed intently upon it and noticing that same star placed close to the stars which antiquity attributed to Cassiopeia. When I had satisfied myself that no star of that kind had ever shone forth before, I was led into such perplexity by the unbelievability of the thing that I began to doubt the faith of my own eyes.
Tycho Brahe

Large stars near the end of their lives regularly explode as a matter of course. With the force of a billion atomic bombs they strew their contents over unimaginably vast regions of space. It is, of course, a once-in-a-lifetime event for the star — a literal going out with a bang. And even though recorded history is just a few thousand years long — and stars live for billions of years — we have some examples of such explosions that were noted by careful observers.

In A.D. 1054 what is now the Crab Nebula exploded in a flash of light bright enough to be seen in daylight for weeks. Observers in Korea, China, Japan, North America and the Middle East all recorded the supernova, as it is now called, although Europeans did not. It seems that Europeans, convinced that the heavens were perfect and unchanging, managed to delude themselves into not seeing this new star, which must surely have been quite visible.

The great Danish astronomer Tycho Brahe witnessed another supernova in 1572. Like his predecessors, he could not believe that such a dramatic change in the heavens was possible, but, apparently unlike his predecessors, he had enough confidence in his observations to know that he was seeing something remarkable. Brahe’s protege, Johannes Kepler, witnessed another supernova in 1604, and then there were no more visible from earth until 1987, when a star exploded in a nearby galaxy known as the Large Magellanic Cloud.

A supernova explosion fills a massive region of space with the elements created inside the star; the powerful explosion, though, follows known laws of physics as it distributes its contents about the universe. A vast cloud of chemically enriched material, trillions of miles in diameter, results from the event — an event absolutely critical for enabling life.

The grand cloud that results from the supernova resembles the original cloud out of which the star formed in the first place, with one important difference — it contains a substantial roster of different materials, and not just hydrogen and helium. This time around gravity has more to work with, beginning again to gather the material in the huge cloud back into balls. The largest chunk at the center becomes another star — one that starts out with heavier elements, in addition to hydrogen. It is the ultimate recycling project, but unlike recycling on earth, the atoms getting recycled remain in mint condition, no matter how many times they are used.

Some of the smaller balls end up orbiting about the second-generation star. These smaller balls contain many different atoms, and some of them have a curious molecular combination of hydrogen and oxygen. In most parts of the universe these molecules are in the form of a solid. In the others they are a gas. But on balls that are exactly the right distance from the central star, the molecules are liquid, an all-purpose, seemingly magical liquid called water.

Water is found in several places in our solar system. Hydrogen is, of course, the most common element in the universe, and while oxygen is less common it is readily available to combine with hydrogen and form water. Water in the form of ice is a major component in comets and can be found in trace quantities in the atmosphere of Venus, under the surface of Mars and possibly even on some of Jupiter’s moons.

(We have to keep in mind, however, that more than 99 percent of the mass of the solar system is in the sun, so the distribution of elements elsewhere is almost irrelevant from the perspective of the solar system as a whole. The earth has a lot of water, but the earth is a tiny, insignificant speck compared to the sun. And because the water tends to cover so much of the surface, it is easy to overestimate the total amount. Astronomers are not sure exactly where the water on the earth came from. Constructing the early history of our solar system is an enormous challenge.)

From a purely scientific point of view, water is a molecule like any other — and there are lots of molecules. The laws of physics and chemistry describe its behavior, and there are no deep mysteries embedded in its familiar structure. But the laws of physics and chemistry conspire to make water unusual in ways that are critically important for life. Most peculiarly, water expands rather than contracts when it freezes. This makes ice lighter than water, so it floats. Floating ice insulates the water beneath it from the cold temperatures of winter.

Absent this layer of insulation bodies of water all over the earth would freeze solid. If ice were heavier than water, the layer of ice that formed on the top would sink to the bottom and another layer would freeze on top and sink until the entire body of water was a solid piece of ice. This would kill almost every life form in the water.

Water is also an effective solvent. Waste products from our bodies dissolve readily in water and can then easily be expelled. But wait — as they say on television — there is more. Water is also a remarkable coolant capable of absorbing heat and carrying it away from our bodies in the form of sweat. And water stores heat in our bodies, helping keep us warm in cold weather. Magical.

The Gathering Of The Waters
If anyone gives even a cup of cold water to one of these little ones because he is my disciple, I tell you the truth, he will certainly not lose his reward.
MATTHEW 10:42

The creation story in Genesis records that God gathered the waters. In the King James Version that I read as a child it says, “God said, Let the waters under the heaven be gathered together unto one place, and let the dry land appear: and it was so.” In ways that the original readers of Genesis could never have imagined, the gathering of the waters was a cosmic process that took billions of years and involved all the laws of physics and chemistry. The water that we take for granted that covers so much of our planet and makes up so much of our bodies was forged in the nuclear furnace of a star that exploded in the suburbs of the Milky Way galaxy billions of years ago.

That water now cycles endlessly through the life process here on earth — cooling, cleansing and nurturing us. It irrigates our crops, nourishes our livestock, cleans our clothes and gets turned into snow at ski resorts. In those parts of the world where it is plentiful, clean and fresh, we take it for granted and play with it. In Quebec City they construct a hotel out of ice every winter to attract tourists and invite hardy souls to hold their weddings there, wearing parkas and snow boots. We think nothing of using thousands of gallons so our lawns will be green rather than brown in the heat of summer. Water is like air — plentiful and useful.

In parts of the world where fresh water is rare, its value is more apparent. There is a school in Bulawayo, Zimbabwe, where children used to walk a quarter mile during their breaks to get a drink of water. I used to walk to the hallway to get a drink when I was in school. World Vision, one of many organizations helping with water problems around the world, installed a well near the school that the children now use to get water. On school days a group of laughing, happy children can be seen working the oversized pump that takes several of them to manage. The water that emerges from its modest faucet is welcomed in ways that few North Americans can appreciate.

For those schoolchildren the water is simply a welcome part of their diet and lifestyle now. Some of the children that stay in school and go on to university will eventually discover that the precious fluid summoned from beneath the earth by a few children cranking on a lever was created billions of years ago, deep in the heart of a star, via processes of unimaginable subtlety. Those that have learned to worship God will no doubt marvel and give thanks.

Water exists because the universe has a set of laws that guide its steady development from the big bang into the present. If we suppose that water and the life it enables are of no consequence, then we can dismiss these laws as irrelevant. On the other hand, if we believe that God is the Creator of life and that life has a purpose, then these laws take on a new character. If God is the Creator, then these laws exist because God created them. And these laws work because God upholds them from moment to moment. Viewed by these lights, the origin of water and life are creation events, intentionally enabled by the Creator of the universe.

h1

Living On A Goldilocks Planet – Karl W. Giberson

April 17, 2012

A mere 20 light-years away in the constellation Libra, red dwarf star Gliese 581 has received much scrutiny by astronomers in recent years. Earthbound telescopes had detected the signatures of multiple planets orbiting the cool sun, two at least close to the system's habitable zone -- the region where an Earth-like planet can have liquid water on its surface. Now a team headed by Steven Vogt (UCO Lick), and Paul Butler (DTM Carnagie Inst.) has announced the detection of another planet, this one squarely in the system's habitable zone. Based on 11 years of data, their work offers a very compelling case for the first potentially habitable planet found around a very nearby star. Shown in this artist's illustration of the inner part of the exoplanetary system, the planet is designated Gliese 581g, but Vogt's more personal name is Zarmina's World, after his wife. The best fit to the data indicates the planet has a circular 37 day orbit, an orbital radius of only 0.15 AU, and a mass 3.1 times the Earth's. Modeling includes estimates of a planet radius of 1.5, and gravity at the planet's surface of 1.1 to 1.7 in Earth units. Finding a habitable planet so close by suggests there are many others in our Milky Way galaxy.

God is infinite, so His universe must be too. Thus is the excellence of God magnified and the greatness of His kingdom made manifest; He is glorified not in one, but in countless suns; not in a single earth, a single world, but in a thousand thousand, I say in an infinity of worlds.
Giordano Bruno, 1582

Giordano Bruno (1548 – February 17, 1600), (Latin: Iordanus Brunus Nolanus) was an Italian Dominican friar, philosopher, mathematician and astronomer. His cosmological theories went beyond the Copernican model in proposing that the Sun was essentially a star, and moreover, that the universe contained an infinite number of inhabited worlds populated by other intelligent beings.

***********************************

About twenty light years — 120 trillion miles — from earth, in the constellation Libra, a planet named Gliese 581g orbits a star resembling our sun. It’s the fourth planet out from the star, which can only be seen from earth with a telescope. Not far from the middle of its solar system, comfortably situated in what astronomers call the “Goldilocks Zone,” the planet is not too hot and not too cold, but “just right,” like the porridge in the fairy tale. Its gravity is also not too strong and not too weak, so it could have a stable atmosphere like earth. And its star is not too bright and not too dim.

Gliese 581g orbits about its sun in the middle of what astronomers call the habitable zone. Five hundred other planets have been discovered to date outside our solar system, and this is the first one that might be habitable. It is also nearby — at least by astronomical standards — being located in our galactic neighborhood. A huge spaceship traveling over the lifetime of many generations of astronauts could conceivably get there, although the expense would be so great as to essentially render the project impossible.

Astronomers have waxed eloquent about Gliese 581g. One of the discovers called it “Zarmina’s World,” after his wife, convinced that such a “beautiful planet” deserves a more interesting name than Gliese 581g. A Penn State astronomer, enthused at the prospects of extraterrestrial life, says Zarmina’s World is the “first one I’m truly excited about.” After decades, of finding uninhabitable sterile orbs, this discovery has finally provided a license to think seriously — or at least scientifically — about the prospects that we are not alone in the universe.

The hypothetical citizens on Zarmina’s World have already been embraced by the Catholic Church as “children of God.” [This curious fact was left unfootnoted by Giberson and a Google Search didn’t reveal his source for the statement. Unless of course “official Vatican astronomers” double as the Magisterium…Ah well, makes for good reading by the uneducated secularists in our midst.]An official Vatican astronomer, Jesuit priest Jose Gabriel Funes, finds nothing surprising in the prospects of extraterrestrial life: “Just as there is a multiplicity of creatures on Earth, so there could be other beings created God.” Another Vatican astronomer-priest assures us that the Zarminians would have souls, and says he would be happy to baptize them, they asked.

Theologically conservative Protestant Ken Ham, head of the creationist organization Answers in Genesis, disagrees. He claims that Vatican astronomers’ offers to baptize the Zarminians shows that they “can’t truly understand the gospel.,, “The Bible,” says Ham, “makes clear that Adam’s sin affected the whole universe. This means that a aliens would also be affected by Adam’s sin, but they can’t have salvation…. [T]o suggest that aliens could respond to the gospel is just totally wrong.”

All this fussing and fretting about aliens might lead one to believe that some sort of signal had been received — an unmistakably intelligent message like what Jodie Foster’s character, astronomer Ellie Artoway, deciphered in the move Contact. The great distance to the planet rules out the possibility of actual alien Zarminians being among us, but a mere twenty light years is no barrier to radio transmission. If the Zarminians started twenty years ago broadcasting messages to earth, or even generically in all directions, we would be receiving them by now. Radio waves have, in fact, been emanating from earth in all directions for almost a century and could be detected by any extraterrestrial civilization with the appropriate technology. But we are receiving no radio messages from 581g or any other planet in the universe. So why all the excitement about the Zarminians?

Hope Springs Eternal
Our Moon exists for us on the earth, not for the other globes. Those four little moons exist for Jupiter, not for us…. From this line of reason we deduce with the highest degree of probability that Jupiter is inhabited.
Johannes Kepler

Zarmina’s World, as near as we can tell, is not like the earth. Astronomers have not “beamed down” on a planet with breathable air, familiar gravity and comfortable temperatures, as Captain Kirk and the crew of the Starship Enterprise were constantly doing on Star Trek. We now know that the vast majority of planets are nothing like those convenient Hollywood fantasies.

Zarmina’s World is three times the mass of the earth but only slightly larger, so gravity would be much stronger there, due to the greatly increased density. Upon being beamed onto that surface Captain Kirk would find himself weighing over 500 pounds, posing challenges for his trademark brawls with the local aliens. In fact, he would have trouble even standing upright, seriously compromising the charismatic persona that always seemed so appealing to the local alien females.

Zarmina’s World is much closer to its star than our earth is to the sun-14 million miles compared to 93 million for the earth. Its “year” is just 37 days long. It rotates so slowly that one side almost always faces the sun, creating temperatures as hot as 160 degrees — beyond even the most dreadful spots on the surface of the earth. The dark side is like the Canadian winters I enjoyed as a boy: -25°F (that’s below zero!). In the literal twilight zone between the unbearable heat and the Canadian cold would be some pleasant temperatures, where creatures like us could certainly make ourselves at home. Zarminians, if they exist, would have to move every so often as the planet slowly turned, to remain in the temperate zone where water could easily be maintained in liquid form.

The hopeful, even confident, speculations that there might be life on Zarmina’s World reveal just how eager astronomers — and many other members of our species — are to discover that we are not the only life in this big universe. Vogt, co-discoverer of the planet, and the earthling Zarmina’s husband — Mr. Zarmina — believes that “chances for life on this planet are 100 percent.” Vogt’s speculation, alas, is one part science and ninety-nine parts wishful thinking: Zarmina’s World has some surface area between 32°F and 212°F (0°-100° C). So, in the event that water exists in those regions — which we don’t know — it would be liquid. And water is essential for life. Therefore, there could be life on Zarmina’s World — which is more than you can say for the hundreds of other planets that have been discovered outside our solar system.

We don’t know if Zarmina’s World actually has any water, but the chances are reasonable based on what we know about water in the universe in general. Whether that water has contributed to the formation of life is an entirely different question. What these speculations about life in Zarminian waters remind us is how critically important and unusual our water supply is here on the earth — a cosmic quirk that we take for granted. There is an inextricable link between liquid water and life, both here on the earth and anywhere else in the universe we hope life might exist.

Water, Water, Everywhere — Or Not
And there sat Sam, looking cool and calm, in the heart of the furnace roar; And he wore a smile you could see a mile, and he said: “Please close that door. Its fine in here, but I greatly fear you’ll let in the cold and storm — Since I left Plumtree, down in Tennessee, it’s the first time I’ve been warm.
Robert Service, The Cremation Of Sam Mcgee

Almost three-fourths of the surface of the earth is covered with water, and virtually all of the world’s cities are on a body of water. Most people live near rivers, lakes and oceans. And water even makes up 60 percent of the human body, a fact readily apparent when one is sweating in the hot sun or desperately thirsty.

Water, in many parts of the world, seems almost magically available. It pours from our taps on demand, falls from the sky, bubbles up in springs, cascades down the sides of mountains and over cliffs. We swim in it, bathe in it, run it through hoses to water our lawns or entertain our children. We make ice from it to put in our drinks. We skate on it. Even beavers use it freely and recklessly, creating gigantic ponds in which to raise their families. In those many parts of the world blessed with an abundance of water, we take it for granted.

In the larger universe, however, water is rare. In some ways the universe seems so inhospitable to liquid water that one might infer that water is not welcome.

For starters, the temperatures don’t cooperate. All but an insignificant fraction of the volume of space is essentially empty. The volume taken up by stars, planets, moons, comets and other bodies where water might possibly be found is quite insignificant. And all this empty space is cold — really cold.

Growing up in Canada I learned a lot about cold. In the midst of winter, during my teen years, I arose before dawn to deliver my village’s only daily newspaper, the Telegraph Journal. The thermometer outside our kitchen window was a stark and skinny messenger framed against the darkness, feebly illuminated by light from inside the house. The mercury on many mornings all but vanished into the little ball at the bottom of the thermometer, with temperatures reading -40°F (Canada had not yet gone metric). The weather report on the radio would warn that additional chilling from the wind had reduced that temperature even further, sometimes to more than -60°F. Dressed warmly by my thoroughly Canadian mother and with one of her hand-knit woolen scarves about my mouth, I would head out into the pitch-black frigid morning to deliver the news to the good citizens of the little village of Bath, New Brunswick. I would return an hour later, an icicle several inches long hanging from the scarf in front of my mouth, where my breath had condensed and frozen in the cold air.

A decade later I found myself studying at Rice University in Houston, Texas, where thermometers had no need for negative numbers. I arrived in the middle of August and was greeted by temperatures that routinely exceeded 100 degrees, a dreadful situation made even worse by high humidity and requiring the continuous use of air conditioners. In between the extremes of New Brunswick and Texas lie the narrow temperatures that humans enjoy — 85 degrees at the beach, 72 in our offices, 65 on a pleasant evening as we turn in for the night.

The temperature ranges experienced by humans seem extreme but that is simply our limited and parochial view. Those cold temperatures that greeted me as I headed out on frosty Canadian mornings are positively balmy compared to the average temperature of the universe, which is more than 400 degrees cooler. If you took a space voyage to another star system, the temperature outside your window for most of the long journey would be -454°F. A cold Canadian winter would be a welcome relief from such unimaginable cold. On the other hand, the temperature on the stars runs as high as 70,000 degrees, an inferno capable of melting just about anything. You would be incinerated just by getting too close, never mind actually making physical contact.

‘The temperature range where humans feel comfortable is thus extremely narrow compared to the universe as a whole. And even the larger range where humans can exist the habitable zone — is very narrow.

Water seems even more remarkable when we note that only 5 percent of the total matter in the universe is the ordinary familiar stuff made up of atoms and molecules. The other 95 percent consists of largely unknown stuff called, for lack of better terms, dark matter and dark energy. All the elements on the chemists’ periodic table, all the vast collection of atoms and molecules that comprise the earth, the sun and the other planets, all the stars in the Milky Way galaxy — all this matter is less than 5 percent of the total stuff in the universe. And this small percentage is itself composed almost entirely of hydrogen, with water making up but a small fraction. Water thus comprises much less than 1 percent of the universe. Given that water accounts for two-thirds of the matter in our bodies, we can see that we are most unusual from a purely chemical point of view, not to mention our more remarkable characteristics.

h1

Learning To See The Universe – Karl Giberson

April 11, 2012

''Galileo Galilei showing the Doge of Venice how to use the telescope'' by Giuseppe Bertini 1858

We are in the position of a little child entering a huge library filled with books in many different languages. The child knows someone must have written those books. It does not know how. It does not understand the languages in which they are written. The child dimly suspects a mysterious order in the arrangement of the books but doesn’t know what it is. That, it seems to me, is the attitude of ‘even the most intelligent human being toward God. We see a universe marvelously arranged and obeying certain laws, but only dimly understand these laws. Our limited minds cannot grasp the mysterious force that moves the constellations.
Albert Einstein

***************************************

The Dutchman Hans Lippershey invented the telescope in 1608. He owed his “aha” moment, at least according to legend, to children playing with lenses in his shop, where he made spectacles. The children were playing with pieces of the glass that Lippershey so painstakingly and precisely ground into lenses for his visually impaired customers. The children noticed that a weather vane on a nearby church looked larger when viewed through a pair of lenses. Intrigued by the children’s discovery, Lippershey installed lenses in a tube and invented what he called a “looker.” Shortly after, he applied for a patent for his looker.

The patent office turned down his application on the grounds that the device was so simple that its workings could hardly remain secret. They were right. After all, it had been discovered by children. A year later, the great Italian scientist Galileo Galilei heard a vague description of the device and built his own looker. His first feeble attempt magnified objects by a mere factor of three. With some effort he improved the performance until the magnification was around nine times, and got rich in the process.

On August 21, 1609, Galileo showed off his supposedly original invention to Venetian political leaders, including the chief magistrate — called the “Doge” — Leonardo Donato. The demonstration took place in the bell tower of Saint Mark’s cathedral, from which one could look in any direction. Galileo’s impressive performance got him named professor to the University of Padua for which he accepted a generous one thousand florin pension a year. (Wikimedia Commons)

In late August 1609 Galileo, then a professor at the University of Padua in the Venetian Republic, led some senators up a tower in Venice so they could look out to sea with his new spyglass. The senators assumed he had invented the remarkable device and were suitably impressed. Galileo’s `optical tube,” as they called it, enabled them to “discover at a much greater distance than usual the hulls and sails of the enemy, so that for two hours or more we can detect him before he detects us.” As the “inventor” of the amazing instrument, Galileo got a big raise and tenure.

Personal gain, although of interest to Galileo, was not his primary interest in the telescope. He wanted a closer look at the heavens in the hopes of seeing something there that would prove that the earth was going around the sun and not vice versa. Galileo was convinced that evidence must be there, somewhere, to establish the motion of the earth, as the great Polish thinker, Nicolaus Copernicus, had proposed in his book, On the Revolutions of the Heavenly Spheres, published in 1543.

The new sun-centered model of the solar system had captured the imagination of some leading astronomers attracted to its simplicity. For decades, however, fans of the new model had been looking for some observational evidence that the earth was moving. But none had been discovered, which was puzzling. It seemed incomprehensible that the earth could be hurtling through space at seventy thousand miles per hour without some evidence that it was doing so.

Galileo, like any first-time user of a telescope, looked first at the moon. He was startled to find it “rough and uneven” with “huge prominences, deep valleys and chasms.” This ran counter to the prevailing view that bodies in the heavens were all perfectly spherical and composed of some perfect ethereal material not present on earth. The moon, Galileo would report, contradicting a two-thousand-year-old tradition going back to Aristotle, was clearly not “robed in a smooth and polished surface.” It looked much like the earth, in fact, undermining the standard view that the heavens were profoundly different than the earth, which was located at the center of the universe.

Galileo’s innocent observation was quite radical in the first decade of the seventeenth century. The prevailing astronomical tradition had long taught that the heavens were perfect and unchanging, in contrast to the earth, which seemed in constant upheaval. This claim derived from the simple observation that change was almost never observed in the heavens. Christian theology, inspired by this pagan Greek idea, had interpreted the consistency of the night sky in terms of sin and the fall.

Adam’s sin had corrupted the earth but not the heavens, so the heavens — including the pattern of stars — were still in the same perfect state that God had originally created. Hell, being at the center of the earth, was also in the center of the universe, the worst spot in God’s creation, of course, and as far away from the heavens as one could get. As one moved outward from the God-forsaken chaos at the center, things improved. The orbit of the moon was the boundary between the earthly and heavenly realm. The moon was in the heavens, beyond the sinister reach of the curse God had inaugurated in response to Adam’s sin. And yet it was not perfectly spherical. Why did it look so much like the earth if it was a heavenly body?

Early in January 1610, four wandering “stars” entered the field of view of Galileo’s telescope, each of them always close to Jupiter. Galileo grew excited as he came to realize these wandering stars were moons orbiting Jupiter, proving that not everything revolved around the earth. This undermined the notion that the earth was somehow the central — although most corrupted — point of the creation. Europe’s other great astronomer, Johannes Kepler, was thrilled with the discovery and went so far as to say, “Our Moon exists for us on the Earth, not for the other globes. Those four little moons exist for Jupiter, not for us…. From this line of reason we deduce with the highest degree of probability that Jupiter is inhabited.”

Galileo also found a way to look at the sun and located odd dark spots on it that came and went. Like the moon, the sun had a distinctly non-heavenly complexion. None of these discoveries provided the whiz-bang proof that the earth was in motion, but they certainly undermined other prevailing views. His more traditional contemporaries, however, were skeptical. Telescopes were rare, so most people couldn’t check for themselves. And those who had telescopes found them so hard to operate that they couldn’t always see what Galileo claimed he saw. Gradually, though, skepticism gave way to grudging respect as Galileo’s discoveries — which he lorded over his peers as evidence of his superior intellect — were confirmed by others.

Galileo’s star rose steadily in the firmament of Italian science as his discoveries became widely known. In less than two decades, however, his rising star would sputter and plummet back to earth in his celebrated confrontation with the Roman Catholic Church.

In the last days of 1612 Galileo recorded in his notebook a “fixed star” that he observed near Jupiter. (The star was much farther away, of course, but it was next to Jupiter visually, like the moons.) Five days later he noted another new star near Jupiter. On January 28, 1613, he again noted two stars near Jupiter. The first one was just another star, now noted in star catalogs with the exciting name SAO 119234. The second one would prove more interesting, although it would take 250 years to realize just how interesting when it turned out to be the planet now named Neptune.

Wobbles
I do not feel obligated to believe that the same God who has endowed us with sense, reason, and intellect has intended us to forgo their use.
GALILEO

In 1633 an elderly Galileo found himself kneeling before the Inquisition and recanting his long-held belief that the earth moves about the sun. Despite his best intentions — and many promises to his colleagues and critics — his telescope had failed to turn up compelling evidence for the radical idea that the earth moves.

Contrary to widespread perception that the church was closed-minded and resistant to scientific ideas, the truth is that Galileo simply did not have any solid observational evidence. And based on both common sense and the best scientific understanding of the day, a moving earth should produce some noticeable effects. Many of Galileo’s contemporaries — themselves astronomers and mathematicians — considered Copernicus’s idea of a moving earth to be ridiculous for reasons that had nothing to do with the Bible or theology. And many of them agreed with Galileo about the motion of the earth, but believed the idea needed further development before it could be presented with any hope of being accepted. Galileo was fir from the only Copernican of his generation, but he was the only one campaigning to change everybody’s mind.

In 1597 Galileo received a copy of Johannes Kepler’s book The Cosmic Mystery, which argued in favor of the sun-centered universe. Kepler was in many ways Galileo’s Protestant counterpart and his only real peer in the pantheon of European astronomers. Galileo responded cordially to this first overture from Kepler, expressing his appreciation for a new “associate in the study of Truth who is a friend of Truth.”

He went on to explain how he had been arguing quietly in favor of Copernicus for years and had “written many arguments in support of him and in refutation of the opposite view.” But he feared ridicule and had not “dared to bring into the public light, frightened by the fate of Copernicus himself, our teacher who, though he acquired immortal fame with some, is yet to remain to an infinite number of others (for such is the number of fools) an object of ridicule and derision.” The word fools, unfortunately, was often on Galileo’s lips as he enthusiastically ridiculed those who disagreed with him.

Galileo’s advocacy for Copernicanism grew with each passing year, despite his consistent failure to find the evidence he promised. He became bolder and more aggressive. His fame spread across the continent and he grew steadily richer, with increasingly more lucrative academic postings and endless sales of telescopes. Gifted at debate and self-promotion, he steadily climbed the Italian social ladder, to the envy i1′ his colleagues. He made enemies and backed many of his into corners from where they could do nothing but seethe and look for an opportunity to get even.

 Some more cool-headed Jesuit astronomers were quietly teaching Copernican astronomy in Catholic universities, and, had Galileo not turned the motion of the earth into a political controversy, their diplomatic approach would have probably carried the day and avoided what became a great humiliation to the church. As it was, they were quite frustrated that Galileo’s bombastic personal style got Copernicanism declared heretical and his book listed on an Index of Prohibited Books that good Catholics were not supposed to read.

The motion of the earth that we accept without a second thought today was troubling and without much support — scientific or otherwise — in the seventeenth century. It flew solidly in the face of a two thousand-year tradition; there wasn’t a single piece of observational data establishing it as true; it removed the earth from the center, where Christian theology thought it belonged, albeit in abject humiliation. Nevertheless, despite all these challenges, many Christians were slowly coming around to the new astronomy, and had Galileo been more diplomatic there would not have been any need for his great and celebrated confrontation between science and religion.

After recanting his heresies in 1633, Galileo spent the rest of his life in a comfortable apartment in Florence, under house arrest and forbidden to explore any longer the idea that the earth goes around the sun. He died in 1642.

Galileo’s celebrated trial before the Inquisition has acquired a mythical status in our secular culture. Paintings such as Galileo Facing the Roman Inquisition by Cristiano Banti (1857), plays like Bertolt Brecht’s Galileo (1940), and even public television’s documentary Galileo’s Battle for the Heavens (2002) have portrayed Galileo as a great hero standing up to a backward and superstitious church. Urban legends report that Galileo was imprisoned and tortured, neither of which is true. Scholars who have examined the Galileo case argue that these portrayals are oversimplified. He was not tortured; the closest he came to imprisonment was house arrest in a luxury apartment; and there is ample evidence that the Italian political scene, over against the church, played a major role in his condemnation. (Cristiano Banti/Wikimedia Commons).

The physicist Steven Barr continues on in a post dating from 2010. The secular scientists have long since rewritten this history and it’s fun to see them get a little comeuppance. To think that PBS was still circulating this crap in 2002 is laughable…

h1

Reclaiming a Sense of the Sacred II By Marilynne Robinson

March 13, 2012

 

Marilynne Robinson was raised as a Presbyterian and later became a Congregationalist, worshipping and sometimes preaching at the Congregational United Church of Christ in Iowa City. Her Congregationalism, and her interest in the ideas of John Calvin, have been important in her works, including Gilead which centers on the life and theological concerns of a fictional Congregationalist minister.

 

We are much afflicted now by tedious, fruitless controversy. Very often, perhaps typically, the most important aspect of a controversy is not the area of disagreement but the hardening of agreement, the tacit granting on all sides of assumptions that ought not to be granted on any side. The treatment of the physical as a distinct category antithetical to the spiritual is one example. There is a deeply rooted notion that the material exists in opposition to the spiritual, precludes or repels or trumps the sacred as an idea. This dichotomy goes back at least to the dualism of the Manichees, who believed the physical world was the creation of an evil god in perpetual conflict with a good god, and to related teachings within Christianity that encouraged mortification of the flesh, renunciation of the world, and so on.

For almost as long as there has been science in the West, there has been a significant strain in scientific thought which assumed that the physical and material preclude the spiritual. The assumption persists among us still, vigorous as ever, that if a thing can be “explained,” associated with a physical process, it has been excluded from the category of the spiritual. But the “physical” in this sense is only a disappearingly thin slice of being, selected, for our purposes, out of the totality of being by the fact that we perceive it as solid, substantial.

We all know that if we were the size of atoms, chairs and tables would appear to us as loose clouds of energy. It seems to me very amazing that the arbitrarily selected “physical” world we inhabit is coherent and lawful. An older vocabulary would offer the word “miraculous.” Knowing what we know now, an earlier generation might see divine providence in the fact of a world coherent enough to be experienced by us as complete in itself, and as a basis upon which all claims to reality can be tested. A truly theological age would see in this divine providence intent on making a human habitation within the wild roar of the cosmos.

But almost everyone, for generations now, has insisted on a sharp distinction between the physical and the spiritual. So we have had theologies that really proposed a “God of the gaps,” as if God were not manifest in the creation, as the Bible is so inclined to insist, but instead survives in those dark places, those black boxes, where the light of science has not yet shone. And we have atheisms and agnosticisms that make precisely the same argument, only assuming that at some time the light of science will indeed dispel the last shadow in which the holy might have been thought to linger.

Religious experience is said to be associated with activity in a particular part of the brain. For some reason this is supposed to imply that it is delusional. But all thought and experience can be located in some part of the brain, that brain more replete than the starry heaven God showed to Abraham, and we are not in the habit of assuming that it is all delusional on these grounds. Nothing could justify this reasoning, which many religious people take as seriously as any atheist could do, except the idea that the physical and the spiritual cannot abide together, that they cannot be one dispensation.

We live in a time when many religious people feel fiercely threatened by science. O ye of little faith. Let them subscribe to Scientific American for a year and then tell me if their sense of the grandeur of God is not greatly enlarged by what they have learned from it. Of course many of the articles reflect the assumption at the root of many problems, that an account, however tentative, of some structure of the cosmos or some transaction of the nervous system successfully claims that part of reality for secularism. Those who encourage a fear of science are actually saying the same thing. If the old, untenable dualism is put aside, we are instructed in the endless brilliance of creation. Surely to do this is a privilege of modern life for which we should all be grateful.

For years I have been interested in ancient literature and religion. If they are not one and the same, certainly neither is imaginable without the other. Indeed, literature and religion seem to have come into being together, if by literature I can be understood to include pre-literature, narrative whose purpose is to put human life, causality, and meaning in relation, to make each of them in some degree intelligible in terms of the other two.

I was taught, more or less, that we moderns had discovered other religions with narratives resembling our own, and that this discovery had brought all religion down to the level of anthropology. Sky gods and earth gods presiding over survival and procreation. Humankind pushing a lever in the hope of a periodic reward in the form of rain or victory in the next tribal skirmish. From a very simple understanding of what religion has been, we can extrapolate to what religion is now and is intrinsically, so the theory goes. This pattern, of proceeding from presumed simplicity to a degree of elaboration that never loses the primary character of simplicity, is strongly recurrent in modern thought.

I think much religious thought has also been intimidated by this supposed discovery, which is odd, since it certainly was not news to Paul, or Augustine, or Thomas Aquinas, or Calvin. All of them quote the pagans with admiration. Perhaps only in Europe was one form of religion ever so dominant that the fact of other forms could constitute any sort of problem. There has been an influential modern tendency to make a sort of slurry of religious narratives, asserting the discovery of universals that don’t actually exist among them. Mircea Eliade is a prominent example. And there is Joseph Campbell. My primary criticism of this kind of scholarship is that it does not bear scrutiny.

A secondary criticism I would offer is that it erases all evidence that religion has, anywhere and in any form, expressed or stimulated thought. In any case, the anthropological bias among these writers, which may make it seem free of all parochialism, is in fact absolutely Western, since it regards all religion as human beings acting out their nature and no more than that, though I admit there is a gauziness about this worldview to which I will not attempt to do justice here.

This is the anthropologists’ answer to the question, why are people almost always, almost everywhere, religious. Another answer, favored by those who claim to be defenders of science, is that religion formed around the desire to explain what pre-scientific humankind could not account for. Again, this notion does not bear scrutiny. The literatures of antiquity are clearly about other business.

Some of these narratives are so ancient that they clearly existed before writing, though no doubt in the forms we have them they were modified in being written down. Their importance in the development of human culture cannot be overstated. In antiquity people lived in complex city-states, carried out the work and planning required by primitive agriculture, built ships and navigated at great distances, traded, made law, waged war, and kept the records of their dynasties. But the one thing that seems to have predominated, to have laid out their cities and filled them with temples and monuments, to have established their identities and their cultural boundaries, to have governed their calendars and enthroned their kings, were the vivid, atemporal stories they told themselves about the gods, the gods in relation to humankind, to their city, to themselves.

I suppose it was in the 18th century of our era that the notion became solidly fixed in the Western mind that all this narrative was an attempt at explaining what science would one day explain truly and finally. Phoebus drives his chariot across the sky, and so the sun rises and sets. Marduk slays the sea monster Tiamat, who weeps, whence the Tigris and the Euphrates. It is true that in some cases physical reality is accounted for, or at least described, in the terms of these myths.

But the beauty of the myths is not accounted for by this theory, nor is the fact that, in literary forms, they had a hold on the imaginations of the populations that embraced them which expressed itself again as beauty. Over time these narratives had at least as profound an effect on architecture and the visual arts as they did on literature. Anecdotes from them were painted and sculpted everywhere, even on household goods, vases, and drinking cups.

This kind of imaginative engagement bears no resemblance whatever to an assimilation of explanatory models by these civilizations. Perhaps the tendency to think of classical religion as an effort at explaining a world otherwise incomprehensible to them encourages us to forget how sophisticated ancient people really were. They were inevitably as immersed in the realm of the practical as we are. It is strangely easy to forget that they were capable of complex engineering, though so many of their monuments still stand. The Babylonians used quadratic equations.

Yet in many instances ancient people seem to have obscured highly available real-world accounts of things. A sculptor would take an oath that the gods had made an idol, after he himself had made it. The gods were credited with walls and ziggurats, when cities themselves built them. Structures of enormous shaped stones went up in broad daylight in ancient cities, the walls built around the Temple by Herod in Roman-occupied Jerusalem being one example. The ancients knew, though we don’t know, how this was done, obviously. But they left no account of it. This very remarkable evasion of the law of gravity was seemingly not of great interest to them. It was the gods themselves who walled in Troy.

In Virgil’s Aeneid, in which the poet in effect interprets the ancient Greek epic tradition by attempting to renew it in the Latin language and for Roman purposes, there is one especially famous moment. The hero, Aeneas, a Trojan who has escaped the destruction of his city, sees a painting in Carthage of the war at Troy and is deeply moved by it and by what it evokes, the lacrimae rerum, the tears in things. This moment certainly refers to the place in classical civilization of art that pondered and interpreted the Homeric narratives, which were the basis of Greek and Roman religion. My point here is simply that pagan myth, which the Bible in various ways acknowledges as analogous to biblical narrative despite grave defects, is not a naïve attempt at science.

It is true that almost a millennium separated Homer and Virgil. It is also true that through those centuries the classical civilizations had explored and interpreted their myths continuously. Aeschylus, Sophocles, and Euripides would surely have agreed with Virgil’s Aeneas that the epics and the stories that surround them and flow from them are indeed about lacrimae rerum, about a great sadness that pervades human life. The Babylonian Epic of Gilgamesh is about the inevitability of death and loss. This is not the kind of language, nor is it the kind of preoccupation, one would find in a tradition of narrative that had any significant interest in explaining how the leopard got his spots.

The notion that religion is intrinsically a crude explanatory strategy that should be dispelled and supplanted by science is based on a highly selective or tendentious reading of the literatures of religion. In some cases it is certainly fair to conclude that it is based on no reading of them at all. Be that as it may, the effect of this idea, which is very broadly assumed to be true, is again to reinforce the notion that science and religion are struggling for possession of a single piece of turf, and science holds the high ground and gets to choose the weapons.

In fact there is no moment in which, no perspective from which, science as science can regard human life and say that there is a beautiful, terrible mystery in it all, a great pathos. Art, music, and religion tell us that. And what they tell us is true, not after the fashion of a magisterium that is legitimate only so long as it does not overlap the autonomous republic of science. It is true because it takes account of the universal variable, human nature, which shapes everything it touches, science as surely and profoundly as anything else. And it is true in the tentative, suggestive, ambivalent, self-contradictory style of the testimony of a hundred thousand witnesses, who might, taken all together, agree on no more than the shared sense that something of great moment has happened, is happening, will happen, here and among us.

I hasten to add that science is a great contributor to what is beautiful and also terrible in human existence. For example, I am deeply grateful to have lived in the era of cosmic exploration. I am thrilled by those photographs of deep space, as many of us are. Still, if it is true, as they are saying now, that bacteria return from space a great deal more virulent than they were when they entered it, it is not difficult to imagine that some regrettable consequence might follow our sending people to tinker around up there. One article noted that a human being is full of bacteria, and there is nothing to be done about it.

Science might note with great care and precision how a new pathology emerged through this wholly unforeseen impact of space on our biosphere, but it could not, scientifically, absorb the fact of it and the origin of it into any larger frame of meaning. Scientists might mention the law of unintended consequences — mention it softly, because that would sound a little flippant in the circumstances. But religion would recognize in it what religion has always known, that there is a mystery in human nature and in human assertions of brilliance and intention, a recoil the Greeks would have called irony and attributed to some angry whim of the gods, to be interpreted as a rebuke of human pride if it could be interpreted at all.

Christian theology has spoken of human limitation, fallen-ness, an individually and collectively disastrous bias toward error. I think we all know that the earth might be reaching the end of its tolerance for our presumptions. We all know we might at any time feel the force of unintended consequences, many times compounded. Science has no language to account for the fact that it may well overwhelm itself, and more and more stand helpless before its own effects.

Of course science must not be judged by the claims certain of its proponents have made for it. It is not in fact a standard of reasonableness or truth or objectivity. It is human, and has always been one strategy among others in the more general project of human self-awareness and self-assertion. Our problem with ourselves, which is much larger and vastly older than science, has by no means gone into abeyance since we learned to make penicillin or to split the atom.

If antibiotics have been used without sufficient care and have pushed the evolution of bacteria beyond the reach of their own effectiveness, if nuclear fission has become a threat to us all in the insidious form of a disgruntled stranger with a suitcase, a rebuke to every illusion of safety we entertained under fine names like Strategic Defense Initiative, old Homer might say, “the will of Zeus was moving toward its end.” Shakespeare might say, “There is a destiny that shapes our ends, rough-hew them how we will.”

The tendency of the schools of thought that have claimed to be most impressed by science has been to deny the legitimacy of the kind of statement it cannot make, the kind of exploration it cannot make. And yet science itself has been profoundly shaped by that larger bias toward irony, toward error, which has been the subject of religious thought since the emergence of the stories in Genesis that tell us we were given a lavishly beautiful world and are somehow, by our nature, complicit in its decline, its ruin. Science cannot think analogically, though this kind of thinking is very useful for making sense and meaning out of the tumult of human affairs.

We have given ourselves many lessons in the perils of being half right, yet I doubt we have learned a thing. Sophocles could tell us about this, or the book of Job. We all know about hubris. We know that pride goeth before a fall. The problem is that we don’t recognize pride or hubris in ourselves, any more than Oedipus did, any more than Job’s so-called comforters. It can be so innocuous-seeming a thing as confidence that one is right, is competent, is clear-sighted, or confidence that one is pious or pure in one’s motives.

As the disciples said, “Who then can be saved?” Jesus replied, “With men this is impossible, but with God all things are possible,” in this case speaking of the salvation of the pious rich. It is his consistent teaching that the comfortable, the confident, the pious stand in special need of the intervention of grace. Perhaps this is true because they are most vulnerable to error — like the young rich man who makes the astonishing decision to turn his back on Jesus’ invitation to follow him, therefore on the salvation he sought — although there is another turn in the story, and we learn that Jesus will not condemn him. I suspect Jesus should be thought of as smiling at the irony of the young man’s self-defeat — from which, since he is Jesus, he is also ready to rescue him ultimately.

The Christian narrative tells us that we individually and we as a world turn our backs on what is true, essential, wholly to be desired. And it tells us that we can both know this about ourselves and forgive it in ourselves and one another, within the limits of our mortal capacities. To recognize our bias toward error should teach us modesty and reflection, and to forgive it should help us avoid the inhumanity of thinking we ourselves are not as fallible as those who, in any instance, seem most at fault. Science can give us knowledge, but it cannot give us wisdom. Nor can religion, until it puts aside nonsense and distraction and becomes itself again.

Follow

Get every new post delivered to your Inbox.

Join 260 other followers