My Actual Homepage - Go here for more info.


I plan to put a graphical banner here eventually...

30 October 2008

IS THE UNIVERSE FINE-TUNED FOR US?

by
Victor J. Stenger

The ancient argument from design for the existence of God is based on the common intuition that the universe and life are too complex to have arisen by natural means alone. However, as philosopher David Hume pointed out in the eighteenth century, the fact that we cannot explain some phenomenon naturally does not allow us to conclude that it had to be a miracle.

In recent years, novel versions of the argument from design that call upon modern science as their authority have appeared on the scene . Proponents of so-called Intelligent Design claim to confidently rule out natural processes as the sole origin for certain biological systems (Behe 1996, Dembski 1998, 1999, 2002). Here we shall focus on another variation of the argument from design, the argument from fine-tuning, in which evidence for a purposeful creation is seen in the laws and constants of physics.

This claim of evidence for divine cosmic plan is based on the observation that earthly life is so sensitive to the values of the fundamental physical constants and properties of its environment that even the tiniest changes to any of these would mean that life, as we see it around us, would not exist. The universe is then said to be exquisitely fine-tuned--delicately balanced for the production of life. As the argument goes, the chance that any initially random set of constants would correspond to the set of values that we find in our universe is very small and the universe is exceedingly unlikely to be the result of mindless chance. Rather, an intelligent, purposeful, and indeed caring personal Creator must have made things the way they are.

Some who make the fine-tuning argument are content to suggest merely that intelligent, purposeful, supernatural design has become an equally viable alternative to a random, purposeless, natural evolution of the universe and humankind suggested by conventional science.

This mirrors recent arguments for intelligent design as an alternative to evolution.
However, a few design advocates have gone further to claim that God is now required by scientific data. Moreover, this God must be the God of the Christian Bible. They insist that the universe is provably not the product of purely natural, impersonal processes. Typifying this view is physicist and astronomer Hugh Ross, who cannot imagine fine-tuning happening any other way than by a "personal Entity . . . at least a hundred trillion times more 'capable' than are we human beings with all our resources." He concludes that "the Entity who brought the universe into existence must be a Personal Being, for only a person can design with anywhere near this
degree of precision" (Ross, 1995).

The delicate connections among certain physical constants, and between those constants and life, I will collectively call the anthropic coincidences. Before examining the merits of the interpretation of these coincidences as evidence for intelligent design, I will review how the notion first came about. Barrow and Tipler (1986) provide a detailed history and a wide-ranging discussion of all the issues and a complete list of references. But be forewarned that this exhaustive tome has many errors, especially in equations, some of which remain uncorrected in later editions.

The Large Number Coincidences
Early in the twentieth century, Weyl (1919) expressed his puzzlement that the ratio of the electromagnetic force to the gravitational force between two electrons is such a huge number, N1 = 1039. This means that the strength of the electromagnetic force is greater than the strength of the gravitational force by 39 orders of magnitude. Weyl puzzled over this, expressing his intuition that "pure" numbers like ! that occur in the description of physical properties should most naturally occur within a few orders of magnitude of 1. You might expect the numbers 1 or 0 "naturally." But why 1039? Why not 1057 or 10-123? Some principle must select out 1039, according to Weyl’s way of thinking.

Eddington (1923) observed further "It is difficult to account for the occurrence of a pure number (of order greatly different from unity) in the scheme of things; but this difficulty would be removed if we could connect it to the number of particles in the world--a number presumably decided by accident. He estimated that number, now called the "Eddington number," to be N = 1079. Well, N is not too far from the square of N1.

Look around at enough numbers and you are bound to find some that appear connected.
Most physicists, then and now, did not regard the large numbers puzzle seriously. It seems like numerology. However, the great physicist Paul Dirac (1937) noticed that N1 is the same order of magnitude as another pure number N2 that gives the ratio of a typical stellar lifetime to the time for light to traverse the radius of a proton. That is, he found two seemingly unconnected large numbers to be of the same order of magnitude. If one number being large is unlikely, how much more unlikely is another to come along with about the same value?

Dicke (1961) pointed out that N2 is necessarily large in order that the lifetime of typical stars be sufficient to generate heavy chemical elements such as carbon. Furthermore, he showed that N1 must be of the same order as N2 in any universe with heavy elements. Carr and Rees (1979) picked up the argument, claiming to show that the order of magnitudes of masses and lengths at every level of structure in the universe are fixed by the values of just three constants, the dimensionless strengths of the electromagnetic and gravitational forces and the electronproton
mass ratio.

Making Carbon
The heavy elements did not get fabricated straightforwardly. According to the big-bang theory, only hydrogen, deuterium (the isotope of hydrogen consisting of one proton and one neutron), helium, and lithium were formed in the early universe. Carbon, nitrogen, oxygen, iron, and the other elements of the chemical periodic table were not produced until billions of years later.

These billions of years were needed for stars to form and, near the end of their lives, assemble the heavier elements out of neutrons and protons. When the more massive stars expended their hydrogen fuel, they exploded as supernovae, spraying the manufactured elements into space.

Once in space, these elements cooled and gravity formed them into planets.
Billions of additional years were needed for our home star, the Sun, to provide a stable output of energy so at least one of its planets could develop life. But if the gravitational attraction between protons in stars had not been many orders of magnitude weaker than the electric repulsion, as represented by the very large value of N1, stars would have collapsed and burned out long before nuclear processes could build up the periodic table from the original hydrogen and deuterium. The formation of chemical complexity is likely only in a universe of great age.

Great age is not all. The element-synthesizing processes in stars depend sensitively on the properties and abundances of deuterium and helium produced in the early universe. Deuterium would not exist if the difference between the masses of a neutron and a proton were just slightly displaced from its actual value. The relative abundances of hydrogen and helium also depend strongly on this parameter. They, too, require a delicate balance of the relative strengths of gravity and the weak force, the force responsible for nuclear beta decay. A slightly stronger
weak force, and the universe would be 100 percent hydrogen; all the neutrons in the early universe would have decayed, leaving none around to be saved in deuterium nuclei for later use in the synthesizing elements in stars. A slightly weaker weak force, and few neutrons would have decayed, leaving about the same numbers of protons and neutrons; then, all the protons and neutrons would have been bound up in helium nuclei, with two protons and two neutrons in each. This would have led to a universe that was 100 percent helium, with no hydrogen to fuel the fusion processes in stars. Neither of these extremes would have allowed for the existence of
stars and life as we know it based on carbon chemistry.

The electron also enters into the tightrope act needed to produce the heavier elements. Because the mass of the electron is less than the neutron-proton mass difference, a free neutron can decay into a proton, electron, and anti-neutrino. If the mass of the electron were just a bit larger, the neutron would be stable and most of the protons and electrons in the early universe would have combined to form neutrons, leaving little hydrogen to act as the main component and fuel of stars. The neutron must also be heavier than the proton, but not so much heavier that
neutrons cannot be bound in nuclei.

In 1952, astronomer Fred Hoyle used anthropic arguments to predict that an excited
carbon nucleus has an excited energy level at around 7.7 MeV. The success of this prediction gave credibility to anthropic reasoning, so let me discuss this example in detail since it is the only successful prediction of this line of inference so far.

I have already noted that a delicate balance of physical constants was necessary for
carbon and other chemical elements beyond lithium in the periodic table to be cooked in stars. Hoyle looked closely at the nuclear mechanisms involved and found that they appeared to be inadequate.

The basic mechanism for the manufacture of carbon is the fusion of three helium nuclei into a single carbon nucleus:
3He4 ---> C12
(The superscripts give the number of nucleons, that is, protons and neutrons in each
nucleus, which is specified by its chemical symbol; the total number of nucleons is conserved, that is, remains constant, in a nuclear reaction.) However, the probability of three bodies coming together simultaneously is very low, and some catalytic process in which only two bodies interact at a time must be assisting. An intermediate process in which two helium nuclei first fuse into a beryllium nucleus which then interacts with the third helium nucleus to give the desired carbon nucleus gives the desired result:
2He4 ---> Be8
He4 + Be8 ---> C12

Hoyle (1954) showed that this still was not sufficient unless the carbon nucleus had a resonant excited state at 7.7 MeV to provide for a high reaction probability. A laboratory experiment was undertaken, and sure enough a previously unknown excited state of carbon was found at 7.66 MeV (Hoyle 1953).

Nothing can gain you more respect in science than the successful prediction of an
unexpected new phenomenon. Here, Hoyle used standard nuclear theory. But his reasoning contained another element whose significance is still hotly debated. Without the 7.7 MeV nuclear state of carbon, our form of life based on carbon would not have existed.

The Anthropic Principles
Like the large number coincidences, the 7.7 MeV nuclear state seems unlikely to be the result of chance. The existence of these apparent numerical coincidences led Carter (1974) to introduce the notion of an anthropic principle, which hypothesizes that the coincidences are not the accidental but somehow built into the structure of the universe. Barrow and Tipler (1986, 21) have identified three different forms of the anthropic principle, defined as follows, which I quote exactly:

"Weak Anthropic Principle (WAP): The observed values of all physical and cosmological
quantities are not equally probable but take on values restricted by the requirement that there exist sites where carbon-based life can evolve and by the requirement that the Universe be old enough for it to have already done so."

The WAP merely states the obvious. If the universe was not the way it is, we would not be the way we are. But it is sufficient for predictions such as Hoyle's.

"Strong Anthropic Principle (SAP): The Universe must have those properties which allow life to develop within it at some stage in its history."

This is essentially the form originally proposed by Carter, which suggests that the
coincidences are not accidental but the result of a law of nature. It is a strange law indeed, unlike any other in physics. It suggests that life exists as some Aristotelian final cause, as has been suggested by the proponents of Intelligent Design.

Barrow and Tipler (1986, 22) argue that the SAP can have three interpretations:
"(A) There exists one possible Universe 'designed' with the goal of generating and
sustaining 'observers.'" This is the interpretation adopted by most design advocates.
"(B) Observers are necessary to bring the Universe into being."
This is a form of solipsism that can be found today's New Age quantum mysticism.
"(C) An ensemble of other different universes is necessary for the existence of our
Universe."

This speculation is part of contemporary cosmological thinking, as I will discuss below. It represents the idea that the coincidences are accidental. We just happen to live in the particular universe that was suited for us.

The current dialogue focusses on the choice between (A) and (C), with (B) not taken
seriously in the scientific and theological communities (Stenger 1995). However, before discussing the relative merits of the three choices, let me complete the story on the various forms of the anthropic principle discussed by Barrow and Tipler. In addition to the two Anthropic Principles above they identify another version:

"Final Anthropic Principle (FAP): Intelligent, information-processing must come into
evidence in the Universe, and, once it comes into existence, it will never die out."
Martin Gardner (1986) referred to this as the "Completely Ridiculous Anthropic Principle (CRAP)."


Interpreting the Coincidences
Many religious thinkers see the anthropic coincidences as evidence for purposeful design to the universe. They ask: How can the universe possibly have obtained the unique set of physical constants it has, so exquisitely fine-tuned for life as they are, except by purposeful design--design with life and perhaps humanity in mind (Swinburne 1998, Ellis 1993, Ross 1995)?

Let us examine the implicit assumptions here. First and foremost, and fatal to the design argument all by itself, is the wholly unwarranted assumption that only one type of life is possible--the particular form of carbon-based life we have here on Earth.

Carbon seems to be the chemical element best suited to act as the building block for the type of complex molecular systems that develop lifelike qualities. Even today, new materials assembled from carbon atoms exhibit remarkable, unexpected properties, from superconductivity to ferromagnetism. However, to assume that only carbon life is possible is tantamount to "carbocentrism" that results from the fact that you and I are structured on carbon.

Given the known laws of physics and chemistry, we can easily imagine life based on
silicon (computers, the Internet?) or other elements chemically similar to carbon. These still require cooking in stars and thus a universe old enough for star evolution. The N1 = N2 coincidence would still hold in this case, although the anthropic principle would have to be renamed the "cyberthropic" principle, or some such, with computers rather than humans, bacteria, and cockroaches the purpose of existence.

Only hydrogen, helium, and lithium were synthesized in the early big bang. They are
probably chemically too simple to be assembled into diverse structures. So, it seems that any life based on chemistry would require an old universe, with long-lived stars producing the needed materials.

Still, we cannot rule out other forms of matter than molecules in the universe as building blocks of complex systems. While atomic nuclei, for example, do not exhibit the diversity and complexity seen in the way atoms assemble into molecular structures, perhaps they might be able to do so in a universe with different properties and laws.

Sufficient complexity and long life may be the only ingredients needed for a universe to have some form of life. Those who argue that life is highly improbable need to open their minds to the possibility that life might be likely with many different configurations of laws and constants of physics. Furthermore, nothing in anthropic reasoning indicates any special preference for human life, or indeed intelligent or sentient life of any sort--just an inordinate fondness for carbon.
Ikeda and Jefferys (2001) have demonstrated these logical flaws and others in the finetuning argument with a formal probability analysis. They have also noted an amusing inconsistency that shows how promoters of design often use mutually contradictory logic: On the one hand the creationists and God-of-the-gaps evolutionists argue that nature is too uncongenial for life to have developed totally naturally, and so therefore supernatural input must have occurred. On the other hand, the fine-tuners (often the same people) argue that the constants and laws of nature are exquisitely congenial to life, and so therefore they must have
been supernaturally created. They can't have it both ways.


How Fine-Tuned Anyway?
Someday we may have the opportunity to study different forms of life that evolved on other planets. Given the vastness of the universe and the common observation of supernovae in other galaxies, we have no reason to assume life exists only on Earth. Although it seems hardly likely that the evolution of DNA and other details were exactly replicated elsewhere, carbon and the other elements of our form of life are well distributed throughout the universe, as evidenced by the composition of cosmic rays, meteors, and the spectral analysis of interstellar gas.

We also cannot assume that life would have been impossible in our universe had the
physical laws been different. Certainly we cannot speak of such things in the normal scientific mode in which direct observations are described by theory. But, at the same time, it is not illegitimate, not unscientific, to examine the logical consequences of existing theories that are well confirmed by data from our own universe.

The extrapolation of theories beyond their normal domains can turn out to be wildly
wrong. But it can also turn out to be spectacularly correct. The fundamental physics learned in earthbound laboratories has proved to be valid at great distances from Earth and at times long before the Earth and solar system had been formed. Those who argue that science cannot talk about the early universe or life on the early Earth because no humans were there to witness these events greatly underestimate the power of scientific theory.

I have made a modest attempt to obtain some feeling for what a universe with different constants would be like. Press and Lightman (1983) have shown that the physical properties of matter, from the dimensions of atoms to the order of magnitude of the lengths of the day and year, can be estimated from the values of just four fundamental constants (this analysis is slightly different from Carr and Rees [1979 ]). Two of these constants are the strengths of the electromagnetic and strong nuclear interactions. The other two are the masses of the electron and proton. Although the neutron mass does not enter into these calculations, it would still have a limited range for there to be neutrons in stars, as discussed earlier.

I find that long-lived stars that could make life more likely will occur over a wide range of these parameters. For example, if we take the electron and proton masses to be equal to their values in our universe, an electromagnetic force strength having any value greater than its value in our universe will give a stellar lifetime of more than 680 million years. The strong interaction strength does not enter into this calculation. If we had an electron mass 100,000 times lower, the proton mass could be as much as 1,000 times lower to achieve the same minimum stellar lifetime.

This is hardly fine-tuning.
Many more constants are needed to fill in the details of our universe. And our universe, as we have seen, might have had different physical laws. We have little idea what those laws might be; all we know are the laws we have. Still, varying the constants that go into our familiar equations will give many universes that do not look a bit like ours. The gross properties of our universe are determined by these four constants, and we can vary them to see what a universe might grossly look like with different values of these constants.

I have analyzed 100 universes in which the values of the four parameters were generated randomly from a range five orders of magnitude above to five orders of magnitude below their values in our universe, that is, over a total range of ten orders of magnitude (Stenger 1995, 2000).

Over this range of parameter variation, N1 is at least 1033 and N2 at least 1020 in all cases. That is, both are still very large numbers. Although many pairs do not have N1 = N2, an approximate coincidence between these two quantities is not very rare.

I have also examined the distribution of stellar lifetimes for these same 100 universes (Stenger 1995, 2000). While a few are low, most are probably high enough to allow time for stellar evolution and heavy element nucleosynthesis. Over half the universes have stars that live at least a billion years. Long stellar lifetime is not the only requirement for life, but it certainly is not an unusual property of universes.

I do not dispute that life as we know it would not exist if any one of several of the
constants of physics were just slightly different. Additionally, I cannot prove that some other form of life is feasible with a different set of constants. But anyone who insists that our form of life is the only one conceivable is making a claim based on no evidence and no theory.

Fine-Tuning the Cosmological Constant
Next, let me discuss an example of supposed fine-tuning that arises out of cosmology. This is the apparent fine-tuning of Einstein's cosmological constant within 120 orders of magnitude, without which life would be impossible. This will require some preliminary explanation. When Einstein first wrote down his equations of general relativity in 1915, he saw that they allowed for the possibility of gravitational energy stored in the curvature of empty spacetime.

This vacuum curvature is expressed in terms of what is called the cosmological constant. The familiar gravitational force between material objects is always attractive. A positive cosmological constant produces a repulsive gravitational force. At the time, Einstein and most others assumed that the stars formed a fixed, stable "firmament," as it says in the Bible. A stable firmament is not possible with attractive forces alone so Einstein thought that the repulsion provided by the cosmological constant might balance things out. When, soon after, Hubble discovered that the universe was not a stable firmament but expanding, the need for a cosmological constant was eliminated, and Einstein called it his "biggest blunder." Until recently, all the data gathered by astronomers have fit very well to models that set the cosmological constant equal to 0.

Einstein's "blunder" resurfaced in 1980 with the inflationary model of the early big bang, which proposed that the universe underwent a huge exponential expansion during its first 10- 35 second or so (Kazanas 1980, Guth 1981, Linde 1982). One way to achieve exponential expansion is with the curvature produced by a cosmological constant in otherwise empty space.

This was not all. In 1998, two independent research groups studying distant supernovae were astonished to discover, against all expectations, that the current expansion of the universe is accelerating (Reiss 1998, Perlmutter 1999). The universe is falling up! Once again, gravitational repulsion is indicated, possibly provided by a cosmological constant.

Whatever is producing this repulsion, it represents 70 percent of the total mass-energy of the universe--the single largest component. This component has been dubbed dark energy to distinguish it from the gravitationally attractive dark matter that constitutes another 26 percent of the mass-energy. Neither one of these ingredients is visible, nor can they be composed of ordinary atomic and subatomic matter like quarks and electrons. Familiar luminous matter, as seen in stars and galaxies, comprises only 0.5 percent of the total mass-energy of the universe,
with the remaining 3.5 percent in ordinary but nonluminous matter like planets.

If dark energy is in fact the vacuum energy implied by a cosmological constant, then we have a serious puzzle called the cosmological constant problem (Weinberg 1989). As the universe expands, regions of space expand along with it. A cosmological constant implies a constant energy density, and the total energy inside a given region of space will increase as the volume of that region expands. Since the end of inflation, volumes have expanded by 120 orders of magnitude. This implies that the cosmological constant was "fine-tuned" to be 120 orders of magnitude below what it is now, a tiny amount of energy. If the vacuum energy had been just a hair greater at the end of inflation, it would be so enormous today that space would be highly
curved and the stars and planets could not exist.

Design advocates have not overlooked the cosmological constant problem (Ross 1998).
Once again they claim to see the hand of God in fine-tuning the cosmological constant to ensure that human life, as we know it, can exist. However, recent theoretical work has offered a plausible non-divine solution to the cosmological constant problem.

Theoretical physicists have proposed models in which the dark energy is not identified with the energy of curved space-time but rather a dynamical, material energy field called quintessence. In these models, the cosmological constant is exactly 0, as suggested by a symmetry principle called supersymmetry. Since 0 multiplied by 10120 is still 0, we have no cosmological constant problem in this case. The energy density of quintessence is not constant but evolves along with the other matter/energy fields of the universe. Unlike the cosmological constant, quintessence energy density need not be fine-tuned.

While quintessence may not turn out to provide the correct explanation for the
cosmological constant problem, it demonstrates, if nothing else, that science is always hard at work trying to solve its puzzles within a materialistic framework. The assertion that God can be seen by virtue of his acts of cosmological fine-tuning, like intelligent design and earlier versions of the argument from design, is nothing more than another variation on the disreputable God-of-the-gaps argument. These rely on the faint hope that scientists will never be able to find a natural explanation for one or more of the puzzles that currently have them scratching their heads and therefore will have to insert God as the explanation. As long as science can provide plausible scenarios for a fully material universe, even if those scenarios cannot be currently tested they are sufficient to refute the God of the gaps.


An Infinity of Universes
We have shown that conditions that might support some form of life in a random universe are not improbable. Indeed, we can empirically estimate the probability that a universe will have life. We know of one universe, and that universe has life, so the "measured" probability is 100 percent, albeit with a large statistical uncertainty. This rebuts a myth that has appeared frequently in the design literature and is indicated by Barrow and Tipler's option (c), that only a multiple-universe scenario can explain the coincidences without a supernatural creator (Swinburne, 1990). Multiuniverses are certainly a possible explanation, but a multitude of other, different universes is not the sole naturalistic explanation available for the particular structure of our universe.

However, if many universes beside our own exist, then the anthropic coincidences are a no-brainer. Within the framework of established knowledge of physics and cosmology, our universe could be one of many in a super-universe or multiverse. Linde (1990, 1994) has proposed that a background space-time "foam" empty of matter and radiation will experience local quantum fluctuations in curvature, forming many bubbles of false vacuum that individually inflate into mini-universes with random characteristics. Each universe within the multiverse can have a different set of constants and physical laws. Some might have life of a form different from
ours; others might have no life at all or something even more complex or so different that we cannot even imagine it. Obviously we are in one of those universes with life. Other multiverse scenarios have been discussed by Smith(1990), Smolin(1992, 1997), and Tegmark(2003).

Several commentators have argued that a multiverse cosmology violates Occam's razor
(Ellis 1993). This is debatable. Occam’s razor is usually expressed as “Entities should not be multiplied beyond necessity. The "entities" that Occam's law of parsimony forbids us from "multiplying beyond necessity" are independent theoretical hypotheses, not universes. For example, the atomic theory of matter multiplied the number of bodies we must consider in solving a thermodynamic problem by 1024 or so per gram. But it did not violate Occam's razor.

Instead, it provided for a simpler, more powerful, more economic exposition of the rules that were obeyed by thermodynamic systems. The multiverse scenario is more parsimonious than that of a single universe. No known principle rules out the existence of other universes which, furthermore, are suggested by modern cosmological models.


Conclusion
The media have reported a new harmonic convergence of science and religion (Begley 1998). This is more a convergence between theologians and devout scientists than a consensus of the scientific community. Those who deeply need to find evidence for design and purpose to the universe now think they have done so. Many say that they see strong hints of purpose in the way the physical constants of nature seem to be exquisitely fine-tuned for the evolution and maintenance of life. Although not so specific that they select out human life, various forms of anthropic principles have been suggested as the underlying rationale.

Design advocates argue that the universe seems to have been specifically designed so that intelligent life would form. These claims are essentially a modern, cosmological version of the ancient argument from design for the existence of God. However, the new version is as deeply flawed as its predecessors, making many unjustified assumptions and being inconsistent with existing knowledge. One gross and fatal assumption is that only one kind of life, ours, is conceivable in every conceivable configuration of universes.

However, a wide variation of constants of physics leads to universes that are long-lived enough for life to evolve, although human life need not exist in such universes.

Although not required to negate the fine-tuning argument, which falls of its own weight, other universes besides our own are not ruled out by fundamental physics and cosmology. The theory of a multiverse composed of many universes with different laws and physical properties is actually more parsimonious, more consistent with Occam's razor, than a single universe.

Specifically, we would need to hypothesize a new principle to rule out all but a single universe. If, indeed, multiple universes exist, then we are simply in that particular universe of all the logically consistent possibilities that had the properties needed to produce us.

The fine-tuning argument and other recent intelligent design arguments are modern
versions of God-of-the-gaps reasoning, where a God is deemed necessary whenever science has not fully explained some phenomenon. When humans lived in caves they imagined spirits behind earthquakes, storms, and illness. Today we have scientific explanations for those events and much more. So those who desire explicit signs of God in science now look deeper, to highly sophisticated puzzles like the cosmological constant problem. But, once again, science continues to progress, and we now have a plausible explanation that does not require fine-tuning. Similarly, science may someday have a theory from which the values of existing physical constants can be derived or at otherwise explained.

The fine-tuning argument would tell us that the Sun radiates light so that we can see
where we are going. In fact, the human eye evolved to be sensitive to light from the sun. The universe is not fine-tuned for humanity. Humanity is fine-tuned to the universe.


References
Barrow, John D. and Frank J. Tipler 1986. The Anthropic Cosmological Principle. Oxford:
Oxford University Press.
Begley, Sharon 1998, "Science Finds God." Newsweek (July 20),pp. 46-52.
Behe, Michael J. 1996, Darwin's Black Box: The Biochemical Challenge to Evolution. New York:
The Free Press.
Carr, B. and Rees, M. 1979. The Anthropic Principle and the Structure of the Physical World,"
Nature 278, 605-12.
Carter, Brandon 1974. "Large Number Coincidences and the Anthropic Principle in Cosmology"
in M. S. Longair, ed., Confrontation of Cosmological Theory with Astronomical Data.
Dordrecht: Reidel, 291-8, Reprinted in Leslie, John 1990. Physical Cosmology and
Philosophy. New York: Macmillan, 61-8.
Dembski, William A. 1998. The Design Inference. Cambridge: Cambridge University Press.
Dembski, William A. 1999. Intelligent Design: The Bridge between Science and Theology.
Downers Grove, Ill.: InterVarsity Press,.
Dembski, William A. 2002. No Free Lunch: Why Specified Complexity Cannot Be Purchased
without Intelligence. Rowman & Littlefield.
Dicke, R. H. 1961. "Dirac's Cosmology and Mach's Principle." Nature 192, 440-1.
Dirac, P. A. M. 1937. "The Cosmological Constants." Nature 139, 323-4.
Eddington, A. S. 1923. The Mathematical Theory of Relativity. London: Cambridge,167.
Ellis, George 1993. Before the Beginning: Cosmology Explained. London, New York:
Boyars/Bowerdean.
Gardner, Martin (1986). "WAP, SAP, PAP, and FAP." in The New York Review of Books 23 (8):
22-25.
Guth, A. 1981. "Inflationary Universe: A Possible Solution to the Horizon and Flatness
Problems." Physical Review D 23: 347-56.
Hoyle, F. 1954. "In Nuclear Reactions Occurring in Very Hot Stars: I. The Synthesis of Elements
from Carbon to Nickel." Astrophysical Journal, Supplement 1, 121-46.
Hoyle, F., Dunbar D.N.F., Wensel, W.A., and Whaling, W. 1953. "A State of C12 Predicted from
Astronomical Evidence." Physical Review Letters 92, 1095.
Kazanas D. 1980. "Dynamics of the Universe and Spontaneous Symmetry Breaking."
Astrophysical Journal 241, L59-63.
Linde, Andre 1982 . "A New Inflationary Universe Scenario: A Possible Solution of the
Horizon, Flatness, Homogeneity, Isotropy, and Primordial Monopole Problems,"
Physics Letters 108B: 389-92.
Linde, Andre 1990. Particle Physics and Inflationary Cosmology. New York: Academic Press.
Linde, Andre 1994 "The Self-Reproducing Inflationary Universe." Scientific American 271,
November: 48-55.
Livio, M., Hollowel, D., Weiss, A., and Truran, J. 1989. "The Anthropic Significance of an
Excited State of 12C," Nature 349, 281-4.
Ikeda, Michael and Bill Jefferys 2001. "The Anthropic Principle Does Not Support
Supernaturalism." [online], (April 30, 2001).
Perlmutter, S., et al., 1999. "Measurements of Omega and Lambda from 42 High-Redshift
Supernovae," Astrophysical Journal 517, 565-86.
Press, W.H. and A.P. Lightman 1983. "Dependence of Macrophysical Phenomena on the Values
of the Fundamental Constants." Philosophical Transactions of the Royal Society of
London, Series A, 310, 323-36.
Reiss, A., et al. 1998. "Observational Evidence from Supernovae for an Accelerating Universe and
a Cosmological Constant", Astronomical Journal 116, 1009-38. *** NOTE ET AL.
Ross, Hugh 1995. The Creator and the Cosmos: How the Greatest Scientific Discoveries of the
Century Reveal God. Colorado Springs: Navpress,118.
Ross, Hugh 1998. "Fine-Tuning the Case for Fine-Tuning: a Cosmic Breakthrough." [online],
.
Smith, Quentin 1990. "A Natural Explanation of the Existence and Laws of Our Universe."
Australasian Journal of Philosophy 68, 22-43.
Smolin, Lee 1992. "Did the universe evolve?" Classical and Quantum Gravity 9, 173-191.
1997. The Life of the Cosmos. Oxford and New York: The Oxford University Press.
Stenger, Victor J. 1995. The Unconscious Quantum: Metaphysics in Modern Physics and
Cosmology. Amherst, N. Y.: Prometheus Books.
Stenger, Victor J. 2000 "Natural Explanations for the Anthropic Coincidences." Philo 3: 50-67.
Swinburne, Richard. 1998. "Argument from the Fine-Tuning of the Universe" in Modern
Cosmology & Philosophy, ed. by John Leslie, Amherst, N.Y.: Prometheus Books, pp.
160-179.
Tegmark, Max. 2003. "Parallel Universes," Scientific American 288 (5): 40-51
Weinberg, Steven 1989. "The Cosmological Constant Problem." Reviews of Modern Physics 61,
1-23.
Weyl, H. 1919. "A New Extension of the Theory of Relativity." Ann. Physik 59, 101.

5 comments:

Unknown said...

In other words: We're only special because we think we are!

Jorgon Gorgon said...

Absolutely correct. The degree of stupidity inherent in some of the design arguments is astounding. While there is some valifdity to at least the weak vrsion of anthropic pronciple, some go much further! I am often asked why Earth's environment is so well-suited for our kind of life. Needless to say, this question most often comes from creationists and reveals, again, the depths of their ignorance.

James Redford said...

Hi, Larian LeQuella. Pertinent to your above post, God has been proven to exist based upon the most reserved view of the known laws of physics. For much more on that, see Prof. Frank J. Tipler's below paper, which among other things demonstrates that the known laws of physics (i.e., the Second Law of Thermodynamics, general relativity, quantum mechanics, and the Standard Model of particle physics) require that the universe end in the Omega Point (the final cosmological singularity and state of infinite informational capacity identified as being God):

F. J. Tipler, "The structure of the world from pure numbers," Reports on Progress in Physics, Vol. 68, No. 4 (April 2005), pp. 897-964. http://math.tulane.edu/~tipler/theoryofeverything.pdf Also released as "Feynman-Weinberg Quantum Gravity and the Extended Standard Model as a Theory of Everything," arXiv:0704.3276, April 24, 2007. http://arxiv.org/abs/0704.3276

Out of 50 articles, Prof. Tipler's above paper was selected as one of 12 for the "Highlights of 2005" accolade as "the very best articles published in Reports on Progress in Physics in 2005 [Vol. 68]. Articles were selected by the Editorial Board for their outstanding reviews of the field. They all received the highest praise from our international referees and a high number of downloads from the journal Website." (See Richard Palmer, Publisher, "Highlights of 2005," Reports on Progress in Physics. http://www.iop.org/EJ/journal/-page=extra.highlights/0034-4885 ) Reports on Progress in Physics is the leading journal of the Institute of Physics, Britain's main professional body for physicists.

Further, Reports on Progress in Physics has a higher impact factor (according to Journal Citation Reports) than Physical Review Letters, which is the most prestigious American physics journal (one, incidently, which Prof. Tipler has been published in more than once). A journal's impact factor reflects the importance the science community places in that journal in the sense of actually citing its papers in their own papers. (And just to point out, Tipler's 2005 Reports on Progress in Physics paper could not have been published in Physical Review Letters since said paper is nearly book-length, and hence not a "letter" as defined by the latter journal.)

See also the below resources for further information on the Omega Point Theory:

Theophysics http://geocities.com/theophysics/

"Omega Point (Tipler)," Wikipedia, April 16, 2008 http://en.wikipedia.org/w/index.php?title=Omega_Point_%28Tipler%29&oldid=206077125

"Frank J. Tipler," Wikipedia, April 16, 2008 http://en.wikipedia.org/w/index.php?title=Frank_J._Tipler&oldid=205920802

Tipler is Professor of Mathematics and Physics (joint appointment) at Tulane University. His Ph.D. is in the field of global general relativity (the same rarefied field that Profs. Roger Penrose and Stephen Hawking developed), and he is also an expert in particle physics and computer science. His Omega Point Theory has been published in a number of prestigious peer-reviewed physics and science journals in addition to Reports on Progress in Physics, such as Monthly Notices of the Royal Astronomical Society (one of the world's leading astrophysics journals), Physics Letters B, the International Journal of Theoretical Physics, etc.

Prof. John A. Wheeler (the father of most relativity research in the U.S.) wrote that "Frank Tipler is widely known for important concepts and theorems in general relativity and gravitation physics" on pg. viii in the "Foreword" to The Anthropic Cosmological Principle (1986) by cosmologist Prof. John D. Barrow and Tipler, which was the first book wherein Tipler's Omega Point Theory was described. On pg. ix of said book, Prof. Wheeler wrote that Chapter 10 of the book, which concerns the Omega Point Theory, "rivals in thought-provoking power any of the [other chapters]."

The leading quantum physicist in the world, Prof. David Deutsch (inventor of the quantum computer, being the first person to mathematically describe the workings of such a device, and winner of the Institute of Physics' 1998 Paul Dirac Medal and Prize for his work), endorses the physics of the Omega Point Theory in his book The Fabric of Reality (1997). For that, see:

David Deutsch, extracts from Chapter 14: "The Ends of the Universe" of The Fabric of Reality: The Science of Parallel Universes--and Its Implications (London: Allen Lane The Penguin Press, 1997), ISBN: 0713990619; with additional comments by Frank J. Tipler. http://geocities.com/theophysics/deutsch-ends-of-the-universe.html

The only way to avoid the Omega Point cosmology is to invent tenuous physical theories which have no experimental support and which violate the known laws of physics, such as with Prof. Stephen Hawking's paper on the black hole information issue which is dependant on the conjectured string theory-based anti-de Sitter space/conformal field theory correspondence (AdS/CFT correspondence). See S. W. Hawking, "Information loss in black holes," Physical Review D, Vol. 72, No. 8, 084013 (October 2005); also at arXiv:hep-th/0507171, July 18, 2005. http://arxiv.org/abs/hep-th/0507171

That is, Prof. Hawking's paper is based upon proposed, unconfirmed physics. It's an impressive testament to the Omega Point Theory's correctness, as Hawking implicitly confirms that the known laws of physics require the universe to collapse in finite time. Hawking realizes that the black hole information issue must be resolved without violating unitarity, yet he's forced to abandon the known laws of physics in order to avoid unitarity violation without the universe collapsing.

Some have suggested that the universe's current acceleration of its expansion obviates the universe collapsing (and therefore obviates the Omega Point). But as Profs. Lawrence M. Krauss and Michael S. Turner point out in "Geometry and Destiny" (General Relativity and Gravitation, Vol. 31, No. 10 [October 1999], pp. 1453-1459; also at arXiv:astro-ph/9904020, April 1, 1999 http://arxiv.org/abs/astro-ph/9904020 ), there is no set of cosmological observations which can tell us whether the universe will expand forever or eventually collapse.

There's a very good reason for that, because that is dependant on the actions of intelligent life. The known laws of physics provide the mechanism for the universe's collapse. As required by the Standard Model, the net baryon number was created in the early universe by baryogenesis via electroweak quantum tunneling. This necessarily forces the Higgs field to be in a vacuum state that is not its absolute vacuum, which is the cause of the positive cosmological constant. But if the baryons in the universe were to be annihilated by the inverse of baryogenesis, again via electroweak quantum tunneling (which is allowed in the Standard Model, as B - L is conserved), then this would force the Higgs field toward its absolute vacuum, cancelling the positive cosmological constant and thereby forcing the universe to collapse. Moreover, this process would provide the ideal form of energy resource and rocket propulsion during the colonization phase of the universe.

Prof. Tipler's above 2005 Reports on Progress in Physics paper also demonstrates that the correct quantum gravity theory has existed since 1962, first discovered by Richard Feynman in that year, and independently discovered by Steven Weinberg and Bryce DeWitt, among others. But because these physicists were looking for equations with a finite number of terms (i.e., derivatives no higher than second order), they abandoned this qualitatively unique quantum gravity theory since in order for it to be consistent it requires an arbitrarily higher number of terms. Further, they didn't realize that this proper theory of quantum gravity is consistent only with a certain set of boundary conditions imposed (which includes the initial Big Bang, and the final Omega Point, cosmological singularities). The equations for this theory of quantum gravity are term-by-term finite, but the same mechanism that forces each term in the series to be finite also forces the entire series to be infinite (i.e., infinities that would otherwise occur in spacetime, consequently destabilizing it, are transferred to the cosmological singularities, thereby preventing the universe from immediately collapsing into nonexistence). As Tipler notes in his 2007 book The Physics of Christianity (pp. 49 and 279), "It is a fundamental mathematical fact that this [infinite series] is the best that we can do. ... This is somewhat analogous to Liouville's theorem in complex analysis, which says that all analytic functions other than constants have singularities either a finite distance from the origin of coordinates or at infinity."

When combined with the Standard Model, the result is the Theory of Everything (TOE) correctly describing and unifying all the forces in physics.

Unknown said...

James,

The most recent new theories regarding the second law of thermodynamics must be pretty disheartening! ;)

Looking BRIEFLY at the sources you cite, they all have in common the reverse logic that the whole fine tuned argument is based on. It's not perfect because we are here. We are here because it just so happens that the conditions for us exist. But those numbers have a wide range that they can fall into. So in the end, there is nothing special about this universe.

Anonymous said...

The problem with Stenger's anti-creationism approach is that it denies evidence that calls for equal time for scientific investigation into strong anthropic constraints on the forces, which necessarily produce a theory of everything, since it would explain why the constants are set in the highly unexpected and pointed manner that they are.

He's turned science into his own personal culture war that has everything to do with politics, and nothing to do with science.

He has also proven that Brandon Carter was right about scientists, and willful ignorance is no excuse:

The Anthropic Principle