God's Providence and Randomness in Nature: Scientific and Theological Perspectives

God's Providence and Randomness in Nature: Scientific and Theological Perspectives

Paperback(1)

$39.95
View All Available Formats & Editions
Choose Expedited Shipping at checkout for delivery by Monday, June 28

Overview

In October 2014, a group of mathematicians, physicists, ecologists, philosophers, and theologians gathered at a special conference in Berkeley, California to present the results of a two-year research program dubbed “Project SATURN”. This program explored many of the rich avenues of thought found at the intersection of modern science and Christian theology. Chief among them is the possibility that certain processes in nature might be so complex that they do not have sufficient physical causes. Known as “ontological indeterminism”, this idea has profound implications for theology. Specifically, it allows God to be thought of as acting providentially within nature without violating the laws and processes of nature.

Such a momentous insight could influence how we understand free will, natural evil, suffering in nature, and the relation between divine providence and human evolution. The essays collected here discuss each of these topics and were originally presented at the 2014 conference. Part I establishes the scientific basis for conceptualizing certain process in the universe as inherently random and possibly indeterministic. Part II discusses the philosophical and theological issues that spring from this understanding. Together they represent the cutting edge of thought in the increasingly productive dialogue between science and theology.

Short for the “Scientific and Theological Understandings of Randomness in Nature”, Project SATURN was created by the Center for Theology and the Natural Sciences, a Program of the Graduate Theological Union, Berkeley. It was funded with a grant administered by Calvin College and provided by the John Templeton Foundation.

Product Details

ISBN-13: 9781599475677
Publisher: Templeton Press
Publication date: 02/11/2019
Edition description: 1
Pages: 388
Product dimensions: 6.00(w) x 9.00(h) x 0.90(d)
Age Range: 3 Months to 18 Years

About the Author

Robert John Russell is the founder and director of CTNS and the Ian G. Barbour Professor of Theology and Science at the Graduate Theological Union in Berkeley, California. He is an ordained minister in the United Church of Christ and has been a leader in the enterprise of promoting creative dialogue between scientists and theologians for more than three decades, including spearheading a twenty-year collaboration between the Vatican Observatory and CTNS. He is a member of the John Templeton Foundation, the Templeton Religion Trust, and the Templeton World Charities Foundation.

Joshua M. Moritz teaches philosophy at the University of San Francisco, theology at the Jesuit Graduate School of Theology at Santa Clara University, and theology and science at the Graduate Theological Union and at the School of Applied Theology, Berkeley. Joshua is managing editor of the journal Theology and Science and has authored numerous books and articles, including Science and Religion: Beyond Warfare and Toward Understanding (Anselm Academic, 2016) and The Role of Theology in the History and Philosophy of Science (Brill, 2017).

Read an Excerpt

CHAPTER 1

NECESSITY, PURPOSE, AND CHANCE

The Role of Randomness and Indeterminism in Nature from Complex Macroscopic Systems to Relativistic Cosmology

George F. R. Ellis

Causation and Randomness in Nature

Various kinds of causation occur in nature. Famously, Jacques Monod (1972) characterized them only as chance and necessity. He missed a key further kind of causation that certainly occurs in the real universe: purpose or goal seeking (Ellis 2005), and specifically libertarian free will, when we include sentience, and in particular, human self-consciousness. With the omission of this key causal category — which inter alia explained why he wrote his book — his analysis was prevented from relating adequately to deep philosophical questions, even though he claimed to answer them. The same comment applies to more recent books by Susskind, Hawking and Mlodinow, Krauss, and others.

This chapter will consider a broader context: The interplay between necessity, purpose, and chance. The first has an inexorable impersonal quality. It is the heart of physics and chemistry. It can be successfully described by mathematical equations. The second is the core of human beings, indeed of all life. All living entities embody some kind of purpose or function in their structure and actions (Campbell and Reece 2005); this is a particular form of top-down causation. The third embodies the idea of randomness, implying a lack of purpose or meaning. Things just happen that way, not because it's inevitable, but because it's possible, and maybe probable. It is prevalent in the real universe because of the large number of unrelated causes that influence events, and in particular, because of the vast numbers of micro-events that underlie all macroscopic outcomes. All three kinds of causation occur in an intricate interplay in the real universe.

Bob Russell points out that the classification of chance that surfaced repeatedly in the Vatican Observatory/The Center for Theology and the Natural Sciences (CTNS) volumes edited by Russell, Murphy, and Stoeger was as follows:

1. Chance events along a system's trajectory as epistemic ignorance of underlying deterministic processes (roulette wheel): the large number of unrelated causes that influence events

2. Chance as the unpredictable juxtaposition of fully deterministic but apparently uncorrelated causal trajectories (car crash): the vast numbers of micro-events that underlie all macroscopic outcomes

3. Chance as ontological indeterminism (Copenhagen interpretation of quantum mechanics or QM)

In the viewpoint I take, all three will occur. Randomness is, in effective terms, basic to quantum physics, and I will adopt the view that this is an irreducible ontological indeterminism: If this is not true, the practical effects are unaltered (otherwise, there would be an experiment showing this to be wrong.) Bose–Einstein and Fermi–Dirac statistics, while they reduce to Boltzmann statistics at the classical level, account for many practical properties of the classical world (like the hardness of a table or the coherence of laser light) in ways that Boltzmann statistics do not. Proposing that quantum physics is not based in ontological indeterminism makes no difference to this assertion. All three relate to causality in complex systems and the existence of life. There is, in effect, an intermediate between chance and necessity — probabilistic causation, such as "Smoking causes lung cancer." This is when chance at the lower level leads to high probability but not uniquely predictable events at the macro-level. But then in a specific case, this is either 1 or 2 of the aforementioned types of chance. For the purposes of this chapter, though, I do not need to distinguish these variants of chance.

Essentially, quantum phenomena such as self-interference, superposition, and nonlocality were, until recently, thought to have no effect on life because decoherence would destroy superpositions and entanglement on very rapid timescales. However, the rapidly developing subject of quantum biology now shows this is not always the case: It seems that such quantum effects play a key role in some processes needed for the existence of life (Al-Khalili and McFadden 2014).

When one has social or engineering systems, randomness is often a problem to be handled, and as far as possible, to be limited by careful design, so that the desired outcome will be attained despite random events intervening in the dynamics. This is not always successful. In particular, digital computers are notoriously susceptible to the smallest error: A single wrong full stop can bring an immensely complex program to a crashing halt. However, social systems, such as the economic and legal systems, and technological artifacts, such as modern aircraft, are generally more robust: They are designed to handle reasonable classes of random events without disaster occurring. Feedback control in cybernetic or homeostatic systems is specifically designed to tame randomness, but it often remains an enemy to be handled with care. It has the potential to derail everything and prevent attainment of the desired goal. However, in some contexts, such as randomized medical trials, it can be useful, and with the right design, randomness in mechanical systems can increase stability by inhibiting resonance. So it can sometimes be used to advantage.

At the micro-level, biological systems do not live in a carefully controlled environment: They face rampant randomness all the time. It turns out that they take advantage of the storm of randomness encountered at the molecular level — there is much evidence that molecular machinery in biology is designed to use that randomness to attain desired results (Hoffmann 2012). This is true also in terms of macro-levels of behavior, and in particular, as regards how the brain functions (Deco, Rolls, and Romo 2009; Glimcher 2005; Rolls and Deco 2010). Randomness is harnessed through the process of adaptive selection, which allows higher levels of order and meaning to emerge. It is then a virtue, not a vice. It allows purpose to be an active agent by selecting the desired outcomes from a range of possibilities.

Effectively, irreducible randomness occurs in physics at the quantum level, thereby influencing cosmology and the existence of life on earth. Whether it is due to ontological indeterminism or not, what is experienced in the real world at the quantum level is unpredictable.

If it were not for this effective randomness, we would be stuck in the vice of determinism and outcomes would be limited and uninteresting. But it is there, as part of the design of the universe. It is a key feature in enabling autonomous higher levels of order to come into being. It is remarkable that many writings on quantum physics (e.g., regarding the black hole information paradox) claim its dynamics are deterministic: Only unitary transformations can occur. These texts, which explicitly or implicitly deny that state vector reduction takes place, give no way whereby superpositions reduce to an eigenstate, so in effect they deny that specific classical outcomes occur, and hence also give no way whereby the Born rule — the key relation that predicts probabilities of different classical outcomes to experiments — can be realized. They simply do not reflect the nature of the paradigmatic two-slit experiment where the quantum mechanical interference pattern is built up photon by photon as individual photons arrive at a detector in an indeterministic way.

This chapter, by contrast, takes that experiment seriously as genuine evidence regarding the nature of the universe. Our experience is that irreversible unpredictable quantum effects do indeed take place in the real universe. This indeterminism — whether ontological or not — is a key aspect of the nature of physical reality. Its existence is based in deeper levels of being — the possibility spaces that determine what is and what is not possible in the physical universe. It is part of the design of the things — one of the features one has to explain, if one claims to explain on any basis whatever the nature of the physical universe.

Why do both chance and necessity occur? Who or what ordered them both? And how does purpose fit in?

An element of randomness at the bottom does not mean all that happens is just pure chance — rather it is one of the foundations that, together with necessity, opens up the possibilities of mental and spiritual life, realized through physical existence, because it enables adaptive selection to take place in accord with higher-level purposes. It does not have to have the connotation of meaningless so often ascribed to it. It is the gateway to variety and possibility. That is what will be explored in this chapter.

Randomness in Quantum Physics

The standard view is that quantum physics, which introduced the quantum principle that only discrete energy states occur, particle–wave duality, the uncertainty principle, quantum tunneling, superposition of states, and entanglement, involves a foundationally indeterminate dynamics: Probabilities can be predicted, but not specific outcomes.

The Standard View

The basic postulate of quantum mechanics (Greenstein and Zajonc 2006; Isham 1995; Morrison 1990; Rae 1994) is that before a measurement is made, the vector |ψ> describing the quantum state of the system can be written as a linear combination of unit orthogonal basis vectors representing different possible measurement outcomes:

[MATHEMATICAL EXPRESSION OMITTED] (1)

where |un(x)> is an eigenstate of some observable A. The evolution of the system can be completely described by a unitary operator u(t2, t1), and so the state vector evolves as

[MATHEMATICAL EXPRESSION OMITTED] (2)

Here u(t2, t1) is determined by the evolution equation

[MATHEMATICAL EXPRESSION OMITTED] (3)

where H is the Hamiltonian (substitute (2) into (3) to see how H determines û). This is a unitary evolution: No surprises occur; the initial state uniquely determines the final state at all later times. Physics is determinate — necessity rules.

Immediately after a measurement is made at a time t = t*, however, the relevant part of the wave function is found to be in one of the eigenstates representing a specific measurement outcome:

[MATHEMATICAL EXPRESSION OMITTED] (4)

for some specific index N. This is where the quantization of entities and energy comes from (the discreteness principle): Only discrete eigenstates can result from a measurement. The eigenvalue c — the outcome of the measurement — is unrelated to the initial wave function (1). It cannot be a unitary evolution (2). The data for t < t* do not determine either N or cN; they merely determine a probability for each possible outcome (4), labeled by N, through the fundamental equation

[MATHEMATICAL EXPRESSION OMITTED] (5)

where |ψ1> is the wave function before the measurement. This is the Born rule for measurement probabilities, which gives the wave function its physical meaning. One can think of this process as the probabilistic time-irreversible reduction of the wave function where the initial indeterminate state changes to a final determinate classical state:

[MATHEMATICAL EXPRESSION OMITTED] (6)

This is the event where the uncertainties of quantum theory become manifest (up to this time the evolution is determinate and time reversible). It will not be a unitary transformation unless the initial state was already an eigenstate of A, in which case we have the identity projection

[MATHEMATICAL EXPRESSION OMITTED] (7)

This discussion presents the simplest idealized case of a measurement (Penrose 2004, 542–49). More generally, one has projection into a subspace of eigenvectors (Isham 1995, 136; Wiseman and Milburn 2010, 10–12) or a transformation of density matrices (Isham 1995, 137) or any other of a large set of possibilities (Wiseman and Milburn 2010, 8–42), but the essential feature of nonunitary evolution remains the core of the process.

Thus, there is a deterministic prescription for evolution of the quantum state determining probabilities of outcomes of measurements, but indeterminacy of the specific outcomes of those measurements, even if the quantum state is fully known.

Examples include radioactive decay — we can't predict when a nucleus will decay or what the velocities of the resultant particles will be — and the foundational two-slit experiments — we can't predict precisely where a photon, electron, neutron, or atom will end up on the screen (Feynman 1963; Greenstein and Zajonc 2006). Each individual photon arrives at an unpredictable place, but the predicted overall interference pattern gradually builds up over time.

The fact that such unpredictable measurement events happen at the quantum level does not prevent them from having macro-level effects. Many systems can act to amplify them to macro-levels, including photomultipliers (whose output can be used in computers or electronic control systems). Quantum fluctuations can change the genetic inheritance of animals (Percival 1991) and so influence the course of evolutionary history on earth (Russell 1998, 2002), and they have changed the course of structure formation in the universe (section "Cosmological Unpredictability").

Alternative Possibilities

The aforementioned is the standard view. A counterview, supported by physicists such as Heisenbery and Dirac, deduce that irreducible randomness occurs in quantum theory. Determinism does not hold in the real world, at the micro-level. This was very worrying to many people, in particular to Albert Einstein, and all possible alternatives, as follows, have been carefully explored.

Hidden variables? Many investigations have tried to see if physicists have somehow missed some hidden variables. This involved the Bohr–Einstein debate; a famous paper by Einstein, Podolsky, and Rosen (1935); Fine (2013) for a discussion; and a set of inequalities by Bell (1964, 1987). These have shown this idea of hidden variables is incompatible with the usual concepts of realism, locality, and causation.

Pilot wave theory (Bohm 1952a) is a deterministic theory that gives exactly the same results as the standard theory. Unlike Einstein, Bohm uses a nonlocal hidden variable interpretation of QM; but again, it makes no difference to the outcome. If one could observe the underlying hidden variables, one would be able to signal faster than light, which according to special relativity leads to temporal paradoxes (Kofler and Zeilinger 2010). The complex problems of this theory are discussed by Redhead (2002).

Many worlds? Everything that is possible occurs, as possibilities multiply and the wave function splits into innumerable branches (Everett 1957); see Isham (1995) for a summary of the various proposals as to how this happens. This has many technical difficulties; for example, it can be argued that almost all of the physicists in the Everett multiverse see experimental violation of the Born rule (Hsu 2011). And if we try to fix this by the a priori assumption of self-locating uncertainty in the universal wave function, the result is open to major questioning (Kent 2014). Additionally, if there really are many other worlds, each actualizing a different quantum output, this inter alia raises difficult questions for personal identity and free will. But in any case, this theory has no cash value: It does not change the aforementioned experimental situation, which is what we experience in the real world. Any number of hypothetical other worlds that are supposed to be realized somewhere as a hidden machinery behind the scenes make no difference to this outcome.

Decoherence: Physicists and philosophers of physics have suggested that the measurement problem is solved by environmental decoherence (Zurek 2004). However while this diagonalizes the density matrix, it leaves a superposition of states, and so does not lead to a specific classical outcome. It does not predict where the individual spots will occur, and neither does any other result from quantum physics. It does not solve the problem of indeterminism in measurement, and so the experienced situation remains as discussed earlier in this chapter.

All these options are discussed by Isham (1995), and many of the original papers are presented with commentaries in Wheeler and Zurek (1983).

The Outcome

There are various alternatives to the standard view, but in the end they amount to proposing some kind of machinery hidden behind the scenes that makes no easily detectable difference to the practical outcomes described by equations (4)–(7). You have no ensemble to which you can apply statistics unless you have the individual events that make up the ensemble, and those are what quantum physics is unable to predict (Ellis 2014).

The position I choose to take is the conventional view — that in all experienced quantum phenomena we have to deal with irreducible uncertainty of specific events, as evidenced in the two-slit experiment. In this view there is indeed genuine unpredictability in the real world, even though we can predict statistics of micro-events with precision. My position will further be that if any of the other options were eventually shown to be true, it would make no difference to the practical outcomes of quantum uncertainty in relation to emergent physics and biology, unless we located hitherto undetected nonlocal hidden variables that were open to manipulation. That seems very unlikely, given the failure to do so, so far. Indeed, given the claim that strong emergence is true, and to a considerable degree occurs because of underlying randomness, as argued in this chapter, perhaps this can be taken to imply ontological indeterminism must be the best interpretation of quantum theory.

(Continues…)


Excerpted from "God's Providence and Randomness in Nature"
by .
Copyright © 2018 Graduate Theological Union (CTNS program).
Excerpted by permission of Templeton Press.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.

Table of Contents

Introduction Robert John Russell 3

Part I Scientific Warrants for Indeterminism throughout Nature

1 Necessity, Purpose, and Chance George F. R. Ellis 21

2 The Universal Laws of Physics Robert E. Ulanowicz 69

3 Multiverse Gerald B. Cleaver 85

Part II Philosophical and Theological Perspectives on Indeterminism in Nature

4 Are Randomness and Divine Providence Inconsistent? James Bradley 117

5 What We've Learned from Quantum Mechanics about Noninterventionist Objective Divine Action in Nature-and its Remaining Challenges Robert John Russell 133

6 Context-Sensitive Constraints, Types, Emergent Properties, and Top-Down Causality Alicia Juarrero 173

7 Is Classical Science in Conflict with Belief in Miracles? Some Bridge-Building between Philosophical and Theological Positions Erkki Vesa Rope Kojonen 205

8 Necessity, Chance, and Indeterminism Veli-Matti Kärkkäinen 235

9 Contingency and Freedom in Brains and Selves Ted Peters 261

10 Contingency, Convergence, Constraints, and the Challenge from Theodicy in Creation's Evolution Joshua M. Moritz 289

About the Contributors 329

Index 333

Customer Reviews