Is the World Provably Indeterminate?


Simplicio: Determinism is well defined. A deterministic system is one in which the future state at any time can be calculated by a complete knowledge of the state at some initial time and a recursive mechanism for calculating future states.


Salviati: In order for your proposed definition to be meaningful, it's necessary to place some restrictions on the range of allowable "recursive mechanisms". For example, the nth state vector of a system may be the kn+1 through k(n+1) digits of p. This would be a perfectly deterministic and "causal" system, but the causal connection between events would be extremely obscure. Even worse, every element of a system might "remember" and be governed by its entire unique history, so all events would be, at once, both ad hoc and deterministic.


This shows that the notions of determinism and causality are significant only with the further stipulation of some very extensive "equivalence classes" of events and objects, so that a single pattern applies to multiple occurrences. Of course, even in a perfectly deterministic world, it would always be possible for us to have drawn our equivalence classes too broadly or too narrowly, resulting in discrepancies relative to our expectations. As Einstein said


Whether objective facts are subject to causality is a question whose answer necessarily depends on the theory from which we start. Therefore, it will never be possible to decide whether the world is causal or not.


Simplicio: Your p example is not a difficult model to crack. Standard cryptography techniques would catch that easily.


Salviati: Not so. Any finite string of decimal digits occurs (infinitely often) in the decimal expansion of both pi and e (assuming the digits of these two numbers are normally distributed). Furthermore, the string occurs with the same frequency in both expansions. Therefore, it would never be possible, based on any finite number of digits, to decide whether the operative algorithm was pi or e, nor whether we had correctly identified the relevant occurrence in the expansion.


Simplicio: That is not the issue, Salviati. The issue is could you come to understand a physical law of causation based on limited experimental data? The sequences generated in the decimal expansion are simple recursive functions that could be found by exhaustive search techniques. You do not need to have an exact law with absolute certainty to be able to do science. All you need is laws that have predictive power.


Salviati: But your assertions are already falsified by the example we are discussing, Simplicio. Exhaustive search techniques won't work on the digits of pi, because the search space is inexhaustible. Therefore, even assuming there is a match between our System and some sequence in the digits of p, there is no finite amount of "exhaustive searching" that would assure us a probability greater than epsilon of finding the match for any e > 0. Thus a System can be perfectly deterministic and yet utterly unpredictable based on any inferences that could be drawn from any finite set of observations of the System.


Sagredo: Let me make a suggestion. Suppose we make an infinite sequence of measurements. The outcome will be an infinite binary string, say




One would hope that this string would be random in the sense of algorithmic information theory. At the very least, hopefully one can prove that this string isn't Turing-computable! I think Everett made some progress in this direction.


Salviati: Chaitin proved (about 8 years after Everett wrote his thesis) the existence an integer k such that it's impossible to prove that the complexity of any particular string of binary bits exceeds k (where "complexity" is defined as the length of the smallest Turing program that generates the string). This is true in spite of the fact that "almost all" strings have complexity greater than k. Therefore, without some restriction on the allowable length of the Turing machine (to something less than k), we can never prove, in the algorithmic information sense, that any particular process necessarily yields a non-Turing-computable sequence of bits. Certainly such a proof could never be constructed on the basis of any finite set of measurements.


Sagredo: Obviously we can't prove anything about an infinite sequence of measurements without some theoretical model. Everett does provide a theoretical framework for talking about an infinite number of measurements, so it is at least conceivable that one could prove the result I (rather vaguely) described. This would be a proof from formal hypotheses, of the same nature as any proof in mathematical physics.


Salviati: But Sagredo, I was addressing the question of whether our observations of the world are, or can ever be, provably inconsistent with a deterministic model. Of course, if we accept that the sum of our observations is, and will always be, finite, then the answer is trivially "no", because a finite Turing machine can always be written to generate a given finite string. I was making a slightly less trivial point, which is that there's a specific number k such that any finite string, no matter how long, cannot be proven to require a Turing machine longer than k. Thus, even if we (sensibly) restrict our meaningful class of Turing machines to those of complexity less than a fixed number k (rather than allowing the complexity of our model to increase in proportion to the number of observations), it's still impossible for any finite set of observations (even if we continue gathering data forever) to be provably inconsistent with a Turing machine of complexity less than k.


It isn't clear, Sagredo, if your comment on Everett's thesis was intended to address this same issue, i.e., the provability of indeterminacy. From what I can gather, your point is that Everett [might have?] defined a model within which it's possible to prove the "occurrence" (suitably defined) of a completed infinity of results that is [provably?] not Turing-computable...or something like that. You add that "this would be a proof from formal hypotheses, of the same nature as any proof in mathematical physics". I'm not sure what to make of this, but it's hard to see how any formal hypotheses would enable us to circumvent the fundamental epistemological limit cited above (except perhaps in the same sense that standard cryptographic techniques would enable us to exhaustively search the digits of p).


Simplicio: "Exhaustive" is not the best term. You can search though a rather large range of simple recursive functions to see if there is some match between the sequence and sequences generated by those functions.


Salviati: Please, Simplicio. Searching through a "rather large" range of an infinite space isn't effective. You yourself have acknowledged that "there is no general way to map a finite set of observations into a particular recursive function", and you even comment that this is not only true, but trivially true. In view of this, it's hard to see why you continue to suggest (when not saying it's trivially true) that it's false.


Simplicio: You said earlier, Salviati, that "a System can be perfectly deterministic and yet utterly unpredictable based on any inferences that could be drawn from any finite set of observations of the System". What do you mean by utterly unpredictable? If there is a recursive function that defines the sequence then it is completely predictable. Any sequence generated by a simple recursive function can be predicted in a practical sense using standard cryptography techniques.


Salviati: Here's a string of digits from an infinite sequence generated by a recursive function: ...283645147026143468... My claim is that both you and the NSA are completely incapable of predicting (correctly) the next several digits of this sequence. Furthermore, I could go on supplying more digits indefinitely, and at no point will you be able to predict the next digits. This illustrates that a System can be perfectly deterministic and yet utterly unpredictable based on any inferences that could be drawn from any finite set of observations of the System.


Simplicio: Let me try another tack. You said earlier that "there's a specific number k such that any finite string, no matter how long, cannot be proven to require a Turing machine longer than k", but I don't believe that's true. A sequence of numbers that recursively encodes the Halting Problem for Turing machines cannot be encoded in any finite Turing machine. Thus for every k there is some initial finite segment of such a sequence that requires a Turing machine of complexity greater than k.


Salviati: It sounds like you're confusing the question of whether "there exist" sequences of complexity greater than k with the question of whether we can prove that any particular sequence has complexity greater than k. The latter question is what's at issue here. Let me just quote from Martin Davis's "What is Computation?":


To fully understand the devastating import of this result [Chaitin's version of Gödel’s theorem] it is important to realize that [it applies to] all methods of proof available in ordinary mathematics (for example, ZF set theory). We are forced to conclude that there is some definite number k such that it is in principle impossible, by ordinary mathematical methods, to prove that any string of bits has complexity greater than k. This is a remarkable limitation on the power of mathematics as we know it.


Simplicio: Well, perhaps so, but we can restrict ourselves to simple recursive functions. The number of simple recursive functions is not infinite.


Salviati: Really? How many are there? But seriously, Simplicio, this statement of yours would only be relevant if your definition of "simple" coincided with the limits of what nature is capable of invoking. Since we know of no such limit, your statement is either

irrelevant or false (not excluding the possibility that it's both). This reminds me of a comment by Feynman, who said


I know there are some scientists who go about preaching that Nature always takes on the simplest solutions. Yet the simplest solution by far would be nothing, that there should be nothing at all in the universe. Nature is far more inventive than that, so I refuse to go along thinking it always has to be simple.


Also, notice that you are now reverting to the very point with which I began, when I said "For you definition to be meaningful, it's necessary to place some restrictions on the range of allowable "recursive mechanisms". You strongly objected, but I then pointed out that the digits of p (for example) immediately falsify your beliefs.


Simplicio: If the digits of p were effective as a random number generator they would be used that way rather than the far more difficult algorithms that are used.


Salviati: Well, Simplicio, you might want to take a look at Knuth's "The Art of Computer Programming" (Vol II) to get some idea of how actual pseudo-random number generators work. One thing you'll discover is that such algorithms are not particularly "difficult" to execute, certainly far less difficult than computing digits of p. Furthermore, the quality of p's "randomness" is unsurpassed by any conventional random number generator. In fact, the Borwein's have pointed out that if, as is widely believed, the digits of p are normally distributed, they would make an inexhaustible source of high-quality "random" number sequences (higher quality than anything we can get out of conventional "random" number generators).


Simplicio: If you try to use any simple recursive function to do encryption as if it were a true random number generator NSA and the average competent hacker will be able to quickly break your code.


Salviati: Your fundamental mistake is the belief that people and/or nature are limited to just some specific finite set of recursive functions (those you refer to as "simple") combined with a finite set of parameters and initial values. On this basis, you're prepared to say that if there is any underlying order to a sequence of data, then the "key" to that order can effectively be determined by simply checking each of the finite possibilities. Unfortunately, Simplicio, your basic premise is flawed. Neither God nor your fellow men have agreed to confine themselves to whatever finite set of recursive functions you consider to be "simple".


Simplicio: I agree now that, as you said, according to complexity theory there exists k within any formal system such that it's impossible to prove any given sequence has complexity greater than k. However, the implications you draw from it are misleading. This is really a proof about the limitations of a particular formal system (ZF) and not a statement about what is provable.


Salviati: I suggest you consult any standard text on mathematical logic or complexity theory (e.g., "Mathematical Logic" by Ebbinghaus, Flum, and Thomas). One thing you'll find is that results such as Gödel’s Theorem and Chaitin's Theorem are not limited to just ZF. They apply to any formal system (of sufficient complexity to encode the natural numbers), which is the only context in which the notion of formal proof applies. You'll also find that Chaitin's theorem is precisely a statement about what is provable. Thus, regardless of what system we choose, we can never formally prove that a given sequence of observations has been produced by a non-deterministic process.


Return to MathPages Main Menu