18 Feb 05

A New Kind of Science

"Three centuries ago science was transformed by the dramatic new idea that rules based on mathematical equations could be used to describe the natural world. My purpose in this book is to initiate another such transformation, and to introduce a new kind of science that is based on the much more general types of rules that can be embodied in simple computer programs."

Thus begins Stephen Wolfram's book A New Kind of Science, which you can read online. I just got back from the lecture and I have to say: wow. I have never been wowed by a lecture up until now. The stuff that he covered is truly mind-blowing in the complete stoner sense of the word.

This is just a brief run-down of an 1.5 hour lecture, so I'm going to skip some parts. But I'll cover the essentials. Let's start off with defining cellular automata. According to MathWorld, a cellular automaton is "a collection of "colored" cells on a grid of specified shape that evolves through a number of discrete time steps according to a set of rules based on the states of neighboring cells. The rules are then applied iteratively for as many time steps as desired." The easiest way to explain what that means is by demonstration.

Puffer Train

The above is an example of the "puffer train," a certain pattern found in Conway's "Game of Life." I included this example first because many of you may be familiar with it from Microsoft's Entertainment Package that shipped with many Packard Bells back in the day. Essentially, the rules are as followed. Count the number of black or "on" cells around a tile. Then apply the following rules:

1. Death: if the count is less than 2 or greater than 3, the current cell is switched off.
2. Survival: if (a) the count is exactly 2, or (b) the count is exactly 3 and the current cell is on, the current cell is left unchanged.
3. Birth: if the current cell is off and the count is exactly 3, the current cell is switched on.

As you can see, such simple rules generates a relatively complex pattern.

The simpler varieties of cellular automata are pictured below.

Various Rules

The simplest form of cellular automata is "elementary cellular automata" that are one-dimensional, nearest-neighbor. Based on the example rule set below it's clear that the number of possible cellular automata in this system is 2^8 = 256.

Rule 30

The above example, rule 30, is one of the more interesting cases. As you can see, at just 16 iterations in, it already shows a decent amount of complexity.

Rule 30

When we continue the iterations, the complexity only becomes more apparent. The question, then, is how does such a simple rule set generate such complex phenomenon? By our current paradigms we conceive of emergent chaos as resulting from either a perturbation (initial or otherwise external randomness), like in the flip of a coin or the flap of a butterfly's wings (the butterfly effect) or from a relatively complex equation describing dynamical systems. But here we have chaos coming out of very simple rules - something doesn't fit. Wolfram investigated more and discovered that he could model snowflake patterns, fluid mechanics, and the patterns on the shells of mollusks all from relatively simple rulesets. Thus, what would to the outside observer seem to be complexity and order, was really the progressive processing of simple rules. This is where things take a turn for the interesting, and the complicated (hah, when I first wrote this I didn't realize the pun, forgive me).

Space, Wolfram postulates, is both space and matter. That is, matter is a property of space, the result of a "connectedness" of discrete positions in space. Think the big white ball in disneyland, only as a grid connecting points. Take a space whose points are seperated at varying distances and you get general relativity. The universe then, in Wolfram's framework, is one big computer, running simple rules on small primitives in a huge space network.

Wolfram discovered several principles in the course of his study. Two of the more relevant ones for this overview are the principle of computational equivalence and the principle of computational irreducibility. From MathWorld:

Almost all processes that are not obviously simple can be viewed as computations of equivalent sophistication (Wolfram 2002, pp. 5 and 716-717).

More specifically, the principle of computational equivalence says that systems found in the natural world can perform computations up to a maximal ("universal") level of computational power, and that most systems do in fact attain this maximal level of computational power. Consequently, most systems are computationally equivalent. For example, the workings of the human brain or the evolution of weather systems can, in principle, compute the same things as a computer. Computation is therefore simply a question of translating inputs and outputs from one system to another.

What this means is that the rules that govern our thinking are no more complicated than the rules that govern a rock, for instance. It also means that whatever can be done in our brains can theoretically be done by any computational system, provided ideal resources and an algorithm. Also from MathWorld:

While many computations admit shortcuts that allow them to be performed more rapidly, others cannot be sped up. Computations that cannot be sped up by means of any shortcut are called computationally irreducible. The principle of computational irreducibility says that the only way to determine the answer to a computationally irreducible question is to perform, or simulate, the computation.

This has important consequences in philosophy and mathematics. On the one hand it relates to free will. It says that, even given absolute knowledge of the starting conditions, it is impossible to predict where the running of a sufficiently complex rule set will lead, short of sitting back and watching it run.

Regarding mathematics, one can consider a mathematical proof as the use of a set of axioms in a series to show truth equivalence between two statements. Based on the principle of computational irreducibility, this means that (1) based on the results regarding unsolvability of the length of Turing Machine programs, the initial conditions and the axiom system are not sufficient to determine how long a proof is (look at how complicated the recent proof to Fermat's Last Theorem is, for instance) and (2) there are statements that can be made in an axiomatic system which cannot be proven (think of it as a cellular automaton that doesn't terminate). Here (2) confirms Godel's Incompleteness Theorem, which basically says the same thing.

Anyway that's all I have for now, I have to get started reading this 1200+ page book. : )

If anyone wants any further clarification on a point, post in the comments and I'll see if I can't explain it better.

Update 10:32 18 Feb 2005: I just want to clarify pre-emptively that I have read the reviews of the book, and I am aware of its flaws. Regardless, I still think that the basic ideas behind the book are novel, and more than that, they're interesting. For a defense of the book, read Comments on a review of NKS. It's actually a really good survey of Wolfram's book, and offers some intriguing insights. I am in no way embracing the ideas in the book... for just one example it clashes with some ideas I have about time. That doesn't mean that I reject it, I'm always open to new ideas, but I need to more thoroughly think some things through.


3 comments

I've read Wolfram's book an year ago and it blew my mind. And I'm reading Delanda's Intensive Science and Virtual Philosophy now - it's pretty good too.

In many ways I find that Wolfram's structures compliment the symmetry-breaking phase transitions that Deleuze talks about.

So do you feel we're approaching a singularity or intertwingularity?

Fadereu, on February 27, 2005 11:45 AM

Thanks for the comment... yeah, both A New Kind of Science and Intensive Science and Virtual Philosophy have 2 and 1 holds on them, respectively - so I won't be able to check them out for a while. Regarding your question on the singularity/intertwingularity... I honestly don't know. I could go all Leibniz and make a comment on the question of "the approach" but I'll spare you.

We should exchange book recommendations some time. If you're interested email me at frankieist_at_gmail.com

Frankie, on March 2, 2005 1:01 PM

After reading through the first half of the book, I have to say my comment appears ill-informed even to me. Still, I don't understand in what way an event can be singular. Or rather, Deleuze if I understand him correctly sees the event as multiple, whereas Badiou sees it as singular (correct me if I'm wrong).

Frankie, on March 15, 2005 1:29 AM

Post a comment









Remember personal info?





Type the characters you see in the picture above.