Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

There can be no doubt that the universe in which we nd ourselves is intricate and complex far beyond our current

capacity of understanding, but what is behind such detailed patterns of organization design? Is our universe chaotic or does it follow a recognizable pattern of logic and order? In the last 50,000 years of what we know as modern human civilization, we have developed such an impressive level of self realization that we can indeed perceive, interpret, and attempt to theorize the foundations of the known universe. Though such big questions seem beyond our understanding, few can argue the marvel of human achievement and what it entails for the future. Perhaps someday, maybe soon, we will be able to nd reasons for what now seems impossible to know. Perhaps someday we may even create our own universe. In 1957, Edwin Thompson Jaynes, a professor of physics at Washington University in St Louis, proposed a connection between digital computation and the basic laws of our known universe - mainly the relationship between information theory mathematics, thermodynamics, and quantum mechanics. Jaynes position relied heavily on probability in physics and the relevance such theory had with the still burgeoning eld of computation. Drawing from these parallels, German computer pioneer Konrad Zuse, creator of the rst high-level programming language, theorized that our universe is in itself a type of digital computer in his 1969 book Rechnender Raum, or Calculating Space. While his ideas were seemingly ludicrous, there has been little or no evidence against his theories since they were rst published. Since the advent of this concept, various quantum theorists have adapted and reconciled the idea of a digital universe with the fundamental properties of quantum mechanics. At the heart of such theories lies a potential answer to the universal complexity that has been observed for the last 50,000 years of humanity. According to such digital computational theories of the universe, there exists a high level program that denes the parameters for universal evolution. Essentially, the very basic elementary physical particles as we currently know to exist store and process information about everything we see around us. These particles could be utilized by such a program similar to the way in which bits in computer store and process information, carrying out basic commands on a hyper computational scale, informing the universe to evolve in an ordered and logical manner, based on initial complexities in programming. In 1998, John Archibald Wheeler, an American theoretical physicist known for his work with Niels Bohr in the eld of nuclear ssion as well as noted for inventing the terms Black Hole and Wormhole, expanded on this theory. ...it is not unreasonable to imagine that information sits at the core of physics, just as it sits at the core of a computer. It from bit...Otherwise put, every 'it'every particle, every eld of force, even the space-time continuum itselfderives its function, its meaning, its very existence entirelyeven if in some contexts indirectly from the apparatus-elicited answers to yes-or-no questions, binary choices, bits. 'It from bit' symbolizes the idea that every item of the physical world has at bottoma very deep bottom, in most instancesan immaterial source and explanation; that which we call reality arises in the last analysis from the posing of yesno questions

and the registering of equipment-evoked responses; in short, that all things physical are information-theoretic in origin and that this is a participatory universe. This it-from-bit thinking maintains that every particle in our universe is essentially just a physical manifestation of a deep seeded informational language which determines the nature of everything we know. Perhaps, beyond the physical world we struggle to understand is a deeper informational coding that denes the evolutionary as well as the immediate behavior of every single elementary particle ever introduced to our known universe. This could potentially explain why the seemingly chaotic universe does in fact maintain an extreme degree of organization at its most fundamentally known levels, and why the evolution of everything we know is essentially predictable - from the potential end of the universe to the future of human kind. In 1987, mathematician Rudy Rucker summarized a computational model of the universe in ve basic points, effectively theorizing possible resolutions or reasoning behind the observably complex universe. His rst point stated, as aforementioned, that the universe and its particles can be interpreted as digital bits, which may be broken up into smaller bits of information. The second point of his theoretical summary asserted that the informational programming at the deepest level of each bit communicates the directions that form the observed fractal patterns within our universe at various frames of reference. He then went on to explain the evolution of universal physical elements by use of a cellular automaton. Ruckers third point ascertained that all particle evolution follows a complex totalistic cellular automata, a particle growth pattern predictable through mathematics and computation, yet still immeasurably complex and heavily reliant on probability. Next, from the evolution of this cellular automata, an impossibly intricate and large structural system of physical particles is created, which denes our universe and its dimensional properties. Ruckers fth point and conclusion stated that from this evolutionary pattern, though initially the universe was once highly simplistic it follows a computational system that is undeniable and unalterably complex, creating organizational system we recognize within the various perceivable scales of our universe in the present. In its essence, such a computational model maintains that from what could have been a single particle or informational thread, an entire universe was developed - like looking in on a recognizable concept such as the big bang through a different window of reference. Stephen Wolfram, a British scientist an mathematical software engineer outlined a similar point of view in his 2002 book A New Kind of Science, which advocated a computational model for recognizing naturally occurring complexity rather than a mathematical one. Obviously these theories are not without aws however, as many in the scientic community have more than graciously pointed out. Indeed, the continuous nature of time itself may be the strongest argument against a computational model of our universe. How does a pre programmed output that creates the evolution of our universe synthesize the perception of time passing? How does the fourth dimension even factor into the idea of a universal informational model? With computational theories such as Wolframs, an assumption that particles behave like a cellular automaton generally means that the universe is a discrete collection of particles, rather

than a continuous fabric, such as the way we perceive time. Nobel laureate particle physicist Steven Weinburg criticized such a perspective in his review of Wolframs A New Kind of Science.

...I suppose he can't resist trying to apply his experience with digital computer programs to the laws of nature. This has led him to the view (also considered in a 1981 paper by Richard Feynman) that nature is that space consists of a set of isolated points, like cells in a Following an idea of Edward Fredkin, he It's possible, but I can't

discrete rather than continuous. He suggests

cellular automaton, and that even time ows in discrete steps.

concludes that the universe itself would then be an automaton, like a giant computer.

see any motivation for these speculations, except that this is the sort of system that Wolfram and others have become used to in their work on computers. So might a carpenter, looking at the moon, suppose that it is made of wood.

It is worth noting, however, that within our current understanding of our universe, much is still only vaguely comprehendible. It is quite possible the collective universe is a vast set of discrete particles so numerous and vast in measure that they are perceivable as a continuous ow of space and time. It is equally possible that our universe is continuous in nature, constantly owing like a web or fabric of space time. There are many theoretical propositions supporting both models and the truth is, despite everything, we still do not know. Likewise, our knowledge of quantum and hyper computing is staggeringly limited or nonexistent. It does not go beyond reasoning that there may soon exist a computational model far beyond human kinds current understanding that can reconcile a universal evolutionary theory stemming from cellular automata. Despite the potential limit on computational power, as some may believe it likely that so called hyper computation is impossible, human kinds current grasp on information technology is far from its peak and continues to grow, exponentially.

There is, nevertheless, a nagging question that arises when considering an informational or computation model of universal complexity. Where did the information come from? How did this information technology nd its way into existence and become the singular set of direction to program an entire plane of reality so intricate that human kind itself can perceive such questions as the genesis of everything? To answer this question, one can look at [arguably] the single most important event to ever happen to the eld of information technology and the human race as a whole.

When one considers the Technological Singularity, a future world of augmented humans and so called future tech is often brought to mind. A science ction fans dream come true. For many, the post singularity world rst described by computer scientist, mathematician and author Vernor Vinge in the January 1983 publication of OMNI magazine is a futuristic utopia of robots and strong articial intelligence. This perception is extremely limited in scope of the implications of an informational technology revolution such as the singularity.

This event, coined as the singularity due to informational explosion of intelligence reaching a singular pivotal point after which information, technology and intelligence manifests itself in a way entirely impercievable to the limited comprehension we as a intelligent life form can currently achieve, reaches far beyond its theoretical effects on the physical universe.

According to futurist Ray Kurzweil, a recognizable model of exponential growth can be observed in the history of technological progress and can be utilized to project future developments in the eld of information technology. This model can be traced back to the advent of the integrated circuit, or to some proponents of the singularity theory, even earlier. Technology itself

Bibliography Eldred, Michael, 2009, The Digital Cast of Being: Metaphysics, Mathematics, Cartesianism, Cybernetics, Capitalism, Communication ontos, Frankfurt 2009 "Human Evolution by The Smithsonian Institution's Human Origins Program". Human Origins Initiative. Smithsonian Institution. Jaynes, E. T., 1957, "Information Theory and Statistical Mechanics," Phys. Rev 106 Jaynes, E. T., 1957, "Information Theory and Statistical Mechanics II," Phys. Rev. 108 Kurzweil, Ray. The Singularity Is Near: When Humans Transcend Biology. New York: Viking, 2005. Rucker, Rudy; Mind Tools - the ve levels of mathematical reality - Houghton Mifin (1987) Stephen Wolfram, A New Kind of Science Vinge, Vernor. "First World." OMNI Jan. 1983 Weinberg, S. (24 October 2002). "Is the Universe a Computer?". The New York Review of Books. Wheeler, John A. (1990), W. Zurek, ed., "Information, physics, quantum: The search for links", Complexity, Entropy, and the Physics of Information (Redwood City, California: Addison-Wesley)

You might also like