Understanding STEM (formerly, MEST), STEM+I, and STEM Compression in Universal Change (original) (raw)

More on STEM Compression as the apparent driver of universal accelerating change (geochemical, biological, cultural, and technological) can be found in the following paper::
Evo Devo Universe? A Framework for Speculations on Cosmic Culture (PDF), 2008-10.
Feedback, edits, and critiques always appreciated.

STEM Compression: A Brief Introduction

The developmental history of human civilization, life on Earth, and the universe itself may be elegantly summarized as doing more (computation or matter-energy transformation), better (more intelligence, innovation, interdependence, immunity, and informational inertia (meaning) in leading complex systems) with less (universal resources per standard computation or transformation). Here on Earth, this process has gotten so advanced that it feels like just few centuries hence humanity's descendants will be capable of doing "almost everything" with "virtually nothing" in terms of physical resources. All that we care about will be done on nano and femtoscales in physical space, and in incredibly complex and sublime simulations (consciousness is one such simulation) in virtual space.

I call this process STEM (Space, Time, Energy, and Matter/Mass) compression, a term that represents the combination of both increasing STEM efficiency (of standardized computation or physical transformation) and increasing STEM density of the most complex adaptive systems over the history of universal development. STEM compressionappears to be an unrealized attractor for the leading edge of complexity development, of emergent hierarchical intelligence, in the universe. A recent book that independently describes and provides good examples this process, but without exploring its longer-run or cosmic implications, is energy expert Robert Bryce's Smaller Faster Lighter, Denser, Cheaper, 2014.

The earliest scholarly writing I've been able to find on this concept comes from architect and futurist Buckminster Fuller, who in 1938 (Nine Chains to the Moon) described the process of "ephemeralization", a move of nature away from physicality and toward informational abstraction, and specifically, the use of less energy, volume, time, and mass "per each given level of functional performance." Thus Fuller saw STEM efficiency (per standard computation or physical transformation, however we define it) yet he missed the concept of STEM density. With respect to STEM density, the engineer Adrian Bejan (Shape and Structure, 2000, Constructal Theory of Social Dynamics, 2007) has documented the relentless thermodynamic efficiencies (spatiotemporal, energy and matter flow densities, entropy minimization) optimization of natural and social sytems.Astrophysicist Eric Chaisson (Cosmic Evolution, 2001) has discovered that late-emerging complex dissipative structures have exponentially greater energy densities than earlier-emerging structures.Quantum physicist Seth Lloyd (Ultimate physical limits to computation, 1999) has even extrapolated this density trend to its physical limit in our universe, a black hole.

STEM compression is the term I suggest we may use to combine the observations of STEM efficiency and density in leading-edge complex adaptive systems in the universe. Such systems are always undergoing exponential or greater growth in their efficiency or density of space, time, energy, and matter utilization, for reasons that are as yet quite poorly understood. As a consequence of STEM compression, I believe we can say the following.

Inner space, not outer space, is the apparent constrained developmental destiny of increasingly complex systems in the universe.

Here we mean inner space both in terms of 1) computational complexity (eg, the human and computer "minds" are where complexity and processes of change increasingly "go"), and 2) increasingly more localized zones of space and time being the ideal ecological niches for Earth's future intelligence. A black hole-equivalent transcension, not lightspeed expansion, seems likely to be the developmental destiny for the future of intelligence on all Earth-like planets. For more on this quite speculative concept, see the developmental singularity hypothesis.

To incremental evidence for this mechanism, let us look at STEM compression from each of the four partially separable STEM perspectives, space, time, energy, and matter.

1. Space Compression: Locality

Perhaps the most obvious universal developmental trend of these four is space compression or locality, the increasingly local (smaller, restricted) spatial zones within which the leading edge of computational change has historically emerged in the hierarchical development of universal complexity.

Consider how the leading edge of structural complexity in our universe has transitioned from universally distributed early matter, to galaxies, to replicating stars within galaxies, to solar systems in the galactic habitable zone, to life on the surface of special planets in that zone, to higher life within the surface biomass, to cities, and soon, to intelligent technology. Each transition to date has involved a sharply increasing spatial locality of the system environment (Smart 2000).

Consider biogenesis, the emergence of life on Earth. It once looked like life emerged in a warm pond and expanded outside its original computational environment into a larger spatial envelope. But more recent evidence (read Paul Davies, The Fifth Miracle, 2000 for an accessible account) strongly suggests that the cooling Earth, in toto, is best thought of as a catalyst for the emergence of archaebacteria, presumably in geothermal vents. Sulfide using life sprung forth as the Earth's crust itself was cooling, implying the entire planetary system was a geological catalyst primed for this emergence. Exactly where did life emerge in this complex adaptive geophysical system? In a local subset of Earthspace, specifically on the "sliver of surface," between magma and vacuum, that we call home.

Consider next the emergence of plant life. In another popular misconception, plants (and then tetrapods) "pioneered" the Earth's crust. But in reality, aerobic, anaerobic, and archaebacteria were there long before them, running perhaps miles deep all across the planet, as well as miles into the atmosphere. So where did these computationally accelerated new forms arise? Within a further restricted subset of the original developmental space.

Now consider the emergence of human civilization. At the planetary-cultural level, scholars have noted space compression due to digital networks, sensors, effectors, memory, and computation (Broderick 1997; Kurzweil 1999), as the ‘end of geography’ (O’Brien 1992) or the ‘death of distance’ (Cairncross 1998). This is a real developmental trend, and it impacts future choices for human cultural evolution in ways we are just beginning to extrapolate.

In perhaps the most obvious misconception, we sometimes think of humans as spatial "pioneers" in comparision to the biota that spawned us. But intelligent humans have not, and if the STEM compression trend continues, will never venture beyond the biosphere in an autonomous fashion. In each case, we see the next emergent substrate occupying a tiny spatial subset of the previous one. So it will soon be with tomorrows artificially intelligent technology, which will model the birth and death of the universe using highly miniaturized, energy efficient, and local technology.

2. Time Compression: Sagan's Cosmic Calendar

Carl Sagan observed in his groundbreaking Cosmic Calendar metaphor (Dragons of Eden, 1977), that when we look back over our own evolutionary development in informational terms, we are struck by the clearly accelerating succession of information processing emergences (e.g., galactic, stellar, planetary-molecular/chemetic, cellular/genetic, neurologic, cultural/memetic, and technologic/technetic "intelligence" eras) in universal time. Experts may disagree on boundary definitions, or specifically, what physical-computational structures represent the next important emergence at any point in the chain. More recently, technology scholar and systems theorist Ray Kurzweil (2005) has compiled more than fifteen (at least partially) independent accounts of emergence frequency for ‘key events’ in Earth and human history, in an attempt to demonstrate that though the event selection process in each case must be subjective, the acceleration pattern seen by independent observers is apparently not. These are all examples of what we might call the "Time compression" trajectory of universal development.

Explaining this accelerating succession may be the most important challenge of our era. We live on the threshold of a coming singularity in these successions, as observed from our unmodified biological perspective. As evidence of this, technological change has already become near-instantaneous at the circuit-electron level in a variety of our silicon sytems, and in coming years is sure to become effectively (never actually, of course) instantaneous at progressively higher levels of machine intelligence.

Plants, Modern Human Society, and Tomorrow’s AIs Appear to Have Roughly Equivalent Scalar ‘Distance’ Between their Intrinsic Learning Rates

How time compressed is the emergent substrate of postbiological intelligence likely to be, relative to human culture? Consider the ten millionfold difference between the speed of biological thought (roughly ‘150 km/hr’ chemical diffusion in and between neurons) and electronic thought (near-speed-of-light electron flow). The scalar distance between Phi-measured learning rates (a topic we will explain shortly) of modern technological society (perhaps 10^7 ergs/s/g) and tomorrow’s autonomous computers (perhaps 10^12 ergs/s/g), is roughly the same as the difference between modern society and plants.

In other words, to self-aware postbiological systems, the dynamics of human thought and culture may be so slow and static by comparison that they will appear as immobilized in space and time as the plant world appears to the human psyche. All of our learning, yearning, thinking, feeling, all our biological desires to merge with our electronic extensions, or to pull their plugs, must move forever at plantlike pace relative to postbiological intelligences. Furthermore, such intelligences are far less computationally restricted, with their near-perfect memories, ability to create variants of themselves, reintegrate at will, and think, learn, share and experiment in virtual space at the universal speed limit, the speed of light. To be sure, as evo devo systems they must also be bound by developmental cycling and death, but for such systems death comes as archiving or erasure of poorly adapted intelligence architectures and redundant or harmful information, or the death-by-transformation seen in any continually growing system. On first analysis, such processes seem far less informationally destructive and subjectively violent than the death we face.

We may be dismayed by such comparisons, yet such prodigious leaps in the critical rates of change for new computational substrates are apparently built into the special physics of our universe. More than anything else, these leaps define the one-way, accelerating, and developmental nature of the universe’s leading evolutionary computational processes over the long term. Discovering such preexistent paths for computational acceleration and efficiency seems the developmental destiny of universal intelligence, though the creative evolutionary paths taken to such destiny are never predictable, and each path adds its own unique value.

3. Energy Compression: Chaisson's Phi

Eric Chaisson, in Cosmic Evolution (2001) has described universal development in terms of hierarchical levels of emergent complexity, each of which employs orders of magnitude greater free energy rate density (Phi) than the previous system from which it emerged. Chaisson's work provides a very helpful quantitative measure of energy flow density acceleration over time in dissipative structures. This "Energy compression" trajectory appears statistically directional, or developmental.

Chaisson has shown that energy-dissipative CAS can be placed on a universal emergence hierarchy, from galaxies to human societies and beyond, with the most accelerated new systems, our electronic computers, having roughly seven orders of magnitude (ten millionfold) greater energy rate density than human culture. ‘Free energy’ is energy available to build structural complexity (von Bertalanffy 1932; Schrödinger 1944). This measure can be related to both marginal entropy production (Kleidon and Lorenz 2005), and dynamic complexity (Chaisson 2003), or, marginal learning capacity of the dissipative structure.

Note that Chaisson’s list is a mix of both autonomous and nonautonomous CAS (planets are dependent on stars for replication, computers are (presently) dependent on human society for replication). Note also that replication (life cycle) always seems necessary for learning-by-dissipation, assuming galaxies replicate as dependents on their parent universes, in the multiverse.

Below are Chaisson’s estimates for Phi (free energy rate density) for a set of semi-discrete complex adaptive systems in our universe (units are ergs/sec/g). Note that this is not an exponential, but a superexponential function, implying some universal limit will be reached relatively soon in astronomical time. This limit, for energy density trends, is of course a black hole. For further consideration of the implications of intelligent civilizations engaging in this apparently universal developmental process, see the Developmental Singularity Hypothesis, a proposal that considers the future of intelligent civilization under STEM compression constriants.

Free energy rate density values in emergent hierarchical CAS. When the accelerating curve of dissipation rate begins is not yet clear. We draw it beginning at matter condensation (10^5 yrs) to the present. (Adapted from Chaisson 2001).
Complex Adaptive System Galaxies (Milky Way) Stars (Sun) Planets (Cooling Earth) Ecosystems (Biosphere) Animals (Human body) Brains (Human cranium) Society (Modern culture) Modern engines Intel 8080 (1970's) Pentium II (1990's)

Extrapolating now to the nearer future, we can expect fully autonomous computers to have Phi values of at least 10^12, seven orders of magnitude greater than human society (10^5). Even today, our global set of electronic computing systems are presently learning information about the universe, encoding knowledge from their human-aided, quasi-evolutionary searches, as much as ten millionfold faster than human society, albeit still in narrow ways and only for intermittent periods.

However, if tomorrow’s best commercial computers will increasingly improve themselves (self-provision, self-repair, self-evolve), as many designers expect they must, they will be able to exploit their greatly superior learning rate on a general and continuous basis, escaping the present need for human manufacturers and consumers in each upgrade cycle. This assumes that quasi-organic, self-improving computers can be selected for stability, productivity, and deep symbiosis with humanity, just as our domestic animals have been over the last 10,000 years (5,000 breeding cycles), organisms whose brain structures are also a complete mystery to us. This assumption will certainly be carefully empirically tested in coming generations. If in turn evolutionary experimentation by computers in ultrafast digitally simulated environments is an increasingly useful proxy for experimentation in slow physical space (a topic we consider in the longer version of this paper) we can begin to understand how ten-millionfold-accelerated computers might recapitulate our 500 million years of metazoan evolutionary developmental learning in as short a period as 50 years.

Turning briefly to computational structure, a universal energy efficiency trend can be observed in the progressively decreasing ‘binding energy’ levels employed at the leading edge of evo devo computation. As some examples show (adapted from Laszlo 1987), each newly emergent substrate in the quintet hierarchy has greatly decreased the binding energies it uses to store and process information in its physical structure, allowing far greater energy (and space, time, and matter) efficiency of computation:

Hierarchy Computing Substrate Binding System / Computation 'Mechanics'
Physics Chem Bio Socio Tech Post-Tech Matter Molecules Cells Brains Computers Black holes Nuclear exchange (strong forces) Ionic and covalent bonds (electromagnetic forces) Cell adhesion molecules, weak peptide bonds Synaptic weighting, neural arborization Gated electron flow, single electron transistors Gravitons? (Note: Gravity is the weakest of the known forces. Dark energy is weaker, but it is repulsive, not binding.)

Finally, energy (and space, time, and matter) density and efficiency may be considered through the framework of Adrian Bejan (2000) and his constructal law, which proposes that for any finite-size system to persist in time (to live), “it must evolve [and develop] in such a way that it provides ever-easier access to the imposed currents that flow through it." Constructal theory, a type of operations research, seeks to describe developmental limits on evolutionary action in nature, describing ‘imperfectly optimal’ conditions for animate and inanimate flow systems, and championing both the emergence of and boundaries to all fractal (self-similar) hierarchies in physical systems.

4. Matter Compression: Life's DNA and Drexler's Nanotechnology

In one sense, we can understand the human organism, and the DNA guided protein synthesis and other molecular machinery on which we are based, as the most effective product yet of billennia of encoding of evolutionary intelligence in highly miniaturized molecular systems. Early life and pre-life forms must have been far less genomically and cellularly efficient and dense, and DNA folding and unfolding regimes in every eukaryotic (vs. prokaryotic) cell are a marvel of material compression (efficiency and density of genetic computation) which we are only now beginning to unravel.

Consider a;sp density and efficiency of social computation (increasing ‘human biological and material flow density’) in a modern city, vs. early nomadic and pretechnologic humans. Note the matter compression (increasing efficiency and to a lesser degree growing physical density) in the technological substrate itself, in Moore’s and a large family of related ‘laws’ in electronic computing, in emerging nanotechnology, optical, quantum and now single electron transistor devices, and in the most plentiful and powerful universal energy source known, nuclear fusion.

As Eric Drexler first explored in Engines of Creation (1986) technological systems (his example was the "rod logic computer") have tremendously greater capacity to compute and do physical processes (sensing, storage, fabrication, disassembly) than biological systems. As with artificial intelligence, which has greatly exceeded human capacity in only very narrow ways today, our technologies have also greatly exceeded human physical capacities in many narrow ways. But the most powerful and intelligent of all technological capacities consistently come from a program of miniaturizing and matter compressing them as effectively as possible.

Finally, consider the extreme matter compression (efficiency and density) in the black hole forming processes that led to our initial cosmic singularity if Lee Smolin's Cosmological Natural Selection hypothesis is correct, and if black hole formation is in our local future, as the speculative but interesting Developmental Singularity hypothesis suggests

5. STEM Density and Efficiency - Anti-Kardashev Measures of Complexity Development

Integrating space, time, matter, and energy processes, let us briefly consider a brain, a social organization, and a planet to see if we can identify STEM density and STEM efficiency growth in each as they progress through their life cycle. Human brains, as they learn any algorithm, must increase synaptic connectivity, causing greater material, spatial, and temporal density at the circuit and protein complex level, and this allows them much greater energy efficiency per learned algorithm. As social organizations, we use languages and artifacts to communicate, compete and cooperate. Our languages grow increasingly information dense on the social level (social vocabulary grows in complexity/total corpus/technical subsets, in level of abstraction, andspeed of communication increases), and our artifacts and social networks grow greatly in complexity and density (we move from villages with simple tools to modern highly STEM-dense cities with advanced automation).

STEM efficiency also accelerates (energy use per instruction in electronic computers declines exponentially with time, see Richards and Shaw, 2004), technical productivity per worker grows exponentially, at 2-9%/year in most countries today, cities are much more STEM efficient than villages at providing almost any type of social good, etc.). Considering the long-term, postbiological future of our planet, we can envision megacities of “living” computational machinery, carpeting Earth like a technological neocortex, with robotic sensors and effectors ranging throughout the solar system. This would be global brain of vastly greater STEM density and efficiency of computation than anything that presently exists, and a community of entitites that fully absorbs and exceeds our biological humanity. As I discuss in the developmental singularity hypothesis, such an entity, as its density grows, may seem increasingly like a black hole to external observers.

Futurists, engineers, and physicists frequently champion the Kardashev scale, which proposes that growth in the amount and spatial scale of energy use (planet, sun, then galaxy) is an appropriate metric for future levels of civilization development. But if STEM compression exists, this "expansion hypothesis" is 180 degrees out of phase with the vector of universal complexity development, which is transcension, not expansion. Cosmologist John Barrow in Impossibility, 1998, has usefully proposed an anti-Kardashev scale, where the appropriate metric for civilization complexity is not total energy use, but the miniaturization of a civilization’s engineering. The developmental singularity hypothesis is a variant of Barrow's perspective which proposes that STEM density and STEM efficiency of our physical and computational engineering are the best metrics for an anti-Kardashev scale. Miniaturization is a good proxy for this, as the closer we approach engineering on the Planck scale, the greater the densities and efficiencies of our engineered objects. It is our increasing approach to black hole level densities and the black hole's unique computational efficiencies (Seth Lloyd, 1999) that truly measures civilization development.

Our historical human era of planetary exploration may appear, on untutored examination, like a journey "outward", but actually, no new zones of space have ever been colonized, in an autopoetic fashion, by the efforts of later, more complex organisms arriving on the scene. In other words, the trajectory of hierarchically developing universal complexity has never actually involved a true journey out, in the cosmological sense. Even the cyclic birth and death of suns in supernovas is best seen as an initially galactic-scale event that rapidly creates locally interesting, high-metallicity solar systems within which further development occurs. And once biological intelligence emerges, all the really interesting computation occurs on one special planet per habitable solar system, on a sliver of surface between magma and vacuum that we call home.

All of Earth's human explorers have been part of a largely unconscious effort to wire up an already previously verdant Earth into one global technological intelligence—making our world smaller, not larger. Today's intelligent bipeds colonize only a small fraction of the space inhabited by our bacterial ancestors, who dwell at least six miles deep in our crust and two miles up in the clouds, as well as having left Earth entirely, and been transported to neighboring planets, as spores on impacting meteorites billennia ago.

The superexponential 'developmental' trajectory is always, on average, relentlessly inward, even as 'evolutionary' individuals regularly do exactly the reverse, using their own lives as experiments. This fundamental constraint, this overwhelming developmental vector toward inner space, has been overlooked for for many years. It is my hope that this will change in coming decades.

STEM and STEM+IC Concepts: Contrasting the Physical (STEM) and Informational Computational (IC) Universe Space, Time, Energy, and Matter/Mass ("STEM"), together with their important but still-poorly-understood cousins, Information and Computation ("STEM+I C") seem likely to be among the foundational elements of any systems theory of how universal complexity emerges, evolves, and develops. With respect to STEM, Twentieth century science, beginning with Albert Einstein's special and general relativity work, introduced in 1905-1916, and continuing with the quantum theoery in the 1930's, uncovered powerful relationships between the first four of these elements. University physics students learn about the "space-time" continuumand "matter-energy" (or more precisely, "mass-energy") equivalence, as described by the famous equation E = mc^2. Information and computation (IC) are today far less well understood. Some scientists, particularly those of the more reductionist variety, expect these concepts to be fully accountable as simply special arrangements of STEM structure. Others, particularly those theorists pursuing digital physics/digital philosophy, envision our universe as a computer of sorts, in a manner where information and its emergents (intelligence, meaning, consciousness, etc.) all exist in neo-Platonic terms, as something both ideal and real that in some sense undergirds and guides all physical STEM, rather than being simply special arrangements or manifestations of STEM physics. An appropriate name for the way that information/computation//intelligence/mind increasingly influences physical STEM as a function of its complexity may be infodynamics, a term preferred by developmental systems theorist Stan Salthe. This is the idea that accumulating and increasingly meaningful universal information has some predictable developmental influence on physical STEM. Infodynamics may one day turn out to be another perspective on thermodynamics, perhaps the most fundamental theory of physical STEM dynamics. Or it may be something more, something that undergirds and increasingly guides the future of our thermodynamic universe. Resolution of this ancient question, the present and future relation of Cartesian dualism, the influence of 'mind' over 'matter', and of 'matter' over 'mind', seems presently quite beyond us. Nevertheless, what we can do at the present time is to label these obviously important observables as STEM+IC, where the plus symboldesignates information and computation's special relation to physical STEM, as either a parallel perspective (some kind of "emergent image" or "mirror image") way of understanding universal STEM, or as something as or more fundamental than the physical STEM that it is related to. Information and computation are certainly special in the way they accelerate in local complexity over universal time (as seen, for example, in Carl Sagan's inspiring Cosmic Calendar metaphor). This acceleration process is not only exponential, but superexponential, implying a limiting state will be reached relatively soon in astronomical time. How special information and computation are in relation to universal dynamics remains to be seen, and requires a theory of information and computation's fundamental relation to the physical universe in which it arises. STEM+IC concepts certainly don't describe all the important elements of universal change, but they seem to be among the fundamentals, a useful place to begin our journey of understanding. It may not surprise you then that they are also among our most ancient insights into the basic features of the universe. The earliest reference I've found so far for STEM+IC as a coherent system describing universal change comes from Indian Jainist cosmology, perhaps originating circa 800 BCE. Surprisingly modern, Jain philosophy does not posit an independent God as a creator, survivor, or destroyer of the universe. Instead, it asserts that the universe is all, and encompasses countless cycles of origination and destruction, but at the same time has always existed and will always exist in strict adherence to the laws of the cosmos. It is a philosophy of both permanence and change. Jainism, a reaction to Brahmanic and analog of Buddhist religious philosophy, proposed six immortal, cycling, and continuously changing "dravyas," or universal substances: 1. Soul/Consciousness/Life/Intelligence/Information and Computation - Jiva 2. Space - Akasa 3. Time - Kaal 4. Medium of motion (Kinetic Energy) - Dharma 5. Medium of rest (Potential Energy) - Adharma 6. Matter - Pudgala The five "nonliving" substances can be collapsed to the familiar STEM of the Newtonian universe. We can also recognize their "living" substance as Information and Computation. Jainists consider the universe as womb for the creation of life and intelligence, and thus STEM+IC is fully represented. Not bad for almost three millennia ago! In the modern scientific paradigm, such features of the Standard Model of particle physics as elementary particles, fundamental parameters, gauge fields, and the forces of the universe are all observable and describable by measuring changes in STEM systems.We can use the term STEM as a "rough shorthand" for the observable physical world, as long as we are ignoring or minimizing the influence of Information/Computation/Intelligence. The present state of affairs, our current physical science, can be only a crude approximation, a primitive, early model of our universe, as we know that Information and Computation become increasingly important in the higher zones of complexity. Humans presumably have high concentrations of meaningful information/ intelligence/ consciousness/ will encoded in our STEM arrangements, after countlless evolutionary and developmental cyclings from the first living cell, and this purposiveness, this increasing influence of mind over matter, is simply ignored by traditional STEM physics. Some future form of STEM+IC physics must inevitably emerge, a physics which ties mind/information deeply to physical STEM. Greek and medieval science, thermodynamics, chemistry, Newtonian physics, quantum mechanics and relativity have all greatly improved our understanding of STEM structure and dynamics over human history. Our concept of Space has been reformulated from Euclid and Newton's concept of absolute space, to our present non-Euclidean, relativistic models (e.g., Friedman, Reimann, Einstein). Time, perhaps the least well understood of these four, like gravity among the fundamental forces, has progressed from our Greek concepts of transience and eternity to Einstein's and Minkowski's space-time continuum and the physics of black hole singularities (where time loses its meaning, at least from our universe's reference frame). Concepts of matter have evolved from Aristotle, Democritus, and the Alchemists to modern chemistry, relativity, quantum mechanics, and astrophysics. Our understanding of energy has likewise advanced through thermodynamics and free energy, to relativity and quantum mechanics. In Albert Einstein's general relativity, we see a further compression of these concepts into space-time (an apparently fundamental universal continuum) and matter-energy (as a precipitation from apparently more elemental gauge fields). Today, string theory, M theory, and other approaches are now attempting, with little success at present, to represent all the features of the universe in a common mathematical landscape. One of the more popular string theory hypotheses uses 10 fundamental dimensions of space and one of time, for example. For good introductory surveys of each of these STEM+IC properties of physical systems, you might investigate the following generalist works. Space: Concepts of Space, Max Jammer, 1954/93; Time: About Time, Paul Davies, 1995; Energy: The Refrigerator and the Universe, Martin Goldstein, 1993; Matter: The Magic Furnace, Marcus Chown, 2001. For some of the more promising conjecture on STEM's relationship to Information and Computation, Erwin Schrodinger's What is Life? 1944/92, Paul Churchland's Matter and Consciousness, 1988, Wolfgang Hofkirchner's The Quest for a Unified Theory of Information, 1999, and Seth Lloyd's Programming the Universe, 2007 are all good places to start. Unfortunately, we are still missing a good understanding of how STEM changes can be characterized as Informational and Computational changes, and the way that Informational and Computational evolution and development constrains STEM transformation, as information and computational systems accelerate in building their local complexity. Clearly humans have a great effect on on the local STEM of our environment, so much so that we now threaten the planet's life support systems and have reached the limits to our own population growth, at least for the forseeable future. None of this is yet predicted or even anticipated in the physics books we read at the university, even in the most advanced graduate courses we can take. The obvious conclusion here is that modern human culture is still missing our "Einstein of information theory," one who will tie Information and Computation to STEM physics and account for and explain our universe's curious history of increasingly rapid and increasingly local informational and computational change. Nevertheless, we can begin to make several claims as to the general shape that this theory must take. One of these, the concept of STEM compression, is briefly outlined below.

Information and Computation as 'Special Perspectives' on STEM Change

In our modern materialist worldview, we don't generally consider information as a separate substance or entity, or computation as a separate process, from the physical STEM that encodes and generates it. Scientists most commonly propose that information theory, or the closely related theory of computation, are essentially another, more holistic way to view the evolutionary developmental changes that occur within our physical universe over time. This view of information and information-processing as simply another, perhaps more holistic perspective on the evolutionary development of the universe, might be diagrammed as "STEM = IC", in a yin-yang relationship, using two different filters to view the same process, if we were to represent the relation in an acronym.

An increasing number of systems theorists (see, for example, Wolfgang Hofkirchner, The Quest for a Unified Theory of Information, 1999) consider the flow of information, and the increasing value or meaning of emergent information, as the most fundamentally useful way to understand reality. Colloquially we might call this an "infomorphic" worldview**.** In other words, the most fundamental bias in this world view is not anthropo-morphism (the specialness of the human form) but info-morphism (the special function of information and computation in controlling and describing the universe). Both are clearly biases, and should be very suspiciously evaluated, but there appears to be much more evidence for the latter than the former in universal structure and process.

From the infomorphic perspective, humans constitute a brief and transitional phase at the leading edge of the local development of cosmic intelligence--no strong anthropomorphism there. But the same time, there appears to be good reason to have mild anthropomorphic bias (e.g., humans are special, in the sense that they are currently the most complex local form of information processing, and anthropic parameters in universal structure appear to be tuned to cause the developmental emergence of humanoid forms). Nevertheless, it is very easy to take this anthropomorphism too far, as is don, by those humans who think that the universe was designed for humanity as an end product. A strong and unjustified anthropomorphism also surfaces among those feel that humanity's destiny is to some how stay 'in control of,' and superior to, exponentiating technological development, a wish, as we will discuss later, that seems entirely unsupportable given both the past history of substrate emergence and the human history of accelerating technological change.

A lot more remains to be understood on the interrelationships between space, time, energy, matter and information and computation. It is most common today, given the fantastic success of reductionism, to consider parameters, forces, physical laws, and bodies of scientific theory as the "root elements" of universal change. That clearly remains the most effective investigatory approach within any scientific discipline. But to gain a broad qualitative and intuitive understanding of the impact of physics on universal change, to understand the way that human will influences physics, and the way that cosmic intelligence is itself constrained by physics, to understand the global features, properties, trajectory, and even the teleology,or functional purpose of the entire system, as a whole, we need to move beyond STEM into STEM+IC conceptualizations, tentative as they are at this early stage of our science.

Contemplating which changes are induced in the physical universe by information and computation may be the most useful and concise conceptual approach to understanding the future that is accessible to human thinkers.

Like the Jains, we know that information and computation (life, higher life, intelligence, consciousness, will) are "something special." They apparently arise out of, and constrains the further evolutionary development of STEM structures over time. As Daniel Dennett observes, while it is at least grossly true that we may accurately describe a human being as a "complicated washing machine" using our most intricate STEM physics, at the same time we know that such a reductionist description, however detailed, misses the subjective perception of one's own consciousness, an emergent informational computational property. This perception may still be entirely constrained by and representable within STEM physics (or not), but either way it demands to be considered as a special perspective on STEM reality. Therefore, trying to understand change from a STEM+IC perspective engages us in a dance that employs both the Cartesian duality of mind and matter, as well as nondualist approaches that refuse to separate the two.

Does information, or constraining pattern, have its own unique existence? It certainly appears so. For example, modern reductionist neuroscience attempts to explain humans in terms of localized action potentials and synaptic activity, e.g., the language of STEM. Yet we know that this is not enough. Humans are also motivated by goals, deductive and inductive thinking, emotions, intuitions. Much of this is in the realm of information. Our reductionist sensibilities tell us that all our motivating information must also be encoded in specific physical structures.

Yet we always find it inordinately powerful to say, for example, that I did a good deed for her because I love her, or because she said so-and-so to me, each compact bits of communicated information, a vastly efficient shorthand for all the brain states that affect my thoughts and behaviors. The specific STEM neural states in that situation are so complex it may take a human-surpassing artificial intelligence to eventually model them (for its own purposes, not ours). Thus language has emerged as a permanent new informational shorthand, and a new way in which computations of human motivation affect the neural and cultural substrates.

It always takes physical STEM to communicate information, so there is never any need to invoke any "vital substance." We never lose our materialist connection. But consider how little information is needed, in many cases, to evoke long, complex sets of action in the physical world. If I hear that someone who I know and may be wary of has poorly treated a friend, for example, a very compact bit of information, I may engage in shock, anger, planned response, and other complex behaviors. Generalizing, we can say that STEM informational and computational encoding gets significantly more efficient, dense, and powerful with time, in a process apparently directed by information flow, as allowed by the physics of the universe we inhabit. So something curious is afoot.

Matter and energy constrain space and time (e.g., warp it or shrink or dilate it in high-matter zones). But the informational constraints on STEM have not come (yet) as an elegant set of equations, the way special and general relativity emerged. As William James observed in Principles of Psychology, 1890, the maddening dualism between mind (information and computation) and body (STEM) will likely be with us for some time to come.

The astronomer Timothy Ferris has proposed that a fundamental advance in our understanding of information theory (and we might add, theory of computation) is perhaps both the most needed and the most expected new breakthrough on the horizon for twenty-first century science. As a scientific concept, information (and its computational correlate--meaning, or value) is even more nebulous than time and gravity, and is truly one of the outstanding riddles and challenges for materialist description.

As the likelihood and trajectory of the coming technological singularity (generally human-surpassing machine intelligence) become clearer to scientific observation in coming decades, we should do our best to help the emergence of a future "Einstein of information theory" to explain this coming transition to us in simple and predictable terms. Let us hope it happens sooner rather than later, to aid and enlighten our evolutionary choices on the path to greater personal, cultural and technological development.