[ I always think of von Neumann Machines as self-replicating devices. Von Neumann himself called them "Universal Constructors". Some people use the phrase to refer to computing machines that use a single storage structure to hold both the set of instructions on how to perform the computation and the data required or generated by the computation. I call this the von Neumann architecture. ]
[ This is an essay on von Neumann machines as such rather than as metaphors for human societies and economies or as conceptual tools for thinking about evolution. ]
In the late 1940s, John von Neumann first suggested a modern, robotic self-replicator. Moreover, he calculated how much information a self-reproducing entity would require. This meant figuring out what parts a machine needs if it is to reproduce. In short, he estimated the minimal size of a self-replicator's `blue prints' or `genome' (25 150 kilobytes).
By extending von Neumann's notion metaphorically, we can think more readily about societies, economies, ecologies, and the origins of life.
Economies, for example, reproduce themselves; in that sense, they are von Neumann machines. But people work in economies; economies do not reproduce without human help.
As yet, no one has built a von Neumann machine. At present, you can build a device that assembles different kinds of large, identical components or `modules' into duplicates of itself. Perhaps some factories that use robots to make robots already do this. Unfortunately, the modules themselves must be manufactured in some other way. I do not know whether they can yet be manufactured purely by self-directed robots. In any event, current investors, whether government or private, would have to spend a huge sum to build the first instance of such a manufacturing system.
We humans are entities that consume `modules' that are not identical some foods taste differently than others. Reproduction from large, non-identical, breakable `components' is difficult. That is what a von Neumann machine that works with `regular sized' components will have to do.
Very small, `nano-sized' von Neumann machines are as yet impossible to build. If built, these as-yet imaginary, `nanotech self-assemblers' would put together small, identical, unbreakable components that are naturally present, namely atoms and molecules. We know that `nano von Neumann machines' exist: bacteria are examples.
The first human-made `nanotech self-assembler' may be `soft' and work in water, like a bacterium, or may be `hard' and involve diamond, as in Eric Drexler's 1986 book 1 Engines of Creation.
`Closure' is a basic concept. A closed system reproduces all it parts. An open system fails to fabricate some of them. For an open system to continue, some parts must be imported from outside. A natural ecology can only be closed. But a farm or factory need not be fully self-reliant, but be partly open.
That is because efficiency becomes a concern when you farm or manufacture. The cost of building the first fully closed von Neumann machine may be more than you can afford; indeed, it may not be possible at all.
Since humans must build the first von Neumann machine, efficiency and cost are issues. It is no good building a von Neumann machine that makes worse use of your land than existing farms and factories. And you cannot build one you cannot afford.
Like any living entity, a von Neumann machine it must `eat', which means it must gather energy and other inputs.
In order to eat and live, a von Neumann machine must be able distinguish useful inputs from poisons; it must be able to `see' (or smell, taste, feel, or hear) potential food.
This means the machine not only needs appropriate sensors, but the ability to understand and act upon the information. It needs `eyes', a `brain', and `hands'.
In a small, `nano' von Neumann machine, thermal motion brings atoms and molecules to a site. Most often, only the appropriate atom or molecule settles in the site. Others do not fit. (The others that do fit create variations.)
Unless you think of the process of `fitting' as a combination of sensing, analysis, and action, you will not consider these entities as having `eyes', `brain', or `hand' at all. However, the process is similar, but more condensed: input that fits is both identified (perhaps wrongly) and accepted by that action.
The inputs, whether energy or material, must be transformed to enable the original von Neumann machine to continue and to enable that machine to reproduce.
In order to continue, the machine must be able not only to provide itself with enough `food' enough energy and materials, it must also be able to ward off illness to defend itself, and to heal itself to repair itself.
Moreover, the machine must be able to dump materials and energy it no longer uses. It must be able to `excrete'. Some of this excreta will be useless to us. It will be `pollution'. We will want other excreta, manufactured `goods'. This will be what we humans say the machine `produces'.
All in all, a von Neumann machine has a minimum of nine different aspects:
A von Neumann machine can reproduce exactly or with errors. Even though errors are common, it is possible to reduce the end number through appropriate `error correction' techniques.
Natural selection depends on some, but not many `errors' appearing in descendants, or, in the case of sex, in `variations' occurring. When reproduction is accompanied by error or variation, the set of re-duplicated descendants includes a mix of entities. A portion of that mix more tightly reproduce the design of the original manufacturer and others more loosely reproduce that design.
Those descendants that do better in the circumstances in which they find themselves which may be different than the original circumstances will be more likely to reproduce themselves into another generation, and thus, probabilistically speaking, be more likely to pass on their design data to their descendants.
On the one hand, the `error' or `variation' aspect of reproduction is important, since it means that different circumstances are met by von Neumann machines with different capabilities. For natural selection to succeed, new instances with different capabilities must appear.
On the other hand, the amount of `error' or `variation' cannot be too great, since circumstances seldom change dramatically. Hence internal error correction mechanisms must operate.
Humans may not want machines with new capabilities, so the machines they design may have strong internal error correction mechanisms, no auto-variation mechanisms (no sex), and newly produced machines may be checked to see whether they pass the quality assurance tests.
But without humans around, you may end up with an ecology like that described in James P. Hogan's 1983 science fiction novel Code of the Lifemaker 2.
Return to: Notions
Or return to: Rattlesnake Home Page