Von Neumann: The Brain: The Problem of Memory within the Nervous System (The Computer and the Brain, 1958)

firebrainMost likely, the nervous system contains one or more memory organs. We don’t know what or where they are any more so than did the Greeks, who believed it was in the diaphragm. We just know that if it exists, then it must have a great capacity.

In a computing machine, memory size can be quantified. It has a maximum capacity, which can be expressed in bits. A memory that can hold a thousand letters has a capacity of 6,450 bits, for example.

How much? How much!?!?!?! Assuming a 60 year human lifespan, a bunch of neurons, each able to receive 14 distinct digital impressions per second, and that we never truly forget things—we just focus away from them–lands us at around 35 million terabytes of data stored in the brain (aka 2.9 billion iPhones).

What is the physical embodiment of memory? One proposal is that it’s the variability of stimulation criteria—that is, the threshold of stimulation changes depending on frequency of the cell’s use. Another proposal is based on distribution of axons connecting cells—in disuse, an axon becomes ineffective over time, while in frequent use, a stimulation is facilitated by a lower threshold over a given path. Another proposal is genetic memory—chromosomes and their genes have memory elements, so perhaps this is the case in an expanded sense. There are many other suggestions also.

“Systems of nerve cells, which stimulate each other in various possible cyclical ways, also constitute memories”—this would go hand-in-hand with the “strange loops” of Gödel, Escher Bach. Likewise, vacuums-tube machines can do the same via “flip-flops”.

But we have good reason to believe that the active organs do not function also as the memory organs. That’s how early computers (the ENIAC) began, with small memory components, and with time memory components have become larger and “technologically entirely different” than active organs.
Advertisements

Von Neumann: The Brain: Stimulation Criteria (The Computer and the Brain, 1958)

This is just a picture of some cows. Has nothing to do with this post.
This is just a picture of some cows. Has nothing to do with this post.

Neurons function as basic logical organs, and basically digital organs: if a neuron requires only one incoming pulse (stimulator) to produce a response, then it is an OR organ; if it requires two incoming pulses, then it is an AND organ. These two, along with simulating “no” can be combined in various ways into any complex logical operation.

It’s not simple as this, though: a neuron may have hundreds of synapses connecting it to other cells, perhaps even to one other cell, receiving an enormous number of pulse combinations, and to further complicate things, pulses may be characterized not only by frequency, but also by spatial relations to one another.
Therefore, while there is a stimulation requirement, that is, a threshold, it may be simple, or it may be very complicated. And in the case of receptors, being neurons that respond to stimuli, there may be more than a simple threshold. Further: “if the nerve cell is activated by the stimulation of certain combinations of synapses on its body and not by others, then the significant count of basic active organs must presumably be a count of synapses rather than of nerve cells.”

Von Neumann: The Brain: The Nature of Nerve Impulses (The Computer and the Brain, 1958)

GalvanifroescheStimulation of nerve cell is similar to two digital markers: 0 in the absence, 1 in the presence, of electrical impulse. This is a high level description of the more conspicuous aspects of nerve impulses—with nuance, the digital qualities are less clear.
“Natural componentry favors automata with more, but slower, organs, while the artificial one favors the reverse arrangement of fewer, but faster organs.” Thus “the human nervous system will pick up many logical or informational items, and process them simultaneously,” while a computer “will be more likely to do things successively. . . or at any rate not so many things at a time.” The nervous system is parallel, while computers are serial. But the two cannot always be substituted for one another—some calculations must be done serially, the next step must follow the one previous to it, while other calculations done parallel, to be done serially require immense memory requirements.

Von Neumann: The Computer: Characteristics of Modern Digital Machines (The Computer and the Brain, 1958)

computerbrainComprises “active” and “memory” organs (he’s including “input” and “output” as part of “memory).

Active organs: perform basic logical actions, sense coincidences and anticoincidences, and combine stimuli, regenerate pulses to maintain pulse shapes and timing via amplification of the signals.

These functions were performed by (in historical succession): relays, tubes, crystal diodes, ferromagnetic cores, transistors, or by combinations of those.

A modern machine will contain 3,000-30,000 active organs, of which 300-2,000 are dedicated to arithmetic, and 200-2,000 to memory. Memory organs require further organs to service and administer them—the memory parts of the machine being around 50% of the whole machine.

Memory organs are classed by their “access time”—the time to store a number, removing the number previously stored, and the time to ‘repeat’ the number upon ‘questioning’ (that is, write/read times, respectively). To classify the speed, you could either take the larger of those two times, or the average of them. If the access time doesn’t depend on the memory address, it is called “random access” (RAM).

Memory registers can be built of active organs—which, while fastest, are also most expensive (i.e,. built out of vacuum tubes). Thus, for large-memory operations, it’s cost-prohibitive. Previously, relays were used as the active organs, and relay registers were used as the main form of memory.

It is possible, however, to reduce the required memory to solve a problem by considering not the total numbers needed in memory, but the minimum needed in memory at any given time. And if that can be determined, numbers can be distributed between faster memory, and slower memory, based on when they are needed—that is, perhaps all the numbers can be stored on the slower memory, while the necessary numbers of the moment are stored on the faster memory. I assume this is how computers now function—everything is stored on the hard drive, while the absolutely necessary things to the current operations are stored in the RAM.

Magnetic drums and tapes are currently (1950s) in use, while magnetic discs are being explored (and now, 2015, becoming obsolete in favor of SSDs).

Inputs are punched cards or paper tapes, outputs are printed or punched paper—that is, means for the machine to communicate with the outside world.

Words are saved directly to named numerical addresses within the memory of the machine—the address is never ambiguous.

Von Neumann: The Computer: Mixed Numerical Procedures (The Computer and the Brain, 1958)

computerbrain3Some machines are combinations of digital and analog parts that may have a common, or may have separate, controls. In the case of separate controls, the controls must communicate with each other, and require D/A and A/D organs. D/A means building up a continuous quantity from the digital expression (i.e., digitally, 10, so analogue, buildup 10V), while A/D means measuring a continuous quantity for digital expression.

Possible description of floating point resolution using a “pulse density” system where number is more or less precise depending on how large it is?

Von Neumann: The Computer: Logical Control (The Computer and the Brain)

computerbrain2Logical Control is essentially how one determines the fixed order to perform logical functions to produce the desired result. This is programming the machine.

So you have the numerical data that is input, you have the organs that perform basic mathematical functions, you have instructions as to the order in which those functions are performed, and you then have the output data.

Beyond being able to perform basic functions, the machine must also have the capacity to perform a sequence/logical pattern to solve mathematical problems, which is actually the goal in the first place. So in analog machines there must be enough components that can perform basic functions as needed for the calculation. They must be connected to each other in a fixed setting for the duration of solving the problem.

It is possible to have only one component for each basic function, but this then requires that there be another organ in the machine to remember numbers—it must be able to remember a number, replace the number in its memory, and also repeat upon being questioned for it. This is called ‘memory’ and the totality of memory organs is a ‘memory register’ and is the main ‘mode of control’ for digital machines.

There are a number of possibilities for controlling this sequence through what is an iterative approach—I’m assuming through a series of IF/THEN sorts of controls.

In “memory stored control”, rather than having physical controls, the controls are stored in the machine’s memory—essentially, the “program” is no longer a physical thing (i.e., a punchcard), but can be stored in the same way that numbers can be stored—within the machine itself.

What makes this remarkable further is the numbers and programs can begin manipulating themselves from within the machine.

These types of controls can be combined, so that a memory stored control could consist of orders that reset plugged (physical) controls, change plugged control setups, and control of the machine over to the plugged “regime”. By the same token, the plugged “scheme” should be able to hand back over control of the machine to the memory stored control scheme at some point in its sequence.

von Neumann: “Analog Procedure” and “Digital Procedure” (The Computer and the Brain)

computerbrain1Analog Procedure

Two different types of computers: analog and digital. In analog, numbers are represented by physical quantities of something, that, when measured, correspond to the number in question. For instance, voltage. You push 1 volt through, and it’s measured as equaling the number 1. From there, components compute basic arithmetic ( + – * / ).

Digital Procedure

On digital machines, numbers are represented as decimal digits, the same as when we write them. Each decimal digit is represented by a system of ‘markers.’ A marker could be an electric signal in a wire, or its absence, for example. So, this marker can have two states (present, not present).

1 marker = 1, 0 = 2 combinations.

2 markers = 00, 01, 10, 11 = 4 combinations (not enough)

3 markers = 000, 001, 010, 100, 011, 101, 110, 111 = 8 combinations (not enough)

4 markers = 16 combinations (more than enough!)

So, for each decimal digit, four markers must be used. Markers could be a group of four wires, where the electric pulse is present or not present. Or they could be electric pulses with positive or negative polarity, or could be other systems. So the minimum needed is four markers. And these pulses would be controlled by various gates. Another option is a ten-valued marker represented, say, by ten wires, each either being on or off! Overkill, but works.

Vacuum tubes, and transistors are types of digital machines.

If a number is being passed around a machine temporally, it can either be done so simultaneously (parallel), or in serial (one after another).

The author expresses these in binary, and I do not understand, particularly, what he’s talking about. From what I gather, the point of his discussion is that the basic operations ( + – * / ) are performed through lengthy, repetitive logical processes. We’re so used to doing these calculations in our heads and on paper that the complex nature of the processes are obscured.