From neurobiology to silicon

Chris Diorio

My goal is to build electronic systems that employ the computational and organizational principles used in the nervous systems of living organisms. Nervous systems solve, in real time, ill-posed problems in image and speech processing, motor control, and learning; they do so in ways that we, as scientists and engineers, do not understand. There are fundamental principles that we can learn from neurobiology about a different and—on poorly conditioned data—vastly more efficient form of computation.

I believe that there is nothing that is done in the nervous system that we cannot emulate with electronics, once we understand the principles of neural information processing. Although nervous tissue solves problems that we do not know how to solve, it does so using an underlying device physics that we know and understand. A similar device physics underlies the semiconductor electronics that we employ to build our digital computers.

In both integrated circuits and nervous tissue, information is manipulated principally on the basis of charge conservation. In semiconductor electronics, electrons are in thermal equilibrium with their surroundings; their energies are Boltzmann distributed. In nerve tissue, ions are in thermal equilibrium with their surroundings; their energies also are Boltzmann distributed. In semiconductor electronics, we erect energy barriers to contain the electronic charge, by using the work-function difference between silicon and silicon dioxide, or the energy barrier in a pn junction. The nervous system erects similar energy barriers to contain its electronic charge, by using lipid membranes in an aqueous solution. In both systems, when the height of the energy barrier is modulated, the resulting current flow is an exponential function of the applied voltage. Both systems use this principle to produce devices that exhibit signal gain. Transistors use populations of electrons to change their channel conductance, in much the same way that neurons use populations of ionic channels to change their membrane conductance.

I believe that the disparity between the computations that can be done by a digital computer and those that can be done by the nervous system is a consequence not of a different underlying physics, but rather of a different computational model based upon a similar underlying physics. The state variables in both electronic and nervous systems are analog. They are represented in electronic systems by electric charge, and in nervous systems by electric charge or by chemical concentrations. The mechanisms by which each systems manipulates its state variables to do computation, however, are vastly different. In a digital computer, we ignore most of the available states in favor of the two binary-valued endpoints: We achieve noise immunity at the expense of dynamic range. The nervous system retains the analog dynamic range, achieving noise immunity by adjusting the signal-detection threshold adaptively. Digital machines quantize their analog inputs, and use restoring logic at every computational step. Nervous systems perform primarily analog computations, and quantize the computed result.

Unfortunately, we do not know what computational primitives neural systems use, how they represent information, or what their organizing principles are. However, because semiconductor electronics allows us to apply, at a high level of integration, a device physics similar to that used by neural tissue, I conclude that we should be able to build electronic circuits that mimic the computational primitives of nervous systems, and that we should be able to use these circuits to explore the organizational principles employed by neurobiology. I call the approach silicon neuroscience: the development of neurobiologically inspired silicon learning systems.

My predecessors began these investigations by modeling two of the sensory organs available to neural systems: the retina and the cochlea [1, 2, 3, 4]. The silicon retina and cochlea are now well developed and mimic some of the sensory preprocessing performed by living organisms. Other researchers have begun modeling the motor-control systems employed by living organisms [5, 6]. My colleagues and I are now beginning to model what is perhaps the most remarkable aspect of living organisms: their ability to adapt and to learn.

The nervous system has mechanisms for long-term memory and for learning, including synaptic plasticity and neuronal growth [7]. Semiconductor electronics also has mechanisms for long-term memory—in particular, the nonvolatile EEPROM devices. We have adapted the floating-gate technology used in digital EEPROM devices to allow nonvolatile analog storage and to perform a local learning function, and we have done so using a standard CMOS process. We have developed a family of single-transistor devices that we call synapse transistors [8–11]; these devices, like neural synapses, implement long-term nonvolatile analog memory, allow bidirectional memory updates, and can learn from an input signal without interrupting the ongoing computation. Our synapse transistors also compute the product of their stored analog memory and the applied input. Although we do not believe that a single device can model the complex behavior of a neural synapse completely, our synapse transistors do implement a local learning function.

Neurons are the nervous system’s primary computing elements. A typical neuron is markedly unlike a typical logic gate; it possesses on average 10,000 synaptic inputs, and a similar number of outputs. Its stored memory is contained in the pattern and strength of the analog synapses that connect it to other neurons. Nervous systems use vast numbers of synapses to effect their computations: In neocortical tissue, the synapse density is roughly 3×108 synapses per cubic millimeter [12].

We will use vast numbers of silicon synapses to model nervous tissue: Synapse-transistor arrays, fabricated in a standard CMOS process, afford a high device density, mimic the low power consumption of neural synapses, and perform both parallel computation and local adaptation. Like neural tissue, arrays of silicon synapse transistors simultaneously and in parallel perform an analog computation and update their nonvolatile analog weight values.

I believe that if we can understand the principles on which biological information-processing systems operate, then we can build circuits and systems that deal naturally with real-world data. My goal, therefore, is to consider the computational principles on which neural systems operate, and to model and understand those principles in the silicon medium.

1    M. Mahowald and C. Mead, "The silicon retina," Scientific American, vol. 264, no. 5, pp. 76–82, 1991
2    K. Boahen, "A retinomorphic vision system," IEEE Micro, vol. 16, no. 5, pp. 30–39, 1996
3    R. F. Lyon and C. A. Mead, "The cochlea," in Analog VLSI and Neural Systems, pp. 279–302, Addison Wesley Publishing Co., Reading MA, 1989.
4    R. Sarpeshkar, R. F. Lyon, and C.A. Mead, "An analog VLSI cochlea with new transconductance amplifiers and nonlinear gain control," Proceedings of the 1996 IEEE International Symposium on Circuits and Systems, Atlanta, GA, vol. 3, pp. 292–295, 1996
5    S. DeWeerth, L. Nielsen, C. Mead, and K Astrom, "A simple neuron servo," IEEE Tran. Neural Networks, vol. 2, no.2, pp. 248–251, 1991
6    T. Horiuchi, T. Morris, C. Koch, and S. DeWeerth, "Analog VLSI circuits for attention-based visual tracking," in Advances in Neural Information Processing Systems 9, pp.706–712, MIT Press, 1997
7    C. Koch, "Computation and the single neuron," Nature, vol. 385, no. 6613, pp. 207–210, 1997
8    C. Diorio, P. Hasler, B. A. Minch, and C. Mead, "A complementary pair of four-terminal silicon synapses," Analog Integrated Circuits and Signal Processing, vol. 13, no. 1/2, pp. 153–166, 1997
9    C. Diorio, P. Hasler, B. A. Minch, and C. Mead, "A single-transistor silicon synapse," IEEE Trans. Electron Devices, vol. 43, no. 11, pp. 1972–1980, 1996
10  P. Hasler, C. Diorio, B. A. Minch, and C. Mead, "Single transistor learning synapses," Advances in Neural Information Processing Systems 7, MIT Press, pp. 817–824, 1995
11  P. Hasler, C. Diorio, B. A. Minch, and C. Mead, "Single transistor learning synapses with long term storage," IEEE Intl. Symp. on Circuits and Systems, vol. 3, pp. 1660–1663, 1995
12  R. Douglas, "Rules of thumb for neuronal circuits in the neocortex", Notes for the Neuromorphic aVLSI Workshop, Telluride, CO, 1994


back to Chris Diorio's home page