My Academia.edu Page w/ Publications

12 Jun 2009

The Wheel is Come Full Pixel: Analog against Digital; Bostrom and Sandberg's Brain Emulation, Examined and Critiqued. Section 4


by Corry Shores
[Search Blog Here. Index-tags are found on the bottom of the left column.]

[Central Entry Directory]
[Posthumanism, Entry Directory]
[Other entries in this paper series.]

[The following is tentative material for my presentation at the Society for Philosophy & Technology Conference this summer.]


[Other entries in this series.]



Corry Shores


Do Posthumanists Dream of Pixilated Sheep?

Bostrom and Sandberg's Brain Emulation,

Examined and Critiqued


Section 4:


The Wheel is Come Full Pixel:

Analog against Digital



Digital technologies replaced analog ones. These two sorts of quantity-representation reside in contrary worlds: the continuous versus the discrete. An abacus is digital. It computes one discrete value or another, but not the ones in between. When we count to two on our fingers, meaningless empty space spans between our digits. So spreading our fingers further apart does not change their value. Slide-rules, however, are analog. One ruler slides against another continuously. So it may calculate any possible real number value along the continuum. It could potentially compute and display irrational numbers like pi or the golden number. However, a digital computer attempting this would never cease carrying-over into the next digit places. So in a sense, analog computers have more computing potential for certain applications. In fact, Hava Siegleman argues that analog is capable of a hyper-computation that no digital computer could possibly accomplish. (Siegleman 109)

Nelson Goodman’s oft cited terminology clarifies the difference. He distinguishes density from differentiation, and continuity from discretion. Analogical values are placed along a continuous scale. Between any two values is a third: this is its density. It implies that all readings are approximations. For, there can be no pinpoint determination, but instead just more-and-more precise possible readings. Digital, however, uses discrete units, and thus each value is perfectly differentiated from the others (Goodman 160-161).

According to James Moor,

in a digital computer information is represented by discrete elements and the computer progresses through a series of discrete states. In an analogue computer information is represented by continuous quantities and the computer processes information continuously. (Moor 217)

Our emulated brain will receive simulated sense-signals. Does it matter if they are digital rather than analog? Many audiophiles swear by the unsurpassable superiority of analog. It might be less precise, but it is always like natural sound waves. Digital, even as it becomes more accurate, still sounds to them artificial or cartoon-like. In other words, there might be a qualitative difference to how we experience analog and digital stimuli, even though it might take a person with extra sensitivities to bring this difference to our explicit awareness.

And if the continuous and discrete are such polar realities, then perhaps a brain computing in analog would experience a qualitatively different feel of consciousness than if the brain were instead computing in digital.

A relevant property of an audiophile’s brain is its ability to discern analog from digital, and prefer one to the other. But a digital emulation of the audiophile’s brain might not be able to share its appreciation for analog. And perhaps digital emulations might even produce a mental awareness quite foreign to what humans normally experience.

Bostrom’s & Sandberg’s brain emulation exclusively uses digital computation. But they acknowledge that some argue that analog and digital are qualitatively different. And the authors even admit that implementing analog in brain emulation could present profound difficulties. (39) But there is no need to worry, they think. And they give their reasons why the qualitative difference is irrelevant.

They first argue that brains are made of discrete atoms. These must obey quantum mechanical rules that force the atoms into discrete energy states. Moreover, these states could be limited by a discrete time-space. (38) The debate over whether or not the world is continuously divisible spans back at least to Zeno of Elea. Perhaps quantum physics finally settled this issue, at least as far as it concerns us here. I apologize that I am unable to comprehend whether and how this might be so. Let’s presume it is. I am still uncertain that digital technologies will be able to compute such tiny quantum-scale variations. This is where analog now already has the edge.

Yet their next argument calls even that notion into question as well. They make what is called “the argument from noise.” Analog devices always take some physical form. It is unavoidable that interferences and irregularities, called noise, will make the analog device imprecise. So analog might be capable of taking-on an infinite range of variations. However, it will never be absolutely accurate, because noise always causes it to veer-off slightly from where it should be. Yet digital has its own inaccuracies. It is always missing variables between its discrete values. Nonetheless, digital is improving. Little-by-little it is coming to handle more variables. It is filling in its gaps. Digital will never be completely dense like analog. But, soon the span between its gaps will equal the range that analog veers-off from its proper values. Digital’s blind-spots and analog’s drunken swerve will miss the same range of variation. So, we only need to wait for digital technology to improve enough that it can compute the same values with equivalent precision. Both will be equally inaccurate, but each for its own reason.

But perhaps the argument from noise reduces the analog/digital distinction to a quantitative difference rather than a qualitative one. And analog is so prevalent in neural functioning that we should not so quickly brush it off.

First note that our nervous system’s electrical signals are discrete pulses, like Morse code. In that sense they are digital. However, the frequency of the pulses can vary continuously (Jackendoff 33). For, the interval between two impulses can take any value. (Müller et al. 5b)

This applies as well to our sense signals. As the stimulus varies continuously, the signal’s frequency and voltage changes proportionally. (Marieb & Hoehn 401) Recent research suggests that the signal’s amplitude is also graded and hence is analog. (McCormick et.al, abstract) Also consider that our brains learn by adjusting the weight or computational significance of certain signal channels. A neuron’s signal inputs are summed. When it reaches a specific threshold, the neuron fires its own signal. It then travels to other neurons where the process is repeated. Another way the neurons adapt is by altering this input threshold. Both these adjustments may take on a continuous range of values. Hence analog computation is fundamental to learning (Mead 353-354)

Fred Dretske gives reason to believe that our memories store information in analog. We may watch the setting sun and observe intently as it finally passes below the horizon. Yet, we do not know that the sun has set until we convert the fluid continuum of sense impressions into concepts. These are discrete units of information, and are thus digital. (Dretske 142) Yet we might later find ourselves in a situation where it is relevant to determine what we were doing just before the sun completely set. To make this assessment, we would need to recall our experience of the event, and re-adjust our sensitivities for a new determination, or as Dretske writes, “as the needs, purposes, and circumstances of an organism change, it becomes necessary to alter the characteristics of the digital converter so as to exploit more, or different, pieces of information embedded in the sensory structures.” (Dretske 143) So in other words, because we can always go back into our memories to make more and more precise determinations, we must somehow be recording sense data in analog, Dretske argues.

Bostrom & Sandberg make another computational assumption. They argue that no matter what the brain computes, a digital (Turing) computer could theoretically accomplish the same operation. (Bostrom & Sandberg 7) But note that we are emulating the brain’s dynamics. And according to Terence Horgan, such dynamic systems use “continuous mathematics rather than discrete.” (Horgan 19)

It is for this reason that Whit Schonbein claims analog neural networks would have more computational power than digital computers. (Schonbein 61) The values in continuous systems make use of "infinitely precise values" that can "differ by an arbitrarily small degree." (Schonbein 60d) And yet, like Bostrom & Sandberg, Schonbein critiques analog using the argument from noise. He says that analog computers are more powerful only in theory. As soon as we build them, noise from the physical environment diminishes their accuracy. (65-66) Curiously, he concludes that we should not for that reason dismiss analog. He says that analog neural networks, “while not offering greater computational power, may nonetheless offer something else.” But he leaves it for another effort to say exactly what the unique value of analog computation would be. (68cd)

A.F. Murray’s research on neural-network learning supplies an answer. Analog noise interference is significantly more effective than digital at aiding adaptation (Murray 1547). Being "wrong" allows neurons to explore new possibilities for weights and connections. This enables us to learn and adapt to a chaotically changing environment. Using digitally-simulated neural noise might be inadequate. Analog is better. For, it affords our neurons an infinite array of alternate configurations (1547-1548). Hence in response to Bostrom’s & Sandberg’s argument from noise, I propose the argument for noise.



[Next entry in this series.]


Dretske, Fred. Knowledge and the Flow of Information.Cambridge: MIT Press, 1981. More information at: http://books.google.be/books?id=IlAtGwAACAAJ&hl=en


Goodman, Nelson. Languages of Art, An Approach to a Theory of Symbols. New York: The Bobb’s-Merrill Company, 1968. More information and preview available at: http://books.google.be/books?id=e4a5-ItuU1oC&printsec=frontcover&dq=Languages+of+Art,+An+Approach+to+a+Theory+of+Symbols&ei=UIcySoXfIJbyygT81JWqBg&hl=en


Horgan, Terence. “Connectionism and the Philosophical Foundations of Cognitive Science.” in Metaphilosophy. Vol. 28, Nos. 1 & 2, January/April 1997, pp.1-30. Available online at: http://www3.interscience.wiley.com/journal/119168156/abstract?CRETRY=1&SRETRY=0


Jackendoff, Ray. Consciousness and the Computational Mind. London: MIT Press, 1987. More information at: http://books.google.be/books?id=ICEpAAAACAAJ&hl=en


Marieb, Elaine N., & Katja Hoehn. Human Anatomy & Physiology. London: Pearson, 2007. More information at: http://books.google.be/books?id=yOm1LXhEX40C&hl=en


McCormick, David A., Yousheng Shu, Andrea Hasenstaub, Alvaro Duque, & Yuguo Yu. "Modulation of intracortical synaptic potentials by presynaptic somatic membrane potential." Nature 441, 761-765 (8 June 2006), Published online 12 April 2006. Text available online at: http://www.nature.com/nature/journal/v441/n7094/full/nature04720.html


Mead, Carver. Analog VLSI and Neural Systems. Amsterdam: Addison-Wesley Publishing Company, 1989. More information available at: http://books.google.be/books?id=nr4HAAAACAAJ&hl=en


Moor, James H.. “Three Myths of Computer Science.” The British Journal for the Philosophy of Science, Vol. 29, No. 3, Sep., 1978. Text available online at: http://www.jstor.org/sici?sici=0007-0882(197809)29:32.0.CO;2-6


Müller, Berndt, & Joachim Reinhardt, Michael Thomas Strickland. Neural Networks: An Introduction. Berlin: Springer, 1995.

More information and preview available at: http://books.google.be/books?id=EFUzMYjOXk8C&hl=en


Sandberg, A. & Bostrom, N. (2008): Whole Brain Emulation: A Roadmap, Technical Report #20083, Future of Humanity Institute, Oxford University. Available online at: http://www.fhi.ox.ac.uk/Reports/2008-3.pdf


Siegelmann, Hava T. “Neural and Super-Turing Computing.” Minds and Machines. Vol. 13, Issue 1, February 2003. Available online at: http://www.springerlink.com/content/j7l1675237505m16/


Schonbein, Whit. "Cognition and the Power of Continuous Dynamical Systems." Mind and Machines, Springer, (2005) 15: pp. 57-71. More information at: http://www.springerlink.com/content/xtx321861761117r/?p=91ba33f01d0a4099b74dcad02b1a5860π=2


Labels continued here.

No comments:

Post a Comment