[Search Blog Here. Index-tags are found on the bottom of the left column.]
[Central Entry Directory]
[Emergentism, Entry Directory]
[Computation Entry Directory]
Analog & Entropy
in Neural Computation
Previously we noted that our brain's neurons communicate digital pulse signals. Despite these pulses being discrete, they were found to involve analog in three ways:
1) the frequency of the pulses varies continuously,
2) according to recent research, their amplitudes are continuously variant, and
3) analog noise is essential for learning (that is, for the modification and creation of nerve connections).
We will now add two more ways that our brain is analog. As well we discuss its entropy.
The neuron sums the signals coming from the dendritic synapses. Certain input channels have greater significance than others. It is said that they have more weight. During learning, these weights change, so to modify the way we compute information in an adaptive way. Carver Mead in Analog VLSI and Neural Systems writes that the input-weights can take-on any value from a continuous range. [And these values may change continuously through learning, partly by means of analog noise or entropy.] Also, each neuron has a certain threshold limit for its input. When the dendritic signals reach a certain level, the neuron fires a new distinct electrical impulse down its axon channel. Mead writes that the threshold values also may take-on continuously variable values as well. Moreover, "some neurons accept analog values for their inputs and generate analog outputs." (354) [At the end we quote specifically from the technical material.] Hence we add these additional ways our neurocomputations are analog:
4) input weights vary continuously, and
5) threshold values vary continuously.
Mead explains that the continuous analog variation is a feature of chaotic entropy, which assists learning.
Entropy factor: The typical input for the neural system comes from a natural environment that has a certain degree of disorder, or entropy. Entropy is a quantitative measure of the disorder or randomness of an environment. An important part of the function of the neural system is to be able to learn from "training" samples drawn from the environment. Under what conditions is learning possible? If we assume that the learning mechanism is local, as in the case of Hebbian learning, where the strength of a synapse is incremented or decremented according to the states of the neurons it connects, we can show that a relation holds between the entropy of the environment and the number of neuron inputs. The relation forces the number of neuron inputs to be at least equal to the entropy of the environment. (353-354, emphasis mine)The ability of neural systems to learn spontaneously a desired function from training samples is these systems' most important feature. (356b, emphasis mine)In the learning process, a huge number of sample patterns are generated at random from the environment and are sent to the system, 1 bit per neuron. The system uses this information to set its internal parameters and gradually to tune itself to this particular environment. Because of the system architecture, each neuron knows only its own bit and (at best) the bits of the neurons to which it is directly connected by a synapse. Hence, the earning rules are local: A neuron does not have the benefit of the entire global pattern that is being learned. (356c, emphasis mine)After the learning process has taken place, each neuron is ready to perform a function defined by what it has learned. The collective interaction of the functions of the neurons is what defines the overall function of the network. (356d)
Technical material for the analog factor:
A neuron, like any other logic device, makes a decision based on the values of its inputs. However, the decision-making mechanism in the case of neurons is analog; that is, it involves the processing of continuous-valued signals rather than of discrete-valued signals. For example, the function of certain neurons can be modeled as a threshold rule: The neuron will fire (will have output +1) if the weighted sum of its inputs exceeds an internal threshold; otherwise, it will not fire (will have output -1). Thus,whereis the output of neuron i,are the inputs to this neuron (and also are the outputs of other neurons),are the weights of the synaptic connections, andis the internal threshold. Although, in this equation, the inputs u1, u2, ... uk and the outputare all discrete (binary), output depends on the input through the analog parametersand. The function of most neurons is more sophisticated than is this simple threshold rule; some neurons accept analog values for their inputs and generate analog outputs. (354b-c)
From:
Mead, Carver. Analog VLSI and Neural Systems. Amsterdam: Addison-Wesley Publishing Company, 1989.
More information available at:
No comments:
Post a Comment