[Search Blog Here. Index-tags are found on the bottom of the left column.]
[Central Entry Directory]
[Emergentism, Entry Directory]
[Computation Entry Directory]
Entropy factor: The typical input for the neural system comes from a natural environment that has a certain degree of disorder, or entropy. Entropy is a quantitative measure of the disorder or randomness of an environment. An important part of the function of the neural system is to be able to learn from "training" samples drawn from the environment. Under what conditions is learning possible? If we assume that the learning mechanism is local, as in the case of Hebbian learning, where the strength of a synapse is incremented or decremented according to the states of the neurons it connects, we can show that a relation holds between the entropy of the environment and the number of neuron inputs. The relation forces the number of neuron inputs to be at least equal to the entropy of the environment. (353-354, emphasis mine)The ability of neural systems to learn spontaneously a desired function from training samples is these systems' most important feature. (356b, emphasis mine)In the learning process, a huge number of sample patterns are generated at random from the environment and are sent to the system, 1 bit per neuron. The system uses this information to set its internal parameters and gradually to tune itself to this particular environment. Because of the system architecture, each neuron knows only its own bit and (at best) the bits of the neurons to which it is directly connected by a synapse. Hence, the earning rules are local: A neuron does not have the benefit of the entire global pattern that is being learned. (356c, emphasis mine)After the learning process has taken place, each neuron is ready to perform a function defined by what it has learned. The collective interaction of the functions of the neurons is what defines the overall function of the network. (356d)
A neuron, like any other logic device, makes a decision based on the values of its inputs. However, the decision-making mechanism in the case of neurons is analog; that is, it involves the processing of continuous-valued signals rather than of discrete-valued signals. For example, the function of certain neurons can be modeled as a threshold rule: The neuron will fire (will have output +1) if the weighted sum of its inputs exceeds an internal threshold; otherwise, it will not fire (will have output -1). Thus,whereis the output of neuron i,are the inputs to this neuron (and also are the outputs of other neurons),are the weights of the synaptic connections, andis the internal threshold. Although, in this equation, the inputs u1, u2, ... uk and the outputare all discrete (binary), output depends on the input through the analog parametersand. The function of most neurons is more sophisticated than is this simple threshold rule; some neurons accept analog values for their inputs and generate analog outputs. (354b-c)
No comments:
Post a Comment