perceptron

perceptron

n. a connected network of input and output nodes that acts as a useful model of associative neural networks. A simple (single-layer) perceptron might stand for two connected neurons, whereas more complicated perceptrons have additional hidden layers between input and output. The connections between the inputs and outputs can be weighted to model the desired output. The goal is to develop a theoretical understanding of the way neural connections process signals and form associations (memories). Back-propagation (backprop) algorithms describe the most common process by which the weightings between input and output are adjusted. The output is compared to a desired endpoint, and changes needed in the strengths of the connections are transmitted back through the perceptron.