Perceptrons
computer science
perceptrons, a type of artificial neural network investigated by Frank Rosenblatt, beginning in 1957, at the Cornell Aeronautical Laboratory at Cornell University in Ithaca, New York. Rosenblatt made major contributions to the emerging field of artificial intelligence (AI), both through experimental investigations of the properties of neural networks (using computer simulations) and through detailed mathematical analysis. Rosenblatt was a charismatic communicator, and there were soon many research groups in the United States studying perceptrons. Rosenblatt and his followers called their approach connectionism to emphasize the importance in learning of the creation and modification of connections between neurons. Modern researchers have adopted this term.
One of Rosenblatt’s contributions was to generalize the training procedure that Belmont Farley and Wesley Clark of the Massachusetts Institute of Technology had applied to only two-layer networks so that the procedure could be applied to multilayer networks. Rosenblatt used the phrase “back-propagating error correction” to describe his method. The method, with substantial improvements and extensions by numerous scientists, and the term back-propagation are now in everyday use in connectionism.
What's Your Reaction?