Abstract
Hopfield’s model [1] of the neural network describes how a set of very simple, identical computing elements (neurons) can collectively perform complex and powerful operations. The "hardware" with which the model is implemented are thresholding elements (the neurons) and analog weights interconnecting the neurons. Information is stored, according to the model, in the interconnections. Computation is performed by setting the state (on or off) of some of the neurons according to an external input and given this initial stimulus, the system converges to a stable state which we can consider as the result of the computation. Hopfield has shown that when information is stored in the interconnections in the manner he prescribed, the system generally converges to a stored state that is closest to the external input. In other words, the computation that is performed is a nearest neighbor search, a fundamental operation in pattern recognition.
© 1985 Optical Society of America
PDF ArticleMore Like This
Nabil Farhat and Demetri Psaltis
WB3 Optical Computing (IP) 1985
Shaoping Bian, Kebin Xu, and Jing Hong
THT28 OSA Annual Meeting (FIO) 1989
Richard Bijjani and P. K. Das
FB6 OSA Annual Meeting (FIO) 1987