Abstract
We present an adaptive neural-network model in which each neuron unit has two inputs, a single potential, and two outputs. This network model can exhibit general functionality, although it uses only nonnegative values for all signals and weights in the interconnection. In this dual-function network (DFN), each output is a distinct nonlinear function of the potential of the neuron. Each neuron-unit input receives a linear combination of both outputs from any set of neurons in the preceding layer. A learning algorithm for multilayer feedforward DFN's based on gradient descent of an error function and similar to backward-error-propagation learning is presented.
© 1990 Optical Society of America
PDF ArticleMore Like This
B. K. Jenkins, A. R. Tanguay, S. Piazzolla, G. C. Petrisor, and P. Asthana
MVV2 OSA Annual Meeting (FIO) 1990
C. Kyriakakis, Z. Karim, A. R. Tanguay, R. F. Cartland, A. Madhukar, S. Piazzolla, B. K. Jenkins, C. B. Kuznia, A. A. Sawchuk, and C. von der Malsburg
OTuB1 Optical Computing (IP) 1995
Peter J. de Groot and Robert J. Noll
THHH6 OSA Annual Meeting (FIO) 1988