Abstract
We prove that the relative error in the result of a single-layer neural network is bounded by the product of the relative error in the analog representation of the interconnect matrix multiplied by the condition number of that matrix. We then show that the condition number is readily minimized in a way that does not affect system design substantially. Of the infinite number of equivalent transformations (so far as the condition number is concerned), some reduce the required dynamic range as well.
© 1988 Optical Society of America
PDF ArticleMore Like This
Yurui Qu, Ming Zhou, Erfan Khoram, Nanfang Yu, and Zongfu Yu
FW4C.5 CLEO: Fundamental Science (CLEO:FS) 2023
James A. Kottas and Cardinal Warde
THHH1 OSA Annual Meeting (FIO) 1988
Dean R. Collins, Jeffrey B. Sampsell, James M. Florence, P. Andrew Penz, and Michael T. Gately
ThB1 Spatial Light Modulators and Applications (SLM) 1988