Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Infernet: a neural network inference machine

Open Access Open Access

Abstract

The Infernet is an architecture for an inference machine that is composed of three sections: (1) an input processor that performs specific preprocessing on the inputs; (2) an output processor for generating outputs; and (3) a hierarchical inference processor which functions as the inference engine. Each section is comprised of several neural network layers. Time-domain pulse neurons are being considered as the basic cells within the layers. Temporal waves of neural activity which are supported by the interconnection topology are being considered for encoding and accessing high-level knowledge. An optical implementation for a neural layer has been designed using two optically addressed spatial light modulators (SLMs), an electrically addressed SLM, and an electronic controller.

© 1988 Optical Society of America

PDF Article
More Like This
Information storage and retrieval in a multilayer neural network model

Henri H. Arsenault and Bohdan Macukow
THJ6 OSA Annual Meeting (FIO) 1988

Silicon Photonics Neural Networks for Training and Inference

Bhavin J. Shastri, Matthew J. Filipovich, Zhimu Guo, Paul R. Prucnal, Sudip Shekhar, and Volker J. Sorger
NeW2D.2 Photonic Networks and Devices (Networks) 2022

Combined optical-microelectronic realization of neural network models

A. Agranat, C. F. Neugebauer, and Amnon Yariv
THJ5 OSA Annual Meeting (FIO) 1988

Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.