Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Models of parallel computation and optical computing

Not Accessible

Your library or personal account may give you access

Abstract

Theoretical models of computation (automata, Turing machines, random access machines, etc.) define not only what is computable but determine how much time and/or space is required for a computation. The parallel extensions of the sequential models usually offer significant speedup in algorithm execution (not all algorithms can be parallelized, however). Real parallel computer architectures based on these parallel computation models should exhibit the same algorithmic speedup. A complexity hierarchy can be defined for parallel computation models based on the space and/or time speedup of certain classes of algorithms. In particular the parallel shared memory models1 are some of the more powerful of the parallel computation models. Their close resemblance to actual computer architectures provide guidance for real architectural design and their massive interconnection requirements are well suited for implementation in optical computing architectures. In addition, the shared memory models have recently been shown to simulate deterministic Boltzmann machines and threshold-logic Turing machines. Thus optical machines based on these parallel models can potentially exhibit the computational power of the shared memory models. These parallel computation models and their implications for optical computing are discussed.

© 1986 Optical Society of America

PDF Article
More Like This
Optical architecture implications of parallel computation models

B. Keith Jenkins and C. Lee Giles
ML4 OSA Annual Meeting (FIO) 1986

A Massively Parallel Optical Computer

Ahmed Louri
MH2 Optical Computing (IP) 1989

Systems view of optical digital computers

Alan Huang
WE1 OSA Annual Meeting (FIO) 1986

Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.