Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Knowledge Distillation Applied to Optical Channel Equalization: Solving the Parallelization Problem of Recurrent Connection

Not Accessible

Your library or personal account may give you access

Abstract

To circumvent the non-parallelizability of recurrent neural network-based equalizers, we propose knowledge distillation to recast the RNN into a parallelizable feed-forward structure. The latter shows 38% latency decrease, while impacting the Q-factor by only 0.5 dB.

© 2023 The Author(s)

PDF Article  |   Presentation Video

Presentation Video

Presentation video access is available to:

  1. Optica Publishing Group subscribers
  2. Technical meeting attendees
  3. Optica members who wish to use one of their free downloads. Please download the article first. After downloading, please refresh this page.

Contact your librarian or system administrator
or
Log in to access Optica Member Subscription or free downloads

Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.