Abstract
We compare by simulation the performance of a maximum likelihood sequence
estimation (MLSE) receiver with that of a standard optimized threshold receiver
in a dispersion-managed multispan WDM transmission system at 10.7 Gbit/s in
the presence of fiber nonlinearities. We found that the use of MLSE receivers
helps to significantly increase the tolerance of the dispersion map design
and may improve overall system performance.
© 2008 IEEE
PDF Article
More Like This
Cited By
You do not have subscription access to this journal. Cited by links are available to subscribers only. You may subscribe either as an Optica member, or as an authorized user of your institution.
Contact your librarian or system administrator
or
Login to access Optica Member Subscription