Abstract
Digital signal processing (DSP)-based system is a common architecture for high-speed communication links. One of the most important components for DSP-based architectures is the analog-to-digital convertor (ADC). Most of the high-speed ADCs are implemented as time interleaved ADC (TI-ADC). One of the major challenges in TI-ADC is the timing mismatch between the parallel sub-ADCs. The calibration of this mismatch increases system cost and complexity. In this article we propose a new background calibration method based on existing clock and data recovery (CDR) circuits. The timing mismatch can be calibrated using a simple and unique CDR per sub-ADC. Using the existing phase error detector and pulse gain estimator of the regular CDR, the calibration circuit becomes very low-cost, requiring only a first order loop, and a slow numerically controlled oscillator (NCO). VCSEL-based lab experiments are performed, which demonstrate the proposed method efficiency, performance and robustness.
PDF Article
More Like This
Cited By
You do not have subscription access to this journal. Cited by links are available to subscribers only. You may subscribe either as an Optica member, or as an authorized user of your institution.
Contact your librarian or system administrator
or
Login to access Optica Member Subscription