Abstract
Several methods exist to measure the group delay of a fiber Bragg grating. Here, we compare two such methods, namely the Hilbert transform (HT) of the device transmission spectrum and standard Fourier spectral interferometry. Numerical simulations demonstrate that both methods work not only for ideal, lossless devices but also for ones with realistic absorption. Experimental measurements show that the HT is more straightforward to implement and is significantly less susceptible to phase noise, which can significantly reduce the standard deviation between measurements.
© 2014 Optical Society of America
Full Article | PDF ArticleMore Like This
Johannes Skaar
Opt. Lett. 24(15) 1020-1022 (1999)
María R. Fernández-Ruiz, Lixian Wang, Alejandro Carballar, Maurizio Burla, José Azaña, and Sophie LaRochelle
Opt. Lett. 40(1) 41-44 (2015)
L. Poladian
Opt. Lett. 22(20) 1571-1573 (1997)