Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Computational coherent imaging by rotating a cylindrical lens

Open Access Open Access

Abstract

To overcome longitudinal sampling rate fluctuation in axial multi-image computational imaging, an effortless and high-efficient optical scanning imaging system via the rotation of single cylindrical lens (RSCL) is proposed for reconstructing the amplitude and phase information of sample. Here the cylinder is a non-axial-symmetry phase modulator for generating diffracted intensity patterns. To determine the explicit rotation angle as a core parameter in the perfect reconstruction of the light field, two-step Radon transform (TsRT) is designed to obtain accurate rotation status of the cylindrical lens with diffraction patterns at the focal plane. All rotation operations are at a lateral plane for keeping the same scale in this multiple parameter computational imaging system. As a kind of scanning imaging approach, the experiment of RSCL is greatly simplified.

© 2018 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

As a powerful framework to overcome the limitations of physical optics or realize superior capabilities, computational imaging has emerged to relax the constraints on the imaging hardware so as to increase the field of view [1,2], improve the imaging resolution [3–5] or enhance the noise robustness [6,7], like coherent diffractive imaging [7–12], coded-aperture imaging [4,13–15] and super-resolution imaging [16–20]. Generally, it is based on the illumination mode innovation. For example, the introduction of green fluorescent protein can fully exploit the states of different spatial locations, thus inspiring the super-resolution microscopes such as PALM [21] and STORM [22]; the multi-angle illumination or high-frequency carrier wave illumination can be utilized to realize synthetic aperture imaging, like SIM [23–26], or FPM [27]. At the same time, algorithms compatible with these computational imaging systems have to be designed to reconstruct the sharp sample image [8,23,28–30].

As a rotationally non-symmetric optical element, cylindrical lens plays an important role in imaging and measurement [31–40]. For instance, in STORM [22], a cylindrical lens is inserted into the optical path to create two slightly different focal planes. In 3D imaging [41], a cylindrical lens can generate a thin laminar beam as illumination source. In a modified confocal microscope imaging scheme by a line scanning based on cylindrical lens [42], the imaging is thereby significantly accelerated. In a phase imaging system combined with multi-stage phase retrieval algorithm [28], six cylindrical lenses are mounted to produce the phase-diversity patterns in gyrator domains. Considering the complicated control of six cylindrical lenses, a single cylindrical lens is adopted to realize phase retrieval here.

In this work, an optical line scanning imaging technique is proposed by rotating a single cylindrical lens perpendicular to the optical axis. A set of diffraction patterns is recorded thereafter and fed into a parallel phase retrieval algorithm with the calculated rotation angles to reconstruct the complex amplitude at the object plane. Here the rotation angle is precisely computed from the measured patterns with a two-step Radon transform (TsRT), which is specially designed for higher estimation accuracy. Compared with other axial-scanning multi-distance phase imaging systems [7,9,12], the advantage of our scheme is that the diffraction distance is fixed. Thus, the data acquisition speed could be faster and the spatial sampling rate is fixed as a constant, which is conducive to simplify the algorithm design for light field reconstruction.

2. Computational imaging method

Figure 1 illustrates the optical imaging system designed with the rotation of a single cylindrical lens (RSCL), represented by the blue and translucent part, whose rotation can be controlled by a precision rotation mount. The symbol α denotes the clockwise rotation angle of the cylindrical lens, which is the primary parameter in this model. d is the diffraction distance between sample and the front surface of cylindrical lens and f is the focal length of the cylindrical lens. At the end, a scientific CMOS (sCMOS) camera is utilized to digitalize and record the transformed patterns.

 figure: Fig. 1

Fig. 1 RSCL and its flowchart of data processing. (a) Layout of diffraction imaging system with RSCL. (b) Data processing flowchart of the computational imaging based on the rotation of single cylinder lens.

Download Full Size | PDF

We apply the angle spectrum theory and Fourier transform to deduce the complex amplitude function on the sensor of the sCMOS camera. As shown in Fig. 1(a), after the diffraction with a length of d, the wave field on the front surface of cylindrical lens can be described as:

Ud(x,y)=D{Uobj(x,y)}=F1{F[Uobj(x,y)]exp[j2πdλ1(λu)2(λv)2]},
where Uobj(x,y) denotes the object function. The symbol D represents the angular spectrum transform. The operators F and F1 are the Fourier transform and its inverse respectively. λ is the illumination wavelength.

With downstream of the diffraction given in Eq. (1), the wave field will be modulated by the cylinder lens and diffracted again with a length off. Therefore, the intensity pattern recorded at sCMOS screen is:

Iα(x1,y1)=|D{Ud(x,y)Rα(x,y)}|2,
Rα(x,y)=exp[jπλf(xcosα+ysinα)2],
where Rα corresponds to the phase modulation function of cylinder lens rotated clockwise by the degree αin the plane perpendicular to the optical axis. The rotation is controlled by a precision rotation mount in this system and manual adjustment is also possible to alter the rotation situation of the lens. The rotation error of hard turntable is implied from the backlash. Here the patterns on sCMOS are obtainable by two steps of Fresnel diffraction and phase modulator of the lens from the information of the sample. Moreover, the images are generated with the same scale, since the changed phase modulations are in a common lateral plane.

The rotation angle α is the key parameter for a successful complex amplitude reconstruction. However, it is hard to directly measure the rotation angle. Thus, we put forward a precise angle estimation method based on Radon transform [43], named as two-step Radon transform (TsRT). Mathematically, Radon transform is defined as line integral of a two-dimensional (2D) function on the plane and its result is a function defined in the space of lines on the plane. For example, taking Radon transform of a line in 2D Cartesian coordinates, the result reaches to the climax when the integral is along the line direction. Thus, the maximum value of a line’s Radon transform indicates the direction of the line. Based on the principle, Radon transform, as a powerful and high-efficient mathematical tool, is applied to detect line patterns in many fields. In addition, on the basis of cylindrical lens’s phase modulate, the light field distribution on the back focal plane of cylindrical lens is one-dimensional (1D) Fourier transform of the one on its front focal plane along the direction perpendicular to the dash line denoted by α. That means the cylindrical lens leaves the image unaltered in the direction parallel to the dash line but compresses it in the direction perpendicular to this line. Therefore, α can be calculated from the line pattern captured by sCMOS, whose inclination degree corresponds to the rotation degree of the cylindrical lens. In order to introduce TsRT clearly, a sample diffractive picture (seen dashed square frame in the inset of Fig. 2(a)) is used as an example in Fig. 2(a). Specific steps of TsRT are as follows. (I) Rough scanning. First Radon transform scans the measured pattern with a large step of 1 in the range of [0,180), as is illustrated by the green semicircle in Fig. 2(a). The sinogram is shown in Fig. 2(b), where the angle corresponding to the maximum value is marked as α0 and it is a rough estimation of α with a range of error (0.5,+0.5). (II) Fine scanning. With a finer step of 0.001 in the range of (α00.5,α0+0.5), a much more accurate angle estimate α1 can be obtained. By applying TsRT to all measured diffraction patterns, a set of rotation angle α of cylindrical lens is calculated and can be used in the following numerical reconstruction.

 figure: Fig. 2

Fig. 2 Schematic diagram of TsRT and the result of Radon transform. (a) Schematic diagram of rough scanning (green sector) and fine scanning (red sector). Background sample is one diffractive pattern recorded by sCMOS camera. (b) The scanning calculating result of rough scanning by Radon transform.

Download Full Size | PDF

The data processing flowchart is shown in Fig. 1(b), consisting of rotation angle estimation and multi-image phase retrieval. In the reconstruction process, a parallel iterative multi-image phase retrieval algorithm [8] is adopted to reconstruct the complex amplitude of sample due to its fast convergence speed and high accuracy.

Here it has to be emphasized that the measured patterns are consecutively utilized to evaluate the rotation angle and constrain the phase retrieval iteration. Thus, there is no need to acquire extra data set or rely on additional instruments for the parameter gauge. For the multi-image phase retrieval algorithm used in our scheme, it can be seen as the parallel synthesis of multiple conventional single-shot phase retrieval units and takes the arithmetic average of complex amplitude estimates in spatial domain from these units. The iteration will be halted and the wave field will be finally reconstructed after some preset threshold is satisfied.

3. Results of simulation and experiment

As a kind of computational imaging, RSCL is demonstrated by simulation and experiment. A 256×256-pixel binary resolution chart (USAF 1951) and campus landscape picture (seen in Figs. 3(a) and 3(b)) serve as the amplitude and phase of the object function Uobj respectively. In both simulation and experiment, d=f=100mm is chosen and the wavelength is 532 nm.

 figure: Fig. 3

Fig. 3 Simulation sample images and LMSE curves: (a) binary resolution chart and (b) a campus landscape picture, which are served as the amplitude and phase of the input complex amplitude, respectively. (c) LMSE curve.

Download Full Size | PDF

3.1 Known rotation degrees

Firstly, we consider an ideal case that the rotation degree is exactly known and applied to the reconstruction process. In RSCL, each pattern in accordance with different rotation angles of cylindrical lens is generated by the model shown in Fig. 1. Four sets of simulated patterns with different numbers of rotation degrees are tested and their corresponding angles are displayed in Table 1.

Tables Icon

Table 1. Assigned values of rotation degree in RSCL

Additionally, in order to quantitatively evaluate the quality of reconstructed amplitude B(x,y) with the reference of sample image A(x,y), the normalized correlation coefficient (NCC) is computed by:

NCC=CA,BCA,ACB,B
where CA,B is the covariance matrix, and is defined as:
CA,B=1MNx=1My=1N[A(x,y)A¯(x,y)][B(x,y)B¯(x,y)].
The value of NCC ranges from [0, 1]. The larger the NCC is, the better the reconstructed image is.

Furthermore, in order to quantitatively evaluate the convergence speed of the parallel phase retrieval algorithm, the logarithm of mean square error (LMSE) function between the A(x,y)and the recovered image B(x,y) is calculated as:

LMSE=log10{1MNm=1Mn=1N[A(m,n)B(m,n)]2}.

The LMSE curves of four sets of simulations are shown in the Fig. 3(c), which shows that the convergence is accelerated quickly through increasing the number of images (number of rotations). In Fig. 4, some reconstructed images obtained with different numbers of measured images and iterations are shown. For clarity, the NCC coefficients are labeled in red above the reconstructed images. Noticeably, almost perfect binary resolution chart (NCC = 0.9995) is reconstructed when six rotation measurements are utilized with 60 iterations (seen at the third row and the fourth column of Fig. 4). The result can be further improved if the number of rotations or iterations is increased again.

 figure: Fig. 4

Fig. 4 Reconstructed binary resolution chart images with four sets of simulations including different numbers rotation degrees and iterations.

Download Full Size | PDF

3.2 Inaccurate rotation degrees

In our scheme, the accuracy of rotation degree plays a vital role in reconstruction. Lack of the ultra-precision rotation control equipment, the angle error is inevitable in our experiment. To assess the effect of degree error on the reconstructed image quality, five sets of different degree offsets are manually added to eight accurate rotation degrees in the reconstruction. For purpose of maximizing the error extent, eight explicit rotation degrees are all added with the same degree offset. Reconstructions after 200 iterations are shown in Fig. 5. The quality of retrieved amplitude and phase gradually declines when the degree offset increases from 0.05 to 0.25. From Fig. 5, the reconstructed result is unacceptable when the degree offset is more than 0.1, judged with the naked eyes. Therefore, the rotation degree error should be limited within the range of(0.10,+0.10), which is regarded as a qualitative criterion to the rotation angle estimation.

 figure: Fig. 5

Fig. 5 Reconstructed amplitude and phase with different rotation degree offsets.

Download Full Size | PDF

Admittedly, owing to the hysteresis error on the gear, the rotation degree used for reconstruction certainly contains the error, sometimes even more than 0.10. In that case, the quality of reconstructed image will be deeply hampered, as is analyzed in Fig. 5. To solve the problem, TsRT is proposed to calculate the rotation degree with high precision. Simulation and experiment results have proved that TsRT has a high enough precision to satisfy the requirement of reconstruction. To verify the effectiveness of TsRT method, Monte Carlo simulation with the rotation degree ranging in (0o,180o)is taken 1,800 times. Degree calculated errors are mainly less than 0.1 except the quite narrowed range near to 90 from Fig. 6(b). However, the angle ranges with abnormal errors can be eluded easily as recording data. The average degree estimation error compared with the truth does not exceed 0.0085, which is bigger than the fine step size. The difference is conjectured by the quantization error and the transverse sampling rate limitation. Among simulation test, nine random explicit rotation degrees are chosen within (0,90), whose amplitude distributions are shown in Fig. 6(a) and whose absolute values of calculated rotation degree error are labeled at top left with white. In simulation experiment, reconstructed results further confirm the superior availability of TsRT, which contributes to a very high quality of reconstructed patterns.

 figure: Fig. 6

Fig. 6 Diffraction patterns and calculated degree error by TsRT method. (a) Amplitude distribution (background patterns) of nine random rotation degrees in RSCL and calculated absolute degree error (white font number) by TsRT method. (b) the distribution retrieved errors of rotation degree by TsRT.

Download Full Size | PDF

To further demonstrate the practicability of TsRT, its estimated angles are fed into the phase retrieval algorithm to reconstruct the wave field. Here, four sets of measured patterns keep the same as the simulation experiment in Section 3.1. The convergence and reconstructions are shown by LMSE curve and NCC in Fig. 7. For perspicuity, NCC is labeled above the reconstructed amplitude in the same color with the LMSE curve. After about 50 iterations with 8 rotation degrees, the sample has been perfectly reconstructed (NCC = 1.0000), shown in the magenta curve of Fig. 7. It clearly indicates that the calculated rotation degrees by TsRT has a high enough precision that the sequential iterative algorithm could robustly converge to the exact solution within limited computation load.

 figure: Fig. 7

Fig. 7 LMSE curves of four sets of random rotation degree and corresponding reconstructed results.

Download Full Size | PDF

3.3 Experiment results

The layout of experiment configuration is shown in Fig. 1. A fiber laser is used to illuminate the sample. A plano-convex round cylindrical lens (f = 100 mm, N-BK7, THORLABS) is rotated by a motorized precision rotation stage (PRM1Z8, THORLABS) and the corresponding diffraction patterns are recorded by a sCMOS camera (Pco.edge 4.2 LT, 6.5 μm, Point Gray). Three different samples, a square-shaped transparent sample (truncated from the USAF 1951 negative resolution chart) and two cross-shaped arrows of different sizes are utilized and 18 diffraction patterns with different rotation degrees are recorded for wave field reconstruction.

After 100 iterations, the reconstructed amplitude and phase of three samples are shown in Fig. 8. The sample square is shown in Fig. 8(a) and different diffraction patterns with different rotation degrees are shown in Fig. 8(b). The reconstructed amplitude and phase are shown in Figs. 8(c) and 8(d), in which amplitude patterns has been normalized. Here the edges of reconstructed square are quite clear. The phase is completely consistent with amplitude, which confirms that RSCL has an excellent performance for computational imaging. Besides, it is worth noting that to our amplitude experiment samples, their reconstructed phase information appears obvious fluctuation with regular form shown in Figs. 8(d), 8(g) and 8(i). Here, the retrieved phase map is the phase of the multiplication of the illumination function and the object function. Considering the object’s phase is absent, the phase distribution roots in the illumination function. In our optical setup, a micro-aperture and a collimating lens are adopted to generate the plane wave illumination. Due to the possible misalignment of the apparatus, the final wavefront of illumination is spherical, resulting in the phase wrapping in the transparent part of the amplitude object. To verify the point, the phase unwrapping is executed along the white solid line in Fig. 8(i) and the result is shown in its inset box. The parabolic curve profile demonstrates the spherical wavefront of illumination function. On account of background disturbance that the opaque part of the amplitude object corresponds to zero points with arbitrary phase, it’s a huge challenge to implement perfect phase-unwrapping of two-dimensional data. Thus, a random phase value could be taken as correct mathematically. According to the analysis above, large phase fluctuations of reconstructed phase maps come from non-plane-wave illumination and phase wrapping.

 figure: Fig. 8

Fig. 8 Reconstructed experiment results. (a) Square-shape sample. (b) Recorded diffraction patterns by sCMOS camera. (c, d) Reconstructed amplitude and phase of square-shaped sample. (e, h) two cross-shaped arrow samples of different sizes. (f, g) and (i, j) Reconstructed amplitude and phase of two arrow samples, in which the white bars correspond to 650 μm.

Download Full Size | PDF

On the other hand, in order to further attest that RSCL has a stronger ability in computational imaging, its results are compared with the previous multi-distance method [7,9]. Two cross-shaped arrows of different sizes are the same sample used in [7,9]. The edge of reconstructions in our scheme is much clearer compared with the results in Fig. 9 in [7] and Fig. 8 in [9]. It is obvious that RSCL has a greater advantage than multi-distance imaging.

 figure: Fig. 9

Fig. 9 Filtered amplitude patterns of bigger size cross-shaped arrow processed by four different filters: (a) 3D figure of red square frame in Fig. 7(i). (b) Gaussian filter, (d) Mean filter, (f) Median filter and (h) BM3D. (c, e, g, i) the corresponding local 3D figures of red square areas in (b, d, f, h). The white bars correspond to 650 μm.

Download Full Size | PDF

In experiment, owing to focusing of cylindrical lens, overexposure happens easily. In order to avoid overexposure, the exposure time is set lower, which brings about signal-to-noise ratio reduce and the intensity of laser source is simultaneously reduced as much as possible. It can explain why the reconstructed results have been contaminated by noises shown in Fig. 8. To demonstrate the effect of noise clearly, a square area marked with red solid square frame shown in Fig. 8(i), is chosen to plot local three-dimensional (3D) shape and shown in Fig. 9(a). The two small solid magenta rectangle areas in the object and background are chosen to calculate variance. Here the variance of pixel values in regions A0 and B0 shown in Fig. 8(i) are 486.6898 and 118.1704, respectively. No matter 3D figure but calculated variance both show the reconstructed pattern have had terrible noise interference.

To find out a suitable filtering method to alleviate the effect of noise, four popular schemes (Gaussian, mean, median filter and BM3D [44]) are considered and tested. Taking the amplitude pattern in Fig. 8(i) as an example, the filtered amplitudes are shown in Figs. 9(b), 9(d), 9(f) and 9(h). For further comparison, similar to Fig. 9(a), the local square 3D filtered amplitudes are demonstrated. Eight corresponding locations magenta rectangle areas, which are marked A1~A4 and B1~B4 in Figs. 9(b), 9(d), 9(f) and 9(h), are selected to calculated variance. Calculated results are listed in Table 2. Through calculated variance, Gaussian and mean filter have little difference in variance and after filtering. The quality of image has a certain degree improvement. For median filtering, it shows much smaller calculated variance, whereas it has lower imaging contrast. Furthermore, variance from BM3D is small enough and contrast is quite high. In RSCL, BM3D is adopted to address the noise problem.

Tables Icon

Table 2. Variance of ten different magenta rectangle areas

One problem worth discussing is the source of noise. In experiment, it is to avoid overexposure that the intensity of laser source is reduced as much as possible. Compared with lensless diffraction imaging [9,45], focusing of the cylindrical lens makes feeble optical field of input plane enhance nonlinearly at sCMOS plane, which is much easier to overexposure. In this situation, noises will play a bigger role in experiment and similar single-lens coherent diffraction imaging, in which noise models [7] have been discussed in detail.

In order to confirm its superiority, two more simulation experiments are conducted. Firstly, under the same optical system, a 256×256 binary resolution chart (USAF 1951) and eight diversified diffraction patterns are used in two methods. For the multi-distance method, the inclination degree of cylindrical lens is fixed at 60 and the defocused depth of the recorded plane is limited within 2.0 cm. Other parameters are kept the same with the previous simulation. LMSE curves and six reconstructed patterns when the iteration is taken as 20, 60 and 100 by two methods are shown in Fig. 10. Obviously, after the same iterations, the convergence speed and the quality of reconstructed patterns for the multi-distance method are both heavily degenerated owing to the spatial sampling rate variation brought by the distance change.

 figure: Fig. 10

Fig. 10 LMSE curves of RSCL and multi-distance methods.

Download Full Size | PDF

4. Conclusion

We propose an optical scanning imaging system with single cylindrical lens rotation to reconstruct the complex amplitude of sample by a parallel-computing multi-image phase retrieval algorithm. Two different situations (known and estimated rotation degrees) are investigated in both simulation and experiment. The wave field reconstruction can be achieved in high fidelity and fast convergence when the rotation degree is exactly known. For the blind rotation degrees, TsRT is designed to calculate the rotation angles from the measurements and its results can fully meet the requirement of reconstruction. Comparing multi-distance computational imaging, its lateral sampling rate is fixed, which simplified the algorithm design. RSCL has just a single rotated lens to modulate the diffraction pattern and is more convenient by contrast to the gyrator transform. Besides, RSCL is also a novel approach to acquire redundant optical information, which provides a possible to recover the multi-dimensional data of light field, such as intensity, phase and polarization.

Funding

National Natural Science Foundation of China (NSFC) (61575055, 61575053, 11474077, 11404083).

Acknowledgments

The authors thank the anonymous reviewers for their helpful suggestions and comments.

References and links

1. X. Pan, C. Liu, Q. Lin, and J. Zhu, “Ptycholographic iterative engine with self-positioned scanning illumination,” Opt. Express 21(5), 6162–6168 (2013). [CrossRef]   [PubMed]  

2. H. M. L. Faulkner and J. M. Rodenburg, “Movable aperture lensless transmission microscopy: a novel phase retrieval algorithm,” Phys. Rev. Lett. 93(2), 023903 (2004). [CrossRef]   [PubMed]  

3. J. Sun, Q. Chen, Y. Zhang, and C. Zuo, “Efficient positional misalignment correction method for Fourier ptychographic microscopy,” Biomed. Opt. Express 7(4), 1336–1350 (2016). [CrossRef]   [PubMed]  

4. J. Sun, Y. Zhang, C. Zuo, Q. Chen, S. Feng, Y. Hu, and J. Zhang, “Coded multi-angular illumination for Fourier ptychography based on Hadamard codes,” Proc. SPIE 9524, 95242C (2015).

5. G. Zheng, R. Horstmeyer, and C. Yang, “Wide-field, high-resolution Fourier ptychographic microscopy,” Nat. Photonics 7(9), 739–745 (2013). [CrossRef]   [PubMed]  

6. X. Shao, X. Dai, and X. He, “Noise robustness and parallel computation of the inverse compositional Gauss-Newton algorithm in digital image correlation,” Opt. Lasers Eng. 71, 9–19 (2015). [CrossRef]  

7. C. Shen, X. Bao, J. Tan, S. Liu, and Z. Liu, “Two noise-robust axial scanning multi-image phase retrieval algorithms based on Pauta criterion and smoothness constraint,” Opt. Express 25(14), 16235–16249 (2017). [CrossRef]   [PubMed]  

8. Z. Liu, C. Guo, J. Tan, Q. Wu, L. Pan, and S. Liu, “Iterative phase-amplitude retrieval with multiple intensity images at output plane of gyrator transforms,” J. Opt. 17(2), 025701 (2015). [CrossRef]  

9. C. Guo, C. Shen, J. Tan, X. Bao, S. Liu, and Z. Liu, “A robust multi-image phase retrieval,” Opt. Lasers Eng. 101, 16–22 (2018). [CrossRef]  

10. C. Guo, C. Wei, J. Tan, K. Chen, S. Liu, Q. Wu, and Z. Liu, “A review of iterative phase retrieval for measurement and encryption,” Opt. Lasers Eng. 89, 2–12 (2017). [CrossRef]  

11. C. Shen, J. Tan, C. Wei, and Z. Liu, “Coherent diffraction imaging by moving a lens,” Opt. Express 24(15), 16520–16529 (2016). [CrossRef]   [PubMed]  

12. C. Guo, Q. Li, C. Wei, J. Tan, S. Liu, and Z. Liu, “Axial multi-image phase retrieval under tilt illumination,” Sci. Rep. 7(1), 7562 (2017). [CrossRef]   [PubMed]  

13. A. M. Farber and J. G. Williams, “Coded-aperture Compton camera for Gamma-ray imaging,” Epj. Web Conf. 106, 05003 (2016).

14. M. N. Lakshmanan, R. E. Morris, J. A. Greenberg, E. Samei, and A. J. Kapadia, ““Coded aperture coherent scatter imaging for breast cancer detection: a Monte Carlo evaluation,” Medical Imaging 2016,” Phys. Med. Imag. 9783, 978321 (2016).

15. V. Katkovnik, I. Shevkunov, N. V. Petrov, and K. Egiazarian, “Computational super-resolution phase retrieval from multiple phase-coded diffraction patterns: simulation study and experiments,” Optica 4(7), 786–794 (2017). [CrossRef]  

16. M. G. L. Gustafsson, “Surpassing the lateral resolution limit by a factor of two using structured illumination microscopy,” J. Microsc. 198(2), 82–87 (2000). [CrossRef]   [PubMed]  

17. S. W. Hell and J. Wichmann, “Breaking the diffraction resolution limit by stimulated emission: stimulated-emission-depletion fluorescence microscopy,” Opt. Lett. 19(11), 780–782 (1994). [CrossRef]   [PubMed]  

18. M. G. L. Gustafsson, “Nonlinear structured-illumination microscopy: wide-field fluorescence imaging with theoretically unlimited resolution,” Proc. Natl. Acad. Sci. U.S.A. 102(37), 13081–13086 (2005). [CrossRef]   [PubMed]  

19. T. A. Klar, S. Jakobs, M. Dyba, A. Egner, and S. W. Hell, “Fluorescence microscopy with diffraction resolution barrier broken by stimulated emission,” Proc. Natl. Acad. Sci. U.S.A. 97(15), 8206–8210 (2000). [CrossRef]   [PubMed]  

20. J. Zhang, E. Moradi, M. G. Somekh, and M. L. Mather, “Label-free, high resolution, multi-modal light microscopy for discrimination of live stem cell differentiation status,” Sci. Rep. 8(1), 697 (2018). [CrossRef]   [PubMed]  

21. E. Betzig, G. H. Patterson, R. Sougrat, O. W. Lindwasser, S. Olenych, J. S. Bonifacino, M. W. Davidson, J. Lippincott-Schwartz, and H. F. Hess, “Imaging intracellular fluorescent proteins at nanometer resolution,” Science 313(5793), 1642–1645 (2006). [CrossRef]   [PubMed]  

22. M. J. Rust, M. Bates, and X. Zhuang, “Sub-diffraction-limit imaging by stochastic optical reconstruction microscopy (STORM),” Nat. Methods 3(10), 793–796 (2006). [CrossRef]   [PubMed]  

23. J. M. Rodenburg and H. M. L. Faulkner, “A phase retrieval algorithm for shifting illumination,” Appl. Phys. Lett. 85(20), 4795–4797 (2004). [CrossRef]  

24. R. Heintzmann and C. Cremer, “Laterally modulated excitation microscopy: improvement of resolution by using a diffraction grating,” Proc. SPIE 3569, 185–196 (1999). [CrossRef]  

25. D. Dan, M. Lei, B. Yao, W. Wang, M. Winterhalder, A. Zumbusch, Y. Qi, L. Xia, S. Yan, Y. Yang, P. Gao, T. Ye, and W. Zhao, “DMD-based LED-illumination super-resolution and optical sectioning microscopy,” Sci. Rep. 3(1), 1116 (2013). [CrossRef]   [PubMed]  

26. J. Qian, M. Lei, D. Dan, B. Yao, X. Zhou, Y. Yang, S. Yan, J. Min, and X. Yu, “Full-color structured illumination optical sectioning microscopy,” Sci. Rep. 5(1), 14513 (2015). [CrossRef]   [PubMed]  

27. S. Dong, P. Nanda, K. Guo, J. Liao, and G. Zheng, “Incoherent Fourier ptychographic photography using structured light,” Photon. Res. 3(1), 19–23 (2015). [CrossRef]  

28. J. A. Rodrigo, H. Duadi, T. Alieva, and Z. Zalevsky, “Multi-stage phase retrieval algorithm based upon the gyrator transform,” Opt. Express 18(2), 1510–1520 (2010). [CrossRef]   [PubMed]  

29. G. Z. Yang, B. Z. Dong, B. Y. Gu, J. Y. Zhuang, and O. K. Ersoy, “Gerchberg-Saxton and Yang-Gu algorithms for phase retrieval in a nonunitary transform system: a comparison,” Appl. Opt. 33(2), 209–218 (1994). [CrossRef]   [PubMed]  

30. G. Pedrini, W. Osten, and Y. Zhang, “Wave-front reconstruction from a sequence of interferograms recorded at different planes,” Opt. Lett. 30(8), 833–835 (2005). [CrossRef]   [PubMed]  

31. A. E. Attard, “Matrix optical analysis of skew rays in mixed systems of spherical and orthogonal cylindrical lenses,” Appl. Opt. 23(16), 2706–2709 (1984). [CrossRef]   [PubMed]  

32. X. Fu, F. Duan, J. Jiang, T. Huang, L. Ma, and C. Lv, “Astigmatism-corrected echelle spectrometer using an off-the-shelf cylindrical lens,” Appl. Opt. 56(28), 7861–7868 (2017). [CrossRef]   [PubMed]  

33. L. Li, C. Kuang, D. Luo, and X. Liu, “Axial nanodisplacement measurement based on astigmatism effect of crossed cylindrical lenses,” Appl. Opt. 51(13), 2379–2387 (2012). [CrossRef]   [PubMed]  

34. C. J. R. Sheppard, “Cylindrical lenses--focusing and imaging: a review [Invited],” Appl. Opt. 52(4), 538–545 (2013). [CrossRef]   [PubMed]  

35. D. Di Battista, D. Ancora, H. S. Zhang, K. Lemonaki, E. Marakis, E. Liapis, S. Tzortzakis, and G. Zacharakis, “Tailored light sheets through opaque cylindrical lenses,” Optica 3(11), 1237–1240 (2016). [CrossRef]  

36. J. Courtial and M. J. Padgett, “Performance of a cylindrical lens mode converter for producing Laguerre-Gaussian laser modes,” Opt. Commun. 159(1–3), 13–18 (1999). [CrossRef]  

37. H. Liu, L. Yang, Y. Guo, R. Guan, and J. Zhu, “Precise calibration of linear camera equipped with cylindrical lenses using a radial basis function-based mapping technique,” Opt. Express 23(3), 3412–3426 (2015). [CrossRef]   [PubMed]  

38. L. Pang, U. Levy, K. Campbell, A. Groisman, and Y. Fainman, “Set of two orthogonal adaptive cylindrical lenses in a monolith elastomer device,” Opt. Express 13(22), 9003–9013 (2005). [CrossRef]   [PubMed]  

39. X. Fang, Z. Kuang, P. Chen, H. Yang, Q. Li, W. Hu, Y. Lu, Y. Zhang, and M. Xiao, “Examining second-harmonic generation of high-order Laguerre-Gaussian modes through a single cylindrical lens,” Opt. Lett. 42(21), 4387–4390 (2017). [CrossRef]   [PubMed]  

40. L. Sun and X. Pu, “A novel visualization technique for measuring liquid diffusion coefficient based on asymmetric liquid-core cylindrical lens,” Sci. Rep. 6(1), 28264 (2016). [CrossRef]   [PubMed]  

41. O. E. Olarte, J. Andilla, E. J. Gualda, and P. Loza-Alvarez, “Light-sheet microscopy: a tutorial,” Adv. Opt. Photonics 10(1), 111–179 (2018). [CrossRef]  

42. K. B. Im, S. Han, H. Park, D. Kim, and B. M. Kim, “Simple high-speed confocal line-scanning microscope,” Opt. Express 13(13), 5151–5156 (2005). [CrossRef]   [PubMed]  

43. J. Radon, “On the determination of functions from their integral values along certain manifolds,” IEEE Trans. Med. Imaging 5(4), 170–176 (1986). [CrossRef]   [PubMed]  

44. A. Danielyan, V. Katkovnik, and K. Egiazarian, “BM3D frames and variational image deblurring,” IEEE Trans. Image Process. 21(4), 1715–1728 (2012). [CrossRef]   [PubMed]  

45. C. Shen, C. Guo, J. Tan, S. Liu, and Z. Liu, “Complex amplitude reconstruction by iterative amplitude-phase retrieval algorithm with reference,” Opt. Lasers Eng. 105, 54–59 (2018). [CrossRef]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (10)

Fig. 1
Fig. 1 RSCL and its flowchart of data processing. (a) Layout of diffraction imaging system with RSCL. (b) Data processing flowchart of the computational imaging based on the rotation of single cylinder lens.
Fig. 2
Fig. 2 Schematic diagram of TsRT and the result of Radon transform. (a) Schematic diagram of rough scanning (green sector) and fine scanning (red sector). Background sample is one diffractive pattern recorded by sCMOS camera. (b) The scanning calculating result of rough scanning by Radon transform.
Fig. 3
Fig. 3 Simulation sample images and LMSE curves: (a) binary resolution chart and (b) a campus landscape picture, which are served as the amplitude and phase of the input complex amplitude, respectively. (c) LMSE curve.
Fig. 4
Fig. 4 Reconstructed binary resolution chart images with four sets of simulations including different numbers rotation degrees and iterations.
Fig. 5
Fig. 5 Reconstructed amplitude and phase with different rotation degree offsets.
Fig. 6
Fig. 6 Diffraction patterns and calculated degree error by TsRT method. (a) Amplitude distribution (background patterns) of nine random rotation degrees in RSCL and calculated absolute degree error (white font number) by TsRT method. (b) the distribution retrieved errors of rotation degree by TsRT.
Fig. 7
Fig. 7 LMSE curves of four sets of random rotation degree and corresponding reconstructed results.
Fig. 8
Fig. 8 Reconstructed experiment results. (a) Square-shape sample. (b) Recorded diffraction patterns by sCMOS camera. (c, d) Reconstructed amplitude and phase of square-shaped sample. (e, h) two cross-shaped arrow samples of different sizes. (f, g) and (i, j) Reconstructed amplitude and phase of two arrow samples, in which the white bars correspond to 650 μm.
Fig. 9
Fig. 9 Filtered amplitude patterns of bigger size cross-shaped arrow processed by four different filters: (a) 3D figure of red square frame in Fig. 7(i). (b) Gaussian filter, (d) Mean filter, (f) Median filter and (h) BM3D. (c, e, g, i) the corresponding local 3D figures of red square areas in (b, d, f, h). The white bars correspond to 650 μm.
Fig. 10
Fig. 10 LMSE curves of RSCL and multi-distance methods.

Tables (2)

Tables Icon

Table 1 Assigned values of rotation degree in RSCL

Tables Icon

Table 2 Variance of ten different magenta rectangle areas

Equations (6)

Equations on this page are rendered with MathJax. Learn more.

U d ( x , y )=D{ U obj ( x,y ) } = F 1 { F[ U obj ( x,y ) ]exp[ j 2πd λ 1 ( λu ) 2 ( λv ) 2 ] },
I α ( x 1 , y 1 )= | D{ U d ( x , y ) R α ( x , y ) } | 2 ,
R α ( x , y )=exp[ j π λf ( x cosα+ y sinα ) 2 ],
NCC= C A,B C A,A C B,B
C A,B = 1 MN x=1 M y=1 N [A(x,y) A ¯ (x,y)][B(x,y) B ¯ (x,y)] .
LMSE=log 10 { 1 MN m=1 M n=1 N [A(m,n)B(m,n)] 2 }.
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.