Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Mueller matrix polarimetry with 3D integral imaging

Open Access Open Access

Abstract

In this paper, we introduce the Mueller matrix imaging concepts for 3D Integral Imaging Polarimetry. The Mueller matrix of a complex scene is measured and estimated with 3D integral imaging. This information can be used to analyze the complex polarimetric behavior of any 3D scene. In particular, we show that the degree of polarization can be estimated at any selected plane for any arbitrary synthetic illumination source which may be difficult to produce in practice. This tool might open new perspectives for polarimetric analysis in the 3D domain. Also, we illustrate that 2D polarimetric images are noisier than 3D reconstructed polarimetric integral imaging. To the best of our knowledge, this is the first report on Mueller matrix polarimetry in 3D Integral Imaging.

© 2019 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Polarization imaging techniques are receiving increasing interest because they provide a way to overcome the intrinsic limitations of standard light intensity imaging [1,2]. Whereas conventional cameras record the intensity of the electric field, polarimetric measurements provide information related to the direction and phase difference of the electric field and the ratio between fully polarized and unpolarized light at each point of the scene. This is possible because of the recent availability of polarization image sensors (see, for instance [3–5]). Based on the well-known Mueller matrix theory, it is a relatively straightforward task to implement polarimetric Mueller imaging techniques using standard polarization equipment plus a CCD camera [6–10]. Polarization based imaging might be much more sensitive than conventional imaging. These techniques could have high impact in fields such as machine vision, target detection in turbid media or underwater imaging, provided they are adequately implemented according to the requirements of each particular problem [11–13]. For example, in machine vision applications, one of the required capabilities is the ability for plane depth discrimination within the scene. For this particular case, besides using polarization imaging, it may be appropriate to take advantage of other imaging techniques such as 3D Integral Imaging (InIm) [14–25]. In summary, to enhance the overall capabilities of the entire imaging process, it can be sometimes useful to have a 3D image acquisition step with a subsequent polarization-based imaging and processing stages. In this way, the specific advantages of the two techniques are jointly considered.

In the present work, we propose Mueller matrix imaging using 3D integral imaging. The Mueller matrix for the 3D scene is generated and used in several scenarios. For instance, we show that reconstructed 3D information displays less noise than the corresponding 2D images. Moreover, it is possible to select a particular plane where the in-focus polarimetric information is calculated. Also, provided that the 3D Mueller matrix InIm is known, we show that it is possible to generate polarimetric landscapes corresponding to any arbitrary synthetic polarimetric illuminations which may be difficult to produce in practice.

The paper is organized as follows. In section 2, we review basic concepts on Stokes imaging acquisition and Mueller matrix calculation. In section 3, we describe the optical setup and introduce the scene used in the experiments. The proposed imaging model is tested in section 4 and we compare the experimental Stokes images with those obtained with the proposed model. In next section 5, we use synthetic light in order to artificially produce polarimetric images that cannot be easily generated in practice. Finally, we present our conclusions in Section 6.

2. Review on Mueller matrix imaging

The Mueller matrix formalism [6] can be applied to 3D scenes illuminated by quasi monochromatic light. This assumption is based on two facts that hold in our experiments: i) the illuminating light has an almost parallel beam configuration, ii) the acquisition camera detects the light scattered by the scene in a certain well-defined direction. Thus, the directional character of both the illumination and the light detection, suggests that the light we collect at each pixel of the camera is the result of an overall interaction of the incident beam with the scene that can be modelled on a Mueller matrix basis. Therefore, we need to determine the Mueller matrix that fully represents each single pixel of the detected images. Eventually, the suitability of our central assumption can only be tested and confirmed by experiments. This is the main objective of the present work.

Let us briefly summarize the Mueller matrix formalism, adapted to our situation. The Mueller matrix of a sample illuminated by partially polarized quasi monochromatic light describes the changes in the state of polarization of the light when it is reflected from the sample, assuming the validity of a linear system scheme. The input and output data are the four Stokes parameters of light. Thus, the corresponding 4x4 Mueller matrix M(i,j) at pixel(i,j), satisfies

Sout(i,j)=M(i,j)Sinp
where Sinp=(S0inp,S1inp,S2inp,S3inp)Tand Sout(i,j)=(S0out(i,j),S1out(i,j),S2out(i,j),S3out(i,j))Tare the Stokes vectors of the illumination source and the recorded light, respectively. In particular, note that we assume the scene is illuminated by a source with a constant polarization state, i.e. not dependent on the pixel. To determine the 16 parameters of matrix M, a suitable number of measurements to solve the system of linear equations in Eq. (1) is required. The Mueller matrix could be calculated using a maximum of four well-selected states of polarization. However, this approach requires the use of wave plates with tunable phase difference [26]. In order to find accurate results and due to the inherent experimental inaccuracies, over-determination in the system of linear equations may be necessary. Accordingly, we illuminate the scene with 6 polarization states (L1 to L6) that correspond to the following set of Stokes parameters (Table 1):

Tables Icon

Table 1. Polarization States of the Input Illumination

Input states of polarization L1, L2, L4 and L5 describe linearly fully polarized light in the 0°, 45°, 90° and 135° directions respectively. States L3 and L6 correspond to right and left circularly polarized sources. These six input polarization states are selected because they are simple to produce in the laboratory by using a polarizer and a quarter wave plate. Any polarization state on the sphere of Poincaré can be determined as a combination of states L1 to L6.

For each illumination L1 to L6, we take 6 snapshots (image captures) that enable the measurement of the 4 output Stokes parameters as follows. The first 4 snapshots areI0(Lk;i,j),I45(Lk;i,j), I90(Lk;i,j), andI135(Lk;i,j), where I0(Lk;i,j)is the measured light intensity when an ideal linear polarizer (at an angle of 0° with the X axis) is placed before the detector, and Lk stands for the k state of polarization (k = 1,…6, as in Table 1). I45(Lk;i,j), I90(Lk;i,j),I135(Lk;i,j) are the measurements for linear polarizer at an angle of 45°, 90° and 135°, respectively. The output Stokes parameters are determined in the usual way:

S1out(Lk;i,j)=I0(Lk;i,j)I90(Lk;i,j),S2out(Lk;i,j)=I45(Lk;i,j)I135(Lk;i,j).
Similarly, we take 2 more snapshots IRC(Lk;i,j),ILC(Lk;i,j), whose intensities are transmitted through a quarter-wave plate plus a linear polarizer at angles suitable for allowing passing right (or left) circularly polarized light. Then, we define:
S3out(Lk;i,j)=IRC(Lk;i,j)ILC(Lk;i,j)
Soout(Lk;i,j), can be equivalently computed in three ways, as
S0out(Lk;i,j)=I0(Lk;i,j)+I90(Lk;i,j),S0out(Lk;i,j)=I45(Lk;i,j)+I135(Lk;i,j),S0out(Lk;i,j)=IRC(Lk;i,j)+ILC(Lk;i,j).
Due to the experimental character of our work, we have considered more convenient to define:
S0out(Lk;i,j)=13(I0(Lk;i,j)+I90(Lk;i,j)+I45(Lk;i,j)+I135(Lk;i,j)+IRC(Lk;i,j)+ILC(Lk;i,j)).
The degree of polarization (DoP) for illumination Lk at each pixel (i,j) of the output is:
DoP(Lk;i,j)=S1out(Lk;i,j)2+S2out(Lk;i,j)2+S3out(Lk;i,j)2S0out(Lk;i,j).
For each pixel (i,j), we define two 4x6 matrices as follows:
Vinp=(110010101001110010101001)
and

Vout(i,j)=(S0(L1;i,j)S1(L1;i,j)S2(L1;i,j)S3(L1;i,j)S0(L2;i,j)S1(L2;i,j)S2(L2;i,j)S3(L2;i,j)S0(L3;i,j)S1(L3;i,j)S2(L3;i,j)S3(L3;i,j)S0(L4;i,j)S1(L4;i,j)S2(L4;i,j)S3(L4;i,j)S0(L5;i,j)S1(L5;i,j)S2(L5;i,j)S3(L5;i,j)S0(L6;i,j)S1(L6;i,j)S2(L6;i,j)S3(L6;i,j)).

Each component of Vout(i,j)is a Stokes-parameter image obtained when the scene is illuminated with a light source with one of the input polarization states shown in Table 1. Then, the 36 intensity images give rise to an overdetermined linear equation system that can be written as follows:

Vout(i,j)=M(i,j)Vinp.
It is important to note that Eq. (9) is similar to Eq. (1), but there are several significant differences. First, the input and output quantities are arranged according to a 4x6 matrix. Second, it is worth to point out that Eq. (9) leads to a best fit estimation of the Mueller matrix Mest(i,j)=Vout(i,j)/Vinp, that depends on the number of measurements carried out, i.e. the number of columns of Vout(i,j) andVinp.Incidentally, form the numerical point of view, modern computing platforms allow an optimal treatment of the mathematical equations involved in Eq. (1), so that finding matrix Mest(i,j)is relatively straightforward.

3. Experimental setup

In order to test the validity of our approach, we arranged an experimental setup consisting of a scene with several elements, illuminated with a light source that provides a fairly parallel and monochromatic beam. The image capture can be done with a photographic camera. Integral imaging can be implemented with an array of conventional 2D cameras, or a single 2D camera with a lenslet array, or a single moving 2D camera. Or, we may use an available plenoptic camera. Figure 1(a) shows a sketch of the experimental arrangement, and the scene used in the experiments in Fig. 1(b). The scene combines several elements made of metal, plastic or glass and a very reflective stage; moreover, some objects are submerged in liquid. In what follows, we refer to plane F (objects located around 90 cm far from the camera) and plane N (objects located around 70 cm far from the camera) for 3D experiments.

 figure: Fig. 1

Fig. 1 (a) Sketch of the capture system (QWP: quarter wave plate, LP: linear polarizer, LED: light emitting diode), and (b) the scene used in the experiments. The Lytro Illum camera captures the perspective images for 3D reconstruction.

Download Full Size | PDF

Several details of the experiment are worth to discuss. First, the directional character of both the illumination and the detection processes is preserved. This is a requirement for the validity of the mathematical model we use. Note that the Mueller matrix we compute does not represent the scene in any general sense since it is only representative of a particular illumination and detection condition. Second, since we need to add or subtract experimental snapshots for computing the Stokes parameters, it is important that all the images we take have the same exposure conditions. In particular, we avoided overexposed areas. Finally, the images were carefully recorded in such a way that no motion between different frames was detected.

Regarding the detection of the light, besides the need for a quarter-wave plate and a linear polarizer, a photographic camera should be used. In our particular case, we are interested in obtaining focused sharp images at different object planes. In the experiments, we use a LYTRO ILLUM light-field camera for convenience. The specs of this particular device can be found in [27]. It is worth to point out that this particular device provides 15x15 elementary views with 434x625 color pixels. Since we use a green LED, the corresponding green channel of the light-field files is used in the calculations.

Subsequently, the required 3D image processing is carried out using an open code Light Field Toolbox that enables us to use the 3D imaging capabilities with moderate computational effort [28,29]. For example, with respect to the ability to focus only on a specific object plane at a given depth, it will be almost sufficient for us to use the LFFiltShiftSum filter. This basic filter is suitable for processing the images registered by the 3D camera so as to refocus it on a single plane, reducing the intrinsic depth of field of the photographic process. Thus, by selecting the object distance, we obtain quantities I0(Lk;i,j),I45(Lk;i,j),I90(Lk;i,j),I135(Lk;i,j),IRC(Lk;i,j)andILC(Lk;i,j)corresponding to the object distance and illumination Lk. Finally, the DoP of the image can be determined using Eqs. (2)–(6). We should point out that from the images given by the lenslets of the 3D camera, one can select the frame given by a single detector of each lenslet. Thus, this would lead to a conventional 2D photographic image.

4. Testing the proposed model

The proposed polarimetric 3D model is tested in this section. Assuming that the pixel to pixel Mueller matrix of the scene is known, we can compare the experimentally recorded images used to compute the Stokes values of the scene with those synthetically obtained using the estimated Mueller matrix. The agreement between experimental measurements and computed quantities should be the basic argument for the acceptance or rejection of our approach. Explicitly, we need to compare the estimated polarimetric imagesVest(i,j),

Vest(i,j)=Mest(i,j)Vinp
with the experimental ones Vout(i,j). The evaluation of these differences provides an estimate for the validity of the proposed procedure. For comparison, we use the experimentally measured 3D DoPout(i,j) and the numerically estimated 3DDoPest(i,j).Figs. 2 and 3 display DoPout(i,j)focused on planes F and N, respectively. Several areas of the scene display high DoP values.

 figure: Fig. 2

Fig. 2 Experimental evaluation of the 3DDoPout(i,j)at plane F. The six images of theDoPout(i,j)have been obtained when the scene is illuminated using input polarized light according to Table 1 or Eq. (7).

Download Full Size | PDF

 figure: Fig. 3

Fig. 3 Experimental evaluation of theDoPout(i,j)at plane N.

Download Full Size | PDF

Note that small differences on the polarimetric landscape can be detected depending on the type of illumination used such as the clock on the left side facing the scene or the reflections on the stage. In addition, objects away from the focused planes do not provide any polarimetric information. For example, the characters of the label wrapped in plastic (upper left part of the image) clearly appear in Fig. 2 but are not visible in Fig. 3.

Since the estimated Mueller matrix Mest(i,j) has been determined, Figs. 2 and 3 can be numerically replicated using Eq. (10). Accordingly, Figs. 4 and 5 show the numerically estimated DoPest(i,j). In order to determine how close are DoPout(i,j)and DoPest(i,j) for the six illuminations considered, we used the Structural Similarity Index Measure (SSIM) [30] and the normalized correlation coefficient. Both metrics range from 0 (completely dissimilar images) to 1 (identical images). The results are presented in Table 2. Interestingly, the similarity between the experimental polarimetric images and the computational ones is very high. In both planes, SSIM and correlation values are around 0.8 and 0.9, respectively. In our experiments, Mueller matrix Mest(i,j) provides a reasonably accurate account of the polarimetric behavior of the 3D scene, according to the specific recording conditions sketched in Fig. 1.

 figure: Fig. 4

Fig. 4 Numerical estimation of theDoPest(i,j)at plane F.

Download Full Size | PDF

 figure: Fig. 5

Fig. 5 Numerical estimation of theDoPest(i,j)at plane N.

Download Full Size | PDF

Tables Icon

Table 2. SSIM and Normalized Correlation betweenDoPout(i,j) andDoPest(i,j)

As discussed earlier, it is possible to select the 2D image frame provided by each lenslet. Using a single 2D image provided by the central lens of the microlens array, we calculated the DoP. This would provide a comparison between the 2D and the 3D imaging technique. The results are presented in Fig. 6 for the experimental,DoPout(i,j)) and Fig. 7 for the estimatedDoPest(i,j)). As expected,DoPout(i,j)and DoPest(i,j)are very similar for the 2D imaging. However, the amount of noise present in the DoP of 2D image is very high, and the DoP values seem to be overestimated if compared with the results shown in Figs. 2-5. 3D-dimensional InIm reconstruction is obtained by weighted averaging of the different perspective view images, and thus noise is minimized. It can be shown that InIm reconstruction is optimum in maximum likelihood sense under certain conditions [31].

 figure: Fig. 6

Fig. 6 Experimental evaluation of DoPout(i,j)using a single frame of the 3D camera.

Download Full Size | PDF

 figure: Fig. 7

Fig. 7 Numerical estimation of DoPest(i,j) using a single frame of the 3D camera.

Download Full Size | PDF

Note that 2D DoP estimation can be improved if a good quality conventional camera is used. Figure 8 shows the experimental DoPout(i,j) using a Canon EOS 400D. The results are less noisy; however, some areas in the background where no objects are present display DoP values close to 1 for certain illumination states (L1 and L4).

 figure: Fig. 8

Fig. 8 Experimental evaluation of DoPout(i,j)using a conventional 2D camera.

Download Full Size | PDF

In order to provide a quantitative comparison among the DOP images, we calculated the Blind Referenceless Image Spatial Quality Evaluator (BRISQUE) for the images considered [32]. BRISQUE is a widely used estimator for assessing the quality of an image. This metric evaluates the statistics of the image in order to find possible distortions. Interestingly, this descriptor does not perform any comparison with another image that can be considered perfect. Moreover, the resulting values range from 0 (best quality) to 100 (worst quality). Other common quality metrics such as the peak signal-to-noise ratio requires a reference image for calculating the mean squared error.

Table 3 shows the BRISQUE values obtained. It is apparent that (i) 3D InIm DOP distributions display better image quality than the corresponding single frame ones (columns #2 and #3 versus #4, and columns #6 and #7 versus #8). (ii) The image quality of the distributions generated using the conventional camera (columns #5 and #9) is better than the corresponding ones generated from a single frame (columns #4 and #8). Nevertheless, any comparison between these two set of values is not possible because of the different specifications of both devices. (iii) Finally, it is worth to point out that BRISQUE figures for numerically estimated DOPs (columns #6 to #9) are always lower that those experimentally estimated (columns #2 to #5).

Tables Icon

Table 3. BRISQUE Values for the DOP Imagesa

5. Estimation of the DoP using synthetic light

The Mueller matrix Mest(i,j) can be used to compute the expected output Stokes parameters Sout(i,j)from any arbitrary input illumination Sinpby simply using Eq. (1). The Mueller matrix formalism can be used for producing synthetic light with arbitrary states of polarization. This includes any state of polarization located on the surface of the Poincaré sphere (elliptical polarization) or inside the sphere (partially polarized light). Sometimes, these polarization states are difficult to be generated in real life experiment because special equipment is required. For instance, in order to produce partially polarized light it is necessary to combine non-polarized and totally polarized light. This means that the experimental illumination system might become more complicated. However, once the Mueller matrix of the scene is determined, it is possible to calculate the full polarimetric information corresponding to an illumination source with any arbitrary state of polarization. Note that this is a general result and may include even structured illumination, i.e. sources with a Stokes input vector Sinp(i,j)that might change at every point of the wave-front.

In the examples presented in Fig. 9, we calculated theDoPest(i,j)at plane F for a set of partially polarized lights with Sinp=(1, 0.5, 0, 0), (1, −0.5, 0, 0), (1, 0, 0.5, 0), (1, 0, −0.5, 0), (1, 0, 0, 0.5), and (1, 0, 0, −0.5). Natural light Sinp=(1,0,0,0)(which was not considered during the calculation of the Mueller matrix stage) is also considered. From the results shown in Figs. 2-5, it is apparent that when illuminating with fully polarized light, large portions of the scene maintain a high degree of polarization. Interestingly, when using partially polarized light few pixels on the scene appear totally polarized and thus it is possible to better discriminate those portions of the scene that polarize the reflected light. For example, in Fig. 9 note the screw – in the right central part of the scene – or the small metallic paper clips in the upper right side of the scene.

 figure: Fig. 9

Fig. 9 Numerical estimation of DoPest(i,j) at plane F using synthetic polarized light.

Download Full Size | PDF

The last of the image series displayed in Fig. 9 shows a resulting DoPest(i,j)image which is a combination of the six partially polarized sourcesSinp obtained by fusion. Each pixel of this distribution is calculated as the maximum value of the DoPest(i,j)set. This means that all the pixels of the scene with a high capability of polarizing light are shown in this image, irrespective of the values of the illumination stateSinp. Image fusion algorithms might be an interesting approach for displaying polarimetric information within the context of object occlusion.

6. Conclusion

In this paper, we extended the Mueller matrix imaging concepts to the 3D integral imaging field. We have shown that the use of polarimetric light field information poses several advantages when compared with conventional 2D polarimetric imaging.

We have shown that it is possible to estimate polarimetric information for any single selected plane. We have illustrated the fact that 2D polarimetric images are noisier than 3D reconstructed integral imaging. This is due to the fact that integral imaging computations involve weighted averaging and may be statistically optimum in maximum likelihood sense, thus noise is substantially reduced. Also, the 3D Mueller Matrix is used to produce computational polarimetric information using simulated synthetic light sources. This is particularly useful if the required state of polarization is difficult to be generated. We have shown illustrative examples with natural, partially polarized, and synthetically-produced light. Interestingly, the results obtained in these conditions can be combined with image fusion techniques in order to produce polarimetric maps that cannot be obtained by using a single source. The proposed approach can have broad applications in micro and macro scale 3D integral imaging [33–37].

Funding

Agencia Estatal de Investigación (AEI) and Fondo Europeo de Desarrollo Regional (FEDER) (FIS2016-77319-C2-2-R); Office of Naval Research (ONR) (N000141712405); Air Force Office of Scientific Research (AFOSR) (FA9550-18-1-0338); Ministerio de Economía y Competitividad (MINECO) (FIS2016-75147-C3-1-P).

References

1. L. B. Wolff, “Polarization vision: a new sensory approach to image understanding,” Image Vis. Comput. 15(2), 81–93 (1997). [CrossRef]  

2. F. A. Sadjadi, “Passive three-dimensional imaging using polarimetric diversity,” Opt. Lett. 32(3), 229–231 (2007). [CrossRef]   [PubMed]  

3. X.-F. He, “Polarization-based imaging: Basics and benefits,” Photon. Spectra (2016).

4. Fraunhofer Institute for Integrated Circuits, “Polka polarization camera.” https://www.iis.fraunhofer.de/en/ff/sse/ims/tech/polarisationskamera.html

5. Sony Semiconductors Solutions Corp, “Polarization image sensor.” https://www.sony-semicon.co.jp/products_en/new_pro/september_2018/imx250mzr_myr_e.html

6. R. M. A. Azzam, “Stokes-vector and Mueller matrix polarimetry [invited],” J. Opt. Soc. Am. A 33(7), 1396–1408 (2016). [CrossRef]   [PubMed]  

7. J. L. Pezzaniti and R. A. Chipman, “Mueller matrix imaging polarimetry,” Opt. Eng. 34(6), 1558–1568 (1995). [CrossRef]  

8. J. Zallat, C. Collet, and Y. Takakura, “Clustering of polarization-encoded images,” Appl. Opt. 43(2), 283–292 (2004). [CrossRef]   [PubMed]  

9. W. Freda, J. Piskozub, and H. Toczek, “Polarization imaging over sea surface- a method for measurements of Stokes components angular distribution,” J Eur Opt Soc-Rapid. 10, 15060 (2015).

10. W. Sparks, T. A. Germer, J. W. MacKenty, and F. Snik, “Compact and robust method for full Stokes spectropolarimetry,” Appl. Opt. 51(22), 5495–5511 (2012). [CrossRef]   [PubMed]  

11. G. D. Lewis, D. L. Jordan, and P. J. Roberts, “Backscattering target detection in a turbid medium by polarization discrimination,” Appl. Opt. 38(18), 3937–3944 (1999). [CrossRef]   [PubMed]  

12. G. W. Kattawar and M. J. Raković, “Virtues of mueller matrix imaging for underwater target detection,” Appl. Opt. 38(30), 6431–6438 (1999). [CrossRef]   [PubMed]  

13. C. R. Givens and A. B. Kostinski, “A simple necessary and sufficient condition on physically realizable Mueller matrices,” J. Mod. Opt. 40(3), 471–481 (1993). [CrossRef]  

14. G. Lippmann, “Epreuves reversibles donnant la sensation du relief,” J. Phys. Theor. Appl. 7(1), 821–825 (1908). [CrossRef]  

15. H. E. Ives, “Optical properties of a Lippman lenticulated sheet,” J. Opt. Soc. Am. 21(3), 171–176 (1931). [CrossRef]  

16. C. B. Burckhardt, “Optimum parameters and resolution limitation of integral photography,” J. Opt. Soc. Am. 58(1), 71–74 (1968). [CrossRef]  

17. E. Y. Lam, “Computational photography with plenoptic camera and light field capture: tutorial,” J. Opt. Soc. Am. A 32(11), 2021–2032 (2015). [CrossRef]   [PubMed]  

18. H. Arimoto and B. Javidi, “Integral three-dimensional imaging with digital reconstruction,” Opt. Lett. 26(3), 157–159 (2001). [CrossRef]   [PubMed]  

19. S. H. Hong, J. S. Jang, and B. Javidi, “Three-dimensional volumetric object reconstruction using computational integral imaging,” Opt. Express 12(3), 483–491 (2004). [CrossRef]   [PubMed]  

20. A. Stern and B. Javidi, “Three-dimensional image sensing, visualization, and processing using integral imaging,” Proc. IEEE 94(3), 591–607 (2006). [CrossRef]  

21. R. Martinez-Cuenca, G. Saavedra, M. Martinez-Corral, and B. Javidi, “Progress in 3-D multiperspective display by integral imaging,” Proc. IEEE 97(6), 1067–1077 (2009). [CrossRef]  

22. O. Matoba and B. Javidi, “Three-dimensional polarimetric integral imaging,” Opt. Lett. 29(20), 2375–2377 (2004). [CrossRef]   [PubMed]  

23. X. Xiao, B. Javidi, G. Saavedra, M. Eismann, and M. Martinez-Corral, “Three-dimensional polarimetric computational integral imaging,” Opt. Express 20(14), 15481–15488 (2012). [CrossRef]   [PubMed]  

24. X. Xiao, B. Javidi, M. Martinez-Corral, and A. Stern, “Advances in three-dimensional integral imaging: sensing, display, and applications [Invited],” Appl. Opt. 52(4), 546–560 (2013). [CrossRef]   [PubMed]  

25. M. Martínez-Corral and B. Javidi, “Fundamentals of 3D imaging and displays: a tutorial on integral imaging, light-field, and plenoptic systems,” Adv. Opt. Photonics 10(3), 512–566 (2018). [CrossRef]  

26. D. S. Sabatke, M. R. Descour, E. L. Dereniak, W. C. Sweatt, S. A. Kemme, and G. S. Phipps, “Optimization of retardance for a complete Stokes polarimeter,” Opt. Lett. 25(11), 802–804 (2000). [CrossRef]   [PubMed]  

27. Lytro, Inc., “Lytro camera specs.” https://web.archive.org/web/20140801135132/https://www.lytro.com/camera/specs/

28. D. G. Dansereau, O. Pizarro, and S. B. Williams, “Decoding, calibration and rectification for lenselet-based plenoptic cameras,” IEEE Conference on Computer Vision and Pattern Recognition 1027–1034 (2013). [CrossRef]  

29. D. G. Dansereau, “Light field toolbox.” https://www.mathworks.com/matlabcentral/fileexchange/49683-light-field-toolbox-v0-4

30. Z. Wang, A. C. Bovik, H. R. Sheikh, and E. P. Simoncelli, “Image quality assessment: from error visibility to structural similarity,” IEEE Trans. Image Process. 13(4), 600–612 (2004). [CrossRef]   [PubMed]  

31. B. Tavakoli, B. Javidi, and E. Watson, “Three dimensional visualization by photon counting computational Integral Imaging,” Opt. Express 16(7), 4426–4436 (2008). [CrossRef]   [PubMed]  

32. A. Mittal, A. K. Moorthy, and A. C. Bovik, “No-reference image quality assessment in the spatial domain,” IEEE Trans. Image Process. 21(12), 4695–4708 (2012). [CrossRef]   [PubMed]  

33. J. S. Jang and B. Javidi, “Three-dimensional integral imaging of micro-objects,” Opt. Lett. 29(11), 1230–1232 (2004). [CrossRef]   [PubMed]  

34. B. Javidi, I. Moon, and S. Yeom, “Three-dimensional identification of biological microorganism using integral imaging,” Opt. Express 14(25), 12096–12108 (2006). [CrossRef]   [PubMed]  

35. A. Stern and B. Javidi, “3D image sensing and reconstruction with time-division multiplexed computational integral imaging,” Appl. Opt. 42(35), 7036–7042 (2003). [CrossRef]   [PubMed]  

36. H. Navarro, M. Martínez-Corral, G. Saavedra, A. Pons, and B. Javidi, “Photoelastic analysis of partially occluded objects with an integral-imaging polariscope,” J. Disp. Technol. 10(4), 255–262 (2014). [CrossRef]  

37. B. Javidi, X. Shen, A. S. Markman, P. Latorre-Carmona, A. Martínez-Uso, J. Martínez Sotoca, F. Pla, M. Martínez-Corral, G. Saavedra, Y.-P. Huang, and A. Stern, “Multidimensional optical sensing and imaging systems (MOSIS): from macro to micro scales,” Proc. IEEE 105(5), 850–875 (2017). [CrossRef]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1
Fig. 1 (a) Sketch of the capture system (QWP: quarter wave plate, LP: linear polarizer, LED: light emitting diode), and (b) the scene used in the experiments. The Lytro Illum camera captures the perspective images for 3D reconstruction.
Fig. 2
Fig. 2 Experimental evaluation of the 3D DoP out (i,j)at plane F. The six images of the DoP out (i,j)have been obtained when the scene is illuminated using input polarized light according to Table 1 or Eq. (7).
Fig. 3
Fig. 3 Experimental evaluation of the DoP out (i,j)at plane N.
Fig. 4
Fig. 4 Numerical estimation of the DoP est (i,j)at plane F.
Fig. 5
Fig. 5 Numerical estimation of the DoP est (i,j)at plane N.
Fig. 6
Fig. 6 Experimental evaluation of DoP out (i,j)using a single frame of the 3D camera.
Fig. 7
Fig. 7 Numerical estimation of DoP est (i,j) using a single frame of the 3D camera.
Fig. 8
Fig. 8 Experimental evaluation of DoP out (i,j)using a conventional 2D camera.
Fig. 9
Fig. 9 Numerical estimation of DoP est (i,j) at plane F using synthetic polarized light.

Tables (3)

Tables Icon

Table 1 Polarization States of the Input Illumination

Tables Icon

Table 2 SSIM and Normalized Correlation between DoP out (i,j) and DoP est (i,j)

Tables Icon

Table 3 BRISQUE Values for the DOP Imagesa

Equations (10)

Equations on this page are rendered with MathJax. Learn more.

S out (i,j)=M(i,j) S inp
S 1 out ( L k ;i,j)= I 0 ( L k ;i,j) I 90 ( L k ;i,j), S 2 out ( L k ;i,j)= I 45 ( L k ;i,j) I 135 ( L k ;i,j).
S 3 out ( L k ;i,j)= I RC ( L k ;i,j) I LC ( L k ;i,j)
S 0 out ( L k ;i,j)= I 0 ( L k ;i,j)+ I 90 ( L k ;i,j), S 0 out ( L k ;i,j)= I 45 ( L k ;i,j)+ I 135 ( L k ;i,j), S 0 out ( L k ;i,j)= I RC ( L k ;i,j)+ I LC ( L k ;i,j).
S 0 out ( L k ;i,j)= 1 3 ( I 0 ( L k ;i,j)+ I 90 ( L k ;i,j)+ I 45 ( L k ;i,j)+ I 135 ( L k ;i,j)+ I RC ( L k ;i,j)+ I LC ( L k ;i,j)).
DoP( L k ;i,j )= S 1 out ( L k ;i,j ) 2 + S 2 out ( L k ;i,j ) 2 + S 3 out ( L k ;i,j ) 2 S 0 out ( L k ;i,j ) .
V inp =( 1 1 0 0 1 0 1 0 1 0 0 1 1 1 0 0 1 0 1 0 1 0 0 1 )
V out ( i,j )=( S 0 ( L 1 ;i,j) S 1 ( L 1 ;i,j) S 2 ( L 1 ;i,j) S 3 ( L 1 ;i,j) S 0 ( L 2 ;i,j) S 1 ( L 2 ;i,j) S 2 ( L 2 ;i,j) S 3 ( L 2 ;i,j) S 0 ( L 3 ;i,j) S 1 ( L 3 ;i,j) S 2 ( L 3 ;i,j) S 3 ( L 3 ;i,j) S 0 ( L 4 ;i,j) S 1 ( L 4 ;i,j) S 2 ( L 4 ;i,j) S 3 ( L 4 ;i,j) S 0 ( L 5 ;i,j) S 1 ( L 5 ;i,j) S 2 ( L 5 ;i,j) S 3 ( L 5 ;i,j) S 0 ( L 6 ;i,j) S 1 ( L 6 ;i,j) S 2 ( L 6 ;i,j) S 3 ( L 6 ;i,j) ).
V out (i,j)=M(i,j) V inp .
V est (i,j)= M est (i,j) V inp
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.