Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Division of amplitude RGB full-Stokes camera using micro-polarizer arrays

Open Access Open Access

Abstract

Existing polarization cameras are generally single or narrow band and sensitive only to linear polarization states. In this work, we demonstrate a novel full-Stokes camera operating in the red, green, and blue (RGB) spectrum. The camera consists of two 1600-by-1200 pixel focal plane arrays with linear micro-polarizers and Bayer filters, in a division-of-amplitude configuration. The design, assembly, calibration, and testing of the camera are presented. Multiple indoor and outdoor scenes are acquired to provide full Stokes polarization images in the three spectral bands. Our camera design has the unique advantage of measuring the intensity, color, and polarization states of an optical field in a single shot.

© 2017 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

Corrections

Xingzhou Tu, Oliver J. Spires, Xiaobo Tian, Neal Brock, Rongguang Liang, and Stanley Pau, "Division of amplitude RGB full-Stokes camera using micro-polarizer arrays: erratum," Opt. Express 26, 4192-4193 (2018)
https://opg.optica.org/oe/abstract.cfm?uri=oe-26-4-4192

1. Introduction

Intensity, wavelength, and polarization are three important properties of light. The wavelength of light is perceived by the human eyes in terms of color, which is often described as responses in red, green, and blue (RGB) color channels. The polarization of light is the orientation of its electric field oscillation. It is commonly characterized by the Jones vector in coherent cases or by the Stokes vector in incoherent cases. Traditional cameras use color filters to capture the color information of an optical field. Linear-Stokes cameras use linear polarizers to record the linear polarization information of an optical field, and have a variety of applications in remote-sensing [1–3], biomedical imaging [1,4,5], and interferometry [6,7]. Full-Stokes cameras add the capability of capturing circular polarization information, and are capable of recording all four elements of the Stokes vector [8–13].

In this paper, an RGB full-Stokes camera is constructed and tested. This novel camera design involves an arrangement combining both division-of-amplitude and division-of-focal-plane configurations, and can capture the intensity, color, and polarization of an optical field simultaneously.

2. Camera design

Our RGB full-Stokes camera design is based on two custom color polarization cameras constructed by 4D Technology. The custom camera is a division-of-focal-plane RGB linear-Stokes camera, realized by applying a wire-grid micro-polarizer array onto the traditional RGB Bayer sensor array, as shown in Fig. 1. Thus, the camera can capture both linear polarization information and color information of any target scene with appropriate demosaicing [14–17].

 figure: Fig. 1

Fig. 1 The sensor array and Bayer color filter array of the color polarization camera have a 4.5µm pitch. The linear micro-polarizer array has a 9µm pitch.

Download Full Size | PDF

The color polarization camera uses a CMOS imaging sensor comprising 1200 by 1600 total pixels of which 1082 by 1312 are usable pixels. The pitches of the Bayer array and the micro-polarizer arrays are 4.5µm and 9µm, respectively. The parallel transmission of the micro-polarizer array for linear polarized light is >70% over the visible band, and the extinction ratio is >25. The camera supports digital output up to 12 bits depth. The frame rate can reach 54fps. The camera can be connected and controlled by computers via a Gigabit Ethernet cable.

In this work, we put an achromatic quarter wave plate (QWP), manufactured by Bolder Vision, in front of the first color polarization camera in order to capture circular polarization information. This color polarization camera is integrated with a second color polarization camera to form a division of amplitude RGB Full-Stokes camera. The system configuration is shown in Fig. 2. A non-polarizing cube beam splitter is utilized to divide the incoming light into two components. A glass compensating plate, with the same thickness as the achromatic QWP, is inserted in front of the second color polarization camera, to produce the same aberrations and images on the two cameras [18]. In addition, the cube beam splitter operates by the principle of frustrated total internal reflection (FTIR) to separate light [19], there exists a retardance between the output s- and p-polarized light. Figure 3 shows the retardance curve of the achromatic QWP and the transmittance path of the beam splitter. The lens interface is a Pentacon lens mount; an Arsat 30mm f/3.5 lens is used in the measurements. The lens has a long flange focal distance of 74.1mm, necessary for the beam splitter installation, and a large field of view for outdoor measurements.

 figure: Fig. 2

Fig. 2 The schematics of the RGB full-Stokes camera. An external color filter is added to narrow the transmission band of each of the RGB color.

Download Full Size | PDF

 figure: Fig. 3

Fig. 3 The measured retardance curve of the achromatic QWP and the transmittance of the cube beamsplitter.

Download Full Size | PDF

3. Camera alignment and calibration

Accurate measurement of color and of the polarization state requires precise alignment of the two color polarization cameras and the calibration of all the pixels in each sensor. The alignment of the camera is performed by using precision translation and rotation stages to adjust the camera position, so that the image of one camera’s sensor through the cube beam splitter matches perfectly with the other camera’s sensor, at sub-pixel accuracy. The detail of the alignment process is described in Appendix A. The result of the alignment is shown in Fig. 4, which shows image of a USAF 1951 test target captured by two aligned cameras. The intensity profile is plotted along different resolution lines in the two images. It is shown that the intensity profile matches closely to sub-pixel resolution.

 figure: Fig. 4

Fig. 4 Alignment result using a USAF resolution test chart taken by the two cameras is shown on the left. Plots of the intensity profile along different lines (Line 1 to Line 4) of the captured images are shown on the right.

Download Full Size | PDF

The alignment offset value in pixels between the two cameras can be estimated by comparing the images from the two cameras. The image taken by Camera 1 in Fig. 4 is rotated by different Δθ values about the center of the sensor and translated by different Δx and Δy values. A reconstructed image can be calculated using bilinear interpolation after translation and rotation of the original image. The Mean Squared Error (MSE) between the reconstructed image and the image taken by Camera 2 can be calculated for different values of Δθ, Δx, and Δy. The smallest MSE is found for Δx = 0.326 pixels, Δy = 0.296 pixels, and Δθ = 0.0294°, which provide an estimate of the alignment offsets between the two cameras. The MSE is defined by

MSE=1MN1iM1jN(I1(i,j)-I2(i,j))2.
where I1 and I2 represents the intensity values of the two images for comparison.

The aligned camera is subsequently calibrated to measure the analyzer vector of each pixel. The analyzer vector describes how each pixel responds to different colors and polarizations of the input light [20]. The analyzer vector is used in the reconstruction of the color and polarization image from the camera raw data and takes into account the variation of each pixel and the misalignment. Both the calibration and reconstruction process, including interpolation of adjacent pixels, are described in Appendices B and C.

Figure 5 shows the analyzer vectors of all red, green, and blue pixels of the two cameras and the histograms of their diattenuation. The histograms show that the average diattenuation, indicated by the dashed line, of the blue, green, and red pixels are 1.01, 0.96 and 0.89 respectively. However, the diattenuation of a wire-grid polarizer generally increases with longer wavelengths. One explanation for our observation is that there is an alignment offset between the micro-polarizer array and the Bayer array, shown exaggerated in Fig. 6, such that the red pixel is covered by four kinds of linear polarizers, while the green pixel is covered by two and the blue pixel is covered by one. Thus, the analyzer vector of red pixel is a mixture of four linear polarizers at different orientations, leading to a smaller overall diattenuation. The observed variation of the histograms provides a sense of not only the spatial variation of the alignment between the micro-polarizer array and the Bayer array but also the calibration error resulting from the analyzer vector calculation process. In the Poincaré sphere plot, the analyzer vector for the ideal case is also shown in black dot. In the ideal case, the beam splitter should not have any polarizance effect, the QWP is a perfect achromat, and each pixel has an identical response. The orientation of the QWP’s fast axis is estimated to be at approximately 165° respect to horizontal. In the figure, we can see the analyzer vectors form 8 clusters, which corresponds to 8 types of red, green, and blue pixels inside a 4 by 4 macro-pixel of the camera. Note that the green channel has twice the number of pixels due to the Bayer filter’s arrangement. For convenience, we consider the measurements as 16 independent measurements and divide the analyzer vectors into 16 clusters. The analyzer vector inside each cluster is averaged and combined to form the measurement matrix. The measurement matrix can then be used to estimate the noise of the Stokes vector reconstruction, i.e. the measurement of the Stokes vector. Here we use the CN (conditional number), EWV (equally weighed variance), and RAD (reciprocal absolute determinant), defined by

RAD=j=0R11μj,CN=μmaxμmin,EWV=j=0R11μj2,
as the figure of merit [21]. These three parameters are related to the singular values µ of the measurement matrix of the Stokes vector. They do not represent the intrinsic noise of the intensity measurement performed by the pixel. Instead they provide a sense of the noise amplification in the Stokes vector reconstruction process. In general, smaller values denote more accurate measurements. The singular values increase as the number of intensity measurement increases. Thus the RAD and EWV usually increase with more measurement, while the CN, which is the ratio between singular values, does not change. Table 1 summarizes the comparison of the three parameters for the actual camera and for the ideal case. The actual camera has slightly larger values of CN, EWV, and RAD than the ideal case, but their difference is not very large. Note that the green channel has a smaller EWV and RAD due to twice the number of green pixels in the Bayer filter arrangement. For comparison, the CN of two single wavelength full-Stokes polarimeters designed by Myhre et al. and Hsu et al. are 3.2255 [8] and 1.7321 [9] respectively.

 figure: Fig. 5

Fig. 5 Top: the histograms of the diattenuation of red (left), green (middle) and blue (right) pixels are shown. The dashed lines represent the average value. Bottom: the calibrated analyzer vectors of all red (left), green (middle), and blue (right) pixels of the two cameras are plotted on the Poincaré sphere. The black dots represent the analyzer vector of the ideal case.

Download Full Size | PDF

 figure: Fig. 6

Fig. 6 The misalignment between micro-polarizer array and color filter array causes the different diattenuation between three color channels.

Download Full Size | PDF

Tables Icon

Table 1. Noise comparison between the actual camera and the perfect case

An estimation of the camera’s performance is made by illuminating the camera with specific polarization states. The setup is identical to the calibration setup illustrated in Fig. 13 of Appendix B. Polarization states with different degree of circular polarization (DoCP) and degree of linear polarization (DoLP) at three color channels are generated by filtering white light with bandpass filters at 460nm, 540nm, 610nm and sending the light through a fixed linear polarizer and a rotating achromatic quarter wave retarder. The FWHMs of the bandpass filter are approximately 10nm. The generated polarization states are characterized by both an Axometrics Mueller matrix polarimeter and our camera. The DoLP and DoCP measured by the Axometrics polarimeter are considered as the measurement standard. The estimated accuracy of the Axometrics polarimeter is 3% and 1% for DoLP and DoCP measurements, respectively, based on measurements of air. The DoLP and DoCP measured by every pixel of our camera are averaged and compared with the Axometrics polarimeter measurements as an estimation of its accuracy. The standard deviation is regarded as the precision of the camera. Figure 7(a) shows the average measurement of DoLP and DoCP performed by the camera with a standard deviation error bar. The Axometrics polarimeter measurements are represented by the solid lines. Figure 7(b) shows the difference between the camera measurements and the Axometrics polarimeter measurements. The error of DoLP and DoCP is seen to be less than 6.5% and 7.6%, respectively. Figure 7(c) shows the standard deviation of measurements of all pixels in our camera. It is shown to be less than 14% for DoLP and 10% for DoCP. Sources of error include inaccuracy from the interpolation of the Stokes vector, sensor noise and mechanical precision in the control of the retarder angle which is estimated to be ± 0.25 °.

 figure: Fig. 7

Fig. 7 The DoCP and DoLP are measured at different retarder orientations. (a) The solid line represents the DoCP and DoLP characterized by an Axometrics polarimeter. The solid dot and error bar represents the average DoCP and DoLP measurements and standard deviation of the camera, respectively. (b) The difference between the Axometrics polarimeter measurements and camera average measurements is plotted as a function of retarder angles. (c) The standard deviation of the camera measurements is plotted as a function of retarder angles.

Download Full Size | PDF

4. Results and conclusions

Four indoor scenes, four outdoor scenes, and a video (See Visualization 1) are taken by the RGB full-Stokes camera to demonstrate its performance. Figure 8 shows the Stokes image of a color target displayed by an iPhone and an Android Phone respectively. The iPhone display has circular polarization while the Android Phone has linear polarization. Figure 9 shows the Stokes image of an outdoor building. The window of the building has large linear polarization due to Fresnel reflection. These polarization features can be seen in the Stokes image taken by our camera. The other three indoor scenes, three outdoor scenes and video are included in Appendix D. The three Stokes parameters S1, S2, S3, can have negative values and cannot be displayed like the S0 image, in which black represents zero and higher brightness means larger intensity. Instead we use grey to represent zero, black to represent negative, and white to represent positive. In the DoLP and angle of linear polarization (AoLP) images, we use a color chart to display the linear polarization information, in which the hue represents the AoLP and the intensity represents the DoLP.

 figure: Fig. 8

Fig. 8 The RGB full-Stokes image of phone displays with an Android phone displaying a color target on the left, an iPhone displaying a color target in the middle, and paper color target on the right.

Download Full Size | PDF

 figure: Fig. 9

Fig. 9 The RGB full-Stokes image of a window building. The window is highlighted in the DoLP images.

Download Full Size | PDF

In conclusion, we have constructed, calibrated, and tested a 2 mega-pixel RGB full-Stokes camera that can capture intensity, color, and polarization of an optical field in one shot. The camera is based on applying the division-of-amplitude technique to two division-of-focal-plane RGB linear Stokes cameras. The two RGB linear Stokes cameras are aligned by a translation and rotation stage to sub-pixel accuracy. The calibration of the camera involves the characterization of both color and polarization responses of each pixel, allowing the camera to reconstruct the color and polarization of the optical field at the same time. Bicubic-spline interpolation is used to calculate the RGB Stokes vector for each pixel from the raw measurements. Several indoor and outdoor scenes are captured to test the camera performance. Our camera can rapidly detect polarization signals of various natural scenes and can be used in many applications such as biomedical imaging and remote sensing.

Appendix A Alignment procedure of the cameras

The alignment of the RGB full-Stokes camera begins with fixing one color polarization camera with screws and epoxy and clamping another color polarization camera to a precision translation and rotation stage. The stage controls all six degrees of freedom of the camera position, which are rotation about the x, y, and z-axis, and translation at x, y, and z-axis. Here we define the optical axis as the z-axis and the x-axis as normal to the camera substrate. The setup is shown in Fig. 10.

 figure: Fig. 10

Fig. 10 A photo shows the top view of the alignment experiment setup.

Download Full Size | PDF

The first step is adjusting the rotation about the x- and y-axis, so that the sensor of the clamped camera is parallel to the image of the fixed camera’s sensor viewed through the beam splitter. This is achieved by illuminating the camera system with a collimated beam. The collimated beam is reflected by the two sensors and results in two collimated beams coming back. The output beam passes through a positive lens and is focused on a focal plane, shown in Fig. 11. If the two sensors are parallel to each other, the two output beams are parallel and the focal points fall on the same position. In the experiment, the screws that control Camera 2’s rotation about the x and y-axis are adjusted so that the two focal points overlap each other.

 figure: Fig. 11

Fig. 11 A schematic shows the first step of the alignment process.

Download Full Size | PDF

The next step is adjusting the remaining 4 degrees of freedom, which is rotation about z-axis and translation of the x, y, and z-axes. In this step, the camera system is equipped with an imaging lens, and a target with four small points is captured by the camera. The two cameras produce two images of the target. The remaining 4 degrees of freedom are adjusted to make the coordinates of the four points on the two images to be as close as possible.

After all 6 degrees of freedom are optimized, the clamped camera is be fixed by screws and epoxy. The camera is fixed after the epoxy is cured and the clamp and adjustment stage are subsequently removed. The alignment performance is tested by measuring a 1951 USAF resolution test chart and by comparing the intensity profiles along different lines along the test chart image.

Appendix B Calibration procedure

As shown in Fig. 12, the Bayer filters have a broad transmission spectrum that overlap each other. Each color pixel has sensitivity of red, green, and blue light, resulting in crosstalk between different color channels. In other words, a measurement taken at one color channel includes some responses to light in the other two color channels. A calibration of the camera color response is required to get a good polarization measurement in each individual color channel. This calibration is accomplished by modeling the pixel's response behavior such that every pixel of the camera has a polarization response for each red, green, and blue channel. The polarization response of a pixel can be described as an analyzer vector. Equation 3 described the relationship between the intensity value I, the analyzer vector A, and the input Stokes vector S of a pixel.

 figure: Fig. 12

Fig. 12 The spectral response of the camera color sensor and the transmission spectrum of the external color filter are shown.

Download Full Size | PDF

I=ASA=[ARAGAB]=[aR0aR1aR2aR3aG0aG1aG2aG3aB0aB1aB2aB3]S=[SRSGSB]T=[SR0SR1SR2SR3SG0SG1SG2SG3SB0SB1SB2SB3]T

Here SRi, SGi, SBi are the Stokes vectors of the input light at red, green, and blue channels, where i = 0,1,2,3 denotes the four Stokes parameters. ARi, AGi, ABi are the analyzer vectors of the pixel at red, green, and blue channels.

The calibration of the RGB full-Stokes camera is done at the wavelengths 460nm (B), 540nm (G), and 610nm (R), which are the peaks of the spectral response of the color sensor. To reduce the calibration error from the dispersion of the achromatic QWP and cube beam splitter, we put an external color filter in front of the beam splitter to narrow the response region of the camera color sensor. The spectral response of the camera color sensor and the external color filter are shown in Fig. 12. Figure 13 shows the calibration setup. White light from a tungsten lamp is filtered by a bandpass filter and collimated by a positive lens. The collimated light can avoid oblique incidence on the camera sensor and increase the intensity of the light received by the sensor. The collimated light then passes through a fixed linear polarizer and rotatable achromatic QWP and is received by the camera sensor. For each wavelength, the camera takes 18 intensity measurements with transmission axis of linear polarizer at vertical and fast axis of QWP at 0, 10, 20, …, 170 degrees. In total, there are 54 intensity measurements for the 54 input light states. The relationship between the intensity and the analyzer vector is given by

 figure: Fig. 13

Fig. 13 The calibration setup of the RGB full-Stokes camera is shown.

Download Full Size | PDF

I=[I1...I54],S=[SR01...SR054SB31...SB354]I=AS

Here In and the nth column of matrix S represent the intensity value and input Stokes vector of nth measurement respectively. The analyzer vector of one pixel can then be calculated as

A=IS+
where S+ represents the pseudo inverse of the matrix S. The analyzer vector of each pixel can be used in the Stokes vector reconstruction from the raw image taken by the camera. Figure 5 of the Section 3 shows plots of the red, green, and blue components of total analyzer vector.

Appendix C Demosaicing algorithm, color and Stokes vector reconstruction

The color and Stokes vector reconstruction can be divided into two steps. In the first step, missing intensity measurements are recovered at each pixel by bicubic-spline interpolation. For the RGB full Stokes camera, there are two color polarization cameras. Each camera has a micro-polarizer array applied on top of the Bayer filter array. The micro-polarizer array contains polarizers at 0, 45, 90, 135 degrees and the Bayer filter array contains one red filter, one blue filter, and two green filters. Thus, there is a total of 2*4*4 = 32 intensity measurements performed by the two cameras for each set of 16 pixels. However, only one intensity measurement is taken at each pixel, which is not enough to solve the Stokes vector at the three color channels. Therefore, for each set of pixels that performs one kind of measurements, we have to recover the measurements for the rest of the pixels using bicubic-spline interpolation. The analyzer vector of each measurement is also interpolated element by element. Figure 14 shows the interpolation methods for red, green, and blue pixels. For red and blue pixels, the bicubic-spline interpolation is done along horizontal and vertical direction. For green pixels, the two green pixels inside a Bayer unit cell are regarded as two independent measurements and are interpolated separately. After the interpolation, we have 32 analyzer vectors and 32 intensity measurements at each pixel. Better interpolation techniques can be applied to get smaller sampling error [22–24].

 figure: Fig. 14

Fig. 14 A schematic of the interpolation of the red, green, and blue pixels is shown. The dashed lines represent the direction of interpolation. Note that the green pixel is divided into two independent set of pixels which are interpolated separately.

Download Full Size | PDF

In the second step, the Stokes vector is calculated at each pixel. The 32 analyzer vectors are combined to form a measurement matrix W. The relationship between W and intensity measurements can be described as

I=[I1I54],W=[aR01...aB31aR054...aB354],S=[SR0SB3]I=WS
Here S is the input Stokes vector and can be solved as
S=W+I
where W + is the pseudo inverse of the measurement matrix.

Appendix D RGB full-Stokes images

Four indoor and four outdoor scenes are taken by the RGB full-Stokes camera. Besides the phone display scene and the window scene in the main article, the other three indoor and three outdoor scenes are a polarization wheel, beetles, toys, window stress birefringence, sky, and cars. The RGB full-Stokes images of these scenes are listed below.

Figure 15 shows the RGB full-Stokes image of a polarization wheel. The polarization wheel is covered with linear polarizers in the outer circle and circular polarizers in the inner circle. In the image, we see that the outer circle has large linear polarization while the inner circle has large circular polarization.

 figure: Fig. 15

Fig. 15 The RGB full-Stokes image of a chopper wheel with circular polarizer placed at the inner wheel and linear polarizer placed at the outer wheel.

Download Full Size | PDF

Figure 16 shows the RGB full-Stokes image of three beetles. The exoskeleton of the beetle is known to have circular polarization due to the chirality of the microstructure. In the image, we see both significant circular polarization and linear polarization. The linear polarization is due to the Fresnel reflection of the illuminating light.

 figure: Fig. 16

Fig. 16 The RGB full-Stokes image of three beetles shows the exoskeletons acting like a reflective polarizer.

Download Full Size | PDF

Figure 17 shows the RGB full-Stokes image of a set of toys. In the image we see some linear polarization due to the Fresnel reflection of the illuminating light.

 figure: Fig. 17

Fig. 17 The RGB full-Stokes image of several toys is shown.

Download Full Size | PDF

Figure 18 shows the RGB full-Stokes image of stress birefringence. The image is taken by capturing the sky through a window. The sky is a linear polarized source while the window serves as a retarder due to its stress birefringence. When a linear polarized light passes through a retarder, the light is converted to elliptical polarization, which has a component of circular polarization. In the image, we see significant variation of DoCP, which indicates the stress distribution inside the window.

 figure: Fig. 18

Fig. 18 The RGB full-Stokes image of sky taken through a window shows the stress birefringence of the glass in the window.

Download Full Size | PDF

Figure 19 shows the RGB full-Stokes image of the sky, clouds, and mountain. In the image, we see that the sky has linear polarization while the cloud and mountain does not.

 figure: Fig. 19

Fig. 19 The RGB full-Stokes image of the sky, clouds, and mountain is shown.

Download Full Size | PDF

Figure 20 shows the RGB full-Stokes image of cars in assorted colors. The car has large linear polarization due to the Fresnel reflection on its smooth surface. The darker the car is, the smaller is the scattering light and the larger is the fraction of specular reflection. In the image we see that the black car has the largest DoLP while the white car has the smallest DoLP.

 figure: Fig. 20

Fig. 20 The RGB full-Stokes image of cars in assorted colors is shown.

Download Full Size | PDF

In order to demonstrate the continuous measurement capability of the camera, a video of a rotating polarization wheel is taken by the camera and is included in the supplementary material (See Visualization 1). The acquisition of the video involves the synchronization of two cameras; the Stokes vectors at different color are reconstructed frames by frames. The two cameras are synchronized by using the MATLAB package “Image Acquisition Toolbox Support Package for GigE Vision Hardware” to control the camera. The two cameras are connected to one computer via a Gigabit Ethernet Cable and a network switch. Multiple camera parameters such as trigger delay and package delay are adjusted to achieve continuous and synchronous frames.

Funding

National Science Foundation (NSF) (1455630 and 1607385).

References and links

1. F. Snik, J. Craven-Jones, M. Escuti, S. Fineschi, D. Harrington, A. D. Martino, D. Mawet, J. Riedi, and J. S. Tyo, “An overview of polarimetric sensing techniques and technology with applications to different research fields,” Proc. SPIE 9099, 90990B (2014).

2. J. S. Tyo, D. L. Goldstein, D. B. Chenault, and J. A. Shaw, “Review of passive imaging polarimetry for remote sensing applications,” Appl. Opt. 45(22), 5453–5469 (2006). [CrossRef]   [PubMed]  

3. W. L. Hsu, S. Johnson, and S. Pau, “Multiplex localization imaging and sub-diffraction limited measurement,” J. Mod. Opt. 60(5), 414–421 (2013). [CrossRef]  

4. K. M. Twietmeyer, R. A. Chipman, A. E. Elsner, Y. Zhao, and D. VanNasdale, “Mueller matrix retinal imager with optimized polarization conditions,” Opt. Express 16(26), 21339–21354 (2008). [CrossRef]   [PubMed]  

5. M. Zhang, X. Wu, N. Cui, N. Engheta, and J. Van der Spiegel, “Bioinspired Focal-Plane Polarization Image Sensor Design: From Application to Implementation,” Proc. IEEE 102(10), 1435–1449 (2014). [CrossRef]  

6. K. Creath and G. Goldstein, “Dynamic quantitative phase imaging for biological objects using a pixelated phase mask,” Biomed. Opt. Express 3(11), 2866–2880 (2012). [CrossRef]   [PubMed]  

7. M. Novak, J. Millerd, N. Brock, M. North-Morris, J. Hayes, and J. Wyant, “Analysis of a micropolarizer array-based simultaneous phase-shifting interferometer,” Appl. Opt. 44(32), 6861–6868 (2005). [CrossRef]   [PubMed]  

8. G. Myhre, W. L. Hsu, A. Peinado, C. LaCasse, N. Brock, R. A. Chipman, and S. Pau, “Liquid crystal polymer full-stokes division of focal plane polarimeter,” Opt. Express 20(25), 27393–27409 (2012). [CrossRef]   [PubMed]  

9. W. L. Hsu, G. Myhre, K. Balakrishnan, N. Brock, M. Ibn-Elhaj, and S. Pau, “Full-Stokes imaging polarimeter using an array of elliptical polarizer,” Opt. Express 22(3), 3063–3074 (2014). [CrossRef]   [PubMed]  

10. W. L. Hsu, J. Davis, K. Balakrishnan, M. Ibn-Elhaj, S. Kroto, N. Brock, and S. Pau, “Polarization microscope using a near infrared full-Stokes imaging polarimeter,” Opt. Express 23(4), 4357–4368 (2015). [CrossRef]   [PubMed]  

11. X. Zhao, A. Bermak, F. Boussaid, and V. G. Chigrinov, “Liquid-crystal micropolarimeter array for full Stokes polarization imaging in visible spectrum,” Opt. Express 18(17), 17776–17787 (2010). [CrossRef]   [PubMed]  

12. M. Garcia, S. Gao, C. Edmiston, T. York, and V. Gruev, “A 1300 × 800, 700 mW, 30 fps spectral polarization imager,” in Circuits and Systems (ISCAS) 2015 IEEE International Symposium (IEEE 2015), 1106–1109. [CrossRef]  

13. M. Garcia, C. Edmiston, R. Marinov, A. Vail, and V. Gruev, “Bio-inspired color-polarization imager for real-time in situ imaging,” Optica 4(10), 1263–1271 (2017). [CrossRef]  

14. X. Li, B. Gunturk, and L. Zhang, “Image demosaicing: a systematic survey,” Proc. SPIE 6822, 68221J (2008). [CrossRef]  

15. S. Gao and V. Gruev, “Bilinear and bicubic interpolation methods for division of focal plane polarimeters,” Opt. Express 19(27), 26161–26173 (2011). [CrossRef]   [PubMed]  

16. S. Gao and V. Gruev, “Gradient-based interpolation method for division-of-focal-plane polarimeters,” Opt. Express 21(1), 1137–1151 (2013). [CrossRef]   [PubMed]  

17. X. Tu and S. Pau, “Optimized design of N optical filters for color and polarization imaging,” Opt. Express 24(3), 3011–3024 (2016). [CrossRef]   [PubMed]  

18. W. T. Welford, Aberrations of Optical System (Bristol; Boston: Hilger, 1986).

19. S. Zhu, A. W. Yu, D. Hawley, and R. Roy, “Frustrated total internal reflection: A demonstration and review,” Am. J. Phys. 54(7), 601–607 (1986). [CrossRef]  

20. R. Chipman, “Polarimetry,” in OSA Handbook of Optics (McGraw-Hill, 2010).

21. D. S. Sabatke, A. M. Locke, M. R. Descour, W. C. Sweatt, J. P. Garcia, E. L. Dereniak, S. A. Kemme, and G. S. Phipps, “Figures of merit for complete Stokes polarimeter optimization,” Proc. SPIE 4133, 75–81 (2000). [CrossRef]  

22. B. M. Ratliff, C. F. LaCasse, and J. S. Tyo, “Interpolation strategies for reducing IFOV artifacts in microgrid polarimeter imagery,” Opt. Express 17(11), 9112–9125 (2009). [CrossRef]   [PubMed]  

23. I. J. Vaughn, A. S. Alenin, and J. Scott Tyo, “Focal plane filter array engineering I: rectangular lattices,” Opt. Express 25(10), 11954–11968 (2017). [CrossRef]   [PubMed]  

24. J. S. Tyo, C. F. LaCasse, and B. M. Ratliff, “Total elimination of sampling errors in polarization imagery obtained with integrated microgrid polarimeters,” Opt. Lett. 34(20), 3187–3189 (2009). [CrossRef]   [PubMed]  

Supplementary Material (1)

NameDescription
Visualization 1       RGB full-Stokes video of a rotating polarization wheel is taken by an RGB full-Stokes camera. Videos of four Stokes parameters and DoLP a DoCP at each color channel are included. The polarization wheel is covered with linear polarizers in the outer c

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (20)

Fig. 1
Fig. 1 The sensor array and Bayer color filter array of the color polarization camera have a 4.5µm pitch. The linear micro-polarizer array has a 9µm pitch.
Fig. 2
Fig. 2 The schematics of the RGB full-Stokes camera. An external color filter is added to narrow the transmission band of each of the RGB color.
Fig. 3
Fig. 3 The measured retardance curve of the achromatic QWP and the transmittance of the cube beamsplitter.
Fig. 4
Fig. 4 Alignment result using a USAF resolution test chart taken by the two cameras is shown on the left. Plots of the intensity profile along different lines (Line 1 to Line 4) of the captured images are shown on the right.
Fig. 5
Fig. 5 Top: the histograms of the diattenuation of red (left), green (middle) and blue (right) pixels are shown. The dashed lines represent the average value. Bottom: the calibrated analyzer vectors of all red (left), green (middle), and blue (right) pixels of the two cameras are plotted on the Poincaré sphere. The black dots represent the analyzer vector of the ideal case.
Fig. 6
Fig. 6 The misalignment between micro-polarizer array and color filter array causes the different diattenuation between three color channels.
Fig. 7
Fig. 7 The DoCP and DoLP are measured at different retarder orientations. (a) The solid line represents the DoCP and DoLP characterized by an Axometrics polarimeter. The solid dot and error bar represents the average DoCP and DoLP measurements and standard deviation of the camera, respectively. (b) The difference between the Axometrics polarimeter measurements and camera average measurements is plotted as a function of retarder angles. (c) The standard deviation of the camera measurements is plotted as a function of retarder angles.
Fig. 8
Fig. 8 The RGB full-Stokes image of phone displays with an Android phone displaying a color target on the left, an iPhone displaying a color target in the middle, and paper color target on the right.
Fig. 9
Fig. 9 The RGB full-Stokes image of a window building. The window is highlighted in the DoLP images.
Fig. 10
Fig. 10 A photo shows the top view of the alignment experiment setup.
Fig. 11
Fig. 11 A schematic shows the first step of the alignment process.
Fig. 12
Fig. 12 The spectral response of the camera color sensor and the transmission spectrum of the external color filter are shown.
Fig. 13
Fig. 13 The calibration setup of the RGB full-Stokes camera is shown.
Fig. 14
Fig. 14 A schematic of the interpolation of the red, green, and blue pixels is shown. The dashed lines represent the direction of interpolation. Note that the green pixel is divided into two independent set of pixels which are interpolated separately.
Fig. 15
Fig. 15 The RGB full-Stokes image of a chopper wheel with circular polarizer placed at the inner wheel and linear polarizer placed at the outer wheel.
Fig. 16
Fig. 16 The RGB full-Stokes image of three beetles shows the exoskeletons acting like a reflective polarizer.
Fig. 17
Fig. 17 The RGB full-Stokes image of several toys is shown.
Fig. 18
Fig. 18 The RGB full-Stokes image of sky taken through a window shows the stress birefringence of the glass in the window.
Fig. 19
Fig. 19 The RGB full-Stokes image of the sky, clouds, and mountain is shown.
Fig. 20
Fig. 20 The RGB full-Stokes image of cars in assorted colors is shown.

Tables (1)

Tables Icon

Table 1 Noise comparison between the actual camera and the perfect case

Equations (7)

Equations on this page are rendered with MathJax. Learn more.

MSE= 1 MN 1iM 1jN ( I 1 (i,j)- I 2 (i,j)) 2 .
RAD= j=0 R1 1 μ j ,CN= μ max μ min ,EWV= j=0 R1 1 μ j 2 ,
I=AS A=[ A R A G A B ] =[ a R0 a R1 a R2 a R3 a G0 a G1 a G2 a G3 a B0 a B1 a B2 a B3 ] S=[ S R S G S B ] T = [ S R0 S R1 S R2 S R3 S G0 S G1 S G2 S G3 S B0 S B1 S B2 S B3 ] T
I=[ I 1 ... I 54 ],S=[ S R0 1 ... S R0 54 S B3 1 ... S B3 54 ] I=AS
A=I S +
I=[ I 1 I 54 ],W=[ a R0 1 ... a B3 1 a R0 54 ... a B3 54 ],S=[ S R0 S B3 ] I=WS
S= W + I
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.