Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Lensless sensor system using a reference structure

Open Access Open Access

Abstract

We describe a reference structure based sensor system for tracking the motion of an object. The reference structure is designed to implement a Hadamard transformation over a range of angular perspectives. We implemented a reference structure with an angular resolution of 5° and a field of view of 40°.

©2003 Optical Society of America

1. Introduction

With the emergence of ubiquitous embedded digital processing and inexpensive electronic focal planes, imaging and sensor systems that integrate optical and electronic processing are increasingly attractive. Examples include wave-front coding [1], spectral tomography [2], coded aperture [3, 4] and interferometric systems [5]. Potential advantages of computational systems include increased depth of field, multidimensional or multispectral imaging, improved data or computational efficiency and improved target recognition or tracking capabilities. These systems are regarded as “integrated computational imaging systems” because they use non-conventional optical elements to preprocess the field for digital analysis [6]. Preprocessing is typically implemented using 2D coded apertures or diffractive elements, although Barbasthathis has used volume holographic elements [7].

This paper describes a sensor system with a new form of volume optical element, a tomographic reference structure. Rather than modulating the wavefront of the sensed field, the reference structure modulates the visibility of the source space. In designing a reference structure, one seeks both to make the source state computationally apparent and to make the computational load trackable. Considerable analysis may be applied to reference structure design for many applications, our goal here is to simply demonstrate the feasibility of this approach and to explore differences between classes of transformations implementable on 3D references structures as opposed to 2D coded apertures or wavefronts.

We use a reference structure in particular to address simple motion tracking problems. Simple integrated computational sensor systems have long been used for applicationspecific purposes. For example, X-ray systems have relied on coded aperture and pin-hole imaging because lenses are not available [8, 9, 10] and infrared systems have relied on moving reticles because of the high cost of focal planes [11, 12].

This paper describes a simple reference structure designed to create a compact low bandwidth device for tracking objects over a wide angular range and high depth of field. This application is particularly attractive for reference structure based imaging, as a wide angular field of view, high depth of field and source discrimination can simultaneously be achieved. One might expect such sensors to be useful in simple positioning, tracking and process control systems. These sensors may achieve spatial resolution and data loads that bridge the gap between video and motion detection systems.

As discussed in the patent literature, existing motion detection systems such as infrared motion sensors utilize Fresnel lens arrays [13, 14] and conical mirrors [15] to shape the sensor response for object discrimination. These systems achieve a wide angular field of view, but have poor resolution. The advantage of our system is that any desired angular field of view and angular resolution can be achieved by implementing a suitable reference structure. Reference structures also allow complex spatial response patterns to be coded in motion sensors, which is particularly useful in sensor array applications.

We describe the design and implementation of this sensor on a linear array in the second section of the paper and describe the performance of a prototype system in the third section.While the performance demonstrates reasonable angular resolution over an impressive depth of field, the most significant aspect of this paper may be its contribution to discussions regarding the utility of volume optical elements in computational sensor systems.

 figure: Fig. 1.

Fig. 1. Schematic of the reference structure based imaging system

Download Full Size | PDF

2. Reference structure design

A passive imaging system is a linear mapping from a source space onto a measurement space. In most planar modulation imaging systems, such as focal, pinhole, coded aperture and reticle systems, this mapping is shift invariant, described in the two dimensional space by the equation

M(x,y)=T(x,y)S(x,y)

where M(x, y) is the spatial distribution of the measurements, S(x, y) is the source distribution and the symbol ⋆ denotes the correlation operation. T(x, y) is the impulse response function of either the pinhole or the coded aperture mask, which is usually the aperture transmission function.

The correlation expressed in Eq. (1) may also be expressed in terms of a system of linear equations as

m=Ts

where the vectors m and B are obtained by lexicographic reordering of the matrices M and S, obtained by discretely sampling the corresponding distributions. For systems implementing a shift invariant mapping, T has the form of a Toeplitz matrix [16].

In contrast, Fourier [17] and Radon [5, 18] transform-based systems involve shiftvariant transformations. It is not possible to implement a shift variant transformation using thin or transverse optical elements. Fourier and Radon systems have used interferometers and camera arrays. A primary goal of this paper is to explore the implementation of shift variant transformations using non-interferometric longitudinal modulations.

For simplicity, we limit our consideration to one dimensional imaging systems. The source state is described by an angular radiance distribution s(θ). We use a reference structure to implement a mapping between the source space and the measurement space as illustrated in Fig. 1. Let the mapping implemented by the structure from the source space to the measurement space be T(r, θ). T(r, θ) is a binary valued function which takes either a 1 or 0 based on whether the source distribution along θ direction is visible at the measurement point r. Thus, the measured value at r is given by

m(r)=T(r,θ)s(θ)dθ

If source radiance is measured in discrete angles (determined by the angular resolution) with a linear detector array then the continuous integral in Eq. (3) takes a discrete form as

mi=jTi(θj)s(θj)

where m i is the measurement at the i th detector.

Thus, in matrix form, the reference structure based imaging can also be described using Eq. (2), where T represents the transformation between source space and the measurement space as implemented by the reference structure. In principle, a reference structure may implement any transformation between the 1D source radiance and the 1D sensor. Transformations between 2D or 3D spaces are constrained, however, by dimensional considerations. In selecting the transformation between the source space and the measurement vector, one seeks to make T well-conditioned for inversion and to achieve targeted angular resolution, field of view and light efficiency.

Consider a reference structure designed to achieve an angular field of view of θ with an angular resolution of Δθ. The number of source elements that can be resolved using these parameters is given by

N=θΔθ

If the source plane is at a distance ’D’ from the detector plane, then these source elements can be resolved with a source resolution of DΔθ. We assume that Δθ is small such that tan Δθ~Δθ in radians.

The reference structure is designed to map the N source elements onto N detector elements. The reference structure is designed such that each detector element is connected to the source element by a light pipe. The pipe has a rectangular cross-section whose dimensions match the detector element in order to ensure maximum signal throughput. The pipe restricts the field of view of each detector to a narrow angle (given by the angular selectivity of the pipe) along a certain direction.

A detector element sees a source element through the pipe pointing towards the source element. For simplicity of design, it was assumed that source plane is sufficiently far away from the detector such that different detector elements see the same source element through pipes oriented at the same angle. This requires that the angle subtended by the detector array at the center of the source plane should be less than the angular selectivity of each pipe, i.e

dD<2lL

where d is the length of the sensor array, D is the distance between the source and sensor plane, l is the width of the pipe and L is the length of the pipe.

The reference structure is a collection of light pipes connecting the source and detector elements according to some chosen transformation matrix T. If element T ij in the matrix T is 1, then a pipe is present which connects the j th source element to the ith detector element. T is chosen to be a well conditioned and invertible matrix. Another consideration for choosing T is that the corresponding structure should be light efficient.

We designed a reference structure for a sensor system with an angular field of view of 40° and angular resolution of 5°. The number of the source and detector elements N was 8. We choose the transformation matrix T which determines the connectivity pattern as an 8×8 Hadamard matrix with the ’-1’s replaced by ’0’s. Hadamard matrix based transformation is chosen as they are known to be well conditioned [19]. Since atleast half the elements of each row are 1’s, each detector monitors half of the source elements, implementing an efficient use of the available source radiation. The transformation matrix is given by

 figure: Fig. 2.

Fig. 2. Connectivity pattern of the reference structure: The lower array of dots represent 8 detector elements. Each line represents a pipe enabling the detector to see along a particular source angle. The upper array of dots represent the exit face of the pipes looking towards the source space. Note that even though there are 11 points, the number of source angles monitored is only 8.

Download Full Size | PDF

T=(0000111100111100010110100110100110011001111111111010101011001100)

Note that some of the rows were interchanged and this was done for purely design and fabrication reasons.

The connectivity pattern between the 8 sensor points and the 8 source angles is shown in Fig. 2. The 8 angles of the pipes were -20°,-15°,-10°,-5°,5°,10°,15°,20°. The connectivity pattern is obtained using the transformation matrix described in Eq. 7. For example, the angles seen by detector 1 is determined by row 1 of the transformation matrix. The columns of the matrix represent each of the 8 source angles. Thus, detector 1 sees angles 5°, 10°, 15° and 20° whereas detector 5 sees -20°, -5°, 5° and 20°.

Using the connectivity pattern described in Fig. 2, we fabricated a reference structure using stereo lithography process [20]. The length ’L’ and the width ’l’ of the pipes were 36 mm and 1.4 mm respectively. The fabricated structure along with the sensors on the end is shown in Fig. 3.

The sensor array is a 16-element photovoltaic silicon photodiode array [21]. Only 8 of these 16 elements are used. The photodiode array is bonded to one end of the reference structure. The dimension of each pipe is designed to match the pixel dimensions of the sensor array to ensure maximum light efficiency. The dimensions of each detector element was 1.22mm×1.84mm. The length of the detector array was 25.20mm.

 figure: Fig. 3.

Fig. 3. Fabricated reference structure with a linear photodiode array attached to one of its end

Download Full Size | PDF

We used this reference structure to track the motion of an object in its field of view. The optical field propagating from the source space is modulated by a reference structure before being detected by the detector array. The analog output of the detector array is a voltage signal and is proportional to the intensity sensed by it. The source motion causes a change in intensity on the detector which is sensed by amplifying the AC component of the detector signal by an AC amplifier. The amplified signal is then digitized by an analog to digital converter. The measured data is then digitally transformed to reconstruct the source state. Thus each detector effectively senses the change in intensity falling on it and the reconstruction corresponds to the change in intensity in the source space. If the motion of an object causes the change in intensity in the source space (no significant change in background), the reconstruction of the source state indicates the object motion.

3. Experimental Results

In this section, we will show some of the experimental results obtained when the above described reference structure was used for motion tracking. Figures 46 describe the results obtained when a source (a fiber illuminator was used as the light source) moves at a distance of about 3 m in front of the structure. At 3 meters, the source space has 8 source elements of length 26 cm each since the structure has an angular resolution of 5°. Figure 4 shows the analog output, sampled and digitized by a 12 bit ADC, of sensor 1 and sensor 6 as the source moves through each of the 8 space cells. The 1st sensor sees the last four space cells and the 6th sensor sees all the 8 space cells as shown in the connectivity pattern of the structure. When the source moves into a space cell connected to the sensor, there is an increase in the intensity falling on the sensor resulting in a positive sensor output. When the source moves out of that space cell, there is a decrease in the intensity resulting in a negative sensor output. Thus the 4 positive and negative peaks in the 1st sensor output represent the motion of the source across the last four space cells. Since the 6th sensor sees all the space cells, there are 8 peak combinations as shown in the Fig. 4.

 figure: Fig. 4.

Fig. 4. Output of sensor 1 and 6 when a light source (fiber lamp) moves in front of the structure at a distance of 3m

Download Full Size | PDF

Figure 5 shows the combined output of all the 8 sensors represented as a matrix of dimensions 8×250. Each row consists of 250 data samples collected from each sensor at a sampling rate of 25 Hz. Note that the intensity level of each detector is different. This is due to the different gain of each amplifier. We correct for this different intensity by scaling each row of the transformation matrix. It can also be observed that the output of sensor 7 at 3 seconds is a spurious signal which could probably due to the change in the background intensity. The source space is then reconstructed by multiplying this output with the inverse of the scaled transformation matrix. The reconstructed source space is shown in Fig. 6 which is also a matrix of dimensions 8×250. Here, each row represents the reconstructed source space cells along each of the 8 angles. It can be observed from the figure that the motion in each of the 8 monitored space cells along the 8 angles occur at subsequent time intervals which would give the information about the velocity of the source. Our source angles are all equally spaced at 5° except between the angles -5° and 5° where the difference is 10°. Thus a longer duration is observed in the reconstruction between 5 and 6 seconds.

The reference structure was then setup to monitor cars passing in front of it at a distance of about 15m. The cars were moving at an average speed of 10mph. Figure 7 shows the reconstructed source space by following the above described procedure.

In order to demonstrate high depth of field imaging with the reference structures, two fiber lamp sources were moving at a distance of 2m and 3m from the structure respectively. There was a lateral separation of about 0.5m between the two sources. The reconstructed image is shown in Fig. 8. Each reconstructed source cell records the motion of both the sources demonstrating the high depth of field of the reference structure.

4. Conclusion

In conclusion, we have discussed a novel sensing technique using a reference structure. We designed a sensing system with an angular resolution of 5° and field of view of 40°. The reference structure was fabricated using a stereolithographic printer.

We used this sensor system to track the motion of an object and presented some results from this experiment. The reference structure was designed to implement a Hadamard transformation matrix. Here we demonstrated a high depth of field 1D sensing system with a reference structure having pipes in one plane only. In general, the mapping implemented by a reference structure could be from a space of higher dimension to that of a lower dimension. For e.g, a 3D volume could be mapped on to a set of 2D sensor planes. Research in this direction is in progress.

 figure: Fig. 5.

Fig. 5. Multiplexed output of the sensors when a fiber light source moves in front of the structure: Note that sensor 1 sees the last four angles viz 5°, 10°, 15° and 20° as given by the transformation matrix described in Eq. (7)

Download Full Size | PDF

 figure: Fig. 6.

Fig. 6. Reconstruction of the motion of the fiber source

Download Full Size | PDF

 figure: Fig. 7.

Fig. 7. Reconstructed source space of a car moving at 10mph

Download Full Size | PDF

 figure: Fig. 8.

Fig. 8. Reconstruction of the motion of two fiber lamps

Download Full Size | PDF

5. Acknowledgments

The authors are grateful to Nikos Pitsianis, Mike Sullivan, Xiaobai Sun, Steve Feller and Evan Cull for useful discussions. This work was supported by the Defense Advanced Research Projects Agency through the grant DAAD 19-01-1-0641.

References and links

1. W. T. Cathey and E. R. Dowski, “New Paradigm for Imaging Systems,” Appl. Opt. , 41, 6080–6092, 2002. [CrossRef]   [PubMed]  

2. B. Ford, M. R. Descour, and R. M. Lynch, “Large-image-format computed tomography imaging spectrometer for fluorescence microscopy,” Opt. Express , 9, 444–4532001), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-9-9-444. [CrossRef]   [PubMed]  

3. T. M. Cannon and E. E. Fenimore, “Coded Aperture Imaging - Many holes make light work,” Optical Engineering 19, 283–289, 1980.

4. G. K. Skinner, “Imaging with Coded-Aperture Masks,” Nuclear Instruments and Methods in Physics Research Section a- Accelerators Spectrometers Detectors and Associated Equipment, 221, 33–40, 1984.

5. D. L. Marks, R. A. Stack, D. J. Brady, D. C. Munson, and R. B. Brady, “Visible cone-beam tomography with a lensless interferometric camera,” Science , 284, 2164–2166, 1999. [CrossRef]   [PubMed]  

6. D. J. Brady and Z. U. Rahman, “Integrated analysis and design of analog and digital processing in imaging systems: introduction to the feature issue,” Appl. Opt. , 41, 6049–6049, 2002. [CrossRef]   [PubMed]  

7. G. Barbastathis and D. J. Brady, “Multidimensional tomographic imaging using volume holography,” Proceedings of the IEEE , 87, 2098–2120, 1999. [CrossRef]  

8. E. Caroli, J. B. Stephen, G. Di Cocco, L. Natalucci, and A. Spizzichino, “Coded Aperture imaging in X- and Gamma-ray astronomy,” Space Sci. Rev. 45, 349–403, 1987. [CrossRef]  

9. G. K. Skinner and T. J. Ponman, “Inverse Problems in X-Ray and Gamma-Ray Astronomical Imaging,” Inverse Problems , 11, 655–676, 1995. [CrossRef]  

10. R. H. Dicke, “Scatter-hole cameras for X-rays and gamma rays,” Astrophys. J. , 153, L101–L106, 1968. [CrossRef]  

11. R. G. Driggers, C. E. Halford, and G. D. Boreman, “Parameters of Spinning Am Reticles,” Appl. Opt. , 30, 2675–2684, 1991. [CrossRef]   [PubMed]  

12. R. G. Driggers, C. E. Halford, G. D. Boreman, D. Lattman, and K. F. Williams, “Parameters of Spinning Fm Reticles,” Appl. Opt. , 30, 887–895, 1991. [CrossRef]   [PubMed]  

13. J. R. Baldwin, “Composite Fresnel lens for use in passive infrared detection system - has array of Fresnel lens segments having expanded composite field-of- view due to cross-over of two segment groups’ field of view,” Hubbell Inc., Patent US5442178-A, 1995.

14. J. R. Baldwin, “Composite Fresnel lens for passive infrared detection system - has two groups of lens segments arranged contiguous side-by-side relationship along curve, one group positioned according to rules for narrow long range cover the other group for short range cover,” Hubbell Inc., Patent CA2222663-A, 1998.

15. H. L. Berman, “Infrared intrusion detector system - has truncated conical mirror for focusing radiation from field of view onto sensing element,” Hoermann Corp, Patent US3703718-A, 1982.

16. H. C. Andrews and B. R. Hunt, Digital Image Restoration, Prentice Hall Inc1977.

17. K. Itoh and Y. Ohtsuka, “Fourier-transform spectral imaging: retrieval of source information from three-dimensional spatial coherence,” J. Opt. Soc. Am. A , 3, 94–100, 1986. [CrossRef]  

18. D. L. Marks, R. Stack, A. J. Johnson, D. J. Brady, and D. C. Munson, “Cone-beam tomography with a digital camera,” Appl. Opt. , 40, 1795–1805, 2001. [CrossRef]  

19. M. Harwit and N. J. A. Sloane, Hadamard Transformation Optics, Academic Press1979.

20. P. F. Jacobs, Rapid prototyping & Manufacturing: Fundamentals of Stereolithography, Society of Manufacturing Engineers1993.

21. Photonic detectors Inc.http://www.photonicdetectors.com

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (8)

Fig. 1.
Fig. 1. Schematic of the reference structure based imaging system
Fig. 2.
Fig. 2. Connectivity pattern of the reference structure: The lower array of dots represent 8 detector elements. Each line represents a pipe enabling the detector to see along a particular source angle. The upper array of dots represent the exit face of the pipes looking towards the source space. Note that even though there are 11 points, the number of source angles monitored is only 8.
Fig. 3.
Fig. 3. Fabricated reference structure with a linear photodiode array attached to one of its end
Fig. 4.
Fig. 4. Output of sensor 1 and 6 when a light source (fiber lamp) moves in front of the structure at a distance of 3m
Fig. 5.
Fig. 5. Multiplexed output of the sensors when a fiber light source moves in front of the structure: Note that sensor 1 sees the last four angles viz 5°, 10°, 15° and 20° as given by the transformation matrix described in Eq. (7)
Fig. 6.
Fig. 6. Reconstruction of the motion of the fiber source
Fig. 7.
Fig. 7. Reconstructed source space of a car moving at 10mph
Fig. 8.
Fig. 8. Reconstruction of the motion of two fiber lamps

Equations (7)

Equations on this page are rendered with MathJax. Learn more.

M ( x , y ) = T ( x , y ) S ( x , y )
m = T s
m ( r ) = T ( r , θ ) s ( θ ) d θ
m i = j T i ( θ j ) s ( θ j )
N = θ Δ θ
d D < 2 l L
T = ( 0 0 0 0 1 1 1 1 0 0 1 1 1 1 0 0 0 1 0 1 1 0 1 0 0 1 1 0 1 0 0 1 1 0 0 1 1 0 0 1 1 1 1 1 1 1 1 1 1 0 1 0 1 0 1 0 1 1 0 0 1 1 0 0 )
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.