Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Bio-inspired variable imaging system simplified to the essentials: modelling accommodation and gaze movement

Open Access Open Access

Abstract

A combination of an aspherical hybrid diffractive-refractive lens with a flexible fluidic membrane lens allows the implementation of a light sensitive and wide-aperture optical system with variable focus. This approach is comparable to the vertebrate eye in air, in which the cornea offers a strong optical power and the flexible crystalline lens is used for accommodation. Also following the natural model of the human eye, the decay of image quality with increasing field position is compensated, in the optical system presented here, by successively addressing different tilting angles which mimics saccadic eye-movements. The optical design and the instrumental implementation are presented and discussed, and the working principle is demonstrated.

© 2015 Optical Society of America

1. Introduction

Biological systems often provide surprising and remarkable solutions for problems which can be solved in the technical world only by considerable efforts and high complexity. In addition, nature’s solutions are always characterized by a very well balanced adjustment between the offered vital functionality and a maximum of simplicity. That means in general, natural systems are only equipped with the absolute necessities, and additional dispensable characteristics or just 'nice to have'-features are not manifested, unless they are an “appendix” of the evolutionary process. An impressive example for this is the vertebrate eye, especially the human eye, which achieves a very well adapted visual perception with a very limited number of optical components and interfaces. In this case, the optimized balance between functionality and simplicity implies that the optical imaging properties of the eye are far away from being perfect as it would be in a commercial photo camera system, but offers high resolution only at the fovea, a very small angular range of approximately 1° [1] for only one specific object distance. Starting from the center of the fovea to the edge of the visual field, the resolution capabilities experiences a strong decay. This is confirmed by the cone-receptor density, which shows an average of ~2x105 cones/mm2 in the center of the fovea and declines by an order of magnitude to about ~2x104 cones/mm2 within a radius of 1 mm away from the foveal center. At the edge of the visual field, the cone density is further reduced to ~5x103 cones/mm2 [1]. Conversely, the diameter of the cones is growing with increasing distance to the foveal center. The field-dependent resolution correlates to a drastically reduced form perception for large visual angles. In this retinal region, motion perception dominates. Despite the objective fact of a strong resolution decay from foveal to peripheral vision, the subjective visual perception of an observer is not disrupted, and the appearance of the world seems not to be blurred. This characteristic effect of an unblurred perceptive impression can be attributed to the processing of a variety of single scenes in which each scene shows a different highly resolved area. The sequence of the processed scenes is originated by saccades, rapid jerky eye movements that redirect the center of gaze from point to point [2].

Additionally, the eyes of mammals, birds and reptiles can be adjusted to different distant objects by deforming the curvature of the elastic crystalline lens in their eyes [3, 4]. The tailored combination of a simple optical system, offering only a limited imaging performance, with variable or deformable and moveable optical components, offer the possibility to capture a three-dimensional scene in highly resolved quality.

In this contribution we present and discuss a simple optical system which copies the above mentioned basic characteristics of the natural model of the human eye. Especially, our model system allows to compensate the decay of the field resolution by the capturing of subsequent images. In each of these images, the highly resolved area is found at different positions of the observed scene. The different images are provided by a pivotal movement of a gimbal-mounted optical group, so that the saccadic movement of the human eye is copied. To adjust different distant objects, a focus-tunable optical lens was incorporated. Typically, these tunable lenses consist of a cavity covered by flexible membrane windows which are filled with an elastic and optically transparent material, e.g., an optical fluid [5, 6] or a cross-linked gel-like polymer [7]. The change of the lens’ radii of curvature and hence an extension of the covering membrane is induced by applying pressure to the filling material. Although significant progress has been made on the optical lens parameters, such as improving the wave-front accuracy [8], the currently available tunable lenses are still limited in diameter and tunable focus range. To mention a commercial example, the company “varioptic” offers liquid lenses with a maximum clear aperture of 3.9 mm (liquid lens “Arctic 39N0” [9]).

To address an adequate field size and a sufficient numerical aperture of the optical system and also to provide enough brightness on the image recording detector, the tunable lens was combined with one single additional fixed lens. In contrast, an imaging concept based on a single deformable lens would be limited in performance and, in particular, offers an insufficiently low numerical aperture.

This compound system is comparable to the human eye, in which the overall optical power comprises the contributions of the fixed cornea and those of the flexible crystalline lens. Hereby, the main optical power is provided by the cornea at the transition from ambient air to the water-like biomaterial of the cornea and the anterior chamber of the eye, and in which the refractive index experiences the largest step. The tunable optical power of the crystalline lens is reinforced by the cornea and is used for depth adaptation or accommodation. The presented optical system offers advantages whenever simultaneously a high resolution is needed for a specific part of a scene and additionally an observing function has to be provided for its peripheral zone. Potential application areas may be visual in-line inspection systems for industrial production or optical security systems.

In the first part of this contribution we discuss general aspects in the development of optical imaging systems based on currently available tunable lenses and, especially we indicate the limitations of an optical system that results from the use of a single tunable lens as the only imaging element. Subsequently, we derive a minimal optical system which combines a tunable lens, and a fixed lens, and which achieves the target characteristics. Here, an aspherical hybrid diffractive-refractive lens was selected as the fixed lens. Using optical design methods, we discuss the optical characteristics of the system, such as the field dependency with respect to the orientation of the pivot axes, the focusing properties of the system and its chromatic behavior. Finally, we present the experimental implementation of the optical system and demonstrate its functionality by capturing and analyzing a laterally extended scene, as well as a depth-extended scene.

2. General aspects for imaging systems based on tunable lenses

To develop an appropriate optical system a number of sometimes mutually contradictory characteristics have to be considered simultaneously. Particularly, demanding specifications address the light sensitivity of the system, the field of view (FOV), the optical resolution, and also includes the number of necessary optical elements and the physical size of the system.

A decisive factor of a system’s concept concerns the selection of the image sensor. Here we have to select either a CMOS (Complementary Metal Oxide Semiconductor) or a CCD (Charge Coupled Device) sensor. It can be approximately assumed that a CCD has superior imaging qualities and lower noise than CMOS, but requires more power and has larger pixel sizes. Typically available image sensors currently range from quarter VGA (320 x 240 pixel) with 1.7 x 1.7 µm2 pixel size and 3.2 x 2.4 mm2 sensor dimension, up to high-end sensors with 3264 x 2448 pixels, 14 x 14 µm2 pixel size and 12.8 x 9.3 mm2 sensor area. Although more compact and smaller pixel sizes would be advantageous, larger pixels are preferred for their increased light sensitivity. In general, the larger the pixel size the more sensitive is the camera. As a compromise, and due to availability we used a CCD-sensor with 1600 x 1200 pixels with a size of 4.4 µm2 (1/1.8” Sony ICX274AL).

The choice of the sensor size directly influences the required aperture size of the lens system. The aperture is the diameter of the opening which limits the light ray bundle entering and transmitting the optical system. The larger the aperture, the more light can enter over a fixed period of time. For the detector, the irradiance is proportional to the area of the aperture, and inversely proportional to the illuminated sensor area. From this follows the rule of thumb that the aperture diameter has to be in the range of the sensor size. In addition this implies that the focal length of the optical system should also be in the range of the image size. Here a characteristic measure is given by the f-number of the system, which is the ratio between the lens' focal length, and the diameter of the entrance pupil. Typically acceptable minimum f-numbers, e.g. for camera phones are in the range of 2.25 o 2.8 [1012]. In conclusion, this means that the three quantities, sensor size, aperture diameter and focal length, have to be in the same range to guarantee sufficient light on the detector. These requirements also apply to optical systems possessing variable focus achieved by tunable liquid lenses. For example Kuiper et. al. [13] presented a lens system which has f/2.5, an entrance pupil diameter of 1.43 mm, and a variable focal length adjustable between 2.85 mm for objects at 2 cm and 3.55 mm for objects at infinity.

In order to switch to larger sensor areas, e.g., to increase the light sensitivity of the system, also the diameter of entrance pupil of the lens system has to be increased. The simplest possible implementation of a variable focus system would be based on a single tunable-lens approach. In this case, a tunable lens with a large diameter and an appropriate wave-front characteristic is necessary. Although manifold approaches exist to realize tunable lenses, it is hard to provide tunable lenses with both a large diameter and an appropriate wave-front characteristic. For example, fluid-based electrowetting tunable lenses, which use the meniscus formed between two immiscible liquids, may suffer from gravity effects when the lens diameter is increased [14].

One type of variable lens which offers appropriate large aperture diameters and also well-suited wave-front property (wave-front error in the range of 0.1 λ - 0.2 λ at 633 nm) is a fluidic membrane lens system [5, 6, 8]. The achievable lens diameters range up to 7 mm [15]. These variable-lenses (see Fig. 1) are plano-convex lenses, and consist of a transparent liquid-filled cavity with a hard plane inorganic glass substrate forming the rear, and a flexible optical transparent silicon membrane on the front side. The radius of curvature of the silicon membrane can be adjusted by a variation of the liquid pressure inside the cavity.

 figure: Fig. 1

Fig. 1 Schematics of a variable fluidic lens: The plano-convex lens consists of a transparent liquid-filled cavity with a plane inorganic glass substrate at the rear, and a flexible optical transparent silicon membrane at the front. The radius of curvature of the silicon membrane is adjusted by a variation of the liquid pressure inside the cavity, effected by piezo-actuators.

Download Full Size | PDF

Our further considerations on optical design concepts, and also the following experimental implementation build upon this type of fluidic membrane lens.

There are two forms with slightly different geometric dimensions common for fluidic membrane lenses. The first lens L1 with a clear aperture of 7 mm allows a variation of the radius of curvature of the silicon membrane from infinity to 60 mm. The second lens L2 offers a clear aperture of 4 mm and a variation in the radius of curvature between infinity and 30 mm. The basic structure of both lens types is identical and is depicted in Fig. 1. The only difference between both lenses concerns the ring-shaped piezo-actuator which has the same outer diameter in both cases but offers a larger inner diameter for L1 and a reduced inner diameter for the lens L2, respectively. The difference in the width of the ring-actuator correlates to its maximum possible deformation. In particular, a larger ring width and therefore a smaller clear aperture, allows a stronger deformation of the piezo-element than a smaller ring width. Finally, the deformation range of the piezo-element is directly related to the changeable radius of curvature of the membrane lens, which means that there is a general trade‐off between aperture size and radius of curvature.

From these characteristics, and assuming an average refractive index for the filling liquid of nli = 1.296 (Sigma-Aldrich, Fomblin Y), the minimum focal lengths f of both lenses can be calculated by the lens equation f = r/(n-1), where r is the radius of curvature of the convex surface. From the focal length of the lens, and the aperture diameter D, its f-number f/# can be calculated by f/# = f/D. Furthermore, the f-number is related to the numerical aperture N.A. which is expressed for large f-numbers by N.A. = 1/ (2 f/#). Finally, assuming a reference wavelength of λ = 550 nm, also the diameter of the diffraction pattern (Airy-disc) can be calculated dAiry = 1.22 λ/N.A. Table 1 shows the calculated quantities for both lens forms L1 and L2:

Tables Icon

Table 1. Exemplary geometrical and optical parameters common for fluidic membrane lenses

The calculations show that the minimum f-numbers of both lenses are rather large, which is associated with a very low irradiance on the sensor and dark images or comparatively long exposure times. Furthermore, the large size of the Airy-discs exceeds the typical dimensions of a detecting pixel on the sensor by far. For completeness, it has to be mentioned that in the case of maximum radius of curvature ( = infinity) both lenses act as plane parallel plate with no optical power.

From this consideration it follows that, on the one hand the fluidic membrane lenses offer beneficial large aperture sizes and appropriate wave-front characteristics but, on the other hand, a pure single-lens approach based on these lenses offers only an insufficient image quality, especially a large diameter of the Airy-disc (see Table 1). This means that for an imaging system, based on the fluidic membrane lenses with reasonable optical quality, a compound system has to be build, which combines in minimum two optical elements.

3. Imitating accommodation: Optical system combining a fluidic membrane lens with an aspherical hybrid diffractive-refractive lens

Figure 2 shows the lens design of the combined system employing a variable fluidic membrane lens and a commercially available aspherical hybrid diffractive-refractive lens (Edmund Optics, hybrid asphere product no. 65-996).

 figure: Fig. 2

Fig. 2 Lens design of a system combining a fixed, aspheric, hybrid diffractive-refractive lens with a flexible fluidic membrane lens (DOE = diffractive optical element). The compact approach offers a light sensitive and wide-aperture optical system with variable focus. Each of the three ray colours represents a different field angle of the observed scene.

Download Full Size | PDF

The hybrid lens is characterized by an aspherical front surface and a spherical back side with radius of curvature of −48.3 mm. The sag z of the aspherical surface is given by the following expression (1):

z=cr21+1(1+κ)c2r2+A4r4+A6r6+A8r8+A10r10
with a conic constant κ = −0.6, aspheric coefficients A4 = −4.431 10−6, A6 = −1.499 10−6, A8 = −1.438 10−8, A10 = −1.342 10−9, a best-fit radius R = 5.379 mm and c = 1/R. The aspherical front surface is superposed by a diffractive structure with the phase function φ (2):
φ=29.345ρ20.2434ρ4.
Here, ρ denotes the normalized radial aperture coordinate. The thickness of the lens is d = 5.6 mm, the lens diameter is 12 mm and the lens material is a cyclo olefin polymer with refractive index nd = 1.53116 and an Abbe-number υd = 56.04 (Zeonex E48R). The material and geometrical properties of the lens are related to a focal length of fasph = 9 mm for the hybrid asphere.

The variable fluidic membrane lens is located at a distance of 1 mm behind the asphere and it is oriented in such a way that the silicone membrane is opposed to the asphere. In the non-actuated state the fluidic membrane lens can be described by a stack of three plane parallel layers. The basic substrate layer (thickness 550 µm) is made of an inorganic glass with nd = 1.5230 and an Abbe-number of υd = 58.76 (Schott B270). The final top covering membrane has a thickness of 200 µm and is made of a transparent Polydimethylsiloxan (PDMS) (Momentive RTV 615) with a refractive index nd = 1.406 and an Abbe-number υd = 60. A residual fluid layer (ne = 1.2967, υe = 98.3) (Sigma-Aldrich Fomblin Y) is sandwiched between membrane and substrate and measures a thickness of 650 µm. During actuation, the fluid forms a spherical cap confined by the covering membrane. For the optical design of the actuated state, it is assumed that the membrane doesn’t change its thickness and also residual fluid layer and substrate remains unchanged.

In general, for a combination of two lenses separated by the distance d and with focal lengths f1 and f2 of the first and the second lens, respectively, it follows for the focal length of the lens combination fcom: 1/fcom = 1/ f1 + 1/ f2 - d/(f1 f2). With respect to the preceding discussion of a minimum focal length of f2 = 202.7 mm for the variable lens, the focal length of the combined system varies according to 8.6 mm < fcom < 9 mm. With a fixed separation between the lens combination and sensor surface, the variable focal length allows one to adjust the optical system to different object distances between 160 mm and infinity.

In comparison to the imaging properties of the single variable lens, the f-number of the lens combination f/#comb is in the range of unity, which means that the system achieves high brightness on the detector. Additionally, the combined system offers a large numerical aperture of N.A.comb = 0.5, which correlates at the reference wavelength of λ = 550 nm to an Airy-disc diameter of dAiry_comb = 1.34 µm, which is well below the pixel size of the sensor. In this context, it also has to be noted that for small f-numbers, the depth of focus is very limited. This means, in our setup, the variable lens becomes indispensable. Otherwise, without the opportunity to adjust the focal distance, a minor amount of defocus will also be associated with considerable blurring effects and thus will drastically reduce the image quality.

The fixed hybrid asphere combines both refractive and diffractive optical power. Due to the opposite dispersion behavior of refraction and diffraction, which is characterized by a negative Abbe-number of the diffractive surface, the hybrid combination offers the advantage of achromatisation in a single element. In comparison with the fixed asphere the variable fluidic membrane lens with its relatively large radius of curvature, contributes only with a weak optical power to the combined system. Consequently, the influence of the variable lens on the wavelength dependency of the combined system is only of minor importance. In essence, the achromatic behavior of the system is mainly determined by the hybrid asphere.

The wavelength dependency of the focal position for the combined system is displayed in Fig. 3. The two curves in the diagram are related to an object distance of 160 mm (red curve) and an infinitely distant object (blue curve).

 figure: Fig. 3

Fig. 3 Secondary spectrum of the system combining an aspherical hybrid diffractive-refractive lens with a flexible fluidic membrane lens. The wavelength dependency of the focal position shows the concave behavior of an achromatic system.

Download Full Size | PDF

The x-coordinate displays the difference of the focal position from the paraxial reference for both object distances. The axial zero crossing is attributed to the reference wavelength of 546.1 nm. Both dependencies show a very similar trend, while the curve for the infinity distant object is slightly shifted to the blue wavelength range. In particular, the advantageous concave shape of the curves of these secondary spectra is in principal not achievable with a single dioptric lens. In the latter case, a steady strong shift of the focus with increasing wavelength would occur, and achromaticity would not be possible. On the other hand, for a classical dioptric achromat, in which a positive and a negative lens of different Abbe-numbers are combined, the secondary spectrum would show the opposite convex curvature. However, since in a classical dioptric approach positive and negative optical powers have to be combined, they partly compensate each other so that the overall length of the system becomes significantly longer.

The presented setup of a fixed lens with a large optical power combined with a position-fixed but curvature-variable lens of relatively low optical power, is very similar to the basic working principle of a human eye. In the eye, the dominant optical power is contributed to the transition at the cornea with a large change in the refractive index from the outside air to water as the main medium of the eye. The variable crystalline lens of the human eye has only minor optical power, but is needed for accommodation, which means that the crystalline lens is used to adjust for different object distances.

4. Artificial saccades: Gimbal tilt of the optical system and image reconstruction

A further characteristic of the human visual system is, on the one hand, the strong decay of resolution from the central part of the retina to its peripheral regions. Really high resolution of about 1 arc min for a typical human eye [16], is restricted to the foveal region of the retina, which covers only a very small angular range of approximately 1°. On the other hand, our peripheral vision allows only to percept very coarse spatial information but it is sensitized for motion detection. Despite the huge differences between foveal and peripheral resolution, this discrepancy does usually not disrupt our seamless visual perception. A reason for the perceived impression of an apparent uniform and well-resolved scene is strongly linked to the saccadic movement of the eyes. Hereby the complete visual scene is addressed by the successive fixation of different object details, which are temporally separated by rapid eye movements.

With respect to field dependent resolution, our compound optical system shows a comparable behavior to the natural model of the human eye. Due to the limited number of only two optical elements which form the system, the image quality also decays rapidly with increasing image height or field position. To compensate these aberrations which are dominated by coma, we therefore follow the example of the saccadic eye movement of the natural model, i.e., we address a complete lateral extended scene in a piecewise fashion. The optical system successively targets different field positions and takes an image from each orientation. In each of the recorded single images, a different part of the entire scene is well resolved, whereas the rest of the scene appears blurred. Finally, a complete and well-resolved image of the scene can then be processed by software that combines the relevant, highly resolved parts of the single images. In our lens system we introduced a simple tilting mechanism, which allows to select different field positions of the observed scene. The implementation of the tilting mechanism is described in the following section.

Figure 4 depicts the calculated spot diagrams in the sensor plane for different tilting angles, and for different height positions, in a matrix representation. For the modelling we used a commercial optical design software (Zemax). In each spot diagram the underlying mesh structure represents the pixels of the sensor. The initial situation, in which the axis of the lens combination is pointing to the center of the image plane (0° tilting angle), is shown in the upper row of Fig. 4.

 figure: Fig. 4

Fig. 4 Matrix representation of calculated spot diagrams in the sensor plane for different tilting angles and for different height positions. Upper row: Axis of the lens combination is pointing to the center of the image plane (0° tilting angle). The following rows (2-4) are associated with increasing field positions. The image position offering the maximum resolution follows the field position. Last row: Spot diagrams for a maximum tilting angle of 2° in a refocused state.

Download Full Size | PDF

The upper left spot diagram depicts the high quality spot concentration for the on-axis imaging case. The spot diagrams from left to right show the changes of the imaging process for the same axis orientation but for different field positions. It is clearly observed that the optical aberrations strongly increase with increasing field position, and, in particular, coma is introduced. The comparison between the extension of the spot distribution and the size of the theoretical Airy-disc, which is represented as the small black circle in each diagram, can be used as a measure for the aberrations. For increasing field positions, we find that the spot distribution exceeds the Airy-disc by far.

The subsequent rows of the matrix are for increased tilting angles of the axis of the compound optical system with respect to the center of the sensor. Hence, the second row displays the spot diagrams for a tilting angle of 0.7° and for various field positions (increasing image height from left to right). In this case the best resolution with a minimum size of the spot diagram is found for a field position of 0.11 mm. Here it has to be mentioned that, in general, the chief-ray of the system has to be oriented parallel to the symmetry axis of the optical group for each tilt position. Both the center position and the large field position is clearly effected by aberrations with an extended and coma-like spot diagram. The 4th row shows the situation for a maximum tilting angle of 2°. Here the image quality becomes blurred in the center but the spot diagram reveals a minimum extension for large image heights. Although in this situation the spot diagram is larger than the diffraction limited Airy-disc, the resolution seems still to be reasonable. Because the images for the different tilting angles are formed on a curvature rather than on a plane, their best focus distance to the fixed detector plane increases with increasing field position. A simple refocusing of the compound system by the tunable lens allows a compensation of this effect. The last row of Fig. 4 shows again the spot-diagrams for a maximum tilting angle of 2°, but now for a refocused system. It follows, that a focus shift of −13 µm leads to a spot distribution for a large image height, which is smaller than the Airy-disc.

5. Experimental implementation and imaging results

To experimentally demonstrate the capability of the suggested tunable and rotating optical approach, we implemented a demonstrator of the imaging system.

In a first step, the tunable lens and the aspherical hybrid diffractive-refractive element have to be combined to one optical group. For this purpose it is essential to align the optical axis of both elements precisely with respect to each other, and to adjust the defined distance of the air spacing between both optical elements within the required tolerances. This alignment procedure was carried out on a commercial optical centering instrument (Trioptics OptiCentric). The variable lens is fixed by a releasable adhesive on a plane-parallel auxiliary plate, where the plane glass substrate of the variable lens is adjacent to the auxiliary plate. The auxiliary plate supporting the variable lens is placed on the optical centering instrument, roughly positioned and set into rotation. With an autocollimator the optical axis of the variable lens is aligned with respect to the reference axis of rotation. As a last step, both axes are aligned in such a way that they have the same position and point towards the same direction.

The hybrid asphere is precisely fixed in the aperture of a mechanical adapter. The adapter is attached onto the variable lens, which is still fixed to the auxiliary plate and positioned in the optical centering instrument. The mechanical tolerances of the adapter guarantee a defined air spacing distance between the hybrid asphere and the variable lens. In this state the hybrid-lens-carrying adapter is still laterally moveable with respect to both the variable lens and the auxiliary plate. Next, the combined lenses are set into rotation in the optical centering instrument. Now the optical axis of the hybrid asphere is adjusted through a lateral shifting and aligning of the adapter. During this step, the auxiliary plate and the variable lens remain fixed. After the procedure the optical axis of the aspherical hybrid lens and the variable lens coincide and both lenses are at the required distance with respect to each other. Finally, a UV-curing adhesive is applied for mechanical fixation of variable lens and adapter. A symmetric distribution of the adhesive compensates potential shrinking effects.

The compound optical group is now released from the auxiliary plate and integrated into a simple gimbal-mounting system. This optical unit and the image detector are separately mounted on an optical bench, where the axis of the optical unit and the center of the image detector have the same height. The distance between camera and optical unit along the z-axis of the bench are roughly adjusted. Under these conditions, already a non-optimized image is formed on the detector.

It has to be mentioned that the housing of the camera was modified so that spatial constraints could be overcome. For fine-tuning of the z-position of the detector with respect to the optical unit, the variable lens is relaxed so that the optical system is adjusted for long-distance imaging. For a remote object (distance approximately 4-5 m from the optical group) the detector is shifted along the z-axis until an optimized image of the object is observable on the detector (Imaging Source DFK 23U274).

To reduce system complexity, we used a mechanical gimbal-mounted system for the adjustment of the pivot axes with respect to the camera. This approach can be easily modified, for example to accommodate an electronically controlled piezo driven actuator. A schematic view of the compound optical group integrated the gimbal-mounting is depicted in Fig. 5.

 figure: Fig. 5

Fig. 5 Schematic view of the compound optical group integrated the gimbal-mounting system. The optical group consists of the fixed hybrid-asphere and the tunable membrane lens. During the capturing of the images for the different target positions, the detector remains fixed.

Download Full Size | PDF

Figure 6 shows a sequence of 5 images taken from a laterally extended scene. Five toy figures (“smurfs”) served as test objects, each with a height of about 4 cm. The object distance to the camera was approximately 1.5 m, and the diagonal of the scene measured 0.35 m, so that a visual angle of 6.65° was addressed. In each image of the sequence, a different part of the scene is highly resolved, while the rest appears blurred. The highly resolved part covers ~1° which is comparable to the foveated region of a human eye.

 figure: Fig. 6

Fig. 6 A sequence of 5 images taken from a laterally extended scene. For each image, the pivot axes of the optical group is pointing in a different direction, so that for each individual image a different part of the scene is highly resolved (highlighted by a white circle). The object distance to the camera was approximately 1.5 m and the diagonal of the scene was measured to be 0.35 m (visual angle ~6.65°).

Download Full Size | PDF

To emphasize this characteristic, the highly resolved regions of each individual image are highlighted by a white circle. Practically, for each sub-image, the pivot axes of the optical group is redirected to a new target position, whereas the position of the detector remains unchanged.

The decay of image sharpness dependent on image position essentially follows the theoretical predictions for the sub-images (see Fig. 4). In particular, for each image the next neighbor of the target toy figure appears only slightly aberrated, but is still easily recognizable. With increasing distance, the image details are drastically reduced. For example, toy figures further away are so strongly blurred that only rough contours are visible.

Following the capturing of individual images of the scene, the final overall image is processed. With the aid of an imaging processing software (Heliconsoft, Helicon Focus), the highly resolved regions from each sub-image are selected and subsequently combined to form the resulting final overall image. In our current project status, the selection of the highly resolved details of each image is performed manually, a process which could be automated as well. Figure 7 shows the combined result for the images as displayed in Fig. 6. All 5 “smurfs” at the different lateral positions appear highly resolved. Additionally, the built-in achromatic aspect of the optical system keeps unwanted color fringes to a minimum.

 figure: Fig. 7

Fig. 7 Resulting composite image in which the highly resolved regions from each sub-image (Fig. 6) are combined to a final overall image.

Download Full Size | PDF

Additionally, in a second experiment, the variable focus properties of the optical system are demonstrated. A series of 4 images is shown in Fig. 8, which targets subsequently far, intermediate and near distances. In this example a very large extended depth of field is addressed, and ranges from “infinity” to object distances of only a few centimeters (distance from the lens to the ‘smurf’ is 20 cm).

 figure: Fig. 8

Fig. 8 Series of 4 images taken from a depth-extended scene. The optical setup targets objects that are separated at three distances from the optical system: far, intermediate, and near. For each of the images, both the focus and the pivot axis are readjusted.

Download Full Size | PDF

The upper left image shows the ‘infinitely’ distant building in the center of the scene in focus and highly resolved. In the upper right image, the building on the left is highly resolved. Both the focus and the pivot axis are adjusted to the relevant part of the scene. The lower left image shows the centrally located flower at an intermediate range in high resolution. Finally, in the lower right image, the ‘smurf’ in the foreground is highly resolved. Also for this image the focus and the pivot axis has to be adjusted to the center of interest.

All images are taken from inside a room through a window and therefore the lighting conditions vary strongly over the scene. On the one hand, the outside background is very bright, and on the other hand the foreground within the room (flower and ‘smurf’) is less illuminated. To enable a balanced contrast over the entire scene especially the ‘smurf’ in the foreground appears darker.

Again, from the individual images, an overall image is created (see Fig. 9). It shows both a high lateral resolution, as well as a remarkably extended and highly resolved depth of field, ranging from a few centimeters up to infinity. The achieved final overall depth of field exceeds a comparable result available from a single shot approach by far.

 figure: Fig. 9

Fig. 9 Resulting composite image processed by recombining the highly resolved areas from each of the individual images of Fig. 8.

Download Full Size | PDF

6. Conclusion

In the 19th century, the physiologist and physician Hermann von Helmholtz (1821–1894), who significantly contributed to the scientific understanding of the working-principle of visual perception, once made fun of the imaging quality of the human eye. The English translation of his comment on all the aberrations associated with the human eyes reads as follows: “Now, it is not too much to say that, if an optician wanted to sell me an instrument which had all these defects, I should think myself quite justified in blaming his carelessness in the strongest terms, and giving him back his instrument.” [17]. However, this famous quotation goes even further: ”… Of course, I shall not do this with my eyes, and shall be only too glad to keep them as long as I can.” Especially the second part of Helmholtz’ statement expresses the outstanding quality of the perceived overall performance of the human visual system. The essential aspect here is, that the human visual perception system is not only a simple fixed-lens-based setup but also implies clever “data-processing”, and in particular, involves tunable and moveable optical functions. Only the combination of these three aspects, namely optics, data-processing, and especially the use of tunable and moveable optical components, allows the perception of sharp and brilliant impressions.

In this contribution we presented an approach which transfers this combined principle into a technical setup. An f-number optimized imaging system with variable focus was implemented through the combination of an aspherical hybrid refractive-diffractive lens with a flexible fluidic membrane lens. Although for large field positions the image quality is reduced, a tilting mechanism modelling the saccadic movement of human eyes allows to compensate the quality loss. The system therefore allows the generation of overall images that are adequately depth- and field-resolved, and is also characterized by its simplicity and compactness. In principle, it is also possible to develop an optical system which offers a high image quality over a large extended field, but in this case the simplicity and compactness is completely lost and the number of optical elements increases drastically.

In the implementation presented, the pivot movement is operated mechanically, and the selection of the individual target field positions and focal distances are externally controlled. However, there is no principle argument against an extension of the system to include electrically driven pivot rotation and automatically controlled image recognition software to select the essential field and focus positions. Moreover, the implementation of an online image-processing software would allow a highly resolved “live image” of the observed scene.

Acknowledgment

Very special thanks are given to Jonas Alteneder who supported the processing of the individual sub-images and also to Julie Montoisy for the possibility to use “smurfs” as test figurines. This work was funded by the German Research Foundation DFG within the Priority Program “Active Micro-optics” under the project EAGLE-II.

References and links

1. C. A. Curcio, K. R. Sloan, R. E. Kalina, and A. E. Hendrickson, “Human photoreceptor topography,” J. Comp. Neurol. 292(4), 497–523 (1990). [CrossRef]   [PubMed]  

2. L. Alfred, Yarbus; Eye Movements and Vision; (Plenum, 1967).

3. M. F. Land and D.-E. Nilsson, Animal Eyes (Oxford University, 2002).

4. M. Ott, “Visual accommodation in vertebrates: mechanisms, physiological response and stimuli,” J. Comp. Physiol. A Neuroethol. Sens. Neural Behav. Physiol. 192(2), 97–111 (2006). [CrossRef]   [PubMed]  

5. J. Draheim, F. Schneider, R. Kamberger, C. Mueller, and U. Wallrabe, “Fabrication of a fluidic membrane lens system,” J. Micromech. Microeng. 19(9), 095013 (2009). [CrossRef]  

6. J. Draheim, T. Burger, F. Schneider, and U. Wallrabe, “Fluidic zoom lens system using two single chamber adaptive lenses with integrated actuation,” in Proceedings of IEEE Conference on MEMS2011 (IEEE 2011), pp. 692–695. [CrossRef]  

7. G. Beadie, M. L. Sandrock, M. J. Wiggins, R. S. Lepkowicz, J. S. Shirk, M. Ponting, Y. Yang, T. Kazmierczak, A. Hiltner, and E. Baer, “Tunable polymer lens,” Opt. Express 16(16), 11847–11857 (2008). [CrossRef]   [PubMed]  

8. F. Schneider, J. Draheim, R. Kamberger, P. Waibel, and U. Wallrabe, “Optical characterization of adaptive fluidic silicone-membrane lenses,” Opt. Express 17(14), 11813–11821 (2009). [CrossRef]   [PubMed]  

9. Arctic 39N0 Family,” http://www.varioptic.com/products/variable-focus/arctic-39n0/.

10. Optical image capturing lens system,” Patent Application US 2013/0021680 A1.

11. X. Peng, “Design of High Pixel Mobile Phone Camera Lens,” Res. J. Appl. Sci. Eng. Technol. 6, 1160–1165 (2013).

12. T. Steinich and V. Blahnik, “Optical design of camera optics for mobile phones,” Adv. Opt. Technol. 1, 51–58 (2012).

13. S. Kuiper and B. H. W. Hendriks, “Variable-focus liquid lens for miniature cameras,” Appl. Phys. Lett. 85(7), 1128–1130 (2004). [CrossRef]  

14. K. Newman and K. Stephens, “Analysis of gravitational effects on liquid lenses,” Proc. SPIE 8450, 84500G (2012). [CrossRef]  

15. J. Draheim, F. Schneider, T. Burger, R. Kamberger, and U. Wallrabe, “Single chamber adaptive membrane lens with integrated actuation,” in 2010 International Conference on Optical MEMS and Nanophotonics, (2010), pp. 15–16. [CrossRef]  

16. F. Trager, Springer Handbook of Lasers and Optics (Springer, 2007).

17. H. von Helmholtz, Popular Scientific Lectures (Dover, 1962).

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1
Fig. 1 Schematics of a variable fluidic lens: The plano-convex lens consists of a transparent liquid-filled cavity with a plane inorganic glass substrate at the rear, and a flexible optical transparent silicon membrane at the front. The radius of curvature of the silicon membrane is adjusted by a variation of the liquid pressure inside the cavity, effected by piezo-actuators.
Fig. 2
Fig. 2 Lens design of a system combining a fixed, aspheric, hybrid diffractive-refractive lens with a flexible fluidic membrane lens (DOE = diffractive optical element). The compact approach offers a light sensitive and wide-aperture optical system with variable focus. Each of the three ray colours represents a different field angle of the observed scene.
Fig. 3
Fig. 3 Secondary spectrum of the system combining an aspherical hybrid diffractive-refractive lens with a flexible fluidic membrane lens. The wavelength dependency of the focal position shows the concave behavior of an achromatic system.
Fig. 4
Fig. 4 Matrix representation of calculated spot diagrams in the sensor plane for different tilting angles and for different height positions. Upper row: Axis of the lens combination is pointing to the center of the image plane (0° tilting angle). The following rows (2-4) are associated with increasing field positions. The image position offering the maximum resolution follows the field position. Last row: Spot diagrams for a maximum tilting angle of 2° in a refocused state.
Fig. 5
Fig. 5 Schematic view of the compound optical group integrated the gimbal-mounting system. The optical group consists of the fixed hybrid-asphere and the tunable membrane lens. During the capturing of the images for the different target positions, the detector remains fixed.
Fig. 6
Fig. 6 A sequence of 5 images taken from a laterally extended scene. For each image, the pivot axes of the optical group is pointing in a different direction, so that for each individual image a different part of the scene is highly resolved (highlighted by a white circle). The object distance to the camera was approximately 1.5 m and the diagonal of the scene was measured to be 0.35 m (visual angle ~6.65°).
Fig. 7
Fig. 7 Resulting composite image in which the highly resolved regions from each sub-image (Fig. 6) are combined to a final overall image.
Fig. 8
Fig. 8 Series of 4 images taken from a depth-extended scene. The optical setup targets objects that are separated at three distances from the optical system: far, intermediate, and near. For each of the images, both the focus and the pivot axis are readjusted.
Fig. 9
Fig. 9 Resulting composite image processed by recombining the highly resolved areas from each of the individual images of Fig. 8.

Tables (1)

Tables Icon

Table 1 Exemplary geometrical and optical parameters common for fluidic membrane lenses

Equations (2)

Equations on this page are rendered with MathJax. Learn more.

z= c r 2 1+ 1(1+κ) c 2 r 2 + A 4 r 4 + A 6 r 6 + A 8 r 8 + A 10 r 10
φ=29.345 ρ 2 0.2434 ρ 4 .
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.