Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Single-frequency 3D synthetic aperture imaging with dynamic metasurface antennas

Open Access Open Access

Abstract

Through aperture synthesis, an electrically small antenna can be used to form a high-resolution imaging system capable of reconstructing three-dimensional (3D) scenes. However, the large spectral bandwidth typically required in synthetic aperture radar systems to resolve objects in range often requires costly and complex RF components. We present here an alternative approach based on a hybrid imaging system that combines a dynamically reconfigurable aperture with synthetic aperture techniques, demonstrating the capability to resolve objects in three dimensions (3D), with measurements taken at a single frequency. At the core of our imaging system are two metasurface apertures, both of which consist of a linear array of metamaterial irises that couple to a common waveguide feed. Each metamaterial iris has integrated within it a diode that can be biased so as to switch the element on (radiating) or off (non-radiating), such that the metasurface antenna can produce distinct radiation profiles corresponding to different on/off patterns of the metamaterial element array. The electrically large size of the metasurface apertures enables resolution in range and one cross-range dimension, while aperture synthesis provides resolution in the other cross-range dimension. The demonstrated imaging capabilities of this system represent a step forward in the development of low-cost, high-performance 3D microwave imaging systems.

© 2018 Optical Society of America

1. INTRODUCTION

Radio-frequency (RF) imaging is an optimal paradigm for many sensing applications due to the ability of microwaves to penetrate many optically opaque materials, reconstructing three-dimensional (3D) images of otherwise hidden objects [111]. Driven by the need for high-resolution RF images, systems based on synthetic aperture radar (SAR) and massive multiple-input, multiple-output platforms have been developed for applications ranging from security screening to biomedical imaging [1221]. Such imaging systems can successfully image complex scenes, but stringent hardware requirements have remained a factor inhibiting their widespread deployment. In recent works, alternative imaging systems with simplified hardware requirements have been developed based on the combination of metasurface apertures with computational imaging techniques [9,2227]. Still, reliance on a large bandwidth to resolve objects in range imposes significant minimum hardware requirements on the RF backend, necessitating specialized sources, filters, and feed networks [28]. A wide bandwidth system also complicates calibration procedures and increases the chance of interference from other devices—a particularly troublesome concern as the RF spectrum becomes increasingly congested [2931]. As a means of addressing these challenges, there has been a growing interest in imaging systems that can operate with reduced spectral bandwidth [32,33].

Although the use of frequency bandwidth to obtain range information has been typical in RF imaging systems, bandwidth is only strictly required when the object lies in the far field of the aperture [12,19]. When instead the object lies within the radiating near field (or Fresnel zone) of the aperture, both range and cross-range information can be obtained solely from the aperture extent and the associated spatial frequencies. Fromenteze et al., for example, recently demonstrated single-frequency microwave imaging using mechanically translated horn antennas [34]. In this system, a multistatic uniform linear array (ULA) was synthesized which could reconstruct two-dimensional (2D) images (range and one cross-range dimension).

Among the available hardware platforms capable of supporting single-frequency microwave imaging, waveguide-fed metasurface antennas are a particularly attractive option due to their low profile and lightweight form factor, low production cost, and single feed [24,25,35]. Waveguide-fed metasurface antennas consist of planar waveguide structures, in which subwavelength metamaterial resonators are patterned into one of the confining metallic surfaces. The metamaterial elements couple energy from the guided wave to free space radiation [22]. By incorporating individually addressable, tunable components within each metamaterial resonator, a dynamic metasurface is formed that can generate a multitude of radiation patterns as a function of the tuning state, irrespective of frequency [36]. Dynamically tuned metasurfaces offer reconfigurable radiation patterns without requiring phase shifters or other integrated components that can drive up the system costs. These radiation patterns can be tailored to exhibit a variety of features such as spatially diverse radiation patterns [24], steered beams [37,38], or multi-lobed patterns [37,39], motivating their development for use in microwave imaging and communications [22,24,40,41]. In [42], dynamic metasurfaces were used to perform single-frequency microwave imaging in 2D, demonstrating that electrically large transmit and receive apertures could be used in place of the time-consuming, mechanically translated synthetic aperture measurement process used in [34].

Building on the 2D stationary, single-frequency metasurface imaging system [42], we extend the single-frequency approach to form a synthetic aperture for full 3D imaging. To form the effective aperture, a pair of metasurface antennas is translated over an area as shown in Fig. 1. This system can extract range (x) and cross-range (y) information from the electrically large aperture extent, while movement of the system (along the height direction) provides height (z) information. To probe a scene, the metasurface antennas apply collections of pseudorandom tuning states to generate radiation patterns that exhibit low spatial correlation [22,24,25]. This illumination strategy, coupled with the movement associated with aperture synthesis, leads to an imaging system that can be conceptualized as producing sets of plane waves from various angles of incidence. The superposition of plane waves thereby interrogates multiple spatial frequency (k space) components, in x, y, and z, simultaneously. This operation enables 3D imaging without relying on a wide bandwidth for range information.

 figure: Fig. 1.

Fig. 1. Conceptual diagram of a single-frequency imaging system using metasurface antennas. One metasurface antenna serves as the transmitter while another receives the backscattered signal. At each synthetic aperture position, measurements are taken by cycling through combinations of spatially diverse radiation patterns on the transmitter and receiver.

Download Full Size | PDF

The large data sets resulting from the single-frequency synthetic aperture measurement just described lead us to the use of the range migration algorithm (RMA) for image reconstruction [43,44]. The RMA is a popular image-reconstruction method for SAR measurements due to its efficient handling of large data sets, facilitated by the use of fast Fourier transforms [4547]. In addition to SAR imaging systems, Fourier processing with range-migration techniques has led to demonstrations of other multistatic imaging systems in the microwave regime [8,18,48]. To process measurements with the RMA, the spatial frequency components probed by the imaging system must be known [5,18,49]. In contrast to conventional SAR systems, we utilize a virtually multistatic imaging system in the near field, which probes distinct sets of spatial frequency components with each spatial position of the antenna as well as with each tuning state of the antenna. As a result of the operational differences between the single-frequency imager and conventional SAR systems, we mathematically analyze the signal acquired by our proposed system in Fourier space, reveal its spatial frequency coverage, and formulate the corresponding dispersion relation. In addition to deriving the system’s dispersion relation, a pre-processing step must be applied due to the nontraditional illumination strategy [44,50]. Incorporating the known dispersion relation along with the pre-processing step results in a specifically adapted version of the RMA that can efficiently reconstruct single-frequency synthetic aperture measurements taken with dynamic metasurfaces.

Single-frequency synthetic aperture operation, as described in this work, combines the hardware benefits of metasurface apertures with the imaging capabilities granted by aperture synthesis. By avoiding large spectral bandwidth, the single-frequency imaging system can make use of less expensive, low-bandwidth components as compared with traditional SAR systems. As demonstrated in this work, metasurface antennas, when combined with aperture synthesis and efficient reconstruction methods, ultimately lead to an imaging system that can measure a scene and reconstruct 3D images. This type of imaging system stands to improve the hardware and capabilities of microwave imaging systems used for biomedical imaging, security screening, and through-wall imaging. Even in the case of targets moving relative to the aperture, where Doppler shifts impose a minimal bandwidth, the single-frequency approach described here may still be adapted to form systems with lower overall bandwidth requirements.

In this paper, the operation of a 3D single-frequency synthetic aperture imaging system is detailed. A mathematical derivation of the signal measured by the system is provided. The details of this derivation are used to facilitate image reconstruction using an extension of the range-migration algorithm. The specific dynamic metasurface antenna design used for imaging is described, as well as the overall experimental setup. Finally, experimentally obtained 3D images are presented for various objects, providing a clear demonstration of the system’s ability to resolve objects in all three dimensions.

2. SYNTHETIC APERTURE IMAGING WITH A SINGLE FREQUENCY

Active microwave imaging systems illuminate a scene and reconstruct images from measurements of the backscattered signal [1,15,5153]. Each illuminating waveform interrogates the scene with a particular spatial frequency component. Probing a span of spatial frequencies in a given direction allows one to resolve spatial information along that direction. Consequently, the performance of an imaging system can be assessed by analyzing the spatial frequency coverage (or k coverage) probed by the transmitter and receiver. Covering a range of k components in a certain direction, ku for example, will enable spatial resolution δu derived from the span of the spatial frequencies in that direction, Δku=ku,maxku,min [20,54]. Traditional microwave imaging devices can probe a span of cross-range k components either by using a large physical aperture or using mechanical motion, which allows plane waves with different k components to illuminate the scene to extract cross-range information. To probe a large span of range k components, a large spectral bandwidth is often used. The combination of cross-range k coverage, availed by a large aperture size, with the range k coverage, availed by frequency bandwidth, allows the transmitted/received waves to exhibit structure in both range and cross range [12,55,56]. An alternative approach to 2D microwave imaging has recently been proposed, in which a multistatic ULA can extract both range and cross-range information from measurements taken at a single frequency point as a function of system geometry. To describe this operation, a sample monostatic SAR system is compared with a sample multistatic ULA, in terms of k space coverage, to assess the imaging capabilities of these systems.

A monostatic SAR system takes a frequency sweep from a transceiver at a series of SAR positions to cover an area in k space as a function of position (rs) and frequency (f) [12,57]. As an example, we consider a monostatic system with 12 SAR locations, evenly spaced along the y direction, spanning 2 m, and with six frequency points covering a 20% fractional bandwidth. At each SAR location, the angle to the target, located at (1 m, 0 m), is calculated and the plane wave probing the target is projected onto the x and y directions to determine the illuminating wavenumber, (kxt,kyt). Here, we can approximate the illuminating waves as plane waves by assuming that the target is an infinitesimal point in the far field of the transmitter [34]. The signal is collected by the receive antenna with a (kxr,kyr) pair associated with the same incidence angle as the transmitter. These k values are added together to determine the overall (kx,ky) probed by the imaging system as kx=kxt+kxr and ky=kyt+kyr. This process is repeated for each SAR location and frequency to determine the full k space coverage of the imaging system, as shown in Fig. 2(a). It is important to note that due to the monostatic SAR operation, the transmitting and receiving incidence angles will always be the same. Additionally, although this analysis is performed with respect to an infinitesimal point target, a similar analysis could be applied to a range of targets with similar results.

 figure: Fig. 2.

Fig. 2. Schematic description of two styles of microwave imaging systems. (a) Monostatic SAR system covering an area in k space function of SAR position and frequency (B is the spectral bandwidth). (b) Multistatic array covering an area in k space as a function of the transmit and receive aperture size, while only using a single frequency point.

Download Full Size | PDF

In contrast to a monostatic SAR system, a multistatic array involves distributing antennas throughout a large-aperture extent and using different combinations of antennas (as transmitters and receivers) to form a measurement set. To describe how a multistatic linear array can image with measurements taken at a single frequency, a k coverage map was calculated (in the same way as the monostatic SAR system coverage map described in the previous paragraph). Here, the SAR locations from the monostatic SAR case are used as the antenna locations for a multistatic uniform linear array in order to ensure a fair comparison. In the case of a multistatic linear array, one antenna is selected as the transmitter and another as the receiver. The incidence angle from the transmitter to the target determines the unique plane wave emanating from the transmitter with a given (kxt,kyt). Since there is a distinct receiver antenna, due to the multistatic operation, the (kxr,kyr) is different from that of the transmitter. Once again, adding the k space pairs associated with the plane waves from the transmitter and receiver determines the overall (kx,ky) probed by one measurement mode. This process is repeated for the exhaustive combination of each transmitter with each receiver. In this way, an area in k space is covered not by using a large spectral bandwidth, but by the measurement paths created by the different combinations of antennas as transmitters and receivers. The resulting k coverage map for a multistatic uniform linear array is shown in Fig. 2(b), where it is clear that a sufficient 2D area in k space can be covered by measurements taken at a single frequency point. In Section 3, the approximations made here are abandoned and the signal measured by a single-frequency SAR system is mathematically derived.

Practically demonstrating single-frequency imaging, as described in Fig. 2(b), can result in prohibitively time-consuming mechanical raster scanning as in [34], or a costly, complex array of antennas. In [42], these practical shortcomings were overcome by utilizing electrically large, singly fed metasurface apertures. The specific metasurfaces utilized in this work are waveguide-fed structures which selectively leak energy from a waveguide mode into free space as radiation [22,24]. Electronic stimuli are used to address specific metamaterial elements [58] in order for the metasurface to radiate from an arbitrary collection of elements. By radiating different spatial patterns, different combinations of (kx,ky) pairs can be simultaneously measured because these apertures exhibit spatial variation along the range and cross-range directions [42]. This illumination strategy is depicted in Fig. 3. Metasurface antennas can generate spatially distinct radiation patterns at a rapid rate, each of which probes a superposition of (kx,ky) pairs. While these apertures multiplex the information from many incidence angles, post-processing techniques can isolate the contributions of each transmit and receive element and virtually achieve the operation described in Fig. 2(b).

 figure: Fig. 3.

Fig. 3. Diagram of the illumination strategy for a single-frequency, metasurface antenna imaging system. The metasurfaces measure the scene by cycling through combinations of transmit and receive radiation patterns (each aperture is set with a tuning state, Tn). This process is repeated at each synthetic aperture position as the metasurfaces are moved together, with the synthetic aperture path oriented along the z direction. Note that the transmitter and receiver do not necessarily apply the same tuning state during a given measurement.

Download Full Size | PDF

Single-frequency microwave imaging with metasurfaces can be supplemented by incorporating mechanical motion along the z direction (as in the coordinate system shown in Fig. 1) to provide height information. In this manner, the system performs stripmap SAR, but with an unconventional illumination strategy in the direction perpendicular to the SAR path [12]. Note that the virtual multistatic operation is maintained with respect to the range and cross-range directions, but the imaging system is operating in a monostatic manner with respect to the height direction. In this mode of operation, a virtual 2D multistatic aperture is formed, with each of the transmit and receive apertures covering independent areas. Similar imaging configurations, including different metasurface antenna placement or different synthetic aperture geometries, may be considered with a similar analysis. Different synthetic aperture paths could help to realize denser k space coverage, which could help to reduce aliasing effects. Since the geometry shown in Fig. 1 probes the scene from the maximum and minimum incidence angles availed to the synthetic aperture footprint, different synthetic aperture paths would not extend the limits of the k space coverage unless the overall footprint of the synthetic aperture was expanded. In this work, due to experimental configuration restrictions, we limit our analysis to the system shown in Fig. 1.

To better illustrate the utility of the proposed imaging system, we briefly compare our imaging system with a more conventional architecture. In our system, we use two 1D metasurface apertures, each spanning 25λ, which are positioned side by side. If the transmit and receive metasurfaces were replaced with arrays of independent antennas, 50 independent antennas would be required for transmission and reception, assuming Nyquist (λ/2) sampling. Thus, the metasurface hardware allows for a reduction from 100 to 2 independent antennas. If the full area covered by the mechanical translation of the metasurface antennas was instead replaced with a solid state array of antennas, 5000 independent antennas would be required.

Incorporating motion along the height dimension results in a different set of relations governing the k space coverage of the imaging system. To account for this behavior, the different k space contributions must be accounted for within the reconstruction method, which can be accomplished by deriving the signal in Fourier space. Ultimately, full 3D information can be obtained with this operation, in which range and cross-range information is extracted from the aperture size and height information is extracted from mechanical motion.

3. SIGNAL DESCRIPTION

Due to the large data sets that result from single-frequency SAR operation, we make use of the range migration algorithm (RMA) for post-processing. To implement the RMA, the spatial frequency components probed by the system must be known and the nontraditional illumination strategy (highlighted in Fig. 3) must be accounted for [5,18,43,44,50]. In this section, these two considerations are addressed, leading to an implementation of the RMA which is specific to the imaging system described in this work.

To understand the pre-processing step required for measurements taken with metasurface apertures, a brief description of these measurements is provided here. Measurements taken with a metasurface-based, single-frequency SAR imaging system are a function of transmit tuning state, receive tuning state, and synthetic aperture position [38]. To facilitate reconstruction of measurements taken this way with the RMA, a pre-processing step must be used to isolate the independent contributions of each virtual source [44]. Each metamaterial radiator may be modeled as an infinitesimal effective dipole which may be considered as an independent source/receiver [23,5963]. The transmitted fields can then be written in decomposed matrix form as Et=ϕG, where ϕ is a matrix of the fields at the surface of the aperture and G is the propagation matrix of those fields to the scene (and analogously for the received fields). Through a matrix decomposition of ϕ, the multiplexed measurements taken as a function of transmit/receive tuning state can be transformed into the equivalent measurements taken as a function of independent transmit/receive elements [64]. This pre-processing step thus generates a measurement for each combination of transmit/receive element [44]. In summary, the pre-processing step transforms the signal in the following manner:

S(Tt,Tr,zs)ST(yt,yr,zs).
Here, Tt and Tr represent the tuning state of the transmitter and receiver, respectively, yt and yr represent the positions of the transmitting and receiving virtual sources, respectively, and zs is the synthetic aperture position.

In order to carry out the described pre-processing step, the fields radiated by each aperture must be known for each tuning state. To achieve this step in the experimental system described subsequently, we perform a near-field scan using a reference antenna to obtain the response of each aperture [65]. A set of pseudorandom tuning states is measured for both the transmit and receive metasurfaces, with a subset of the measured tuning states used for each experiment [24]. The measured fields are used to convert the measured signals to those of effective, independent sources characterized by the near-field scan [22].

After applying the pre-processing step described in Eq. (1), the signal as represented by the transmit and receive dipoles can be reconstructed. To properly reconstruct signals measured by such an imaging system using RMA formulation, the k space relations of the imaging system must be properly described; this is done through the dispersion relation of the system. To determine the dispersion relation of the system, a derivation of the signal in Fourier space must be conducted (in a similar manner to the derivations described in [5,44]). For a signal which is a function of the transmit element locations (yt), receive element locations (yr), and synthetic aperture positions (zs), the phase term of the transmitted field, Et, can be isolated according to the propagation by the Green’s function as ejkRt (and similarly for the received field, Er). Here, and throughout this work, k is the free space wavenumber, defined as k=2πf/c and also k=kx2+ky2+kz2. By combining Et and Er (and neglecting the amplitude term) along with the scene reflectivity (σ), the measured signal can be written as

ST=Vσ(x,y,z)ejkRtejkRrdV,
where
Rt=x2+(yyt)2+(zzt)2Rr=x2+(yyr)2+(zzr)2.
Here, zt and zr are the position in the z direction of the transmitter and receiver, respectively. The Fourier transform of the transformed signal (ST), in the zt, zr, yt, and yr dimensions, can then be calculated as
S^T=VσytztejkRtejkytytejkztztyrzrejkRrejkyryrejkzrzrdzrdyrdztdytdV.

Since the transmitting and receive locations have been distinctly defined, the contributions of the transmitter and receiver are separable. To solve this equation, first we address the contribution of the transmitter, which is

ytztej(kRt+kytyt+kztzt)dytdzt.
This integral can be solved by applying the method of stationary phase, which yields the stationary points yt,s and zt,s:
yt,s=y±kytxk2kyt2kzt2,
zt,s=z±kztxk2kyt2kzt2.
Thus, the solution to the integral shown in Eq. (5) can be written by substituting Eqs. (6) and (7) to obtain
ej(k2kyt2kzt2x+kyty+kztz).
In an analogous manner, the contribution of the received signal can be solved to be
ej(k2kyr2kzr2x+kyry+kzrz).

Using the solutions from Eqs. (8) and (9), Eq. (4) can then be written as

S^T=Vσej(k2kyt2kzt2x+kyty+kztz)ej(k2kyr2kzr2x+kyry+kzrz)dV.
From this solution, the dispersion relations of the imaging system can be regrouped to write Eq. (10) as
S^T=Vσej(kxx+kyy+kzz)dV,
where kx, ky, and kz are
kx=kxt+kxr=k2kyt2kzt2+k2kyr2kzr2,
ky=kyt+kyr,
kz=kzt+kzr.

At this point, the relations governing the k space sampling of the imaging system have been identified. Properly modeling the dispersion relation of the system is paramount to successfully reconstructing SAR measurements [5,44]. In this imaging configuration, kyt=kyr=[π/Δya,π/Δya], which is determined by the spacing between the neighboring virtual dipole sources representing the metasurface. Meanwhile, kzt=kzr=[π/2Δz,π/2Δz], with Δz determined by the synthetic aperture step size. In this work, the transmitter and receiver are moved concurrently, meaning that measurements are only taken when zt=zr. Once these dispersion equations have been defined, Stolt interpolation can be performed within the range migration processing to map the signal from the kyt, kyr, and kz domain to the kx, ky, and kz domain, following Eqs. (12)–(14).

Note that the signal must also be spatially demodulated to account for the center position of the transmitter and receiver according to their positions, following [34,42], and mathematically implemented as

S*=Sej(kxxc,t+kyyc,t+kzzc,t),S*=Sej(kxxc,r+kyyc,r+kzzc,r).
The position of the center of the transmitter and receiver are (xc,t,yc,t,zc,t) and (xc,r,yc,r,zc,r), respectively.

To summarize the specific RMA implementation used in this work, the following steps are performed. First, a matrix decomposition is applied to pre-process the measured signal, following Eq. (1). Then a Fourier transform of the signal is taken to move the data into the wavenumber domain. Once in the wavenumber domain, the signal is mapped into the kx, ky, and kz directions using the dispersion relation described in Eqs. (12)–(14). After that, the signal is spatially demodulated and matched filter is applied. Next, Stolt interpolation is applied to map the signal onto a regular grid in kx, ky, and kz. Finally, a 3D inverse Fourier transform of the signal is taken to form an image [5,18,44,62]. This process represents an implementation of the RMA which is specific to the imaging system at hand, and can efficiently reconstruct images taken with our single-frequency, metasurface-based SAR imaging system.

4. EXPERIMENTAL SETUP

To experimentally demonstrate single-frequency, 3D synthetic aperture imaging, two dynamic metasurfaces were deployed, one as a transmitter and one as a receiver, as shown in Fig. 4. A detailed description of the specific dynamic metasurface used can be found in [24]; a brief description is provided here. This dynamic metasurface is a microstrip waveguide loaded with 112 metamaterial elements, spaced at 3.33 mm, spanning 40 cm, and designed to operate in the lower K band (17.5–21 GHz); in this work, the frequency used is 19.03 GHz. Each metamaterial element can be tuned on or off independently by biasing two integrated PIN diodes with an Arduino microcontroller and DC control circuitry. As different combinations of elements are turned on or off by the PIN diodes, the aperture generates different spatially diverse radiation patterns. Each aperture is fed from a combination of three feeds using power dividers: two end launch feeds and one coaxial feed. This feed architecture results in four propagating waves, all with equal magnitude [24], as shown in Fig. 5.

 figure: Fig. 4.

Fig. 4. Experimental setup consisting of two metasurface apertures. One is used as a transmitter (labeled Tx) and the other as a receiver (labeled Rx).

Download Full Size | PDF

 figure: Fig. 5.

Fig. 5. Diagram of the metasurface aperture. The aperture has three feeds, which generate a total of four propagating waveguides within the microstrip, shown with the blue arrows.

Download Full Size | PDF

The experiments conducted in this work involve two metasurfaces oriented along the same axis (the y axis as shown in Fig. 1), with edge-to-edge separation of 20 cm. This arrangement leads to an overall aperture size of 100 cm. The imaging system is controlled by a PC which uses Matlab to coordinate the two Arduino microcontrollers (one for each metasurface), the vector network analyzer used as the RF source/detector (Keysight E8364C), and the mechanical stage. Each measurement is acquired by setting the position of the synthetic aperture (by way of a linear stage), applying a new tuning state to the transmit and receive metasurfaces (by way of the microcontrollers), and measuring the signal between the transmitter and receiver. In this manner, the collection of signals as a function of transmit tuning state, receive tuning state, and synthetic aperture position can be obtained.

Measurements taken for this work involved cycling through a number of transmit and receive tuning states at each position on the synthetic aperture. In this case, due to physical constraints, the apertures remained stationary while the target(s) moved, consistent with inverse SAR operation. The synthetic aperture path spanned 40 cm and was sampled at 7.5 mm (λ/2), leading to 54 locations. The number of transmit/receive tuning states ranged from 25 to 65, as described in further detail in Section 5. With 25 tuning states for the transmitter and receiver, a total of 25×25×54=33,750 measurements were taken. For each of these tuning states, a random set of 24 elements were radiating simultaneously. This choice ensures that sufficient energy can reach all portions of the metasurface, while radiating efficiently to maintain an adequate SNR [24].

5. EXPERIMENTAL RESULTS AND DISCUSSION

Using the imaging configuration described in Section 4, a variety of targets were measured and reconstructed to demonstrate 3D imaging at a single frequency. The first feature of the system experimentally investigated was resolution. The resolution of an imaging system can be determined by a point spread function (PSF) analysis, in which a sub-resolution point scatterer is reconstructed to reveal an imaging system’s resolution and possible aliasing effects [66]. It should be noted that since the operation is based in the near field, the resolution of the system is spatially varying and does not follow traditionally modeled equations [42]. As result, an expectation for the resolution performance of the system was determined from a simulation of a PSF.

The signal for a simulated PSF was calculated by propagating the fields radiated by the transmitter (as measured by near-field scan) to the scene, applying the scattering off of a point and propagating the scattered fields back to the receive aperture to form a simulated signal. The simulated signal for a point object located at (x=0.35m, y=0m, z=0m) was reconstructed with the range migration algorithm, as described in Section 3. Here, 65 transmit and receive tuning states were used. From the simulated PSF, the expected resolution in x, y, and z was calculated from the full width at half-maximum (FWHM) to be 3.5, 1.6, and 0.8 cm, respectively. Using the same imaging parameters as in simulation, an experimental PSF was also performed, with the resulting resolution (as determined by the FWHM of the reconstruction) in x, y, and z found to be 3.8, 2.7, and 3.0 cm, respectively. Images of the simulated and experimental PSF are shown in Figs. 6 and 7, respectively. It should be noted that the metallic sphere used to experimentally characterize the PSF was 1.75 cm in diameter, which is slightly above the expected resolution in the y and z directions. Here, the object is well-resolved in range, cross range, and height, which validates the idea that using a metasurface antenna in a SAR context enables single-frequency 3D imaging.

 figure: Fig. 6.

Fig. 6. 3D image reconstruction of simulated measurements of a point scatterer. The target is well-resolved in all three directions with measurements at a single frequency.

Download Full Size | PDF

 figure: Fig. 7.

Fig. 7. 3D image reconstruction of experimental measurements of a point scatterer. Here, a metallic marble of diameter 1.75 cm is used as target and is well-resolved in all three directions with measurements at a single frequency.

Download Full Size | PDF

Having demonstrated single-frequency SAR imaging, we now compare performance of single-frequency imaging to bandwidth imaging. To conduct similar studies with bandwidth, single-frequency measurements are reconstructed as described in Section 3. Then, reconstructions of measurements taken with different frequencies are coherently summed. In this study, introducing multiple frequency points across a bandwidth increases the number of total measurements. Without compensating for the total number of measurements, the bandwidth case could have an unfair advantage since using more measurements can suppress noise. With this consideration in mind, we demonstrate test cases with different numbers of transmit and receive tuning states at each synthetic aperture position. First, we show a single-frequency result which uses 64 transmit states, 64 receive states, and one frequency point (4,096 total measurement modes per synthetic aperture location). Second, we show a bandwidth result which uses 16 transmit states, 16 receive states, and 16 frequency points (4,096 total measurement modes per synthetic aperture location). Third, we show a bandwidth result which uses 64 transmit states, 64 receive states, and 16 frequency points (65,536 total measurement modes per synthetic aperture location). This case is included to provide a direct comparison from single-frequency to bandwidth when the number of measurement modes per frequency is the same. Each of the bandwidth cases uses a frequency sweep from 17.68 to 19.03 GHz, sampled at 90 MHz. This bandwidth is chosen because the resolution in certain directions is determined by the maximum frequency; using the single-frequency operating point as the upper bound of the bandwidth ensures a fair comparison.

Each of the described cases was used to reconstruct an experimental PSF. The resulting images are plotted in Figs. 810, which show a 3D image and the corresponding cross sections at the target’s location. For these studies, the target used was the same as in Fig. 7. From the cross-section plots in Figs. 810, the resolution of the three cases can be extracted by measuring the FWHM in each direction—these results are shown in Table 1. From these results, it can be seen that the resolution of the bandwidth cases remains consistent with the single-frequency case, indicating that operating at a single frequency does not degrade resolving power. We expect that if the standoff distance were increased substantially, the bandwidth case would begin to demonstrate superior resolution, specifically in the x direction. At standoff distances comparable to the length of the aperture, however, the single-frequency imaging system can match the performance of bandwidth systems. The similar resolution stems from the fact that within the near field, the resolution in both range and cross range are dominated by the geometry of the system rather than by the bandwidth [42]. Additionally, Fig. 10 shows suppressed background noise compared to the other cases; this effect can be attributed to a larger number of overall measurement modes.

 figure: Fig. 8.

Fig. 8. Experimental PSF with 64 transmit masks, 64 receive masks, and a single frequency (19.03 GHz). Cross sections are plotted with a linear colormap.

Download Full Size | PDF

 figure: Fig. 9.

Fig. 9. Experimental PSF with 16 transmit masks, 16 receive masks, and 16 frequency points, spanning 17.68–19.03 GHz. Cross sections are plotted with a linear colormap.

Download Full Size | PDF

 figure: Fig. 10.

Fig. 10. Experimental PSF with 64 transmit masks, 64 receive masks, and 16 frequency points, spanning 17.68–19.03 GHz. Cross sections are plotted with a linear colormap.

Download Full Size | PDF

Tables Icon

Table 1. Resolution of Single-Frequency and Bandwidth Imaging Casesa

Returning to single-frequency imaging demonstrations, a more complex target was also measured, comprising four copper tubes. These tubes were oriented along the y and z directions, as shown in Fig. 11, and had a diameter of approximately 2 cm and length of approximately 10 cm. This target highlights the ability of this imaging system to resolve multiple targets in all three directions; the experimentally reconstructed image is shown in Fig. 11. The number of tuning states and synthetic aperture positions remained consistent with those used for the experimental PSF shown in Fig. 7. The reconstructed image shows the ability of this system to image an object spanning a large extent in y and z.

 figure: Fig. 11.

Fig. 11. Reconstructed 3D image of four copper cylinders with approximate diameter 2 cm and length of 10 cm.

Download Full Size | PDF

In addition to the two objects just described, a larger, extended object was also imaged. The target in this case was a metallic cylinder with diameter of 15 cm and height of 70 cm. The resulting image, which highlights the ability of this imaging system to image objects which span a larger extent in z, is shown in Fig. 12. Furthermore, when using such a large object, the signal returned is much greater, leading to higher SNR. As a result, this study allowed us to reduce the number of transmit and receive tuning states from 65 to 25 to demonstrate the ability of our imaging system to reconstruct using fewer measurements.

 figure: Fig. 12.

Fig. 12. Reconstructed 3D image of a cylinder of diameter 15 cm and height 70 cm.

Download Full Size | PDF

Another study performed involved investigating the particular choice of frequency used. For any imaging system performing single-frequency imaging, choosing the optimal frequency is paramount to achieving the best performance. Different factors can affect the optimal frequency choice, most of which are specific to the imaging hardware being used. As a result, the frequency choice study performed here only applies to the specific metasurface antennas discussed in this work. While these specific results only apply to the metasurface antenna used throughout this work, the concepts and procedure can be readily applied to other imaging hardware. For all single-frequency imaging systems, choosing a higher frequency will result in higher resolution.

The metasurfaces used in this work (as described in Section 4 and detailed in [24]) have two element geometries, resonant at 17.8 and 18.9 GHz, respectively. Each element radiates more strongly close to these frequencies. When the elements radiate strongly, there is a higher contrast ratio between the on/off states, but the strong radiation from each element results in a quick attenuation of the guided wave. When the guided wave is attenuated quickly, there is not sufficient energy left to excite elements far from the feeds, leading to portions of the aperture which do not radiate. With such a nonuniform illumination of the aperture, aliasing can become significant. To ensure that sufficient energy can reach all elements in the aperture, the operational frequency can be chosen to be off-resonance, either above or below the resonant frequencies of the two elements. When choosing such an operating frequency, the guided wave attenuates slower, allowing the full extent of the aperture to radiate. If the frequency choice is too far away from the resonances of the two element designs, however, the contrast between the on and off states becomes small, leading to more correlation among radiation patterns, and the elements radiate too little energy, reducing the efficiency of the overall aperture. These behaviors can be seen by analyzing the power-radiated spectrum shown in [24]. All of these factors (radiation efficiency, illumination uniformity, and contrast ratio) must be balanced by the frequency choice.

To balance these factors, we predicted that best performance would be achieved with a frequency slightly above 18.9 GHz (the higher resonant frequency among the two elements in the metasurface). We then tested this prediction by imaging the target from Fig. 12 with different frequency points. From the results shown in Fig. 13, 19.03 GHz (the frequency used throughout this work) yields the best reconstructed image. Operating at 17.5 GHz results in the elements coupling too strongly to the guided wave, such that the metasurfaces only radiate from elements close to the feed. Working at 17.5 GHz leads to high radiation efficiency (as evidenced by the low noise in the reconstructed image), but causes the aperture to only radiate near the feeds. Since the metasurface has nonuniform illumination, spatial aliasing occurs, leading to strong sidelobes in the reconstructed image. Operating too far above the resonance of each element (at 20.02 or 21.01 GHz) produces uniform illumination, which can be seen from the small sidelobes. However, this frequency choice also leads to low radiation efficiency and correlation among radiation patterns, which contribute to the noisy image. When these considerations are properly balanced, at 19.03 GHz, the aperture demonstrates adequate radiation efficiency, uniform aperture illumination, and low correlation among measurement modes, resulting in a high fidelity reconstruction.

 figure: Fig. 13.

Fig. 13. Single-frequency SAR images of a large cylinder at five different frequency points.

Download Full Size | PDF

To further investigate the behavior of this single-frequency SAR imaging system, an analysis of the number of transmit/receive tuning states was performed. If fewer states can be used on the transmit/receive metasurfaces, the acquisition time can be significantly reduced. In this case, the four copper tubes (shown in Fig. 11) were used as the target, while the number of transmit/receive states was varied from 25 to 65. There are two major consequences of the number of transmit/receive states (and thereby the number of measurements). First, with more measurements, noise in the system is better averaged across the measurement set. Second, a larger set of measurements results in a larger matrix representing the effective dipolar sources that characterize the radiation of a metasurface antenna. As the number of measurements increases, the corresponding pseudo-inverse of this ill-conditioned matrix will yield a better transformation from the measured signal in terms of transmit/receive states to the effective independent dipolar sources [following Eq. (1)]. On top of these effects, more measurements may mean that the system is more likely to span the entire scene’s basis. Selecting a set of tailored transmit/receive states that sample this basis with a reduced set of measurements is the focus of compressive sampling techniques which are discussed in other works (see [67], for example). Our goal here is to demonstrate the viability of the proposed imaging configuration; further analysis and optimization of tuning states is left for future works.

The consequences of the number of measurements are highlighted by the results in Fig. 14, which show a reconstruction for 25, 45, and 65 tuning states. From these images, using 65 states results in a cleaner and higher-fidelity image as compared to using 45 states. With 25 states, the image is significantly degraded.

 figure: Fig. 14.

Fig. 14. Reconstructions of four metallic cylinders. Each image uses a different number of transmit/receive tuning states, NT, resulting in NT2Ns total measurements (Ns is the number of synthetic aperture positions).

Download Full Size | PDF

6. CONCLUSION

We have demonstrated that 3D microwave imaging can be conducted with measurements taken at a single frequency using dynamic metasurface antennas. This demonstration shows newfound capabilities from existing hardware. Within the imaging paradigm and methodology discussed in this work, one can envision future single-frequency 3D imaging systems using 2D dynamic metasurfaces, sparse multistatic antenna arrays, and multiple unique 1D and/or 2D synthetic apertures. In all of these cases, costly RF components are avoided due to the fact that spectral bandwidth is not required for these systems to produce 3D images.

In demonstrating single-frequency SAR, we have shown an imaging system that incorporates metasurface hardware with aperture synthesis in order to reconstruct 3D images from backscattered measurements. Metasurfaces stand to offer comparable performance to traditional antennas with a significantly reduced cost, size, and weight. Image reconstruction has been conducted in an efficient manner by adapting the range-migration algorithm to our specific configuration, leading to a system which has high performance in both hardware and software. The results presented here have all been demonstrated without requiring complex RF components, with single-frequency measurements. Ultimately, the high performance of a single-frequency, metasurface SAR imaging system presents a low-cost, accessible hardware platform with 3D imaging capabilities for applications in through-wall imaging, security screening, and biomedical diagnosis.

Funding

Air Force Office of Scientific Research (AFOSR) (FA9550-12-1-0491, FA9550-18-1-0187).

REFERENCES

1. D. M. Sheen, D. L. McMakin, and T. E. Hall, “Three-dimensional millimeter-wave imaging for concealed weapon detection,” IEEE Trans. Microwave Theory Tech. 49, 1581–1592 (2001). [CrossRef]  

2. F. Ahmad, M. G. Amin, and S. A. Kassam, “Synthetic aperture beamformer for imaging through a dielectric wall,” IEEE Trans. Aerosp. Electron. Syst. 41, 271–283 (2005). [CrossRef]  

3. M. Dehmollaian and K. Sarabandi, “Refocusing through building walls using synthetic aperture radar,” IEEE Trans. Geosci. Remote Sens. 46, 1589–1599 (2008). [CrossRef]  

4. Q. Huang, L. Qu, B. Wu, and G. Fang, “UWB through-wall imaging based on compressive sensing,” IEEE Trans. Geosci. Remote Sens. 48, 1408–1415 (2010). [CrossRef]  

5. X. Zhuge and A. Yarovoy, “Three-dimensional near-field MIMO array imaging using range migration techniques,” IEEE Trans. Image Process. 21, 3026–3033 (2012). [CrossRef]  

6. B. Sun, M. P. Edgar, R. Bowman, L. E. Vittert, S. Welsh, A. Bowman, and M. Padgett, “3D computational imaging with single-pixel detectors,” Science 340, 844–847 (2013). [CrossRef]  

7. W. Zhang and A. Hoorfar, “Three-dimensional synthetic aperture radar imaging through multilayered walls,” IEEE Trans. Antennas Propag. 62, 459–462 (2014). [CrossRef]  

8. Y. Alvarez, Y. Rodriguez-Vaqueiro, B. Gonzalez-Valdes, C. Rappaport, F. Las-Heras, and J. Martinez-Lorenzo, “Three-dimensional compressed sensing-based millimeter-wave imaging,” IEEE Trans. Antennas Propag. 63, 5868–5873 (2015). [CrossRef]  

9. T. Fromenteze, X. Liu, M. Boyarsky, J. Gollub, and D. R. Smith, “Phaseless computational imaging with a radiating metasurface,” Opt. Express 24, 16760–16776 (2016). [CrossRef]  

10. M. G. Amin, Through-the-Wall Radar Imaging (CRC Press, 2016).

11. S. Depatla, C. R. Karanam, and Y. Mostofi, “Robotic through-wall imaging: radio-frequency imaging possibilities with unmanned vehicles,” IEEE Antennas Propag. Mag. 59(5), 47–60 (2017). [CrossRef]  

12. M. Soumekh, Synthetic Aperture Radar Signal Processing (Wiley, 1999).

13. E. C. Fear, S. C. Hagness, P. M. Meaney, M. Okoniewski, and M. A. Stuchly, “Enhancing breast tumor detection with near-field imaging,” IEEE Microwave Mag. 3(1), 48–56 (2002). [CrossRef]  

14. R. Eils and C. Athale, “Computational imaging in cell biology,” J. Cell Biol. 161, 477–481 (2003). [CrossRef]  

15. E. J. Bond, X. Li, S. C. Hagness, and B. D. Van Veen, “Microwave imaging via space-time beamforming for early detection of breast cancer,” IEEE Trans. Antennas Propag. 51, 1690–1705 (2003). [CrossRef]  

16. A. W. Doerry, D. F. Dubbert, M. Thompson, and V. D. Gutierrez, “A portfolio of fine resolution Ka-band SAR images: part I,” in Defense and Security (International Society for Optics and Photonics, 2005), pp. 13–24.

17. S. Ochs and W. Pitz, “The TerraSAR-X and Tandem-X satellites,” in 3rd International Conference on Recent Advances in Space Technologies (IEEE, 2007), pp. 294–298.

18. X. Zhuge and A. G. Yarovoy, “A sparse aperture MIMO-SAR-based UWB imaging system for concealed weapon detection,” IEEE Trans. Geosci. Remote Sens. 49, 509–518 (2011). [CrossRef]  

19. T. P. Ager, “An introduction to synthetic aperture radar imaging,” Oceanography 26, 20–33 (2013). [CrossRef]  

20. S. S. Ahmed, Electronic Microwave Imaging with Planar Multistatic Arrays (Verlag, 2014).

21. G. Krieger, “Mimo-SAR: opportunities and pitfalls,” IEEE Trans. Geosci. Remote Sens. 52, 2628–2645 (2014). [CrossRef]  

22. J. Hunt, T. Driscoll, A. Mrozack, G. Lipworth, M. Reynolds, D. Brady, and D. R. Smith, “Metamaterial apertures for computational imaging,” Science 339, 310–313 (2013). [CrossRef]  

23. J. Hunt, J. Gollub, T. Driscoll, G. Lipworth, A. Mrozack, M. S. Reynolds, D. J. Brady, and D. R. Smith, “Metamaterial microwave holographic imaging system,” J. Opt. Soc. Am. A 31, 2109–2119 (2014). [CrossRef]  

24. T. Sleasman, M. Boyarsky, M. F. Imani, J. Gollub, and D. Smith, “Design considerations for a dynamic metamaterial aperture for computational imaging at microwave frequencies,” J. Opt. Soc. Am. B 33, 1098–1111 (2016). [CrossRef]  

25. J. N. Gollub, O. Yurduseven, K. P. Trofatter, D. Arnitz, M. F. Imani, T. Sleasman, M. Boyarsky, A. Rose, A. Pedross-Engel, H. Odabasi, T. Zvolensky, G. Lipworth, D. Brady, D. L. Marks, M. S. Reynolds, and D. R. Smith, “Large metasurface aperture for millimeter wave computational imaging at the human-scale,” Sci. Rep. 7, 42650 (2017). [CrossRef]  

26. A. Pedross-Engel, D. Arnitz, J. N. Gollub, O. Yurduseven, K. P. Trofatter, M. F. Imani, T. Sleasman, M. Boyarsky, X. Fu, D. Marks, D. R. Smith, and M. S. Reynolds, “Orthogonal coded active illumination for millimeter wave, massive-MIMO computational imaging with metasurface antennas,” IEEE Trans. Comput. Imaging, doi: 10.1109/TCI.2018.2808762. [CrossRef]   (to be published).

27. A. V. Diebold, M. F. Imani, T. Sleasman, and D. R. Smith, “Phaseless computational ghost imaging at microwave frequencies using a dynamic metasurface aperture,” Appl. Opt. 57, 2142–2149 (2018). [CrossRef]  

28. K.-W. Yu, Y.-L. Lu, D.-C. Chang, V. Liang, and M. F. Chang, “K-band low-noise amplifiers using 0.18/spl mu/m CMOS technology,” IEEE Microwave Wireless Compon. Lett. 14, 106–108 (2004). [CrossRef]  

29. D. Porcino and W. Hirt, “Ultra-wideband radio technology: potential and challenges ahead,” IEEE Commun. Mag. 41(7), 66–74 (2003). [CrossRef]  

30. H. Odabasi, M. F. Imani, G. Lipworth, J. Gollub, and D. R. Smith, “Investigation of alignment errors on multi-static microwave imaging based on frequency-diverse metamaterial apertures,” Prog. Electromagn. Res. 70, 101–112 (2016). [CrossRef]  

31. O. Yurduseven, J. N. Gollub, K. P. Trofatter, D. L. Marks, A. Rose, and D. R. Smith, “Software calibration of a frequency-diverse, multistatic, computational imaging system,” IEEE Access 4, 2488–2497 (2016). [CrossRef]  

32. M. T. Ghasr, M. A. Abou-Khousa, S. Kharkovsky, R. Zoughi, and D. Pommerenke, “Portable real-time microwave camera at 24 GHz,” IEEE Trans. Antennas Propag. 60, 1114–1125 (2012). [CrossRef]  

33. M. T. Ghasr, S. Kharkovsky, R. Bohnert, B. Hirst, and R. Zoughi, “30 GHz linear high-resolution and rapid millimeter wave imaging system for NDE,” IEEE Trans. Antennas Propag. 61, 4733–4740 (2013). [CrossRef]  

34. T. Fromenteze, M. Boyarsky, J. Gollub, T. Sleasman, M. F. Imani, and D. R. Smith, “Single-frequency near-field MIMO imaging,” in 11th European Conference on Antennas and Propagation (EuCAP), Paris, France (2017), pp. 1415–1418.

35. T. A. Sleasman, M. F. Imani, M. Boyarsky, J. Gollub, and D. R. Smith, “Reconfigurable metasurface apertures for computational imaging,” in Mathematics in Imaging (Optical Society of America, 2017), paper MM2C.4.

36. T. Sleasman, M. F. Imani, J. N. Gollub, and D. R. Smith, “Dynamic metamaterial aperture for microwave imaging,” Appl. Phys. Lett. 107, 204104 (2015). [CrossRef]  

37. M. Boyarsky, T. Sleasman, L. Pulido-Mancera, T. Fromenteze, A. Pedross-Engel, C. M. Watts, M. F. Imani, M. S. Reynolds, and D. R. Smith, “Synthetic aperture radar with dynamic metasurface antennas: a conceptual development,” J. Opt. Soc. Am. A 34, A22–A36 (2017). [CrossRef]  

38. T. Sleasman, M. Boyarsky, L. Pulido-Mancera, T. Fromenteze, M. F. Imani, M. S. Reynolds, and D. R. Smith, “Experimental synthetic aperture radar with dynamic metasurfaces,” IEEE Trans. Antennas Propag. 65, 6864–6877 (2017). [CrossRef]  

39. M. Boyarsky, T. Sleasman, L. Pulido-Mancera, M. F. Imani, M. S. Reynolds, and D. R. Smith, “Alternative synthetic aperture radar (SAR) modalities using a 1D dynamic metasurface antenna,” Proc. SPIE 10189, 101890H (2017). [CrossRef]  

40. M. C. Johnson, S. L. Brunton, N. B. Kundtz, and J. N. Kutz, “Sidelobe canceling for reconfigurable holographic metamaterial antenna,” IEEE Trans. Antennas Propag. 63, 1881–1886 (2015). [CrossRef]  

41. M. Johnson, “Opening satellite capacity to consumers with metamaterial antennas,” in Proceedings of the International Conference on Metamaterials, Photonic Crystals and Plasmonics (2016).

42. T. Sleasman, M. Boyarsky, M. F. Imani, T. Fromenteze, J. N. Gollub, and D. R. Smith, “Single-frequency microwave imaging with dynamic metasurface apertures,” J. Opt. Soc. Am. B 34, 1713–1726 (2017). [CrossRef]  

43. C. Cafforio, C. Prati, and F. Rocca, “SAR data focusing using seismic migration techniques,” IEEE Trans. Aerosp. Electron. Syst. 27, 194–207 (1991). [CrossRef]  

44. L. Pulido-Mancera, T. Fromenteze, T. Sleasman, M. Boyarsky, M. F. Imani, M. Reynolds, and D. Smith, “Application of range migration algorithms to imaging with a dynamic metasurface antenna,” J. Opt. Soc. Am. B 33, 2082–2092 (2016). [CrossRef]  

45. H. J. Callow, M. P. Hayes, and P. T. Gough, “Wavenumber domain reconstruction of SAR/SAS imagery using single transmitter and multiple-receiver geometry,” Electron. Lett. 38, 336–338 (2002). [CrossRef]  

46. Y. Qi, W. Tan, Y. Wang, W. Hong, and Y. Wu, “3D bistatic omega-K imaging algorithm for near range microwave imaging systems with bistatic planar scanning geometry,” Prog. Electromagn. Res. 121, 409–431 (2011). [CrossRef]  

47. R. Zhu, J. Zhou, L. Tang, Y. Kan, and Q. Fu, “Frequency-domain imaging algorithm for single-input-multiple-output array,” IEEE Geosci. Remote Sens. Lett. 13, 1747–1751 (2016). [CrossRef]  

48. Y. Alvarez, Y. Rodriguez-Vaqueiro, B. Gonzalez-Valdes, S. Mantzavinos, C. M. Rappaport, F. Las-Heras, and J. A. Martinez-Lorenzo, “Fourier-based imaging for multistatic radar systems,” IEEE Trans. Microwave Theory Tech. 62, 1798–1810 (2014). [CrossRef]  

49. J. M. Lopez-Sanchez and J. Fortuny-Guasch, “3-d radar imaging using range migration techniques,” IEEE Trans. Antennas Propag. 48, 728–737 (2000). [CrossRef]  

50. A. V. Diebold, L. Pulido-Mancera, T. Sleasman, M. Boyarsky, M. F. Imani, and D. R. Smith, “Generalized range migration algorithm for synthetic aperture radar image reconstruction of metasurface antenna measurements,” J. Opt. Soc. Am. B 34, 2610–2623 (2017). [CrossRef]  

51. X. Li, S. K. Davis, S. C. Hagness, D. W. Van der Weide, and B. D. Van Veen, “Microwave imaging via space-time beamforming: experimental investigation of tumor detection in multilayer breast phantoms,” IEEE Trans. Microwave Theory Tech. 52, 1856–1865 (2004). [CrossRef]  

52. B. Gonzalez-Valdes, G. Allan, Y. Rodriguez-Vaqueiro, Y. Alvarez, S. Mantzavinos, M. Nickerson, B. Berkowitz, J. Martinez-Lorenzo, F. Las-Heras, and C. M. Rappaport, “Sparse array optimization using simulated annealing and compressed sensing for near-field millimeter wave imaging,” IEEE Trans. Antennas Propag. 62, 1716–1722 (2014). [CrossRef]  

53. T. Fromenteze, O. Yurduseven, M. Boyarsky, J. Gollub, D. L. Marks, and D. R. Smith, “Computational polarimetric microwave imaging,” Opt. Express 25, 27488–27505 (2017). [CrossRef]  

54. S. S. Ahmed, A. Schiessl, and L. P. Schmidt, “A novel fully electronic active real-time imager based on a planar multistatic sparse array,” IEEE Trans. Microwave Theory Tech. 59, 3567–3576 (2011). [CrossRef]  

55. M. Soumekh, Fourier Array Imaging (Prentice-Hall, 1994).

56. D. L. Marks, J. Gollub, and D. R. Smith, “Spatially resolving antenna arrays using frequency diversity,” J. Opt. Soc. Am. A 33, 899–912 (2016). [CrossRef]  

57. J. C. Curlander and R. N. McDonough, Synthetic Aperture Radar (Wiley, 1991).

58. T. Sleasman, M. F. Imani, W. Xu, J. Hunt, T. Driscoll, M. S. Reynolds, and D. R. Smith, “Waveguide-fed tunable metamaterial element for dynamic apertures,” IEEE Antennas Wireless Propag. Lett. 15, 606–609 (2016). [CrossRef]  

59. G. Lipworth, A. Mrozack, J. Hunt, D. L. Marks, T. Driscoll, D. Brady, and D. R. Smith, “Metamaterial apertures for coherent computational imaging on the physical layer,” J. Opt. Soc. Am. A 30, 1603–1612 (2013). [CrossRef]  

60. G. Lipworth, A. Rose, O. Yurduseven, V. R. Gowda, M. F. Imani, H. Odabasi, P. Trofatter, J. Gollub, and D. R. Smith, “Comprehensive simulation platform for a metamaterial imaging system,” Appl. Opt. 54, 9343–9353 (2015). [CrossRef]  

61. L. M. Pulido-Mancera, T. Zvolensky, M. F. Imani, P. T. Bowen, M. Valayil, and D. R. Smith, “Discrete dipole approximation applied to highly directive slotted waveguide antennas,” IEEE Antennas Wireless Propag. Lett. 15, 1823–1826 (2016). [CrossRef]  

62. L. Pulido-Mancera, M. F. Imani, and D. R. Smith, “Discrete dipole approximation for simulation of unusually tapered leaky wave antennas,” in IEEE MTT-S International Microwave Symposium (IMS) (2017), pp. 409–412.

63. L. Pulido-Mancera, P. T. Bowen, M. F. Imani, N. Kundtz, and D. Smith, “Polarizability extraction of complementary metamaterial elements in waveguides for aperture modeling,” Phys. Rev. B 96, 235402 (2017).

64. T. Fromenteze, E. L. Kpré, D. Carsenat, C. Decroze, and T. Sakamoto, “Single-shot compressive multiple-inputs multiple-outputs radar imaging using a two-port passive device,” IEEE Access 4, 1050–1060 (2016). [CrossRef]  

65. A. D. Yaghjian, “An overview of near-field antenna measurements,” IEEE Trans. Antennas Propag. 34, 30–45 (1986). [CrossRef]  

66. O. Yurduseven, M. F. Imani, H. Odabasi, J. Gollub, G. Lipworth, A. Rose, and D. R. Smith, “Resolution of the frequency diverse metamaterial aperture imager,” Prog. Electromagn. Res. 150, 97–107 (2015). [CrossRef]  

67. M. Liang, Y. Li, H. Meng, M. Neifeld, and H. Xin, “Reconfigurable array design to realize principal component analysis (PCA)-based microwave compressive sensing imaging system,” IEEE Antennas Wireless Propag. Lett. 14, 1039–1042 (2015). [CrossRef]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (14)

Fig. 1.
Fig. 1. Conceptual diagram of a single-frequency imaging system using metasurface antennas. One metasurface antenna serves as the transmitter while another receives the backscattered signal. At each synthetic aperture position, measurements are taken by cycling through combinations of spatially diverse radiation patterns on the transmitter and receiver.
Fig. 2.
Fig. 2. Schematic description of two styles of microwave imaging systems. (a) Monostatic SAR system covering an area in k space function of SAR position and frequency ( B is the spectral bandwidth). (b) Multistatic array covering an area in k space as a function of the transmit and receive aperture size, while only using a single frequency point.
Fig. 3.
Fig. 3. Diagram of the illumination strategy for a single-frequency, metasurface antenna imaging system. The metasurfaces measure the scene by cycling through combinations of transmit and receive radiation patterns (each aperture is set with a tuning state, T n ). This process is repeated at each synthetic aperture position as the metasurfaces are moved together, with the synthetic aperture path oriented along the z direction. Note that the transmitter and receiver do not necessarily apply the same tuning state during a given measurement.
Fig. 4.
Fig. 4. Experimental setup consisting of two metasurface apertures. One is used as a transmitter (labeled Tx) and the other as a receiver (labeled Rx).
Fig. 5.
Fig. 5. Diagram of the metasurface aperture. The aperture has three feeds, which generate a total of four propagating waveguides within the microstrip, shown with the blue arrows.
Fig. 6.
Fig. 6. 3D image reconstruction of simulated measurements of a point scatterer. The target is well-resolved in all three directions with measurements at a single frequency.
Fig. 7.
Fig. 7. 3D image reconstruction of experimental measurements of a point scatterer. Here, a metallic marble of diameter 1.75 cm is used as target and is well-resolved in all three directions with measurements at a single frequency.
Fig. 8.
Fig. 8. Experimental PSF with 64 transmit masks, 64 receive masks, and a single frequency (19.03 GHz). Cross sections are plotted with a linear colormap.
Fig. 9.
Fig. 9. Experimental PSF with 16 transmit masks, 16 receive masks, and 16 frequency points, spanning 17.68–19.03 GHz. Cross sections are plotted with a linear colormap.
Fig. 10.
Fig. 10. Experimental PSF with 64 transmit masks, 64 receive masks, and 16 frequency points, spanning 17.68–19.03 GHz. Cross sections are plotted with a linear colormap.
Fig. 11.
Fig. 11. Reconstructed 3D image of four copper cylinders with approximate diameter 2 cm and length of 10 cm.
Fig. 12.
Fig. 12. Reconstructed 3D image of a cylinder of diameter 15 cm and height 70 cm.
Fig. 13.
Fig. 13. Single-frequency SAR images of a large cylinder at five different frequency points.
Fig. 14.
Fig. 14. Reconstructions of four metallic cylinders. Each image uses a different number of transmit/receive tuning states, N T , resulting in N T 2 N s total measurements ( N s is the number of synthetic aperture positions).

Tables (1)

Tables Icon

Table 1. Resolution of Single-Frequency and Bandwidth Imaging Cases a

Equations (15)

Equations on this page are rendered with MathJax. Learn more.

S ( T t , T r , z s ) S T ( y t , y r , z s ) .
S T = V σ ( x , y , z ) e j k R t e j k R r d V ,
R t = x 2 + ( y y t ) 2 + ( z z t ) 2 R r = x 2 + ( y y r ) 2 + ( z z r ) 2 .
S ^ T = V σ y t z t e j k R t e j k y t y t e j k z t z t y r z r e j k R r e j k y r y r e j k z r z r d z r d y r d z t d y t d V .
y t z t e j ( k R t + k y t y t + k z t z t ) d y t d z t .
y t , s = y ± k y t x k 2 k y t 2 k z t 2 ,
z t , s = z ± k z t x k 2 k y t 2 k z t 2 .
e j ( k 2 k y t 2 k z t 2 x + k y t y + k z t z ) .
e j ( k 2 k y r 2 k z r 2 x + k y r y + k z r z ) .
S ^ T = V σ e j ( k 2 k y t 2 k z t 2 x + k y t y + k z t z ) e j ( k 2 k y r 2 k z r 2 x + k y r y + k z r z ) d V .
S ^ T = V σ e j ( k x x + k y y + k z z ) d V ,
k x = k x t + k x r = k 2 k y t 2 k z t 2 + k 2 k y r 2 k z r 2 ,
k y = k y t + k y r ,
k z = k z t + k z r .
S * = S e j ( k x x c , t + k y y c , t + k z z c , t ) , S * = S e j ( k x x c , r + k y y c , r + k z z c , r ) .
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.