Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Fluorescence Scheimpflug LiDAR developed for the three-dimension profiling of plants

Open Access Open Access

Abstract

This work proposes a novel fluorescence Scheimpflug LiDAR (SLiDAR) technique based on the Scheimpflug principle for three-dimension (3D) plant profile measurements. A 405 nm laser diode was employed as the excitation light source to generate a light sheet. Both the elastic and inelastic/fluorescence signals from a target object (e.g., plants) can be simultaneously measured by the fluorescence SLiDAR system employing a color image sensor with blue, green and red detection channels. The 3D profile can be obtained from the elastic signal recorded by blue pixels through elevation scanning measurements, while the fluorescence intensity of the target object is mainly acquired by red and green pixels. The normalized fluorescence intensity of the red channel, related to the chlorophyll distribution of the plant, can be utilized for the classification of leaves, branches and trunks. The promising results demonstrated in this work have shown a great potential of employing the fluorescence SLiDAR technique for 3D fluorescence profiling of plants in agriculture and forestry applications.

© 2020 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

The Paris Agreement set a target of “holding the increase in the global average temperature to well below 2 °C above pre-industrial levels and to pursue efforts to limit the temperature increase to 1.5 °C above pre-industrial levels.” In order to achieve this goal, methods such as afforestation/reforestation, avoided deforestation, and biomass energy with carbon capture and storage (BECCS) are widely employed. It has been determined that forest-based mitigation could be more efficient for atmospheric CO2 removal than BECCS [1]. Forest, which contains up to 80% of all aboveground carbon [2], is one of the keys to achieve the target set by the Paris Agreement. Accurate estimation of forest biomass provides an important basis for assessing/tracking the current state of forest carbon pools and their rates of changing [3]. It is also helpful for predicting how much industrial carbon emissions can be offset by forest carbon sequestration in the future. The traditional approach for estimating carbon stocks is to calculate the aboveground biomass, according to allometric equations using the mean diameter at breast height (DBH) of the trunk or the bole of standing trees, and then multiply it with the number of trees in a forest [4]. Though it is convenient to calculate the biomass through this method, its accuracy is limited.

In recent years, light detection and ranging (LiDAR) has become one of the major tools in forest three-dimension (3D) structure measurement [59]. The conventional LiDAR used is mainly based on the time-of-flight (ToF) principle, which means up to four-dimensional data can be collected, including the 3D coordination and the intensity of the collected scattering light [10]. There are mainly two methods, namely the intensity and the geometry relationship, for separating tree branches from leaves. Traditionally, the distance normalized apparent reflectance intensity is utilized for separation of foliage and woody parts [11]. A full-waveform LiDAR instrument cannot only measure the range and the peak intensity, but also the shape of the reflected waveform, which could be utilized for trunk and foliage classifications [5,12]. However, the waveform reflected from trees may vary with tree species, distance, surface characteristics, incident angles, etc. [13]. Full waveform multispectral LiDAR is able to provide multispectral 3D data clouds by using a supercontinuum laser [14,15]. Yotam et al., proposed an automatic algorithm employing a series of global optimizations to consolidate a point cloud into skeletal structures. However, the existence of dense crowns could be a great challenge for the 3D reconstruction [16]. Three-dimension coordinate points-based methods are promising for wood-leaf separation, but requiring a high-quality point cloud [17]. Fluorescence LiDAR techniques, mainly based on the Newtonian telescope structure, have been used in atmospheric aerosol measurements. When using fluorescence LiDAR for plant detection in short distance, data acquisition card with sampling rate higher than 1 GHz has to be used for resolving the fine profile [18]. If the position information as well as the laser-induced fluorescence (LIF) signal, or even multi-channel fluorescence signals can be measured by the LiDAR technique, more accurate and direct methods can be developed for leaves/branches separation and tree reconstruction.

Since 2015, the Scheimpflug LiDAR (SLiDAR) technique, which employs image sensors as detectors [19], has been developed for atmospheric aerosol sensing [20,21], insect monitoring [22], and under water detection [2325]. By employing inexpensive image sensors as detectors and low-cost multimode laser diodes as light sources, the SLiDAR technique features of low cost and low maintenance. In this paper, a terrestrial 2D fluorescence SLiDAR, employing a multimode high power laser diode as the light source and a color complementary metal–oxide–semiconductor (CMOS) image sensor as the detector, has been developed to obtain the 3D structure as well as the fluorescence distribution of plants. Both the elastic and inelastic/fluorescence signals from plants can be measured by the CMOS sensor with blue, green and red detection channels. The 3D profile can be obtained from the elastic signal recorded by blue pixels through scanning measurements, while the fluorescence intensity of the target object is mainly acquired by red and green pixels. A fluorescence-based method has been proposed and implemented to evaluate the elastic/fluorescence point clouds for classification of tree leaves, branches and even fruits.

2. Principles

2.1 Principle of the 2D Scheimpflug LiDAR (SLiDAR)

Unlike the traditional LiDAR technique that employs the arrival time of the backscattering light to obtain the distance information of the measured target, the SLiDAR technique utilizes the corresponding pixel position of the backscattering light on the image sensor to calculate the distance. According to the Scheimpflug principle, when the lens plane, the object plane, and the image plane intersect at the same line, and the angle between the image plane and the object plane β satisfies Eq. (1), the object can be sharply focused on the image plane with an infinite depth-of-focus (DoF), as shown in Fig. 1(a).

$$\beta = \arctan \left( {\frac{{{{\sin }^2}\alpha \cdot L}}{{\sin \alpha \cdot \cos \alpha \cdot L - f}}} \right)$$
Here α represents the angle between the image plane and the lens plane, L is the distance from the central of the lens to the object plane, f is the focal length of the receiving lens. The longitudinal distance (s) of the object to the SLiDAR system can be calculated from the pixel position (u) in the image plane according to Eq. (2).
$$s = \frac{{u \cdot f \cdot \sin ({\beta - \alpha } )}}{{[{u \cdot \sin ({\beta - \alpha } )- f} ]\cdot \sin \alpha }}$$
When an area image sensor is used as the detector, a sector region can be detected, as shown in Fig. 1(b). The position of the object (s, t) is uniquely determined by its image position (u, v) in the image plane. The transverse distance (t), corresponding to the distance from the object to the center of the object plane, can be described by
$$t ={-} \frac{{f \cdot v}}{{u \cdot \sin ({\beta - \alpha } )- f}}$$

 figure: Fig. 1.

Fig. 1. (a) Principle of the 1D SLiDAR: the object plane, the lens plane and the image plane intersect at the same line. L is the distance to the intersect line from the center of the lens, α is the angle between the lens and the object plane, β is the angle between the image plane and the object plane, f is the focal length of the lens. (b) Principle of the 2D SLiDAR. Each column (u-axis) of the area image sensor in the image plane corresponds to a single line in the object plane, as indicated by lines with arrows. All the lines intersect at one point, which is the projection of the principal point of the lens on the object plane.

Download Full Size | PDF

Thus, the position of the target in the object plane can be calculated from pixel positions according to Eqs. (1)–(3). On the other hand, each column of the area image sensor in the image plane corresponds to a single line of sight in the object plane, as indicated by lines with arrows shown in Fig. 1(b). Meanwhile, all lines of sight in the object plane intersect at the projection of the principal point of the lens. This implies that an area image sensor samples a trapezoidal area in the object plane with a spreading angle γ, which is given by

$$\gamma = \arctan \left( {\frac{{{N^ + } \cdot {W_{pix}}}}{{u \cdot \sin ({\beta - \alpha } )}}} \right) + \arctan \left( {\frac{{{N^ - } \cdot {W_{pix}}}}{{u \cdot \sin ({\beta - \alpha } )}}} \right)$$
Here Wpix represents the width of a pixel, N+ and N- represent the pixel number on each side of the central column. When α approaches to 90° and N+ equals to N- (denoted as N), Eq. (4) can be simplified as
$$\gamma = 2 \cdot \arctan \left( {\frac{{N \cdot {W_{pix}}}}{f}} \right)$$

2.2 Principle of the 2D fluorescence SLiDAR

Laser-induce fluorescence spectroscopy is a widely used optical sensing technology, which has been applied in, e.g., tea classification [26], leaf biochemical contents estimation [27,28], chlorophyll detection [29], pre-symptomatic wheat leaf rust [30], etc. Short-wavelength lasers are often used to induce fluorescence with high quantum efficiencies. Although various compositions in plants can show characteristic fluorescence, the main and most easily observed one is the chlorophyll, which is highly related with photosynthesis process [31]. Chlorophyll absorbs ultra-violet and blue light, and emits fluorescence with two peaks centered at 685 nm and 740 nm, respectively [32].

Modern color image sensors often place an RGB filter in front of the photodiodes, allowing spectral detection of the incoming light. By utilizing color image sensors as detectors, a 2D fluorescence SLiDAR system can be accomplished and utilized for detecting range-resolved elastic and inelastic (fluorescence) light from objects. The elastic light is mainly detected by the blue pixel with a blue filter placed in front. The fluorescence signals, on the other hand, are mainly acquired by the red and green channels. As the intensity of the elastic light is often much higher than that of the inelastic or fluorescence light, the received elastic light should be suppressed by e.g., placing a long-pass filter in front of the image sensor.

Figure 2 shows the spectrum of the exciting laser and the fluorescence signal from a green leaf, indicated with red shadow. Strong fluorescence signal peaks can be observed around 685 nm and 740 nm, respectively. According to the detection sensitivities of the red, green, and blue channels shown in Fig. 2, the blue channel mainly responds to the excitation light at 405 nm, while the red channel has a large responsivity for chlorophyll fluorescence. On the other hand, the sensitivities of the green channel are much lower at the blue and the red/near-infrared regions. The detected light intensity of each channel is theoretically described by

$$I{n_i} = {c_i} \cdot \int {spec(\lambda )} \cdot sen{s_i}(\lambda )d\lambda$$
Here In represents the detected light intensity, i stands for the red, green, and blue channels respectively, c is a constant coefficient for each channel, spec is the spectrum of the received light from the measured object, e.g., a leaf or other parts of a plant, sens is the relative sensitivity of each channel.

 figure: Fig. 2.

Fig. 2. Spectra of the excitation and the fluorescence light, together with the relative sensitivities of RGB channels. The red shadow indicates the spectrum of the excitation light at 405 nm and the fluorescence of a green leaf.

Download Full Size | PDF

According to above discussions, each pixel in the image plane, located at (u, v), corresponds to a single point in the object plane, i.e., (s, t). The excitation and the fluorescence spectra for the object located at (s, t) are evaluated from the RGB values of the corresponding pixel, which can be used for the classification of leaves or branches. In summary, a data matrix of M×N×3, representing the elastic and fluorescence light intensities for M×N pixels, is obtained for each measurement. A three-dimension reconstruction of the target object can thus be achieved by performing scanning measurements with different elevation angles through a pan tilt.

3. Apparatus and data processing method

3.1 Apparatus

The schematic of the fluorescence SLiDAR is shown in Fig. 3(a). The fluorescence SLiDAR consists of a commercial blue laser diode with a central wavelength of 405 nm and a maximum output power of 1 W. A cylindrical lens is mounted to generate a light sheet with a spreading angle of 50° on the s-t plane. The sector angle can be adjusted by replacing the cylindrical lens with different focal lengths. The backscattering light from the illuminated surface is collected by an imaging lens (Canon, Japan) with a focal length of 18 mm and an aperture of 58 mm. The distance from the central of the lens to the light-sheet plane is set to 100 mm. The angle between the CMOS camera and the lens is 10.2°, calculated according to Eq. (1). The CMOS camera (Panasonic, Japan, 4656×3520 pixels, 3.8 µm) is capable of measuring the incoming light with RGB channels. The light intensity of the laser diode is modulated with a square wave to enable measurements under illumination. In one measurement period, the CMOS shoots two photos with laser diode on and off, respectively. The background light, recorded with the laser diode off, can then be subtracted dynamically.

 figure: Fig. 3.

Fig. 3. (a) The schematic diagram of the fluorescence SLiDAR; (b) The schematic diagram of a 3D profile measurement; inset: coordinate conversion of each sectional image, Φ is the elevation angle, R is the distance from rotation axis to the plane of light sheet, the t-axis and y-axis are parallel, the 3D coordination position of each sectional image can be calculated by trigonometric.

Download Full Size | PDF

The fluorescence SLiDAR system is mounted on a pan tilt with a maximum elevation angle of 70°, as shown in Fig. 3(b). For each measurement with an elevation angle of Φ, a data matrix of M×N×3 can be recorded by a customized LabVIEW-based program. The 3D position (x, y, z) of the measured target object can be obtained by

$$x = s \cdot \cos \Phi - R \cdot \sin \Phi $$
$$y = t$$
$$z = s \cdot \sin \Phi - R + R \cdot \cos \Phi $$
Here R is the distance from the rotation axis of the pan tilt to the plane of the light sheet, as shown in the inset of Fig. 3(b). A 3D profile of the target object can be obtained by transforming the pixel and the elevation angle information to x-y-z positions.

RGB values of each individual pixel were utilized for spectral analysis. The intensity of the blue channel (InB) was employed to normalize the signals of the red (InR) and green (InG) channels according to Eq. (10), where ch represents red or green channel. The normalized fluorescence intensities of the red channel and the green channel, indicating the relative fluorescence efficiency, could be less influenced by the excitation light intensity. The normalized fluorescence intensity of the red channel was related to the chlorophyll content, which was abundant in leaves but much less in tree branches.

$${F_{norm}} = I{n_{ch}}/I{n_B}$$

3.2 Resolution of the fluorescence SLiDAR

The wider side of the CMOS (4656 pixels) was mount along the v-axis to achieve a larger field of view (FOV). Thus, a spreading angle of 52° could be reached according to Eq. (5). The nominal angle resolution along t-axis is 0.01°. The number of pixels in each column (along the u-axis) determines the measurement range of the fluorescence SLiDAR system. According to the Scheimpflug principle, a detection range from 0.152 m to 7 m is achieved, and the range-resolution along s-axis varies with the square of the measurement distance. It can be found out that the range resolution at 1 m, 3 m and 5 m can reach up to 2 mm, 18 mm, and 52 mm, respectively. The elevation angle resolution, which is mainly related to the height resolution, is determined by the rotation step resolution of the pan tilt, namely 0.28° in our case. The relationships between the spatial resolution and the measurement distance in s-axis and t-axis are shown in Fig. 4.

 figure: Fig. 4.

Fig. 4. The resolution at different measurement distance in each dimension. The dark area indicates the blind distance.

Download Full Size | PDF

4. Results and discussion

4.1 Reconstruction of an herbaceous plant

A common house plant, epipremnum aureum, as shown in Fig. 5(a), was measured in laboratory with in-room illumination to demonstrate the feasibilities of performing 3D reconstruction and fluorescence detection using the fluorescence SLiDAR system. The epipremnum aureum plant, with four large leaves, was approximately 0.1 m in width and 0.3 m in height. The plant was placed 1.1 m away from the fluorescence SLiDAR system. In a successful scanning process, 80 sectional images were acquired for 3D reconstruction and spectral analysis. Each acquisition took 200 ms for 2 exposures and 40 ms for system elevation, while the time for turning on and off the laser diode was negligible. Thus, the total measurement time was around 20 s.

 figure: Fig. 5.

Fig. 5. (a) The picture of the Epipremnum aureum plant, which was placed 1.1 m away from the SLiDAR system; The raw 3D profiles, with signals detected from (b) blue, (c) green, and (d) red channel, respectively; (e) and (f) are the intensity-normalized 3D profiles of the red and green channel, respectively.

Download Full Size | PDF

The blue channel, which had high quantum efficiency for the elastic reflection of the blue laser but low quantum efficiency for the inelastic fluorescence, showed much stronger intensity compared to the red or green channel as shown in Figs. 5(b), 5(c) and 5(d). In the red and green channels, the intensities of most data points were under 100. After normalization of the red and green channels according to Eq. (10), a distribution of the normalized fluorescence intensity, indicating the relative fluorescence efficiency, could be obtained, as shown in Figs. 5(e) and 5(f), respectively. The relative fluorescence efficiencies obtained from the red channel could qualitatively indicate the chlorophyll distribution of the plant.

4.2 Reconstruction of a woody plant

To evaluate the feasibility of classifying the leaves and branches by the proposed fluorescence SLiDAR system, a pritchardia gaudichaudii was scanned in laboratory with in-room illumination, as shown in Fig. 6(a). The plant was placed about 1 m away from the SLiDAR system. The three-dimensional structure was built based on blue channel data points. Red channel intensities were normalized to blue channel. Three dominant colors could be observed from the normalized fluorescence intensity of the red channel as shown in Fig. 6(b), i.e., dark blue (0.05), light blue (0.15), and yellow (0.4), respectively. The trunks, containing less chlorophyll, has very weak fluorescence and could be readily distinguished from stems and leaves, as shown in Figs. 6(c) and 6(d). The stems contained higher chlorophyll concentration than trunks, but less than leaves. Its normalized fluorescence intensity of the red channel was around 0.15, and thus could be distinguished from the leaves, which had much larger values (0.4). It can be seen that some of the leaves in the middle-top region showed weaker normalized fluorescence intensities, possibly due to the strong specular refection of the wax layer. In these cases, the penetration efficiency of the excitation light could be lower, leading to a lower fluorescence efficiency and thus a lower normalized fluorescence intensity.

 figure: Fig. 6.

Fig. 6. (a) The picture of the pritchardia gaudichaudii plant; (b) The red channel intensity-normalized 3D profile, the trunks, the stems and the leaves can be classified according to the intensity, and were indicated by different colors; The 3D profiles with points with normalized fluorescence intensity of the red channel (c) above 0.1 and (d) below 0.1.

Download Full Size | PDF

4.3 Reconstruction of a field fruit plant

The fluorescence SLiDAR system has also been used for the measurement of a grapefruit tree with mature grapes, located at 6 m away from the system, as shown in Figs. 7(a) and 7(b). The tree has a height of more than 3 m. In field condition, sunlight is usually much stronger than that in laboratory, which can significantly deteriorate the signal-to-noise ratios (SNR) or even saturate the CMOS sensor. Meanwhile, the leaves and branches could also be moved back and forth due to wind, which can introduce a large measurement bias on the fluorescence signals particularly during daytime measurements. Thus, the outdoor scanning measurement was conducted during nighttime under a still atmospheric condition. The exposure time was set to 200 ms and the gain of the camera was enhanced to 100, because of the longer measurement distance. 250 RGB images were recorded to obtain the full profile of the tree. The normalized fluorescence 3D profile of the red channel was shown in Fig. 7(c). The constructed shape was consistent with the real grapefruit tree. The normalized fluorescence intensity of tree branches was mostly under 0.1, while it was around 0.1 for grapes. Figure 7(d) was a zoom-in figure, in which up to 7 grapes colored with light blue could be identified. The normalized fluorescence intensities of the leaves were above 0.15, but less than the values reported in Section 4.2. This could be due to the difference of chlorophyll concentration in leaves, the dependence of fluorescence efficiency on the measurement angle and the decreased range resolution of the fluorescence SLiDAR system.

 figure: Fig. 7.

Fig. 7. The picture of the grapefruit tree with horizontal distance of 6 m away from the SLiDAR system during measurement at night (a), and in the daytime (b); (c) The red channel intensity-normalized 3D profile, the branches, the leaves and the fruits can be classified accordingly; (d) The zoom-in 3D profile to show the grapes colored with light blue vividly.

Download Full Size | PDF

5. Conclusion and outlook

A novel fluorescence SLiDAR technique based on the Scheimpflug principle and the laser induced fluorescence technique has been proposed and implemented for 3D plant profile measurements as well as the fluorescence distribution detection. A 405 nm laser diode was employed as the excitation light source to generate a light sheet. Both the elastic and inelastic/fluorescence signals from the various plants with horizontal distance from 1 to 6 m were measured simultaneously by the fluorescence SLiDAR system employing a color image sensor with RGB channels. The 3D profile can be obtained from the elastic signal recorded by the blue pixels through an elevation scanning measurement, while the fluorescence intensity of the object is mainly acquired by the red and green pixels. The normalized fluorescence intensity of the red channel showed the ability for the classification of leaves, branches and trunks, and even fruits of plants. The results demonstrated in this work has shown a great potential of employing the fluorescence SLiDAR technique as a supplement of current ToF-based terrestrial LiDAR techniques, and would contribute to biomass evaluation, fruit output evaluation, photosynthesis monitoring, etc.

The fluorescence spectra can be better resolved by employing an image sensor with specific on-chip filters (e.g., 4 bands or even 16 bands) in future. Under this circumstance, the red fluorescence, the far red fluorescence, etc., can be independently measured without the interference from other spectra. The ratio of the far red fluorescence intensity to the red fluorescence intensity, is then independent of the illumination intensity, orientation of the leaves, etc. The fluorescence ratio can thus be quantified for more accurate evaluation of the chlorophyll concentration, as has been done in traditional LIF applications. Besides, using on-chip narrowband filters could also be helpful for suppressing the sunlight background signal, which may provide much better SNR for outdoor daytime operation. The exposure time can also be reduced by increasing the aperture of the receiving lens, the laser power, etc. Moreover, the optical parameters of the system, i.e. f, L, and α, can also be optimized for extending the detection range and thus improving the range resolution of the fluorescence SLiDAR system.

Funding

Natural Science Foundation of Zhejiang Province (LQ20F050006, LY20D010004); National Natural Science Foundation of China (61705030).

Acknowledgements

The authors want to thank Fengnong Chen and Jingcheng Zhang for providing the facilities for the experiment.

Disclosures

The authors declare no conflicts of interest.

References

1. A. B. Harper, T. Powell, P. M. Cox, J. House, C. Huntingford, T. M. Lenton, S. Sitch, E. Burke, S. E. Chadburn, W. J. Collins, E. Comyn-Platt, V. Daioglou, J. C. Doelman, G. Hayman, E. Robertson, D. van Vuuren, A. Wiltshire, C. P. Webber, A. Bastos, L. Boysen, P. Ciais, N. Devaraju, A. K. Jain, A. Krause, B. Poulter, and S. Shu, “Land-use emissions play a critical role in land-based mitigation for Paris climate targets,” Nat. Commun. 9(1), 2938 (2018). [CrossRef]  

2. R. K. Dixon, A. M. Solomon, S. Brown, R. A. Houghton, M. C. Trexier, and J. Wisniewski, “Carbon pools and flux of global forest ecosystems,” Science 263(5144), 185–190 (1994). [CrossRef]  

3. C. Chen, T. Park, X. Wang, S. Piao, B. Xu, R. K. Chaturvedi, R. Fuchs, V. Brovkin, P. Ciais, R. Fensholt, H. Tommervik, G. Bala, Z. Zhu, R. R. Nemani, and R. B. Myneni, “China and India lead in greening of the world through land-use management,” Nat. Sustain. 2(2), 122–129 (2019). [CrossRef]  

4. T. Hu, Y. Su, B. Xue, J. Liu, X. Zhao, J. Fang, and Q. Guo, “Mapping Global Forest Aboveground Biomass with Spaceborne LiDAR, Optical Imagery, and Forest Inventory Data,” Remote Sens. 8(7), 565 (2016). [CrossRef]  

5. T. Yao, X. Yang, F. Zhao, Z. Wang, Q. Zhang, D. Jupp, J. Lovell, D. Culvenor, G. Newnham, W. Ni-Meister, C. Schaaf, C. Woodcock, J. Wang, X. Li, and A. Strahler, “Measuring forest structure and biomass in New England forest stands using Echidna ground-based lidar,” Remote Sens. Environ. 115(11), 2965–2974 (2011). [CrossRef]  

6. F. Fiorani and U. Schurr, “Future scenarios for plant phenotyping,” Annu. Rev. Plant Biol. 64(1), 267–291 (2013). [CrossRef]  

7. G. Zheng and L. M. Moskal, “Retrieving Leaf Area Index (LAI) Using Remote Sensing: Theories, Methods and Sensors,” Sensors 9(4), 2719–2745 (2009). [CrossRef]  

8. J. R. Rosell, J. Llorens, R. Sanz, J. Arnó, M. Ribes-Dasi, J. Masip, A. Escolà, F. Camp, F. Solanelles, F. Gràcia, E. Gil, L. Val, S. Planas, and J. Palacín, “Obtaining the three-dimensional structure of tree orchards from remote 2D terrestrial LIDAR scanning,” Agric. For. Meteorol. 149(9), 1505–1515 (2009). [CrossRef]  

9. J. W. Atkins, G. Bohrer, R. T. Fahey, B. S. Hardiman, T. H. Morin, A. E. L. Stovall, N. Zimmerman, C. M. Gough, and S. Goslee, “Quantifying vegetation and canopy structural complexity from terrestrial LiDAR data using the forestr r package,” Methods Ecol. Evol. 9(10), 2057–2066 (2018). [CrossRef]  

10. K. Omasa, F. Hosoi, and A. Konishi, “3D lidar imaging for detecting and understanding plant responses and canopy structure,” J. Exp. Bot. 58(4), 881–898 (2006). [CrossRef]  

11. M. Béland, D. D. Baldocchi, J.-L. Widlowski, R. A. Fournier, and M. M. Verstraete, “On seeing the wood from the leaves and the role of voxel size in determining leaf area distribution of forests with terrestrial LiDAR,” Agric. For. Meteorol. 184, 82–97 (2014). [CrossRef]  

12. X. Yang, A. H. Strahler, C. B. Schaaf, D. L. B. Jupp, T. Yao, F. Zhao, Z. Wang, D. S. Culvenor, G. J. Newnham, J. L. Lovell, R. O. Dubayah, C. E. Woodcock, and W. Ni-Meister, “Three-dimensional forest reconstruction and structural parameter retrievals using a terrestrial full-waveform lidar instrument (Echidna®),” Remote Sens. Environ. 135, 36–51 (2013). [CrossRef]  

13. J.-F. Côté, R. A. Fournier, and R. Egli, “An architectural model of trees to estimate forest structural attributes using terrestrial LiDAR,” Environ. Modell. Softw. 26(6), 761–777 (2011). [CrossRef]  

14. T. Hakala, J. Suomalainen, S. Kaasalainen, and Y. Chen, “Full waveform hyperspectral LiDAR for terrestrial laser scanning,” Opt. Express 20(7), 7119–7127 (2012). [CrossRef]  

15. S. Kaasalainen and T. Malkamäki, “Potential of active multispectral lidar for detecting low reflectance targets,” Opt. Express 28(2), 1408–1416 (2020). [CrossRef]  

16. Y. Livny, F. Yan, M. Olson, B. Chen, H. Zhang, and J. El-Sana, “Automatic reconstruction of tree skeletal structures from point clouds,” ACM Trans. Graph. 29(6), 1 (2010). [CrossRef]  

17. S. L. Tao, Q. H. Guo, S. W. Xu, Y. J. Su, Y. M. Li, and F. F. Wu, “A Geometric Method for Wood-Leaf Separation Using Terrestrial and Simulated Lidar Data,” Photogramm. Eng. Remote Sens. 81(10), 767–776 (2015). [CrossRef]  

18. G. Zhao, Z. Duan, L. Ming, Y. Li, R. Chen, J. Hu, S. Svanberg, and Y. Han, “Reflectance and fluorescence characterization of maize species using field laboratory measurements and lidar remote sensing,” Appl. Opt. 55(19), 5273–5279 (2016). [CrossRef]  

19. L. Mei and M. Brydegaard, “Atmospheric aerosol monitoring by an elastic Scheimpflug lidar system,” Opt. Express 23(24), A1613–A1628 (2015). [CrossRef]  

20. G. Sun, L. Qin, Z. Hou, X. Jing, F. He, F. Tan, and S. Zhang, “Small-scale Scheimpflug lidar for aerosol extinction coefficient and vertical atmospheric transmittance detection,” Opt. Express 26(6), 7423–7436 (2018). [CrossRef]  

21. Z. Kong, T. Ma, K. Chen, Z. Gong, and L. Mei, “Three-wavelength polarization Scheimpflug lidar system developed for remote sensing of atmospheric aerosols,” Appl. Opt. 58(31), 8612–8621 (2019). [CrossRef]  

22. S. Zhu, E. Malmqvist, W. Li, S. Jansson, Y. Li, Z. Duan, K. Svanberg, H. Feng, Z. Song, G. Zhao, M. Brydegaard, and S. Svanberg, “Insect abundance over Chinese rice fields in relation to environmental parameters, studied with a polarization-sensitive CW near-IR lidar system,” Appl. Phys. B 123(7), 211 (2017). [CrossRef]  

23. F. Gao, J. Li, H. Lin, and S. He, “Oil pollution discrimination by an inelastic hyperspectral Scheimpflug lidar system,” Opt. Express 25(21), 25515–27188 (2017). [CrossRef]  

24. G. Zhao, M. Ljungholm, E. Malmqvist, G. Bianco, L.-A. Hansson, S. Svanberg, and M. Brydegaard, “Inelastic hyperspectral lidar for profiling aquatic ecosystems,” Laser Photonics Rev. 10(5), 807–813 (2016). [CrossRef]  

25. K. Chen, F. Gao, X. Chen, Q. Huang, and S. He, “Overwater light-sheet Scheimpflug lidar system for an underwater three-dimensional profile bathymetry,” Appl. Opt. 58(27), 7643–7648 (2019). [CrossRef]  

26. L. Mei, P. Lundin, M. Brydegaard, S. Y. Gong, D. S. Tang, G. Somesfalean, S. L. He, and S. Svanberg, “Tea classification and quality assessment using laser-induced fluorescence and chemometric evaluation,” Appl. Opt. 51(7), 803–811 (2012). [CrossRef]  

27. W. Li, Z. Niu, G. Sun, S. Gao, and M. Wu, “Deriving backscatter reflective factors from 32-channel full-waveform LiDAR data for the estimation of leaf biochemical contents,” Opt. Express 24(5), 4771–4785 (2016). [CrossRef]  

28. J. Yang, J. Sun, L. Du, B. Chen, Z. Zhang, S. Shi, and W. Gong, “Effect of fluorescence characteristics and different algorithms on the estimation of leaf nitrogen content based on laser-induced fluorescence lidar in paddy rice,” Opt. Express 25(4), 3743–3755 (2017). [CrossRef]  

29. W. Wan, D. Hua, J. Le, T. He, Z. Yan, and C. Zhou, “Study of laser-induced chlorophyll fluorescence lifetime measurement and its correction,” Measurement 60, 64–70 (2015). [CrossRef]  

30. C. Römer, K. Bürling, M. Hunsche, T. Rumpf, G. Noga, and L. Plümer, “Robust fitting of fluorescence spectra for pre-symptomatic wheat leaf rust detection with Support Vector Machines,” Comput. Electron. Agr. 79(2), 180–188 (2011). [CrossRef]  

31. P. Talamond, J. L. Verdeil, and G. Conejero, “Secondary metabolite localization by autofluorescence in living plant cells,” Molecules 20(3), 5024–5037 (2015). [CrossRef]  

32. H. Lin, Z. Li, H. Lu, S. Sun, F. Chen, K. Wei, and D. Ming, “Robust Classification of Tea Based on Multi-Channel LED-Induced Fluorescence and a Convolutional Neural Network,” Sensors 19(21), 4687 (2019). [CrossRef]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (7)

Fig. 1.
Fig. 1. (a) Principle of the 1D SLiDAR: the object plane, the lens plane and the image plane intersect at the same line. L is the distance to the intersect line from the center of the lens, α is the angle between the lens and the object plane, β is the angle between the image plane and the object plane, f is the focal length of the lens. (b) Principle of the 2D SLiDAR. Each column (u-axis) of the area image sensor in the image plane corresponds to a single line in the object plane, as indicated by lines with arrows. All the lines intersect at one point, which is the projection of the principal point of the lens on the object plane.
Fig. 2.
Fig. 2. Spectra of the excitation and the fluorescence light, together with the relative sensitivities of RGB channels. The red shadow indicates the spectrum of the excitation light at 405 nm and the fluorescence of a green leaf.
Fig. 3.
Fig. 3. (a) The schematic diagram of the fluorescence SLiDAR; (b) The schematic diagram of a 3D profile measurement; inset: coordinate conversion of each sectional image, Φ is the elevation angle, R is the distance from rotation axis to the plane of light sheet, the t-axis and y-axis are parallel, the 3D coordination position of each sectional image can be calculated by trigonometric.
Fig. 4.
Fig. 4. The resolution at different measurement distance in each dimension. The dark area indicates the blind distance.
Fig. 5.
Fig. 5. (a) The picture of the Epipremnum aureum plant, which was placed 1.1 m away from the SLiDAR system; The raw 3D profiles, with signals detected from (b) blue, (c) green, and (d) red channel, respectively; (e) and (f) are the intensity-normalized 3D profiles of the red and green channel, respectively.
Fig. 6.
Fig. 6. (a) The picture of the pritchardia gaudichaudii plant; (b) The red channel intensity-normalized 3D profile, the trunks, the stems and the leaves can be classified according to the intensity, and were indicated by different colors; The 3D profiles with points with normalized fluorescence intensity of the red channel (c) above 0.1 and (d) below 0.1.
Fig. 7.
Fig. 7. The picture of the grapefruit tree with horizontal distance of 6 m away from the SLiDAR system during measurement at night (a), and in the daytime (b); (c) The red channel intensity-normalized 3D profile, the branches, the leaves and the fruits can be classified accordingly; (d) The zoom-in 3D profile to show the grapes colored with light blue vividly.

Equations (10)

Equations on this page are rendered with MathJax. Learn more.

β = arctan ( sin 2 α L sin α cos α L f )
s = u f sin ( β α ) [ u sin ( β α ) f ] sin α
t = f v u sin ( β α ) f
γ = arctan ( N + W p i x u sin ( β α ) ) + arctan ( N W p i x u sin ( β α ) )
γ = 2 arctan ( N W p i x f )
I n i = c i s p e c ( λ ) s e n s i ( λ ) d λ
x = s cos Φ R sin Φ
y = t
z = s sin Φ R + R cos Φ
F n o r m = I n c h / I n B
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.