Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

High-speed telescope autofocus for UAV detection and tracking

Open Access Open Access

Abstract

This paper presents the analysis, implementation and experimental evaluation of a high-speed automatic focus module for a telescope-based UAV detection and tracking system. An existing optical drone detection system consisting of two telescopes and deep learning-based object detection is supplemented by suitable linear stages and passive focus algorithms to enable fast automatic focus adjustment. Field tests with the proposed system demonstrate that UAVs flying at speeds of up to 24 m/s towards the system are successfully tracked and kept in focus from more than 4500 m down to 150 m. Furthermore, different search functions and contrast measures are evaluated and it is shown that the Tenengrad operator combined with the Hill Climbing search function achieve the best performance for focusing on fast moving small UAVs.

© 2024 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement

1. Introduction

In recent years the necessity of optical detection systems, which are capable of detecting and tracking small, fast and agile unmanned aerial vehicles (UAVs) has become evident as numerous incidents illustrate the growing threat to public safety and infrastructure. A famous example is the shutdown of the London Gatwick airport for more than a day in 2018 due to a nearby UAV and in 2023 the airport had to cease operation for an hour again due to a suspected UAV [1]. In 2022 UAVs were sighted close to nuclear power plants and government buildings in Sweden [2]. Other incidents involve the smuggling over state boarders and prisons [3,4] or the disruption of sporting events [5]. Early detection of illegally intruding UAVs to safety critical areas is paramount to prepare an appropriate response to a possibly hazardous situation.

UAV detection is generally performed in a multispectral manner, combining various sensor technologies, like RADAR [6], radio frequency [7], acoustics [8] and electro-optics [9], into a holistic system [10,11]. An essential component of such a system is the electro-optical sensor, as it allows the most conclusive situational assessment through visual images. These optical sensors have a narrow field of view (FoV) and use mounts for pan and tilt movement [12]. Detection of UAVs is facilitated via computer vision algorithms, with convolutional neural networks, like YOLO [13], FRCNN [14] or Retinanet [15] outperforming conventional methods [16]. To achieve long detection ranges for small UAVs, the latest research incorporates telescopes, which allow larger apertures and thus, better resolution, to extend the operational range to more than 4 km [9]. The usage of optics with long focal lengths and large apertures result in a narrow depth of field (DoF), which increases the demand on a fast automatic focus especially when tracking high-speed UAVs.

The automatic focus is characterized into two categories: active and passive focusing methods. Active focus methods measure the distance to the object through time of flight or ultrasonic sensors to adjust the focus accordingly [17,18]. The most common passive focus technique is phase detection, which divides the incoming light rays into pairs of beams and by comparing them, the focus direction and distance to the best focus position is determined [19]. Although this method is fast and does not require much computational power, additional sensors and optical elements as well as a precise alignment are necessary.

Passive contrast-based methods find the optimal focus position by comparing the contrast values of multiple images after shifting the camera sensor with respect to the optics in axial direction [20] or in the case of telescope systems, the mirrors with respect to each other [21]. Contrast functions can be divided into five major groups: gradient-, laplace-, wavelet-, statistics- and discrete cosine transform based methods [22]. A well-known example of the gradient-based group is the Tenengrad operator [23], which has shown promising results, when applied using telescope systems and focusing on terrestrial objects like buildings [24]. Likewise, the normalized variance [25], also a gradient-based approach, achieves good results for astronomical observations [26]. To focus on very small image regions, as necessary for detection and tracking of small UAVs, the variance of Laplacian [27] and the discrete wavelet transform operator have proven to be good choices [28]. A variety of algorithms exist, which try to maximize the contrast value and thus the sharpness of the image, like the Hill Climb [29], Fibonacci or Binary algorithm [20,30], curve fitting [31] or Gaussian fitting [32] etc.

Passive autofocus with contrast detection is frequently used in telescope-based imaging systems. However, the main applications are concentrated around focusing on large scale objects like buildings or for astronomical observations, which do not require a fast focus, but rather high precision [24,26]. Passive high-speed autofocus methods for long focal lengths have been implemented for time delay integration (TDI) cameras, which take continuous measurements at 250 Hz and estimate the maximum focus position through polynomials [33]. For the use case of telescope-based UAV detection and tracking, challenges arise through the movement of the UAV itself during the focusing phase, which introduces noise to the passive focus measurement. Furthermore, high UAV velocities require fast focusing speed to keep the UAV in focus.

The contribution of this paper is the analysis, implementation and evaluation of a fast passive autofocus module for a telescope-based UAV detection and tracking system. A high-speed linear stage is used to reposition the camera with respect to the telescope in accordance to the focus algorithm output. The article is organized as follows. Section 2 shows the analysis of the requirements for the autofocus module. In Section 3 the implemented hardware and software system is described in detail. The experiments and results are shown in Section 4 followed by a conclusion in Section 5.

2. System analysis

To derive the hardware specifications for the automatic focus, the requirements for a telescope-based UAV detection and tracking system are necessary. The operational range of the system is given from 150 m to 5000 m and it shall be possible to track for example a DJI Mavic 3, which has a diameter of 0.3 m and a maximum speed of 21 m/s. For reliable long distance detection of UAVs, at least 15 x 15 pixels covering the object are required [9]. Therefore, the focal length of the optical system should range from approximately 400 mm for closer distances to 2500 mm to resolve objects at long distances.

2.1 Travel range

To enable focusing over the entire operational range, either the camera has to be moved with respect to the telescope or the telescope mirrors have to be repositioned with respect to each other. For the analysis it is assumed, that the camera is moved. To calculate the required travel range necessary for the focus, the thin lens model [34] is used, where the image distance $I$ is determined by

$$I = \frac{1}{\frac{1}{f} - \frac{1}{x}},$$
with $f$ being the focal length of the telescope and $x$ is the distance to the object. The necessary focus range range is calculated to 1 mm for a focal length of 400 mm and 41.3 mm for a focal length of 2500 mm.

2.2 Focus velocity

To keep fast UAVs in focus, a high focus velocity is crucial. Using Eq. (1) and assuming a challenging case, of a DJI Mavic 3 flying at 21 m/s towards the telescope system in the closest specified distance of 150 m, the required minimum focus velocity is 0.15 mm/s using a $f =$ 0.4 m telescope and 6 mm/s using a $f =$ 2.5 m telescope.

2.3 Focus accuracy

The required focus accuracy is deduced from the DoF, given by

$$DoF = \frac{2 x D f c (x - f)}{D^2 f^2 - c^2(x-f)^2},$$
where $D$ is the aperture, $f$ is the focal length, $c$ is the circle of confusion and $x$ is the distance to the object [35]. Following guidelines of photography $c$ is determined by $d/1500$, with $d$ being the camera sensor diagonal [36]. Figure 1 shows the DoF of two focal lengths and an aperture of 0.3 m, assuming an 1 inch camera sensor with a diagonal of 15.86 mm the resulting $c$ is 0.01 mm. The minimal required focus accuracy is calculated by using the DoF of the closest distance of the object of 150 m. For a focal length of 0.4 m the DoF is 4 m, which translates to a necessary focus accuracy of 28 $\mu$m. Within these 4 m, the optical system captures a sharp image for a human observer [36]. For the longer focal length of 2500 mm the minimum accuracy is 285 $\mu$m.

 figure: Fig. 1.

Fig. 1. DoF over distance to the observed object for the given focal lengths of the optical system assuming an $1$ inch camera sensor.

Download Full Size | PDF

3. System implementation

The implemented system is depicted in Fig. 2 and consists of two telescopes, an f/10 Meade Schmidt Cassegrain telescope (LX200-ACF, Meade Acquisition Corp., Watsonville, CA, USA) with a focal length of 2.54 m and an ASA UWF300 telescope (ASA Astrosysteme GmbH, Neumarkt, Austria) with a focal length of 0.39 m and an aperture of 0.3 m. The telescopes are attached to a DDM100 mount (ASA Astrosysteme GmbH) to enable pan and tilt motion. To capture images, the Meade telescope is equipped with a Moment scientific CMOS camera (Teledyne Photometrics, USA), with a sensor diagonal of 17.5 mm and an ASI 385 MC-Cool camera (ZWO Company, Suzhou, China) with a sensor diagonal of 8.4 mm is attached to the UWF300 telescope.

 figure: Fig. 2.

Fig. 2. The implemented dual telescope-system with the custom autofocus module consisting of a linear stage and an ASI camera for the ASA UWF 300 telescope and an additional stage for the Meade telescope, which adjusts the focus for the Moment camera.

Download Full Size | PDF

Following the analysis presented in Section 2, two linear stages of the type LSM050B-E03T4 (Zaber Technologies Inc., Vancouver, Canada) together with corresponding controllers X-MCB1 are used to reposition the cameras with respect to the telescopes. The stage offers a travel range of 50.8 mm with a travel speed of 104 mm/s, while maintaining an accuracy of 3.5 $\mu$m. Both cameras are mounted to the stages using custom-made adapters. Implementing the passive focus by repositioning the cameras, rather than the telescope mirrors, is selected, as the latter approach is too slow for the intended application.

3.1 Software architecture

The system architecture is depicted in Fig. 3, whereas the software components, the camera, the object detector and the autofocus component are running in parallel [37]. The camera captures images, that are provided to the deep learning object detector, as developed and trained to detect UAVs in [37], and the autofocus component. For object detection, a fine-tuned FRCNN object detector is used, which is trained on a custom UAV dataset [37]. The autofocus component issues position commands to the stage driver, which controls the linear stage to position the camera with respect to the telescope.

 figure: Fig. 3.

Fig. 3. Overview of the system architecture. The camera provides images to the object detector and autofocus algorithm, which send commands to the stage driver. The stage driver controls the linear stage, that physically moves the camera with respect to the optics to set the focus.

Download Full Size | PDF

The automatic focus consists of two phases. First, the operational range is scanned in $z$ direction in the search for a UAV. Once a UAV is detected by the object detection algorithm and a bounding box shows the location of the UAV within an image, the focus optimization phase starts. During the optimization phase, the search algorithm tries to maximize the contrast value within the bounding box, by moving the camera with respect to the telescope and comparing the contrast of multiple images. For the calculation of the contrast, three functions are implemented and evaluated, namely the Tenengrad operator [23]

$$FV_{Tenengrad} = \sum_{n}^{N} \sum_{n}^{M} \nabla_{x} I(n,m)^{2} + \nabla_{y} I(n,m)^{2},$$
the variance of Laplacian [27]
$$FV_{VarLap} = \frac{\sum_{n}^{N} \sum_{n}^{M} ( \Delta I(n,m) - \overline{ \Delta I})^{2}}{\overline{ \Delta I}},$$
and the normalized variance [25]
$$FV_{NVar} = \frac{\frac{1}{N M}\sum_{n}^{N} \sum_{n}^{M} ( I(n,m) - \overline{I(n,m)})^{2}}{\overline{I}},$$
where $I(n,m)$ is the image intensity of the pixel at the position $n$ and $m$, $N$ and $M$ the overall size of the cropped image and $\overline {I}$ the average image intensity. Three search algorithms, which are responsible to maximize the contrast value, are selected and evaluated. The Hill Climb Algorithm [29] moves the camera stepwise into one direction, until the contrast value starts to decrease and then moves it in the other direction with a smaller step size. The Binary and the Fibonacci Algorithm [30] divide the search space in a binary and Fibonacci pattern respectively. The focus optimization is triggered periodically after 3 s to keep the UAV in focus, to ensure a continuous track.

4. Experiments and results

To evaluate the system, various field tests are performed with UAVs to analyse the contrast functions, the search algorithms and the required speed of the autofocus.

4.1 Contrast functions

To evaluate the contrast functions, a scan with the linear stage is performed, while filming for example a drone as seen in Fig. 4. Then, the contrast functions from Eq. (3) to Eq. (5) are applied to the images to obtain the contrast value per stage position and contrast function. The results of each function are normalized to 1 respectively as seen in Fig. 5.

 figure: Fig. 4.

Fig. 4. A sequence of nine selected images from a linear stage scan to obtain the contrast measurement is depicted for the evaluation shown in Table 1. The white numbers within the images indicate the current stage position.

Download Full Size | PDF

 figure: Fig. 5.

Fig. 5. Contrast values calculated using three different contrast measures for a linear stage scan, which contains a single UAV in the image in front of a clear sky.

Download Full Size | PDF

The metrics used for the analysis are the sensitivity, the mean number of local maxima and the position error [26]. It is desirable to minimize the number of local maxima across the curve, as this reduces the probability of the search algorithm to stop focusing at a wrong position. Ideally, only one global maximum is present, which is the object to be tracked. The sensitivity of the curve is defined as the ratio between the maximum value divided by the mean of the contrast values, which are smaller than the mean of the contrast curve itself. A high sensitivity, therefore, manifests itself as a prominent peak. Finally, the position error shows the offset between the stage position of the maximum value of the measured contrast curve with respect to the manually selected best focus position.

Table 1 shows the mean value of 8 linear stage scan measurements performed with each telescope respectively. For both telescopes, the normalized variance has the worst performance in terms of all three evaluation metrics. The Tenengrad and variance of Laplacian achieve similar results in terms of position error, however, Tenengrad shows a higher sensitivity and a lower number of maxima and is, therefore, found to the best performing contrast function for the task of focusing on small objects like UAVs. Furthermore, the position error is below the required accuracy as determined using Eq. (2) in Section 2.

Tables Icon

Table 1. Results of the evaluation of the contrast functions for both the UWF and the Meade telescope after each contrast function is normalized to 1. Furthermore the data shows the mean of 8 measurements per telescope. The best results are marked in bold.

4.2 Search algorithms

Another crucial aspect for a reliable contrast-based automatic focus is a deterministic optimization function to maximize the contrast value and thus find the best focus position. Fast moving UAVs paired with the large apertures and low DoF of the optical detection system, require a quick termination of the optimization algorithm.

As evaluation metrics the number of linear stage steps, the total distance travelled, the elapsed time from the start to end of the optimization and the accuracy, defined as the offset to the manually set best focus position, are selected. For the experiments, the UAV is maintaining a stationary position and the manually set optimal focus position is noted. Then, the image is defocused by offsetting the linear stage from the optimal focus position by up to 0.5 mm for the UWF and 5 mm for the Meade telescope. In the next step, the object detector is started, which triggers the search algorithm and thus the contrast maximization for the detected bounding box. During the evaluation of the search algorithms, the Tenengrad operator is used to obtain the contrast value.

Table 2 shows the results of the optimization functions for both telescopes, whereas each value represents the mean of approximately 450 tests and the standard deviation is shown in brackets. For both telescopes, the Hill Climbing optimization algorithm scores the best or close to best results according to all four metrics.

Tables Icon

Table 2. Results of approximately 450 tests of the search algorithms per telescope for the UWF and the Meade telescope. The values represent the mean, whereas the standard deviation is shown in brackets. The best results are marked in bold.

4.3 Speed and DoF

Finally, the speed and apparent deep learning DoF (DL-DoF) of the system is experimentally evaluated. The DoF calculated for the UWF telescope in a distance of 150 m is 4 m, which represents the depth of a completely sharp image. However, UAVs in slightly defocused images can still be detected using deep learning algorithms [9]. To experimentally evaluate the DL-DoF of the deep learning algorithm, a UAV is maintaining a certain distance to the system and the focus is set manually to obtain a sharp image. Then, without adjusting the focus, the UAV flies towards or away from the system, slowly going out of focus until the deep learning algorithm fails to detect the UAV as seen in Fig. 6. Using this method the DL-DoF of the UWF telescope for the most challenging case in a close distance of 120 m is determined to be approximately 75 m. The distances are obtained by extracting the GPS position from the UAV remote controller. Within this DL-DoF the UAV is quite defocused, however, still detectable by the deep learning algorithm. Therefore, the requirements towards the stage accuracy can be relaxed. As the measured DL-DoF represents the maximum DoF before the detector fails, for the calculation of the more tolerant accuracy a DL-DoF of 15 m is assumed to ensure robust detection of objects. Referring back to Eq. (2), the new more tolerant circle of confusion for object detection is calculated to be 0.06 mm resulting in a necessary positioning accuracy of 0.16 mm for the UWF and 1.3 mm for the Meade telescope. These more tolerant constraints approve the relatively large standard deviations of the accuracy measured in Table 2.

 figure: Fig. 6.

Fig. 6. Image sequence showing the DL-DoF test for the object detection algorithm. The initial focus is set manually, which is depicted on the left two images showing a distance to the UAV of 120 m. Then, the UAV flies towards 6(a) or away 6(b) from the system, while the focus is not adjusted. To obtain the DL-DoF, the distance is recorded, when the object detector fails, as seen in the right most images.

Download Full Size | PDF

To evaluate the speed required by the focus module to keep up with the autofocus, experiments are performed with quad-copter and fixed wing UAVs. The DJI Mini 2 with a maximum speed of 16 m/s and a width of 289 mm and DJI Mavic 3 are used as quad-copter. The utilized fixed wing drone achieves a maximum speed of 41 m/s with a wingspan of 5 m. Figure 7 shows two example flight trajectories together with some example images in Fig. 8, where a UAV is tracked continuously and the automatic focus is ensuring that the UAV stays in focus. During the tests the Tenengrad operator together with the Hill climbing search algorithm are used, whereas the autofocus optimization is triggered every 3 s. The maximum speed of the UAV flying towards the telescope system is recorded at 24 m/s resulting in a maximum stage velocity of 12 mm/s for the Meade telescope as shown in Fig. 7(a). The necessary speed of the linear stage is higher than the calculated speed in Section 2, as the contrast based method requires multiple measurements at different stage positions to find the best focus. Also the UAV speeds during the tests are slightly higher than the assumed values in the system analysis. Figure 7(a) confirms the calculated required travel range for the linear stage using the Meade telescope in Section 2.

 figure: Fig. 7.

Fig. 7. Example data of successful continuous tracking and automatic focusing onto UAVs flying at speeds of up to 24 m/s towards the telescope system. The black dashed line shows the thin lens model, which fits well into the actual stage position. In Fig. 7(a) the non linear dependency between the UAV distance towards the linear stage position is visible, as defined by Eq. (1). The apparent noise of the stage position around the thin lens model is the optimization by the hill climb algorithm, which is periodically triggered to keep the UAV in focus as it moves.

Download Full Size | PDF

 figure: Fig. 8.

Fig. 8. Example image sequence depicting a UAV flying at up to 15 m/s towards the telescope system, while the automatic focus keeps the UAV in focus.

Download Full Size | PDF

In summary, by periodically running the automatic focus, continuous detection and tracking of UAVs flying a speeds of up to 24 m/s from more than 4500 m down to 150 m towards the telescope system is possible, as the focus module ensures a sharp image.

5. Conclusion

An automatic focus module for a telescope system is presented, which enables it to keep small and fast moving UAVs in focus. Following the analysis and an integration of a suitable linear stage, it is experimentally validated that the Tenengrad contrast measure and the Hill Climb search algorithms show the best performance for the task of focusing fast and precisely onto small UAVs within a telescope image. Furthermore, it is demonstrated, that the designed focus module keeps a UAV, flying at a speed of 24 m/s, at a worst case distance of 150 m in focus when using the $f10$ Meade telescope with a focal length of 2540 mm, resulting in a stage velocity of up to 12 mm/s. This enables detection and continuous tracking of UAVs flying at high speeds towards the telescope system as the focus module ensures that the UAV remains in focus. Future work will be centred around an efficient search pattern, which includes the pan and tilt motion of the mount and the focus of the camera, to reduce the time needed until the object detector finds a UAV and the optimization phase is triggered. Furthermore, the effects of the bounding box size onto the focusing performance, especially in front of complex backgrounds will be investigated.

Funding

Technische Universität Wien Bibliothek; Österreichische Forschungsförderungsgesellschaft (873504).

Acknowledgement

This publication is funded by the Austrian defence research programme FORTE of the Federal Ministry of Finance (BMF). The authors acknowledge TU Wien Bibliothek for financial support through its Open Access Funding Program.

The authors gratefully acknowledge the cooperation with ASA Astrosysteme GmbH and thank for their support and valuable expertise.

Disclosures

The authors declare that there are no conflicts of interest related to this paper.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

References

1. “Suspected drone’ disrupts Gatwick Airport flights,” BBC News (2023). Accessed Aug 2023.

2. “Sweden drones: Sightings reported over nuclear plants and palace,” BBC News (2022). Accessed Aug 2023.

3. “Mexican drug cartels using drones to smuggle heroin, meth, cocaine into U.S.,” The Washington Times (2015). Accessed Feb 2022.

4. “Charges over drone drug smuggling into prisons,” BBC News (2018). Accessed Feb 2022.

5. “Drones crashing big sporting events, including U.S. Open, college football,” CNN US (2015). Accessed Aug 2023.

6. A. D. De Quevedo, F. I. Urzaiz, J. G. Menoyo, et al., “Drone Detection and RCS Measurements with Ubiquitous Radar,” in International Conference on Radar (RADAR), (2018), pp. 1–6.

7. S. Yang, H. Qin, X. Liang, et al., “An improved unauthorized unmanned aerial vehicle detection algorithm using radiofrequency-based statistical fingerprint analysis,” Sensors 19(2), 274 (2019). [CrossRef]  

8. V. Baron, S. Bouley, M. Muschinowski, et al., “Drone localization and identification using an acoustic array and supervised learning,” in Artificial Intelligence and Machine Learning in Defense Applications, vol. 11169 (SPIE, 2019), pp. 129–137.

9. D. Ojdanić, A. Sinn, C. Naverschnigg, et al., “Feasibility analysis of optical UAV detection over long distances using robotic telescopes,” IEEE Trans. Aerosp. Electron. Syst. 59(5), 5148–5157 (2023). [CrossRef]  

10. Aselsan, “IHTAR anti-drone system - datasheet,” ASELSAN A.S., Ankara, Türkiye (2018).

11. J. Farlik, M. Kratky, J. Casar, et al., “Multispectral detection of commercial unmanned aerial vehicles,” Sensors 19(7), 1517 (2019). [CrossRef]  

12. H. U. Unlu, P. S. Niehaus, D. Chirita, et al., “Deep learning-based visual tracking of UAVs using a PTZ camera system,” in IECON 2019 - 45th Annual Conference of the IEEE Industrial Electronics Society, (IEEE, 2019), pp. 638–644.

13. A. Bochkovskiy, C.-Y. Wang, and H.-Y. M. Liao, “YOLOv4: Optimal Speed and Accuracy of Object Detection,” arXiv, arXiv:2004.10934 (2020). [CrossRef]  

14. S. Ren, K. He, R. Girshick, et al., “Faster R-CNN: towards real-time object detection with region proposal networks,” in Advances in Neural Information Processing Systems, vol. 28 (2015).

15. T.-Y. Lin, P. Goyal, and R. Girshick, “Focal loss for dense object detection,” IEEE Trans. Pattern Anal. Mach. Intell. 42(2), 318–327 (2020). [CrossRef]  

16. L. Liu, W. Ouyang, and X. Wang, “Deep learning for generic object detection: A survey,” Int. J. Comput. Vis. 128(2), 261–318 (2020). [CrossRef]  

17. N. L. Stauffer, “Active auto focus system improvement,” (1983). US Patent 4,367,027.

18. Z. Zhou, C. Li, and T. He, “Facile large-area autofocusing Raman mapping system for 2D material characterization,” Opt. Express 26(7), 9071–9080 (2018). [CrossRef]  

19. T. Sasakura, “Automatic focusing device using phase difference detection,” (1999). US Patent 5,995,144.

20. Y. Yao, B. Abidi, N. Doggaz, et al., “Evaluation of sharpness measures and search algorithms for the auto-focusing of high-magnification images,” in Visual Information Processing XV, vol. 6246 (SPIE, 2006), pp. 132–143.

21. F. Bortoletto, C. Bonoli, D. Fantinel, et al., “An active telescope secondary mirror control system,” Rev. Sci. Instrum. 70(6), 2856–2860 (1999). [CrossRef]  

22. S. Pertuz, D. Puig, and M. A. Garcia, “Analysis of focus measure operators for shape-from-focus,” Pattern Recognit. 46(5), 1415–1432 (2013). [CrossRef]  

23. J. M. Tenenbaum, Accommodation in computer vision (Stanford University, 1971).

24. C. Yang, M. Chen, and F. Zhou, “Accurate and rapid auto-focus methods based on image quality assessment for telescope observation,” Appl. Sci. 10(2), 658 (2020). [CrossRef]  

25. J. Vaquero, J. Pena, and N. Malpica, “Evaluation of autofocus functions in molecular cytogenetic analysis,” J. Microsc. 188(3), 264–272 (1997). [CrossRef]  

26. I. Helmy, F. Elnagahy, and A. Hamdy, “Focus measures assessment for astronomical images,” in International Conference on Innovative Trends in Communication and Computer Engineering (ITCE), (IEEE, 2020), pp. 244–250.

27. J. Pech-Pacheco, G. Cristobal, J. Chamorro-Martinez, et al., “Diatom autofocusing in brightfield microscopy: a comparative study,” in Proceedings 15th International Conference on Pattern Recognition. ICPR-2000, vol. 3 (2000), pp. 314–317 vol.3.

28. G. Yang and B. J. Nelson, “Wavelet-based autofocusing and unsupervised segmentation of microscopic images,” in Proceedings IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2003), vol. 3 (IEEE, 2003), pp. 2143–2148.

29. J. He, R. Zhou, and Z. Hong, “Modified fast climbing search auto-focus algorithm with adaptive step size searching technique for digital camera,” IEEE Trans. Consumer Electron. 49(2), 257–262 (2003). [CrossRef]  

30. S. S. Rao, Engineering optimization: theory and practice (John Wiley & Sons, 2019).

31. Y. Xiong and S. Shafer, “Depth from focusing and defocusing,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, (IEEE, 1993), pp. 68–73.

32. P. DiMeo, L. Sun, and X. Du, “Fast and accurate autofocus control using gaussian standard deviation and gradient-based binning,” Opt. Express 29(13), 19862–19878 (2021). [CrossRef]  

33. D. Wang, X. Ding, T. Zhang, et al., “A fast auto-focusing technique for the long focal lens tdi CCD camera in remote sensing applications,” Opt. Laser Technol. 45, 190–197 (2013). [CrossRef]  

34. A. Recknagel, Elementarphysik (Elektrik Optik) (P.E. Blank Verlag, 1953).

35. E. Krotkov, “Focusing,” Int. J. Comput. Vis. 1(3), 223–237 (1988). [CrossRef]  

36. H. Nasse, “Depth of field and bokeh,” Carl Zeiss camera lens division report 1, 4 (2010).

37. D. Ojdanić, C. Naveschnigg, A. Sinn, et al., “Parallel architecture for low latency UAV detection and tracking using robotic telescopes,” IEEE Transactions on Aerospace and Electronic Systems (2023). Submitted Jul. 2023.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (8)

Fig. 1.
Fig. 1. DoF over distance to the observed object for the given focal lengths of the optical system assuming an $1$ inch camera sensor.
Fig. 2.
Fig. 2. The implemented dual telescope-system with the custom autofocus module consisting of a linear stage and an ASI camera for the ASA UWF 300 telescope and an additional stage for the Meade telescope, which adjusts the focus for the Moment camera.
Fig. 3.
Fig. 3. Overview of the system architecture. The camera provides images to the object detector and autofocus algorithm, which send commands to the stage driver. The stage driver controls the linear stage, that physically moves the camera with respect to the optics to set the focus.
Fig. 4.
Fig. 4. A sequence of nine selected images from a linear stage scan to obtain the contrast measurement is depicted for the evaluation shown in Table 1. The white numbers within the images indicate the current stage position.
Fig. 5.
Fig. 5. Contrast values calculated using three different contrast measures for a linear stage scan, which contains a single UAV in the image in front of a clear sky.
Fig. 6.
Fig. 6. Image sequence showing the DL-DoF test for the object detection algorithm. The initial focus is set manually, which is depicted on the left two images showing a distance to the UAV of 120 m. Then, the UAV flies towards 6(a) or away 6(b) from the system, while the focus is not adjusted. To obtain the DL-DoF, the distance is recorded, when the object detector fails, as seen in the right most images.
Fig. 7.
Fig. 7. Example data of successful continuous tracking and automatic focusing onto UAVs flying at speeds of up to 24 m/s towards the telescope system. The black dashed line shows the thin lens model, which fits well into the actual stage position. In Fig. 7(a) the non linear dependency between the UAV distance towards the linear stage position is visible, as defined by Eq. (1). The apparent noise of the stage position around the thin lens model is the optimization by the hill climb algorithm, which is periodically triggered to keep the UAV in focus as it moves.
Fig. 8.
Fig. 8. Example image sequence depicting a UAV flying at up to 15 m/s towards the telescope system, while the automatic focus keeps the UAV in focus.

Tables (2)

Tables Icon

Table 1. Results of the evaluation of the contrast functions for both the UWF and the Meade telescope after each contrast function is normalized to 1. Furthermore the data shows the mean of 8 measurements per telescope. The best results are marked in bold.

Tables Icon

Table 2. Results of approximately 450 tests of the search algorithms per telescope for the UWF and the Meade telescope. The values represent the mean, whereas the standard deviation is shown in brackets. The best results are marked in bold.

Equations (5)

Equations on this page are rendered with MathJax. Learn more.

I = 1 1 f 1 x ,
D o F = 2 x D f c ( x f ) D 2 f 2 c 2 ( x f ) 2 ,
F V T e n e n g r a d = n N n M x I ( n , m ) 2 + y I ( n , m ) 2 ,
F V V a r L a p = n N n M ( Δ I ( n , m ) Δ I ¯ ) 2 Δ I ¯ ,
F V N V a r = 1 N M n N n M ( I ( n , m ) I ( n , m ) ¯ ) 2 I ¯ ,
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.