## Abstract

In order to improve the wavefront distortion correction performance of the classical stochastic parallel gradient descent (SPGD) algorithm, an optimized algorithm based on Nesterov-accelerated adaptive momentum estimation is proposed. It adopts a modified second-order momentum and a linearly varying gain coefficient to improve iterative stability. It integrates the Nesterov momentum term and the modified Adam optimizer to further improve the convergence speed, correct the direction of gradient descent in a timely fashion, and avoid falling into local extremum. Besides, to demonstrate the algorithm’s performance, a wavefront sensorless adaptive optics system model is established using a ${{6}} \times {{6}}$ element deformable mirror as wavefront corrector. Simulation results show that, compared with the SPGD algorithm, the proposed algorithm converges faster, and its Strehl ratio after convergence is nearly 6.25 times that of the SPGD algorithm. Also, the effectiveness and superiority of the proposed algorithm are verified by comparing with two existing optimization algorithms.

© 2021 Optical Society of America

Full Article | PDF Article**More Like This**

Heng Zhang, Li Xu, Yongfei Guo, Jingtai Cao, Wei Liu, and Leqiang Yang

Opt. Express **30**(5) 7477-7490 (2022)

R. Yazdani, M. Hajimahmoodzadeh, and H. R. Fallah

Appl. Opt. **53**(1) 132-140 (2014)

Qintao Hu, Liangli Zhen, Yao Mao, Shiwei Zhu, Xi Zhou, and Guozhong Zhou

Opt. Express **28**(9) 13141-13154 (2020)