摘 要: 提出了一種大規(guī)模無約束優(yōu)化問題的求解方法,通過修正Dai-Liao(DL)共軛梯度法的共軛參數和譜共軛梯度法的譜參數,構造了一種修正DL型譜共軛梯度法。所選取的譜參數使得每次迭代都自動產生一個不依賴于任何線搜索的下降方向;在常規(guī)假設下,利用強Wolfe線搜索證明了此方法對一致凸函數是全局收斂的。
關鍵詞: 無約束優(yōu)化;強Wolfe線搜索;譜共軛梯度法;譜參數;全局收斂
中圖分類號: O221.2 文獻標志碼: A 文章編號: 1673-3851 (2023) 03-0279-06
引文格式:李亞敏.一種具有充分下降性的修正DL型譜共軛梯度法[J]. 浙江理工大學學報(自然科學),2023,49(2):279-284.
Reference Format: LI Yamin. A modified DL-type spectral conjugate gradient method with sufficiently descent property[J]. Journal of Zhejiang Sci-Tech University,2023,49(2):279-284.
A modified DL-type spectral conjugate gradient method with sufficiently descent property
LI Yamin
(School of Economics, Technology & Media University of Henan Kaifeng, Kaifeng 475001, China)
Abstract:? A method for solving large-scale unconstrained optimization problems is proposed. By modifying the conjugate parameter of the Dai-Liao (DL) conjugate gradient method and the spectral parameter of the spectral conjugate gradient method, a modified DL-type spectral conjugate gradient method is constructed. The spectral parameter is selected so that each iteration automatically generates a descent direction that does not depend on any line search. Under conventional assumptions, it is proved that this method is globally convergent for uniformly convex functions by using strong Wolfe line search.
Key words: unconstrained optimization; strong Wolfe line search; spectral conjugate gradient method; spectral parameter; global convergence
0 引 言
共軛梯度法(Conjugate gradient method, CGM)的迭代格式簡單,在計算時僅需要目標函數值和梯度函數值,無需計算二階導數值,儲存需求低,是求解無約束優(yōu)化問題(1)的常用迭代方法之一:
3 結 論
本文在DL法和譜共軛梯度法的基礎上,把DL法的共軛參數βDLk的第一項gTkyk-1dTk-1yk-1修正為gTk(gk-gk-1)‖dk-1‖2,得到了共軛參數βDL-Rk=gTk(gk-gk-1)‖dk-1‖2-tgTksk-1dTk-1(gk-gk-1),修正譜參數為θk=1+βDL-RkgTkdk-1‖gk‖2,使得搜索方向dk不依賴于任何線搜索都充分下降,進而得到了一種求解大規(guī)模無約束優(yōu)化問題的修正DL型譜共軛梯度法;并在強Wolfe線搜索下證明了該方法對一致凸函數是全局收斂的。
參考文獻:
[1]Fletcher R, Reeves C M. Function minimization by conjugate gradients[J]. The Computer Journal, 1964, 7(2): 149-154.
[2]Polak E, Ribiere G. Note sur la convergence de mthodes de directions conjugues[J]. Revue Franaise d'Informatique et De Recherche Oprationnelle Srie Rouge, 1969, 3(16): 35-43.
[3]Polyak B T. The conjugate gradient method in extremal problems[J]. USSR Computational Mathematics and Mathematical Physics, 1969, 9(4): 94-112.
[4]Hestenes M R, Stiefel E. Methods of conjugate gradients for solving linear systems[J]. Journal of Research of the National Bureau of Standards, 1952, 49(6): 409-436.
[5]Dai Y H, Yuan Y. A nonlinear conjugate gradient method with a strong global convergence property[J]. SIAM Journal on Optimization, 1999, 10(1): 177-182.
[6]Liu Y, Storey C. Efficient generalized conjugate gradient algorithms, part 1: Theory[J]. Journal of Optimization Theory and Applications, 1991, 69(1):129-137.
[7]Fletcher R. Practical Methods of Optimization, Vol 1: Unconstrained Optimization[M]. New York: John Wiley & Sons,1987:80-92.
[8]李丹丹,李遠飛,王松華.一種修正三項Hestenes-Stiefel共軛梯度投影算法及其應用[J].吉林大學學報(理學版),2022,60(1):64-72.
[9]Jiang X Z, Liao W, Yin J H, et al. A new family of hybrid three-term conjugate gradient methods with applications in image restoration[J]. Numerical Algorithms, 2022, 91(1):161-191.
[10]Waziri M Y, Ahmed K, Halilu A S. A modified PRP-type conjugate gradient projection algorithm for solving large-scale monotone nonlinear equations with convex constraint[J]. Journal of Computational and Applied Mathematics, 2022,407:114035.
[11]Lotfi M, Hosseini S M. An efficient Dai-Liao type conjugate gradient method by reformulating the CG parameter in the search direction equation[J]. Journal of Computational and Applied Mathematics, 2020, 371:112708.
[12]Mtagulwa P, Kaelo P. An efficient modified PRP-FR hybrid conjugate gradient method for solving unconstrained optimization problems[J]. Applied Numerical Mathematics, 2019, 145:111-120.
[13]Dai Y H, Liao L Z. New conjugacy conditions and related nonlinear conjugate gradient methods[J]. Applied Mathematics and Optimization,2001,43(1):87-101.
[14]Hager W, Zhang H C.A new conjugate gradient method with guaranteed descent and an efficient line search[J]. SIAM Journal on Optimization, 2005,16:170-192.
[15]段復建,楊沖,李向利. 一個自適應Dai-Liao共軛梯度法[J]. 應用數學, 2022,35(2):336-342.
[16]Rivaie M, Mamat M, June L W,et al. A new class of nonlinear conjugate gradient coefficients with global convergence properties[J]. Applied Mathematics and Computation, 2012, 218(22): 11323-11332.
[17]Birgin E G, Martínez J M. A spectral conjugate gradient method for unconstrained optimization[J]. Applied Mathematics and Optimization, 2001, 43(2): 117-128.
[18]Awwal A M, Kumam P, Abubakar A B. Spectral modified Polak-Ribire-Polyak projection conjugate gradient method for solving monotone systems of nonlinear equations[J]. Applied Mathematics and Computation, 2019, 362:124514.
[19]Koorapetse M, Kaelo P, Lekoko S, et al. A derivative-free RMIL conjugate gradient projection method for convex constrained nonlinear monotone equations with applications in compressive sensing[J]. Applied Numerical Mathematics,2021,165:431-441.
[20]Fang X W. A class of new derivative-free gradient type methods for large-scale nonlinear systems of monotone equations[J]. Journal of Inequalities and Applications, 2020, 2020: 93.
[21]Liu H, Yao Y, Qian X Y, et al.Some nonlinear conjugate gradient methods based on spectral scaling secant equations[J]. Computational and Applied Mathematics,2016,35(2):639-651.
[22]高立. 數值最優(yōu)化方法[M]. 北京:北京大學出版社,2014:107-108.
(責任編輯:康 鋒)
收稿日期: 2022-08-01網絡出版日期:2022-11-01網絡出版日期
作者簡介: 李亞敏(1992- ),女,河南開封人,助教,碩士,主要從事優(yōu)化理論方面的研究。