摘 要:譜共軛梯度法作為經(jīng)典共軛梯度法的推廣,它是求解大規(guī)模無約束優(yōu)化問題的有效方法之一.基于標(biāo)準(zhǔn)Wolfe線搜索準(zhǔn)則和充分下降性條件,提出了一種具有充分下降性質(zhì)的FR型譜共軛梯度法.在溫和的假設(shè)條件下,該算法具有全局收斂性.最后,將新算法與現(xiàn)存的修正FR型譜共軛梯度法進(jìn)行比較,數(shù)值結(jié)果表明提出的算法是極其有效的.
關(guān)鍵詞:無約束優(yōu)化;譜共軛梯度法;充分下降性;標(biāo)準(zhǔn)Wolfe線搜索準(zhǔn)則;全局收斂性
中圖分類號(hào):O221.2 " " " " " " " " " " " " " " " " " " " " " " " " " 文獻(xiàn)標(biāo)識(shí)碼:A " "文章編號(hào):1009-3583(2024)-0080-05
FR Type Spectral Conjugate Gradient Method Improved
by Wolfe Line-search
(WANG Sen-sen1, HAN Xin2*, WU Xiang-biao3
(1.School of Mathematics and Information Science, Xinjiang Hetian College, Hetian 848000, China; 2. School of Mathematics, Sichuan University of Arts and Sciences, Dazhou 635000, China; 3. School of Mathematics, Zunyi Normal University, Zunyi 563006, China)
Abstract: The spectral conjugate gradient method, as an extension of the classical conjugate gradient method, is one of the effective methods for solving large-scale unconstrained optimization problems. Based on the standard Wolfe line search criterion and sufficient descent condition, a FR type spectral conjugate gradient method with sufficient descent property is proposed. Under mild assumptions, the algorithm has global convergence. Finally, the new algorithm is compared with the existing modified FR type spectral conjugate gradient method, and numerical results show that the proposed algorithm is extremely effective.
Keywords: unconstrained optimization; spectral conjugate gradient method; sufficient degradability; standard Wolfe line search criteria; global convergence
共軛梯度法作為一種優(yōu)化方法,憑借其算法結(jié)構(gòu)簡(jiǎn)單、易于編程和存儲(chǔ)需求少等特點(diǎn),常被用于解決航空航天、大氣模擬、石油勘探、信號(hào)恢復(fù)等工程應(yīng)用領(lǐng)域遇到的大規(guī)模優(yōu)化問題[1-4].共軛梯度法 (Conjugate Gradient Method),簡(jiǎn)稱CG法,最早是由Stiefel和Hestenes在求解非線性方程組時(shí)提出的算法,1962年Reeves和Fletcher將此方法應(yīng)用于解決非線性優(yōu)化問題。無約束優(yōu)化問題是優(yōu)化領(lǐng)域一類重要問題,其一般形式為
3" 數(shù)值實(shí)驗(yàn)和結(jié)論
為了檢驗(yàn)算法WSFR的數(shù)值效果,從常見的無約束優(yōu)化測(cè)試函數(shù)集中選取部分函數(shù)進(jìn)行測(cè)試,并與文獻(xiàn)[22]中的CZFR算法進(jìn)行數(shù)值比較.程序由Matlab編寫,所有算法均在采用Windows 10 操作系統(tǒng)的PC機(jī)(榮耀MagicBook,AMD Ryzen 7 3750H with Radeon Vega Mobile Gfx 2.30 GHz)上進(jìn)行,算法WSFR中參數(shù)的設(shè)置為:,另外算法CZFR也采用標(biāo)準(zhǔn)的Wolfe線搜索準(zhǔn)則確定步長,且參數(shù)設(shè)置和算法WSFR一致.算法終止準(zhǔn)則為或者迭代次數(shù)超過10000次.具體運(yùn)算結(jié)果見表1.
表1中,F(xiàn)unction:測(cè)試函數(shù)名稱,Dim:函數(shù)的維數(shù),Iter:迭代終止時(shí)算法迭代的總次數(shù),Time:算法的CPU運(yùn)行時(shí)間,gk 為迭代終止時(shí)目標(biāo)函數(shù)的梯度值,“-”表示算法對(duì)此問題無效。從表中結(jié)果可以看出,算法WSFR對(duì)40個(gè)測(cè)試問題都是有效的,而算法CZFR僅對(duì)60%的測(cè)試問題有效.此外,對(duì)比迭代次數(shù)、CPU運(yùn)行時(shí)間、迭代終止時(shí)目標(biāo)函數(shù)的梯度值,40個(gè)測(cè)試問題中對(duì)于兩個(gè)算法都有效果的24測(cè)試問題而言,算法CZFRR僅有兩個(gè)測(cè)試問題在迭代次數(shù)、CPU運(yùn)行時(shí)間、迭代終止時(shí)目標(biāo)函數(shù)的梯度值優(yōu)于算法WSFR.然而,對(duì)于剩下的測(cè)試問題,算法WSFR在迭代次數(shù)、CPU運(yùn)行時(shí)間、迭代終止時(shí)目標(biāo)函數(shù)的梯度值等方面優(yōu)于算法CZFR.
綜上所述,算法WSFR對(duì)比算法CZFR其數(shù)值計(jì)算效果有明顯的提升,且該算法在標(biāo)準(zhǔn)的Wolfe線搜索準(zhǔn)則下具有充分下降性.此外,對(duì)于不同的測(cè)試問題,算法WSFR可以通過調(diào)整譜系數(shù)中的參數(shù) 來提高數(shù)值計(jì)算效率.
參考文獻(xiàn):
[1]王森森,張俊容,韓信,等.一類具有充分下降性的混合型譜共軛梯度法[J].西南大學(xué)學(xué)報(bào)(自然科學(xué)版),2017,39(5):139-144.
[2] Han X, Zhang J, Chen J. A New Hybrid Conjugate Gradient Algorithm for Unconstrained Optimization[J]. Bulletin of the Iranian Mathematical Society,2017,43(6):2067-2084.
[3] 鄭宗劍, 韓信. 一種具有充分下降性的新混合型共軛梯度法[J]. 西南師范大學(xué)學(xué)報(bào)(自然科學(xué)版), 2022, 47(1): 1-7.
[4] 陸游,何嘉.基于并行優(yōu)化與訪存優(yōu)化遺傳算法的TSP問題求解方法[J].四川文理學(xué)院學(xué)報(bào),2017,27(2):11-17.
[5]Fletcher R, Reeves C M. Function Minimization by Conju-gate Gradients[J]. The Computer Journal, 1964,7(2):149-154.
[6] Dai Y H, Yuan Y. A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property[J]. SIAM Journal on Optimization,1999,10(2):177-182.
[7] Fletcher R. Practical Methods of Optimization Unconstrained Optimization[J]. Journal of the Operational Research Society,1980,33(7):675-676.
[8] Hestenes M R, Stiefel E. Methods of Conjugate Gradients for Solving Linear Systems[J]. Journal of Research of the Nation-al Bureau of Standards, 1952, 49(6): 409-436.
[9] Polyak E, Ribiere G. Note Sur La Convergence De Méthodes Directions Conjuguées.[J]. Revue Francaise Informatique Et De Recherche Opérationnelle, 1968,3(16):35-43.
[10]Polyak B T. The Conjugate Gradient Method in Extremal "Problems[J]. Ussr Computational Mathematics amp; Mathemat- ics Physics,1969,9(4):94-112.
[11] Liu Y, Storey C. Efficient Generalized Conjugate Gradient" Algorithms.Part1:Theory[J]. Journal of Optimization Theory" and Applications,1992,69(1):129-137.
[12] Wei Z X, Yao S W, et al. The Convergence Properties of "Conjugate Gradient method for Optimization[J]. Appl Math "Comput,2006,183(2):1341-1350.
[13] Yao S W, Wei Z X, et al. A Note about WYL’s Conjugate "Gradient Method and its Application[J]. Applied Mathemat- ics and Coumpuation,2007,191(2):381-388.
[14]林穗華.改進(jìn)共軛梯度法的收斂性[J].西南大學(xué)學(xué)報(bào)(自然 科學(xué)版),2021, 43(7):81-88.
[15] Wang Y, Huang J P, Shao H, et al. A Modified PRP-HS hy- brid Conjugate Gradient Method with Global Convergence[J]. "Mathematical Theory and Applications,2022,42(4):58-70.
[16] 張鵬,杜學(xué)武.強(qiáng)Wolfe線搜索下一種混合的PRP-WYL 共軛梯度法[J].重慶師范大學(xué)學(xué)報(bào)(自然科學(xué)版),2020,37(1): 41-51.
[17] Birgin E G, Martinez J M. A Spectral Conjugate Gradient "Method for Unconstrained Optimization[J]. Applied Mathe- matics and Optimization,2001,43(2):117-128.
[18] 劉鵬杰,江羨珍,宋丹.一類具有充分下降性的譜共軛梯度 法[J].運(yùn)籌學(xué)報(bào).2022.26(4):87-97.
[19] 江羨珍,廖偉,簡(jiǎn)金寶,等.一個(gè)帶重啟步的改進(jìn)PRP型譜共 軛梯度法[J].數(shù)學(xué)物理學(xué)報(bào),2022,42(1):216-227.
[20] 簡(jiǎn)金寶,劉鵬杰,江羨珍.一個(gè)充分下降的譜三項(xiàng)共軛梯度 法[J].應(yīng)用數(shù)學(xué)學(xué)報(bào).2020,43(6):1000-1012.
[21] 王森森,張俊容,韓信.一類修正的FR型譜共軛梯度法[J]. 西南大學(xué)學(xué)報(bào)(自然科學(xué)版),2018,40(2):49-55.
[22] 晁麗佳,張永富.修正FR型譜共軛梯度法在信號(hào)恢復(fù)中的 應(yīng)用[J].重慶師范大學(xué)學(xué)報(bào).2023,40(5):11-18.
[23] Zoutendijk G. Nonlinear Programing Computational Me- thods[J]. Integer and Nonlinear Programing.1970,143(1):37- 86.
[24] 劉鵬杰, 吳彥強(qiáng), 邵楓,等.兩個(gè)帶重啟方向的改進(jìn)HS型共 軛梯度法[J].數(shù)學(xué)物理學(xué)報(bào),2023,43(2):570-580.
[25] Andrei N. An Unconstrained Optimization Test Functions "Collection[J]. Advanced Modeling and Optimization,2008, 10(1):147-161.
(責(zé)任編輯:羅東升)