吳雙江,杜學(xué)武
(重慶師范大學(xué)數(shù)學(xué)科學(xué)學(xué)院,重慶 401331)
一個(gè)帶有擾動(dòng)因子的修正DL共軛梯度法
吳雙江,杜學(xué)武
(重慶師范大學(xué)數(shù)學(xué)科學(xué)學(xué)院,重慶401331)
本文研究無約束優(yōu)化問題:
其中f:Rn→R為光滑函數(shù).為解決無約束優(yōu)化問題,可以采用非線性共軛梯度法,其通常有如下格式:
其中:gk=▽f(xk),αk>0是通過某種線搜索獲得搜索步長,βk是決定共軛梯度法類型的參數(shù).
研究者對共軛梯度法做過許多研究,其中有6種早期經(jīng)典的βk的取法如下:
Yao等在文獻(xiàn)[7]中提出修正HS法,記作MHS法,其βk具體形式如下:
Dai和Liao[8]提出DL共軛梯度法,βk的具體形式如下:
由假設(shè)A中②可知‖gk‖≤γ.
引理1[11]若假設(shè)A成立.對于迭代格式為式(2)~(3)的共軛梯度法,其中dk有充分下降性,αk是由強(qiáng)Wolfe線搜索獲得的步長.如果:
由式(3)和式(4)得:
其中θk是gk與gk與gk-1的夾角.證畢.
性質(zhì)(*)[11]考慮迭代格式為式(2)~(3)的共軛梯度法,且假設(shè):
如果存在兩個(gè)常數(shù)b>0和λ>0,使得對任意k≥0滿足:
則稱共軛梯度法具有性質(zhì)(*).
引理3 若假設(shè)A成立.考慮迭代格式為式(2)~(3)的DMHSDL:
共軛梯度法.如果步長通過強(qiáng)Wolfe線搜索獲得,則DMHSDL法具有性質(zhì)(*).
以下定理1、引理4和定理2的證明過程與文獻(xiàn)[9]中的引理6、引理7和定理8相似,本文省略證明過程.
定理1 若假設(shè)A成立.通過強(qiáng)Wolfe線搜索獲得步長的DMHSDL共軛梯度法,若存在常數(shù)γ>0,使得:
通過圖1~4和表1可知,DMHSDL法在函數(shù)計(jì)算次數(shù),梯度計(jì)算次數(shù),迭代次數(shù),時(shí)間上均好于其他幾種方法.
表1 DL法,MHS法,MHSDL法,DMHSDL法的相對有效性Tab.1 The relative efficiency of DL,MHS,MHSDL and DMHSDL
圖1 函數(shù)計(jì)算次數(shù)Fig.1 Performance profile on the number of function evaluation
圖2 梯度計(jì)算次數(shù)Fig.2 Performance profile on the number of gradient evaluation
圖3 迭代次數(shù)Fig.3 Performance profile on the number of iteration evaluation
圖4 CPU時(shí)間Fig.4 Performance profile on CPU time
[1]FLETCHER R,REEVES C M.Function minimization by conjugate gradients[J].The Computer Journal,1964,7(2):149-154.
[2]NOCEDAL J,WRIGHT S J.Numerical Optimization[M].2ndedition.Springer:Springer Series in Operations Research and Financial Engineering,2006.
[3]HESTENES M R,STIEFEL E.Methods of conjugate gradients for solving linear systems[M].Washington DC:Research of the National Bureau of Standards,1952.
[4]FLETCHER R.Practical Methods of Optimization[M].2ndedition,John Wiley&Sons,1987.
[5]LIU Y,STOREY C.Efficient generalized conjugate gradient algorithms[J].JOTA,1991,69(1):129-152.
[6]DAI Y H,YUAN Y X.Anonlinear conjugate gradient with a strong global convergence property[J].SIAM Journal on Optimization,1999,10(1):177 -182.
[7]YAO S W,WEI Z X,HUANG H.A note about WYL's conjugate gradient method and its applications[J].Applied Mathematics and Computation,2007,191(2):381-388.
[8]DAI Y H,LIAO L Z.New conjugacy conditions and related nonlinear conjugate gradient methods[J].Appl Math Optim,2001,43(1):87-101.
[9]YAO S W,LU X W,WEI Z X.A conjugate gradient method with global convergence for largescale unconstrained optimization problems[J].Journal of Applied Mathematics,2013,Article ID734454,9 pages.
[10]JIANG X Z,JIAN J B.Two modified nonlinear conjugate gradient methods with disturbance factors for unconstrained optimization[J].Nonlinear Dynamics,2014,77(1/2):387-397.
[11]GILBERT J C,NOCEDAL J.Global convergence properties of conjugate gradient methods for optimization[J].SIAM Journal on Optimization,1992,1(2):21-42.
[12]MORE J J,GARBOW B S,HILLSTROM K E.Testing unconstrained optimization software[J].ACM Transactions on Mathematical Software(TOMS),1981,7(1):17-41.
責(zé)任編輯:時(shí) 凌
A Modified DL Conjugate Gradient Method with Disturbance Factors
WU Shuangjiang,DU Xuewu
(School of Mathematics,Chongqing Normal University,Chongqing,401331,China)
conjugate gradient method;strong Wolfe line search;disturbance factors;global convergence
O224
A
1008-8423(2015)04-0375-04DOI:10.13501/j.cnki.42-1569/n.2015.12.005
2015-11-11.
國家自然科學(xué)基金項(xiàng)目(11171363).
吳雙江(1990-),男,碩士生,主要從事最優(yōu)化理論與算法的研究.