数学进展
數學進展
수학진전
ADVANCES IN MATHEMATICS
2006年
3期
265-274
,共10页
无约束优化%超记忆梯度算法%全局收敛性%数值实验
無約束優化%超記憶梯度算法%全跼收斂性%數值實驗
무약속우화%초기억제도산법%전국수렴성%수치실험
unconstrained optimization%memory gradient method%global convergence%numerical experiment
本文提出一种新的无约束优化超记忆梯度算法,算法利用当前点的负梯度和前一点的负梯度的线性组合为搜索方向,以精确线性搜索和Armijo搜索确定步长.在很弱的条件下证明了算法具有全局收敛性和线性收敛速度.因算法中避免了存贮和计算与目标函数相关的矩阵,故适于求解大型无约束优化问题.数值实验表明算法比一般的共轭梯度算法有效.
本文提齣一種新的無約束優化超記憶梯度算法,算法利用噹前點的負梯度和前一點的負梯度的線性組閤為搜索方嚮,以精確線性搜索和Armijo搜索確定步長.在很弱的條件下證明瞭算法具有全跼收斂性和線性收斂速度.因算法中避免瞭存貯和計算與目標函數相關的矩陣,故適于求解大型無約束優化問題.數值實驗錶明算法比一般的共軛梯度算法有效.
본문제출일충신적무약속우화초기억제도산법,산법이용당전점적부제도화전일점적부제도적선성조합위수색방향,이정학선성수색화Armijo수색학정보장.재흔약적조건하증명료산법구유전국수렴성화선성수렴속도.인산법중피면료존저화계산여목표함수상관적구진,고괄우구해대형무약속우화문제.수치실험표명산법비일반적공액제도산법유효.
A new super-memory gradient method for unconstrained optimization problem is proposed. The algorithm uses the linear combination of negative gradient and its previous negative gradient as a search direction, and uses exact line search or inexact line search to define the step-size at each iteration. It is suitable to solve large scale unconstrained optimization problems because it avoids the computation and storage of matrices associated with the Hessian of objective functions. The convergence of the algorithm with exact line search is proved. Furthermore, the global convergence is also proved under Armijo line search.Numerical experiments show that the algorithm is efficient in practical computation in many situations.