APP下载

A New Adaptive Nonmonotone Newton Algorithm

2023-06-29YUANLiuyang袁柳洋JINHuihui晋慧慧WANZhongping万仲平

应用数学 2023年3期

YUAN Liuyang(袁柳洋)JIN Huihui(晋慧慧)WAN Zhongping(万仲平)

(1.College of Science,Wuhan University of Science and Technology,Wuhan 430065,China;2.Hubei Province Key Laboratory of Systems Science in Metallurgical Process,Wuhan University of Science and Technology,Wuhan 430081,China;3.School of Mathematics and Statistics,Wuhan University,Wuhan 430072,China)

Abstract: In this paper,a new adaptive nonmonotone line search technique is proposed for unconstrained optimization.Based on the new nonmonotone line search technology,an adaptive nonmonotone Newton algorithm is developed.Under suitable assumptions,the global convergence of the new algorithm is proved.The Numerical results show the feasibility and efficiency of the proposed algorithm.

Key words: Nonmonotone line search;Adaptive;Global convergence

1.Introduction

In this paper,the following unconstrained optimization problem is considered

wheref(x) : Rn →R is a continuously differentiable function.Letg=∇f(x),which is the gradient function off(x).

For solving the problem (1.1),there are many iterative methods.The general iterative format has the following form:

wherexkis the current iteration point,dkis the search direction of thekth iteration,andαkis the step-size of thekth iteration along the direction.

For a given descent directiondk,in order to obtain the step-sizeαkalongdk,there are mainly two kinds of line search rules: the exact line search and the inexact one.However,the exact line search is only applicable to the iterative process of obtaining arg,which is difficult to realize in the actual calculation process.The inexact line search is to find a step-size close to the optimal search step-size by some inexact line search criteria,so that adequate reduction infis achieved with less cost.Therefore,more and more researchers have begun to study the inexact line search.The famous inexact line search is Armijo line search[1],and its form is as follows:

Inexact line search includes nonmonotone line search and monotone line search.The Armijo line search belongs to monotone line search,asdkis a descent search direction,the objective function value always decreases monotonously at each iteration[2].Since the monotone line search rule limits the objective function to decline strictly in the iterative process,when the objective function appears narrow and curved canyons in the feasible region,it will lead to very small iteration steps and even sawtooth phenomenon occurs,the numerical effi-ciency of the algorithm will be greatly reduced.In order to avoid such problems,nonmonotone line search is often adopted,and the numerical results in [3-7]show that nonmonotone line search can improve the likelihood of finding a global optimal solution and the convergence speed.

The nonmonotone line search strategy was first proposed by Grippo et al.[7]Their approach is roughly as follows: LetMbe a fixed nonnegative integer andδ,ρbe a constant in(0,1).Take an initial pointx0,chooseαk=kρhksatisfying the following conditions

where 0≤m(k)≤min{m(k −1)+1,M},m(0)=0,kis the prior trial step-size,hkis the smallest nonnegative integer that makes (1.4) hold.

Although the nonmonotone line search technique in(1.4)has many advantages,it cannot fully utilize the value of the current objective function.In addition,[7-8]pointed out that the numerical efficiency of the algorithm depends on the selection of elementM.To overcome these disadvantages,ZHANG and Hager[9]developed another nonmonotone line search technique.Specifically,letfk=f(xk),C0=f(x0),Q0=1,δ ∈(0,1),0≤ηmin≤ηmax<1 withηk ∈[ηmin,ηmax].Then,the nonmonotone line search method (1.4) is replaced by

Subsequently,HUANG et al.[10]proposed a new nonmonotone line search strategy,which is proved to be an improved version of ZHANG and Hager[9].

It is well known that the degree of nonmonotonicity plays an important role in nonmonotone search technology.In [9,11],it has been shown that the nonmonotonicity of the algorithm can be controlled.In recent years,some scholars have begun to study how to adaptively control the degree of nonmonotonicity in nonmonotone technology[12−14].

Inspired by the above discussion,in this paper,we propose a new adaptive nonmonotone line search rule based on (1.5),and prove that the new line search rule has some good properties.Based on this new line search rule,an adaptive nonmonotone Newton algorithm is proposed.Under some mild assumptions,the global convergence of the algorithm is obtained,and the numerical results show that the new method is feasible and effective.

The following structure of this paper is as follows: In Section 2,a new nonmonotone line search technique is proposed,based on this new line search rule,an adaptive nonmonotone Newton algorithm is proposed,and some good properties of the new algorithm are given;In Section 3,The global convergence of the new algorithm is proved.In Section 4,several numerical experiments are carried out.Some conclusions are made in Section 5.

2.A New Adaptive Nonmonotone Newton Algorithm

In this section,a new nonmonotone line search technique is first introduced.Then,based on the new line search technology,a new adaptive nonmonotone Newton algorithm is proposed.

The new nonmonotone line search technique can be given as follows.

whereCkis calculated by (1.6),mk+1is calculated as follows,‖·‖and‖·‖∞represent the Euclidean and infinity norms of a vector,respectively.

and 0

As mentioned in [13],the gradient norm is related to the form of the objective function.Therefore,in the new nonmonotone line search rule,the value ofmkis adaptively adjusted by the gradient norm.If the gradient norm is large,the first condition in (2.3) is satisfied,just as the case that iterative sequence follows the bottom of a narrow valley.Therefore,in order to avoid the iterative sequence creeping along the bottom of a narrow and curved valley,the value ofmkshould be increased;If the gradient norm is moderate,the second condition in (2.3) is satisfied,then the value ofmkremains unchanged.If the gradient norm is small,the third condition in (2.3) is satisfied,which indicates that the current iteration is in a flat area.Therefore,in order to further reduce the value of the function,the value ofmkshould be reduced.

Based on the new line search technology,we propose a new adaptive nonmonotone Newton algorithm as shown in Algorithm 2.1.

Algorithm 2.1(A new adaptive nonmonotone Newton algorithm)

Remark 2.1The sufficient descent condition is very important to establish the global convergence of Algorithm 2.1.Actually,in Step 1 of Algorithm 2.1,it ensures that there are two positive constantsc1andc2,and fork ≥0,the search directiondksatisfies:

Now we prove the existence ofδkin (2.1).

By the mean-theorem,there is aξk ∈(0,1) such that

Asm →∞,it is obtained that

From 0<δk <1,it follows that.This contradictsdkas the descend direction.Therefore,there is a step-sizeαkthat makes (2.1) hold.

Therefore,under the condition of Theorem 2.1,there is always a step-size that makes(2.1) hold,which means that Algorithm 2.1 is well defined.

3.The Convergence of Algorithm 2.1

In this section,the global convergence of Algorithm 2.1 is proved.Firstly,the following assumptions are given.

Assumption 3.1Assume thatf(x) is continuously differentiable and bounded below on Rn.

Assumption 3.2Assume that the gradientg(x) off(x) is Lipschitz continuous,that is,there is a constantL>0 such that

Before stating the following theorem,for convenience,we set(k)=min(k,mk),Mis the largest positive integer obtained bymk.

Theorem 3.1Suppose that{xk}is the iterative sequence generated by Algorithm 2.1,and there isfork ≥0,then∀l ≥1,

ProofIn order to prove (3.1),we first prove that the following inequality holds.

According to the line search condition (2.1) and Theorem 2.1,it follows that

From (3.4),(3.3) holds forj=1.Suppose that (3.3) holds for 1≤j ≤M −1.According to the inductive hypothesis and the descent property ofdkthat

By (3.5) and (2.1),we obtain

Thus,(3.3) holds forj+1.By induction,(3.3) holds for 1≤j ≤M.Therefore,(3.1) holds.Sincef(x) is bounded below,it follows that

By summing (3.1) overl,we have

Therefore,(3.2) holds.

Lemma 3.1Suppose that Assumptions 3.2 holds.Ifαkis the iterative step-size generated by Algorithm 2.1,then under the conditions of (2.4) and (2.5) fork ≥0,

ProofFor the obtained step-sizeαk,ifdoes not satisfy (2.1),then

By the mean-theorem,there is aξk ∈(0,1) such that

We conclude from Assumption 3.2,the Cauchy-Schwartz inequality,(2.5) and (3.7) that

We can obtain from (2.4) and (2.5) that

Therefore (3.6) holds.

Theorem 3.2Suppose that{xk}is the iterative sequence generated by Algorithm 2.1.Under the conditions of Assumptions 3.1,3.2 and (2.4) and (2.5),it holds that

ProofFirstly,let us prove that there exists a constantγsuch that

From Algorithm 2.1,we get thatαk ≤1.By (1.2) and (2.5),we have that

From Assumption 3.2,it follows that

Settingγ=1+Lc2,we obtain

Thus,(3.10) holds.

From (3.1),(3.6) and (2.4),(2.5),it follows that

In view of (3.12),we get

Therefore,(3.9) holds.

We will replace (2.4) and (2.5) with a milder Assumption 3.3 and establish the global convergence of Algorithm 2.1.

Assumption 3.3Assume that there exist three positive constantsc1,τ1andτ2such that for allk,

Theorem 3.3Suppose that{xk}is the iterative sequence generated by Algorithm 2.1.Under Assumptions 3.1,3.2 and 3.3,it holds that

ProofAssuming that (3.15) does not true.Then,for eachk,there exists a constantβ >0 such that

By Assumption 3.3,we can get

Since (2.4) holds under the condition of Assumption 3.3,(3.8) still holds under the condition of Assumption 3.3.Therefore,for allk,

Then,it follows from Assumption 3.3 and (3.17) that

This contradicts (3.2),therefore (3.15) holds.

4.The Numerical Experiments

In this section,the numerical efficiency of the proposed nonmonotone line search is tested.We select 24 unconstrained optimization test problems from [15]for numerical experiments,and compare our algorithm with other algorithms associated with different line search rules.All the numerical experiments are carried out on WINDOWS10 operating system,Inter (R)Core(TM)i5-7200U CPU@2.50GHz 2.71GHz and 12G memory personal computer,and are implemented in MATLAB 9.8.0.1323502 (R2020a).In the new algorithm,we setc1=10−5,c2=105,b0=0.5,b1=1.5.

In order to show the numerical efficiency of the new algorithm,we mainly compare the four methods in the following:

M1: the search direction in Algorithm 2.1 combined with the Armijo line search in [1];

M2: the search direction in Algorithm 2.1 combined with the nonmonotone line search in [7];

M3: the search direction in Algorithm 2.1 combined with the nonmonotone line search in [9];

M4: Algorithm 2.1.

In our implementation,the trial step-size of all algorithms is set to be 1.All algorithms stop if the termination condition‖gk‖≤10−8is satisfied or the number of iterations exceeds 10,000.The parameters in all algorithms are selected as follows:

M1:δ=10−3,ρ=0.5;

M2:δ=10−3,ρ=0.5,M=10;

M3:δ=10−3,ηk=0.85,ρ=0.5;

M4:δk=10−3,ηk=0.85,ρ=0.5,N=1.

The numerical experimental results are shown in the Tab.4.1,where the following denotations are used:

Tab.4.1 Numerical performance of the four methods

Problem: the name of the test problem from [15];

Dim: the dimension of the problem;

Ni: the number of iterations;

Ng: the number of gradient calculations;

Mi: the methodi.

It can be seen from the number of iterations and the number of gradient calculations of each algorithm in Tab.4.1:

There is one problem where M1 is superior to M4 and eight problems where M4 is superior to M1.

There are no problems where M2 is superior to M4 and four problems where M4 is superior to M2.

There are no problems where M3 is superior to M4 and three problems where M4 is superior to M3.

Therefore,the numerical results in Tab.4.1 show that the new nonmonotone line search algorithm is feasible and effective.

5.Conclusion

Based on the nonmonotone line search technique proposed by ZHANG and Hager[9],a new adaptive nonmonotone line search rule is proposed,which adaptively adjusts the value ofmkthrough the gradient norm in each iteration.A new adaptive nonmonotone Newton algorithm is developed by utilizing the proposed nonmonotone line search rule,and the global convergence of the proposed algorithm is proved.Finally,24 unconstrained optimization problems are numerically tested and compared with the existing three classical line search algorithms.The numerical results show that the new method is feasible and effective.