亚洲免费av电影一区二区三区,日韩爱爱视频,51精品视频一区二区三区,91视频爱爱,日韩欧美在线播放视频,中文字幕少妇AV,亚洲电影中文字幕,久久久久亚洲av成人网址,久久综合视频网站,国产在线不卡免费播放

        ?

        A New Adaptive Nonmonotone Newton Algorithm

        2023-06-29 11:00:00YUANLiuyang袁柳洋JINHuihui晉慧慧WANZhongping萬(wàn)仲平
        應(yīng)用數(shù)學(xué) 2023年3期

        YUAN Liuyang(袁柳洋)JIN Huihui(晉慧慧)WAN Zhongping(萬(wàn)仲平)

        (1.College of Science,Wuhan University of Science and Technology,Wuhan 430065,China;2.Hubei Province Key Laboratory of Systems Science in Metallurgical Process,Wuhan University of Science and Technology,Wuhan 430081,China;3.School of Mathematics and Statistics,Wuhan University,Wuhan 430072,China)

        Abstract: In this paper,a new adaptive nonmonotone line search technique is proposed for unconstrained optimization.Based on the new nonmonotone line search technology,an adaptive nonmonotone Newton algorithm is developed.Under suitable assumptions,the global convergence of the new algorithm is proved.The Numerical results show the feasibility and efficiency of the proposed algorithm.

        Key words: Nonmonotone line search;Adaptive;Global convergence

        1.Introduction

        In this paper,the following unconstrained optimization problem is considered

        wheref(x) : Rn →R is a continuously differentiable function.Letg=?f(x),which is the gradient function off(x).

        For solving the problem (1.1),there are many iterative methods.The general iterative format has the following form:

        wherexkis the current iteration point,dkis the search direction of thekth iteration,andαkis the step-size of thekth iteration along the direction.

        For a given descent directiondk,in order to obtain the step-sizeαkalongdk,there are mainly two kinds of line search rules: the exact line search and the inexact one.However,the exact line search is only applicable to the iterative process of obtaining arg,which is difficult to realize in the actual calculation process.The inexact line search is to find a step-size close to the optimal search step-size by some inexact line search criteria,so that adequate reduction infis achieved with less cost.Therefore,more and more researchers have begun to study the inexact line search.The famous inexact line search is Armijo line search[1],and its form is as follows:

        Inexact line search includes nonmonotone line search and monotone line search.The Armijo line search belongs to monotone line search,asdkis a descent search direction,the objective function value always decreases monotonously at each iteration[2].Since the monotone line search rule limits the objective function to decline strictly in the iterative process,when the objective function appears narrow and curved canyons in the feasible region,it will lead to very small iteration steps and even sawtooth phenomenon occurs,the numerical effi-ciency of the algorithm will be greatly reduced.In order to avoid such problems,nonmonotone line search is often adopted,and the numerical results in [3-7]show that nonmonotone line search can improve the likelihood of finding a global optimal solution and the convergence speed.

        The nonmonotone line search strategy was first proposed by Grippo et al.[7]Their approach is roughly as follows: LetMbe a fixed nonnegative integer andδ,ρbe a constant in(0,1).Take an initial pointx0,chooseαk=kρhksatisfying the following conditions

        where 0≤m(k)≤min{m(k ?1)+1,M},m(0)=0,kis the prior trial step-size,hkis the smallest nonnegative integer that makes (1.4) hold.

        Although the nonmonotone line search technique in(1.4)has many advantages,it cannot fully utilize the value of the current objective function.In addition,[7-8]pointed out that the numerical efficiency of the algorithm depends on the selection of elementM.To overcome these disadvantages,ZHANG and Hager[9]developed another nonmonotone line search technique.Specifically,letfk=f(xk),C0=f(x0),Q0=1,δ ∈(0,1),0≤ηmin≤ηmax<1 withηk ∈[ηmin,ηmax].Then,the nonmonotone line search method (1.4) is replaced by

        Subsequently,HUANG et al.[10]proposed a new nonmonotone line search strategy,which is proved to be an improved version of ZHANG and Hager[9].

        It is well known that the degree of nonmonotonicity plays an important role in nonmonotone search technology.In [9,11],it has been shown that the nonmonotonicity of the algorithm can be controlled.In recent years,some scholars have begun to study how to adaptively control the degree of nonmonotonicity in nonmonotone technology[12?14].

        Inspired by the above discussion,in this paper,we propose a new adaptive nonmonotone line search rule based on (1.5),and prove that the new line search rule has some good properties.Based on this new line search rule,an adaptive nonmonotone Newton algorithm is proposed.Under some mild assumptions,the global convergence of the algorithm is obtained,and the numerical results show that the new method is feasible and effective.

        The following structure of this paper is as follows: In Section 2,a new nonmonotone line search technique is proposed,based on this new line search rule,an adaptive nonmonotone Newton algorithm is proposed,and some good properties of the new algorithm are given;In Section 3,The global convergence of the new algorithm is proved.In Section 4,several numerical experiments are carried out.Some conclusions are made in Section 5.

        2.A New Adaptive Nonmonotone Newton Algorithm

        In this section,a new nonmonotone line search technique is first introduced.Then,based on the new line search technology,a new adaptive nonmonotone Newton algorithm is proposed.

        The new nonmonotone line search technique can be given as follows.

        whereCkis calculated by (1.6),mk+1is calculated as follows,‖·‖and‖·‖∞r(nóng)epresent the Euclidean and infinity norms of a vector,respectively.

        and 0

        As mentioned in [13],the gradient norm is related to the form of the objective function.Therefore,in the new nonmonotone line search rule,the value ofmkis adaptively adjusted by the gradient norm.If the gradient norm is large,the first condition in (2.3) is satisfied,just as the case that iterative sequence follows the bottom of a narrow valley.Therefore,in order to avoid the iterative sequence creeping along the bottom of a narrow and curved valley,the value ofmkshould be increased;If the gradient norm is moderate,the second condition in (2.3) is satisfied,then the value ofmkremains unchanged.If the gradient norm is small,the third condition in (2.3) is satisfied,which indicates that the current iteration is in a flat area.Therefore,in order to further reduce the value of the function,the value ofmkshould be reduced.

        Based on the new line search technology,we propose a new adaptive nonmonotone Newton algorithm as shown in Algorithm 2.1.

        Algorithm 2.1(A new adaptive nonmonotone Newton algorithm)

        Remark 2.1The sufficient descent condition is very important to establish the global convergence of Algorithm 2.1.Actually,in Step 1 of Algorithm 2.1,it ensures that there are two positive constantsc1andc2,and fork ≥0,the search directiondksatisfies:

        Now we prove the existence ofδkin (2.1).

        By the mean-theorem,there is aξk ∈(0,1) such that

        Asm →∞,it is obtained that

        From 0<δk <1,it follows that.This contradictsdkas the descend direction.Therefore,there is a step-sizeαkthat makes (2.1) hold.

        Therefore,under the condition of Theorem 2.1,there is always a step-size that makes(2.1) hold,which means that Algorithm 2.1 is well defined.

        3.The Convergence of Algorithm 2.1

        In this section,the global convergence of Algorithm 2.1 is proved.Firstly,the following assumptions are given.

        Assumption 3.1Assume thatf(x) is continuously differentiable and bounded below on Rn.

        Assumption 3.2Assume that the gradientg(x) off(x) is Lipschitz continuous,that is,there is a constantL>0 such that

        Before stating the following theorem,for convenience,we set(k)=min(k,mk),Mis the largest positive integer obtained bymk.

        Theorem 3.1Suppose that{xk}is the iterative sequence generated by Algorithm 2.1,and there isfork ≥0,then?l ≥1,

        ProofIn order to prove (3.1),we first prove that the following inequality holds.

        According to the line search condition (2.1) and Theorem 2.1,it follows that

        From (3.4),(3.3) holds forj=1.Suppose that (3.3) holds for 1≤j ≤M ?1.According to the inductive hypothesis and the descent property ofdkthat

        By (3.5) and (2.1),we obtain

        Thus,(3.3) holds forj+1.By induction,(3.3) holds for 1≤j ≤M.Therefore,(3.1) holds.Sincef(x) is bounded below,it follows that

        By summing (3.1) overl,we have

        Therefore,(3.2) holds.

        Lemma 3.1Suppose that Assumptions 3.2 holds.Ifαkis the iterative step-size generated by Algorithm 2.1,then under the conditions of (2.4) and (2.5) fork ≥0,

        ProofFor the obtained step-sizeαk,ifdoes not satisfy (2.1),then

        By the mean-theorem,there is aξk ∈(0,1) such that

        We conclude from Assumption 3.2,the Cauchy-Schwartz inequality,(2.5) and (3.7) that

        We can obtain from (2.4) and (2.5) that

        Therefore (3.6) holds.

        Theorem 3.2Suppose that{xk}is the iterative sequence generated by Algorithm 2.1.Under the conditions of Assumptions 3.1,3.2 and (2.4) and (2.5),it holds that

        ProofFirstly,let us prove that there exists a constantγsuch that

        From Algorithm 2.1,we get thatαk ≤1.By (1.2) and (2.5),we have that

        From Assumption 3.2,it follows that

        Settingγ=1+Lc2,we obtain

        Thus,(3.10) holds.

        From (3.1),(3.6) and (2.4),(2.5),it follows that

        In view of (3.12),we get

        Therefore,(3.9) holds.

        We will replace (2.4) and (2.5) with a milder Assumption 3.3 and establish the global convergence of Algorithm 2.1.

        Assumption 3.3Assume that there exist three positive constantsc1,τ1andτ2such that for allk,

        Theorem 3.3Suppose that{xk}is the iterative sequence generated by Algorithm 2.1.Under Assumptions 3.1,3.2 and 3.3,it holds that

        ProofAssuming that (3.15) does not true.Then,for eachk,there exists a constantβ >0 such that

        By Assumption 3.3,we can get

        Since (2.4) holds under the condition of Assumption 3.3,(3.8) still holds under the condition of Assumption 3.3.Therefore,for allk,

        Then,it follows from Assumption 3.3 and (3.17) that

        This contradicts (3.2),therefore (3.15) holds.

        4.The Numerical Experiments

        In this section,the numerical efficiency of the proposed nonmonotone line search is tested.We select 24 unconstrained optimization test problems from [15]for numerical experiments,and compare our algorithm with other algorithms associated with different line search rules.All the numerical experiments are carried out on WINDOWS10 operating system,Inter (R)Core(TM)i5-7200U CPU@2.50GHz 2.71GHz and 12G memory personal computer,and are implemented in MATLAB 9.8.0.1323502 (R2020a).In the new algorithm,we setc1=10?5,c2=105,b0=0.5,b1=1.5.

        In order to show the numerical efficiency of the new algorithm,we mainly compare the four methods in the following:

        M1: the search direction in Algorithm 2.1 combined with the Armijo line search in [1];

        M2: the search direction in Algorithm 2.1 combined with the nonmonotone line search in [7];

        M3: the search direction in Algorithm 2.1 combined with the nonmonotone line search in [9];

        M4: Algorithm 2.1.

        In our implementation,the trial step-size of all algorithms is set to be 1.All algorithms stop if the termination condition‖gk‖≤10?8is satisfied or the number of iterations exceeds 10,000.The parameters in all algorithms are selected as follows:

        M1:δ=10?3,ρ=0.5;

        M2:δ=10?3,ρ=0.5,M=10;

        M3:δ=10?3,ηk=0.85,ρ=0.5;

        M4:δk=10?3,ηk=0.85,ρ=0.5,N=1.

        The numerical experimental results are shown in the Tab.4.1,where the following denotations are used:

        Tab.4.1 Numerical performance of the four methods

        Problem: the name of the test problem from [15];

        Dim: the dimension of the problem;

        Ni: the number of iterations;

        Ng: the number of gradient calculations;

        Mi: the methodi.

        It can be seen from the number of iterations and the number of gradient calculations of each algorithm in Tab.4.1:

        There is one problem where M1 is superior to M4 and eight problems where M4 is superior to M1.

        There are no problems where M2 is superior to M4 and four problems where M4 is superior to M2.

        There are no problems where M3 is superior to M4 and three problems where M4 is superior to M3.

        Therefore,the numerical results in Tab.4.1 show that the new nonmonotone line search algorithm is feasible and effective.

        5.Conclusion

        Based on the nonmonotone line search technique proposed by ZHANG and Hager[9],a new adaptive nonmonotone line search rule is proposed,which adaptively adjusts the value ofmkthrough the gradient norm in each iteration.A new adaptive nonmonotone Newton algorithm is developed by utilizing the proposed nonmonotone line search rule,and the global convergence of the proposed algorithm is proved.Finally,24 unconstrained optimization problems are numerically tested and compared with the existing three classical line search algorithms.The numerical results show that the new method is feasible and effective.

        狼人伊人影院在线观看国产| 欧美人与物videos另类xxxxx| 亚洲精品aⅴ无码精品丝袜足| 久久亚洲精品国产精品婷婷| 白嫩少妇高潮喷水av| 欧美性猛交xxxx乱大交极品| 亚洲精品午夜无码专区| 欧美极品美女| 少妇高潮惨叫久久久久电影| 日本高清视频一区二区| 日本亲近相奷中文字幕| 男人靠女人免费视频网站| 欧美日韩亚洲成色二本道三区 | 在线视频一区二区亚洲| 中文字幕日本在线乱码| 女人无遮挡裸交性做爰| 国产乱码一二三区精品| 亚洲AⅤ无码片一区二区三区| 久久色悠悠亚洲综合网| 国产成人大片在线播放| 免费无码高潮流白浆视频| 天天做天天躁天天躁| 曰本亚洲欧洲色a在线| av网站不卡的av在线| 天天摸天天做天天爽水多| 亚洲av永久无码国产精品久久| 久久精品国产亚洲黑森林| 一级黄片草逼免费视频| 青青草狠吊色在线视频| 国产精品无码一区二区在线看| 亚洲精品美女久久久久久久| 久久国产精品国产精品久久| 国产精品亚洲二区在线看| √新版天堂资源在线资源| 婷婷四房播播| 亚洲中国美女精品久久久| 久久黄色国产精品一区视频| 亚洲日产一线二线三线精华液 | 色婷婷久色国产成人免费| 少妇激情av一区二区三区| 白天躁晚上躁麻豆视频|