Solving a Large-Scale Nonlinear System of Monotone Equations by using a Projection Technique (original) (raw)
Related papers
Solving Systems of Nonlinear Monotone Equations by Using a New Projection Approach
IOP Publishing, 2021
The projection technique is one of the famous method and highly useful to solve the optimization problems and nonlinear systems of equations. In this work, a new projection approach for solving systems of nonlinear monotone equation is proposed combining with the conjugate gradient direction because of their low storage. The new algorithm can be used to solve the large-scale nonlinear systems of equations and satisfy the sufficient descent condition. The new algorithm generates appropriate direction then employs a good line search along this direction to reach a new point. If this point solves the problem then the algorithm stops, otherwise, it constructs a suitable hyperplane that strictly separate the current point from the solution set. The next iteration is obtained by projection the new point onto the separating hyperplane. We proved that the line search of the new projection algorithm is well defined. Furthermore, we established the global convergence under some mild conditions. The numerical experiment indicates that the new method is effective and very well.
Two derivative-free projection approaches for systems of large-scale nonlinear monotone equations
Numerical Algorithms, 2013
This study proposes two derivative-free approaches for solving systems of large-scale nonlinear equations, where the underlying functions of the systems are continuous and satisfy a monotonicity condition. First, the framework generates a specific direction then employs a backtracking line search along this direction to construct a new point. If the new point solves the problem, the process will be stopped. Under other circumstances, the projection technique constructs an appropriate hyperplane strictly separating the current iterate from the solutions of the problem. Then the projection of the new point onto the hyperplane will determine the next iterate. Thanks to the low memory requirement of derivative-free conjugate gradient approaches, this work takes advantages of two new derivative-free conjugate gradient directions. Under appropriate conditions, the global convergence result of the recommended procedures is established. Preliminary numerical results indicate that the proposed approaches are interesting and remarkably promising.
A Projection Hestenes-Stiefel-Like Method for Monotone Nonlinear Equations with Convex Constraints
2018
The Hestenes-Stiefel (HS) conjugate gradient (CG) method is generally regarded as one of the most efficient methods for large-scale unconstrained optimization problems. In this paper, we extend a modified Hestenes-Stiefel conjugate gradient method based on the projection technique and present a new projection method for solving nonlinear monotone equations with convex constraints. The search direction obtained satisfies the sufficient descent condition. The method can be applied to solve nonsmooth monotone problems for it is derivative free. Under appropriate assumptions, the method is shown to be globally convergent. Preliminary numerical results show that the proposed method works well and is very efficient.
FR type methods for systems of large-scale nonlinear monotone equations
A large class of iterative methods for solving nonlinear monotone systems is developed in recent years. In this paper we propose some new FR type directions in the frame of algorithm which is a combination of conjugate gradient approach and hyperplane projection technique. Derivative-free, function-value-based line search combined with projection procedure is used for globalization strategy. Numerical performances of methods with different search directions are compared.
A method with inertial extrapolation step for convex constrained monotone equations
Journal of Inequalities and Applications
In recent times, various algorithms have been incorporated with the inertial extrapolation step to speed up the convergence of the sequence generated by these algorithms. As far as we know, very few results exist regarding algorithms of the inertial derivative-free projection method for solving convex constrained monotone nonlinear equations. In this article, the convergence analysis of a derivative-free iterative algorithm (Liu and Feng in Numer. Algorithms 82(1):245–262, 2019) with an inertial extrapolation step for solving large scale convex constrained monotone nonlinear equations is studied. The proposed method generates a sufficient descent direction at each iteration. Under some mild assumptions, the global convergence of the sequence generated by the proposed method is established. Furthermore, some experimental results are presented to support the theoretical analysis of the proposed method.
A New Hybrid Approach for Solving Large-scale Monotone Nonlinear Equations
Journal of Mathematical and Fundamental Sciences
In this paper, a new hybrid conjugate gradient method for solving monotone nonlinear equations is introduced. The scheme is a combination of the Fletcher-Reeves (FR) and Polak-Ribiére-Polyak (PRP) conjugate gradient methods with the Solodov and Svaiter projection strategy. Using suitable assumptions, the global convergence of the scheme with monotone line search is provided. Lastly, a numerical experiment was used to enumerate the suitability of the proposed scheme for large-scale problems.
A descent derivative-free algorithm for nonlinear monotone equations with convex constraints
RAIRO - Operations Research
In this paper, we present a derivative-free algorithm for nonlinear monotone equations with convex constraints. The search direction is a product of a positive parameter and the negation of a residual vector. At each iteration step, the algorithm generates a descent direction independent from the line search used. Under appropriate assumptions, the global convergence of the algorithm is given. Numerical experiments show the algorithm has advantages over the recently proposed algorithms by Gao and He (Calcolo 55 (2018) 53) and Liu and Li (Comput. Math. App. 70 (2015) 2442–2453).
Two derivative‐free algorithms for constrained nonlinear monotone equations
Computational and Mathematical Methods, 2021
We propose two positive parameters based on the choice of Birgin and Martínez search direction. Using the two classical choices of the Barzilai-Borwein parameters, two positive parameters were derived by minimizing the distance between the relative matrix corresponding to the propose search direction and the scaled memory-less Broyden-Fletcher-Goldfarb-Shanno (BFGS) matrix in the Frobenius norm. Moreover, the resulting direction is descent independent of any line search condition. We established the global convergence of the proposed algorithm under some appropriate assumptions. In addition, numerical experiments on some benchmark test problems are reported in order to show the efficiency of the proposed algorithm.
In this paper, based on projection technique Solodove and Svaiter (Reformulation: Nonsmooth, piecewise smooth, semismooth and smoothing methods; 355-369,1995), we propose a decent improved threeterm derivative free approach for solving nonlinear monotone equations with convex constraints. The algorithm combines the spectral gradient parameter with a newly PRPlike CG coefficient in the search direction. The global convergence of the proposed approach established under standard conditions. Additionally, using some benchmark problems, the numerical results highlight the outstanding performance of this approach compared to popular conjugate gradient methods. The experiments also show its effectiveness in solving large-scale nonlinear equations with convex constraints.