Linearly Convergent First-Order Algorithms for Semidefinite Programming (original) (raw)

First and second-order methods for semidefinite programming

Mathematical Programming, 2003

In this paper, we survey the most recent methods that have been developed for the solution of semidefinite programs. We first concentrate on the methods that have been primarily motivated by the interior point (IP) algorithms for linear programming, putting special emphasis in the class of primal-dual path-following algorithms. We also survey methods that have been developed for solving large-scale SDP problems. These include first-order nonlinear programming (NLP) methods and more specialized path-following IP methods which use the (preconditioned) conjugate gradient or residual scheme to compute the Newton direction and the notion of matrix completion to exploit data sparsity.

ON FORMULATING SEMIDEFINITE PROGRAMMING PROBLEMS AS SMOOTH CONVEX NONLINEAR OPTIMIZATION PROBLEMS

2000

Consider the diagonal entries d j , j = 1, 2, . . . , n, of the matrix D in an LDL T factorization of an n × n matrix X. As a function of X, each d j is well-defined on the closed domain of positive semidefinite matrices. We show that these functions are twice continuously differentiable and concave throughout the interior of this domain. Using these facts, we show how to formulate semidefinite programming problems as standard convex optimization problems that can be solved using an interior-point method for nonlinear programming.

Sequential constant rank constraint qualifications for nonlinear semidefinite programming with applications

2021

We present new constraint qualification conditions for nonlinear semidefinite programming that extend some of the constant rank-type conditions from nonlinear programming. As an application of these conditions, we provide a unified global convergence proof of a class of algorithms to stationary points without assuming neither uniqueness of the Lagrange multiplier nor boundedness of the Lagrange multipliers set. This class of algorithm includes, for instance, general forms of augmented Lagrangian, sequential quadratic programming, and interior point methods. We also compare these new conditions with some of the existing ones, including the nondegeneracy condition, Robinson's constraint qualification, and the metric subregularity constraint qualification.

Interior-point algorithms for semidefinite programming based on a nonlinear formulation

2002

Recently, the authors of this paper introduced a nonlinear transformation to convert the positive de niteness constraint on an n n matrix function of a certain form into the positivity constraint on n scalar variables while keeping the number of variables unchanged 4]. Based on this transformation, they proposed interior point algorithms for solving a special class of linear semide nite programs 5]. In this paper, we extend this approach and apply the transformation to general linear semide nite programs, producing nonlinear programs that have not only the n positivity constraints, but also n additional nonlinear inequality constraints. Despite this complication, the transformed problems still retain most of the desirable properties. We propose interior-point algorithms for this type of nonlinear program and establish their global convergence.

On the Local Convergence of a Predictor-Corrector Method for Semidefinite Programming

SIAM Journal on Optimization, 1999

We study the local convergence of a predictor-corrector algorithm for semide nite programming problems based on the Monteiro-Zhang uni ed direction whose polynomial convergence was recently established by Monteiro. We prove that the su cient condition for superlinear convergence of Potra and Sheng applies to this algorithm and is independent of the scaling matrices. Under strict complementarity and nondegeneracy assumptions superlinear convergence with Q-order 1.5 is proved if the scaling matrices in the corrector step have bounded condition number. A version of the predictor-corrector algorithm enjoys quadratic convergence if the scaling matrices in both predictor and corrector steps have bounded condition numbers. The latter results apply in particular to algorithms using the AHO direction since there the scaling matrix is the identity matrix.

Interior-Point Algorithms for Semidefinite Programming Based on A Nonlinear Programming Formulation

IEEE Transactions on Reliability, 1999

Recently, the authors of this paper introduced a nonlinear transformation to convert the positive de niteness constraint on an n n matrix function of a certain form into the positivity constraint on n scalar variables while keeping the number of variables unchanged 4]. Based on this transformation, they proposed interior point algorithms for solving a special class of linear semide nite programs 5]. In this paper, we extend this approach and apply the transformation to general linear semide nite programs, producing nonlinear programs that have not only the n positivity constraints, but also n additional nonlinear inequality constraints. Despite this complication, the transformed problems still retain most of the desirable properties. We propose interior-point algorithms for this type of nonlinear program and establish their global convergence.

Block Coordinate Descent Methods for Semidefinite Programming

International Series in Operations Research & Management Science, 2011

We consider in this chapter block coordinate descent (BCD) methods for solving semidefinite programming (SDP) problems. These methods are based on sequentially minimizing the SDP problem's objective function over blocks of variables corresponding to the elements of a single row (and column) of the positive semidefinite matrix X; hence, we will also refer to these methods as row-by-row (RBR) methods. Using properties of the (generalized) Schur complement with respect to the remaining fixed (n − 1)-dimensional principal submatrix of X, the positive semidefiniteness constraint on X reduces to a simple second-order cone constraint. It is well known that without certain safeguards, BCD methods cannot be guaranteed to converge in the presence of general constraints. Hence, to handle linear equality constraints, the methods that we describe here use an augmented Lagrangian approach. Since BCD methods are first-order methods, they are likely to work well only if each subproblem minimization can be performed very efficiently. Fortunately, this is the case for several important SDP problems, including the maxcut SDP relaxation and the minimum nuclear norm matrix completion problem, since closed-form solutions for the BCD subproblems that arise in these cases are available. We also describe how BCD can be applied to solve the sparse inverse covariance estimation problem by considering a dual formulation of this problem. The BCD approach is further generalized by using a rank-two update so that the coordinates can be changed in more than one row and column at each iteration. Finally, numerical results on the maxcut SDP relaxation and matrix completion problems are presented to demonstrate the robustness and efficiency of the BCD approach, especially if only moderately accurate solutions are desired.

Sequential constant rank constraint qualifications for nonlinear semidefinite and second-order cone programming with applications

2021

We present new constraint qualification conditions for nonlinear semidefinite and second-order cone programming that extend some of the constant rank-type conditions from nonlinear programming. As an application of these conditions, we provide a unified global convergence proof of a class of algorithms to stationary points without assuming neither uniqueness of the Lagrange multiplier nor boundedness of the Lagrange multipliers set. This class of algorithm includes, for instance, general forms of augmented Lagrangian and sequential quadratic programming methods. We also compare these new conditions with some of the existing ones, including the nondegeneracy condition, Robinson's constraint qualification, and the metric subregularity constraint qualification.

Optimality conditions and global convergence for nonlinear semidefinite programming

Mathematical Programming

Sequential optimality conditions have played a major role in unifying and extending global convergence results for several classes of algorithms for general nonlinear optimization. In this paper, we extend theses concepts for nonlinear semidefinite programming. We define two sequential optimality conditions for nonlinear semidefinite programming. The first is a natural extension of the so-called Approximate-Karush-Kuhn-Tucker (AKKT), well known in nonlinear optimization. The second one, called Trace-AKKT (TAKKT), is more natural in the context of semidefinite programming as the computation of eigenvalues is avoided. We propose an Augmented Lagrangian algorithm that generates these types of sequences and new constraint qualifications are proposed, weaker than previously considered ones, which are sufficient for the global convergence of the algorithm to a stationary point.

A Decomposition Augmented Lagrangian Method for Low-rank Semidefinite Programming

2021

We develop a decomposition method based on the augmented Lagrangian framework to solve a broad family of semidefinite programming problems possibly with nonlinear objective functions, nonsmooth regularization, and general linear equality/inequality constraints. In particular, the positive semidefinite variable along with a group of linear constraints can be decomposed into a variable on a smooth manifold. The nonsmooth regularization and other general linear constraints are handled by the augmented Lagrangian method. Therefore, each subproblem can be solved by a semismooth Newton method on a manifold. Theoretically, we show that the first and secondorder necessary optimality conditions for the factorized subproblem are also sufficient for the original subproblem under certain conditions. Convergence analysis is established for the Riemannian subproblem and the augmented Lagrangian method. Extensive numerical experiments on large-scale semidefinite programming such as max-cut, neares...