Decomposition-based Method for Sparse Semidefinite Relaxations of Polynomial Optimization Problems (original) (raw)

Sums of Squares and Semidefinite Program Relaxations for Polynomial Optimization Problems with Structured Sparsity

SIAM Journal on Optimization, 2006

Unconstrained and inequality constrained sparse polynomial optimization problems (POPs) are considered. A correlative sparsity pattern graph is defined to find a certain sparse structure in the objective and constraint polynomials of a POP. Based on this graph, sets of supports for sums of squares (SOS) polynomials that lead to efficient SOS and semidefinite programming (SDP) relaxations are obtained. Numerical results from various test problems are included to show the improved performance of the SOS and SDP relaxations.

Generalized Lagrangian Duals and Sums of Squares Relaxations of Sparse Polynomial Optimization Problems

SIAM Journal on Optimization, 2005

Sequences of generalized Lagrangian duals and their SOS (sums of squares of polynomials) relaxations for a POP (polynomial optimization problem) are introduced. Sparsity of polynomials in the POP is used to reduce the sizes of the Lagrangian duals and their SOS relaxations. It is proved that the optimal values of the Lagrangian duals in the sequence converge to the optimal value of the POP using a method from the penalty function approach. The sequence of SOS relaxations is transformed into a sequence of SDP (semidefinite program) relaxations of the POP, which correspond to duals of modification and generalization of SDP relaxations given by Lasserre for the POP.

Convergent SDP‐Relaxations in Polynomial Optimization with Sparsity

SIAM Journal on Optimization, 2006

We consider a polynomial programming problem P on a compact semi-algebraic set K ⊂ R n , described by m polynomial inequalities g j (X) ≥ 0, and with criterion f ∈ R[X]. We propose a hierarchy of semidefinite relaxations in the spirit those of Waki et al. [9]. In particular, the SDP-relaxation of order r has the following two features: (a) The number of variables is O(κ 2r) where κ = max[κ 1 , κ 2 ] witth κ 1 (resp. κ 2) being the maximum number of variables appearing the monomials of f (resp. appearing in a single constraint g j (X) ≥ 0). (b) The largest size of the LMI's (Linear Matrix Inequalities) is O(κ r). This is to compare with the respective number of variables O(n 2r) and LMI size O(n r) in the original SDP-relaxations defined in [11]. Therefore, great computational savings are expected in case of sparsity in the data {g j , f }, i.e. when κ is small, a frequent case in practical applications of interest. The novelty with respect to [9] is that we prove convergence to the global optimum of P when the sparsity pattern satisfies a condition often encountered in large size problems of practical applications, and known as the running intersection property in graph theory. In such cases, and as a by-product, we also obtain a new representation result for polynomials positive on a basic closed semialgebraic set, a sparse version of Putinar's Positivstellensatz [16].

Equality Based Contraction of Semidefinite Programming Relaxations in Polynomial Optimization

Journal of the Operations Research Society of Japan, 2008

The SDP (semidefinite programming) relaxation fbr general POPs (polynomial optimization problems), which was proposed as a method for computing global optimal solutions of POPs by Lasserre, has become an active research subject recently. We propose a new heuristic method exploiting the equality constraints in a given POP, and strengthen the SDP relaxation so as to achieve faster convergence to the global optimum of the POP. We can apply this method te both of the dense SDP relaxation which was originally proposed by Lasserre, and the sparse SDP relaxation which was later proposed by Kim, Kejima, Muramatsu and Waki. E$pecially, our heuristic method incorporated into the sparse SDP relaxation method has shown a promising performance in numerical experiments on large scale sparse POPs, Roughly speaking, we induce valid equality constraints from the original equality constraints of the POP, and then use them to convert the dense or sparse SDP relaxation into a new stronger SDP relaxation. Our method is enlightened by some strong theoretical results on the convergence of SDP relaxations for POPs with equality constraints provided by Lasserre , Parrilo and Laurent, but we place the main emphasis oll the practical aspect to compute mere accurate lower bounds of larger sparse POPs.

SDP relaxations for sparse Polynomial Optimization Problems (Decision Theory and Optimization Algorithms)

2005

POPs (Polynomial optimization problems or optimization problems with polynomial objective alld constraints) represent a broad range of applications in science and engineering. Recently, an important theoretical development has been made by Lasserre [6] toward achieving optimal values of POPs. According to the paper [4], his method to obtain a sequence of SDP relaxations can be considered as a primal approach. He proved that

A note on sparse SOS and SDP relaxations for polynomial optimization problems over symmetric cones

Computational Optimization and Applications, 2009

This short note extends the sparse SOS (sum of squares) and SDP (semidefinite programming) relaxation proposed by Waki, Kim, Kojima and Muramatsu for normal POPs (polynomial optimization problems) to POPs over symmetric cones, and establishes its theoretical convergence based on the recent convergence result by Lasserre on the sparse SOS and SDP relaxation for normal POPs. A numerical example is also given to exhibit its high potential.

Exploiting Sparsity in SDP Relaxation of Polynomial Optimization Problems

Handbook on Semidefinite, Conic and Polynomial Optimization, 2011

We present a survey on the sparse SDP relaxation proposed as a sparse variant of Lasserre's SDP relaxation of polynomial optimization problems. We discuss the primal and dual approaches to derive the sparse SDP and SOS relaxations, and their relationship. In particular, exploiting structured sparsity in the both approaches is described in view of the quality and the size of the SDP relaxations. In addition, numerical techniques used in the Matlab package SparsePOP for solving POPs are included. We report numerical results on SparsePOP and the application of the sparse SDP relaxation to sensor network localization problems.

Solving polynomial least squares problems via semidefinite programming relaxations

Journal of Global Optimization, 2010

A polynomial optimization problem whose objective function is represented as a sum of positive and even powers of polynomials, called a polynomial least squares problem, is considered. Methods to transform a polynomial least square problem to polynomial semidefinite programs to reduce degrees of the polynomials are discussed. Computational efficiency of solving the original polynomial least squares problem and the transformed polynomial semidefinite programs is compared. Numerical results on selected polynomial least square problems show better computational performance of a transformed polynomial semidefinite program, especially when degrees of the polynomials are larger.