Optimization Challenge Problems – Optimization Challenge Problems (2-D and Single OF) (original) (raw)
The code is in Excel VBA file Two-D Optimization Examples 2013-09-03.xlsm This set of objective functions (OF) was created to provide sample challenges for testing optimizers. The examples are all two-dimensional, having two decision variables (DV) so as to provide visual understanding of the issues that they embody. Most are relatively simple to program and compute rather simply, for user convenience. Most represent physically meaningful situations, for the person who wants to see utility and relevance. All are presented with minimization as the objective. All DVs and OF values are scaled on a 0 to 10 basis for common presentation. Classic challenges to optimizers are objective functions that have: 1. non-quadratic behavior, 2. multiple optima, 3. stochastic responses, 4. asymptotic approach to optima at infinity, 5. hard inequality constraints, or infeasible regions, 6. slope discontinuities (sharp valleys), 7. a gently sagging channel (effectively slope discontinuities), 8. level discontinuities (cliffs), 9. flat spots, 10. nearly flat spots, 11. very thin global optimum in a large surface, improbable to find, 12. discrete, integer, or class variables mixed with continuous variables, 13. underspecified problems with infinite number of equal solutions, and 14. discontinuous response to seemingly continuous DVs because of discretization in a numerical integration. In all equations that follow, x1 and x2 are the DVs, and f_of_x is the OF value. The DVs are programmed for the range [0, 10]. However, not all functions use DV values in that range. So, the DVs are scaled for the appropriate range and labeled x11 and x22. The OF value f_of_x is similarly scaled for a [0, 10] range. Any solution depends on the optimizer algorithm, the coefficients of the algorithm, and the convergence criteria. For instance, a multi-player optimizer has an increased chance of finding the global optimum. An optimizer based on a quadratic surface assumption (such as successive quadratic or Newton's) will jump to the optimum when near it, but can jump in the wrong direction when not in the proximity. The values for optimizer coefficients (scaling, switching, number of players, number of replicates, initial step size, tempering or acceleration) can make an optimizer efficient for one application, but with the same values it might be sluggish or divergent in an application with other features. The convergence criteria may be right for one optimizer, but stop another long before arriving at an optimum. When you are