OPTAC: a portable software package for analyzing and comparing optimization methods by visualization (original) (raw)
Related papers
Review of Modern Optimization Techniques
International Journal of Engineering Research and, 2015
Many difficulties are associated with the optimization of large-scale problems. The major difficulties are multi-modality, dimensionality and differentiability. Traditional techniques generally fail to solve such large-scale problems especially with nonlinear objective functions. The main problem is to solve non-differentiable functions with the help of traditional techniques because most of the traditional techniques require gradient information and hence it is not possible. Moreover, such techniques often fail to solve optimization problems that have many local optima. To overcome these problems, there is a need to develop more powerful optimization techniques. These techniques are known as modern optimization technique. In this Paper, the theory needed to understand the modern optimization techniques are explained. These modern techniques are used to solve linear, nonlinear, differential and non-differential optimization problems. Although various optimization methods have been proposed in recent years, but some more popular optimization techniques such as Genetic Algorithm, Simulated Annealing Ant colony method, Honey Bee Algorithm are presented here. The methods were broadly reviewed.
Optimization with Scilab, present and future
2009
We present in this paper an overview of optimization algorithms available in the Scilab software. We focus on the user's point of view, that is, we have to minimize or maximize an objective function and must find a solver suitable for the problem. The aim of this paper is to give a simple but accurate view of what problems can be solved by Scilab and what behavior can be expected for those solvers.
In this document, we make an overview of optimization features in Scilab. The goal of this document is to present all existing and non-existing features, such that a user who wants to solve a particular optimization problem can know what to look for. In the introduction, we analyse a classification of optimization problems. In the first chapter, we analyse the flagship of Scilab in terms of nonlinear optimization: the optim function. We analyse its features, the management of the cost function, the linear algebra and the management of the memory. Then we consider the algorithms which are used behind optim, depending on the type of algorithm and the constraints. In the remaining chapters, we present the algorithms available to solve quadratic problems, nonlinear least squares problems, semidefinite programming, genetic algorithms, simulated annealing and linear matrix inequalities. A chapter focus on optimization data files managed by Scilab, especially MPS and SIF files. Some optimization features are available in the form of toolboxes, the most important of which are the Quapro and CUTEr toolboxes. The final chapter is devoted to missing optimization features in Scilab.
IJERT-Review of Modern Optimization Techniques
International Journal of Engineering Research and Technology (IJERT), 2015
https://www.ijert.org/review-of-modern-optimization-techniques https://www.ijert.org/research/review-of-modern-optimization-techniques-IJERTV4IS041129.pdf Many difficulties are associated with the optimization of large-scale problems. The major difficulties are multi-modality, dimensionality and differentiability. Traditional techniques generally fail to solve such large-scale problems especially with nonlinear objective functions. The main problem is to solve non-differentiable functions with the help of traditional techniques because most of the traditional techniques require gradient information and hence it is not possible. Moreover, such techniques often fail to solve optimization problems that have many local optima. To overcome these problems, there is a need to develop more powerful optimization techniques. These techniques are known as modern optimization technique. In this Paper, the theory needed to understand the modern optimization techniques are explained. These modern techniques are used to solve linear, nonlinear, differential and non-differential optimization problems. Although various optimization methods have been proposed in recent years, but some more popular optimization techniques such as Genetic Algorithm, Simulated Annealing Ant colony method, Honey Bee Algorithm are presented here. The methods were broadly reviewed.
Advanced Optimization Techniques
Many difficulties such as multi-modality, dimensionality and differentiability are associated with the optimization of large-scale problems. Traditional techniques such as steepest decent, linear programing and dynamic programing generally fail to solve such large-scale problems especially with nonlinear objective functions. Most of the traditional techniques require gradient information and hence it is not possible to solve non-differentiable functions with the help of such traditional techniques. Moreover, such techniques often fail to solve optimization problems that have many local optima. To overcome these problems, there is a need to develop more powerful optimization techniques and research is going on to find effective optimization techniques since last three decades.
Optimization Techniques – A Review
Optimization problem formulation, methods of optimization and solution techniques are presented. Population based methods are also explained. Optimization using constraints in terms of reliability is found to be best option for optimizing structures with discrete parameters.
Global optimization using Mathematica: A test of software tools
2006
Mathematica provides a suite of built-in and 3rd party tools for nonlinear optimization. These tools are tested on a set of hard problems. The built-in Mathematica functions are tested as well as the tools in the MathOptimizer and Global Optimization packages. The problems tested represent classes of problems that cause difficulties for global solvers, including those with local minima, discontinuous and black box modules, problems with non-real regions, and constrained problems with complicated and wavy constraints. In addition, scaling of performance with problem size is tested. In general, no tool could solve all problems but all problems could be solved by at least on tool. All of the tools except the Global Optimization tools GlobalSearch and GlobalPenaltyFn were prone to returning infeasible solutions on discontinuous and black box modules, problems with non-real regions, and constrained problems with complicated and wavy constraints. The GlobalSearch and GlobalPenaltyFn tools were thus the most robust, and were in many cases also the fastest.
Computational Optimization, Methods and Algorithms, 2011
Design optimization in engineering tends to be very challenging, partly due to the complexity and highly nonlinearity of the problem of interest, partly due to stringent design codes in engineering practice. Conventional algorithms are not the best tools for highly nonlinear global optimization, as they are local search algorithms, and thus often miss the global optimality. In addition, design solutions have to be robust, subject to uncertainty in parameters and tolerance of available components and materials. Metaheuristic algorithms have become increasingly popular in the last two decades. This chapter reviews some of the latest metaheuristics.