A review of Hopfield neural networks for solving mathematical programming problems (original) (raw)
Related papers
Hopfield neural networks for optimization: study of the different dynamics
Neurocomputing, 2002
In this paper the application of arbitrary order Hop"eld-like neural networks to optimization problems is studied. These networks are classi"ed in three categories according to their dynamics, expliciting the energy function for each category. The main problems a!ecting practical applications of these networks are brought to light: (a) Incoherence between the network dynamics and the associated energy function; (b) Error due to discrete simulation on a digital computer of the continuous dynamics equations; (c) Existence of local minima; (d) Convergence depends on the coe$cients weighting the cost function terms. The e!ect of these problems on each network is analysed and simulated, indicating possible solutions. Finally, the called continuous dynamics II is dealt with, proving that the integral term in the energy function is bounded, in contrast with Hop"eld's statement, and proposing an e$cient local minima avoidance strategy. Experimental results are obtained solving Diophantine equation, Hamiltonian cycle and k-colorability problems.
Convergence and stability study of Hopfield's neural network for linear programming
1992
Parameters that affect stability and convergence of the Hopfield model were identified by simulation. The Hopfield model used to solve optimization problems was defined by an analog electrical circuit. The authors illustrate that by introducing one additional amplifier a convergence with a good stability can be obtained. It is shown that convergence and stability can be obtained without oscillations. This novel model was used to solve a linear programming problem. Some results are presented
Computational Properties of Generalized Hopfield Networks Applied to Nonlinear Optimization
1990
A nonlinear neural framework, called the Generalized Hopfield Network, is proposed, which is able to solve in a parallel distributed manner systems of nonlinear equations. The method is applied to the general nonlinear optimization problem. We demonstrate GHNs implementing the three most important optimization algorithms, namely the Augmented Lagrangian, Generalized Reduced Gradient and Successive Quadratic Programming methods. The study results in a dynamic view of the optimization problem and offers a straightforward model for the parallelization of the optimization computations, thus significantly extending the practical limits of problems that can be formulated as an optimization problem and which can gain from the introduction of nonlinearities in their structure (eg. pattern recognition, supervised learning, design of contentaddressable memories).
Generalized Hopfield Networks and Nonlinear Optimization
A nonlinear neural framework, called the Generalized Hopfield network, is proposed, which is able to solve in a parallel distributed manner systems of nonlinear equations. The method is applied to the general nonlinear optimization problem. We demonstrate GHNs implementing the three most important optimization algorithms, namely the Augmented Lagrangian, Generalized Reduced Gradient and Successive Quadratic Programming methods. The study results in a dynamic view of the optimization problem and offers a straightforward model for the parallelization of the optimization computations, thus significantly extending the practical limits of problems that can be formulated as an optimization problem and which can gain from the introduction of nonlinearities in their structure (eg. pattern recognition, supervised learning, design of content-addressable memories).
Journal of Engineering and Technology, 2012
This paper discusses the implementation of Hopfield neural networks for solving constraint satisfaction problems using field programmable gate arrays (FPGAs). It discusses techniques for formulating such problems as discrete neural networks, and then it describes the N-Queen problem using this formulation. Finally results will be presented which compare the computation times for the custom computer against the simulation of the Hopfield network run on a high end workstation. In this way, the speed-up can be determined, that illustrate a speedup of up to 2 to 3 orders of magnitude is possible using current FPGAs devices. Key words: Hopfield neural network, field programmable gate arrays (FPGA), N-Queen problem.
Neural Networks and Optimization Problems
Citeseer
An optimization problem can be defined as a pair of an objective function and a set of constraints on the variables of the function. The goal is to find out the values of the variables that lead to an optimal value of the function (either minimum or maximum), while satisfying all the constraints. During the last decades, an alternative model of computation has been explored, namely the neural network model. It turned out that several Hopfield-type networks can be employed successfully to provide approximate (near-optimal or even optimal) solutions to hard optimization problems. This is due to the property of reducing their "energy function" during evolution, leading to a local or global minimum. In this report, the general methodology of the approach is described as well as the different network models usually employed as optimizers. Then, a case study involving the Minimum Cost Spare Allocation Problem (or equivalently Vertex Cover in bipartite graphs) is presented. Finally, the experimental results (using a simulation implemented in C) demonstrate clearly the advantages and the limitations of the approach in terms of solution quality and computation time.
A New Method to Solve the Constraint Satisfaction Problem Using the Hopfield Neural Network
The constraint satisfaction problem is constituted by several condition formulas, which makes it difficult to be solved. In this paper, using the Hopfield neural network, a new method is pro posed to solve the constraint satisfaction problem by simplifying its condition formula. In this method, all restriction conditions of a constraint satisfaction problem are divided into two restrictions: restriction I and restriction II. In processing step, restriction II is satisfied by setting its value to be 0 and the value of restriction I is always made on the decreasing direction. The optimum so- lution could be obtained when the values of energy, restriction I and restriction II become 0 at the same time. To verify the valid ity of the proposed method, we apply it to two typical constraint satisfaction problems: N-queens problem and four-coloring prob lem. The simulation results show that the optimum solution can be obtained in high speed and high convergence rate. Moreover, compared ...
A wide variety of combinatorial problems can be viewed as Weighted Constraint Satisfaction Problems (WCSPs). All resolution methods have an exponential time complexity for big instances. Moreover, they combine several techniques, use a wide variety of concepts and notations that are difficult to understand and implement. In this paper, we model this problem in terms of an original 0-1 quadratic programming subject to linear constraints. This model is validated by the proposed and demonstrated theorem. View its performance, we use the Hopfield neural network to solve the obtained model basing on original energy function. To validate our model, we solve several instances of benchmarking WCSP. Our approach has the same memory complexity as the HNN and the same time complexity as Euler-Cauchy method. In this regard, our approach recognizes the optimal solution of the said instances.
Modified Hopfield Neural Network Approach for Solving Nonlinear Algebraic Equations
Engineering Letters, 2007
In this paper, we present an neural network approach to solve a set of nonlinear equations. A modified Hopfield network has been developed to optimize a energy function. This approach provides faster convergence and extremely accurate solutions for all solvable systems. We solved and discussed several illustrative examples in order to depict the powerfulness of the proposed method.