A New Method to Solve the Constraint Satisfaction Problem Using the Hopfield Neural Network (original) (raw)
Related papers
A wide variety of combinatorial problems can be viewed as Weighted Constraint Satisfaction Problems (WCSPs). All resolution methods have an exponential time complexity for big instances. Moreover, they combine several techniques, use a wide variety of concepts and notations that are difficult to understand and implement. In this paper, we model this problem in terms of an original 0-1 quadratic programming subject to linear constraints. This model is validated by the proposed and demonstrated theorem. View its performance, we use the Hopfield neural network to solve the obtained model basing on original energy function. To validate our model, we solve several instances of benchmarking WCSP. Our approach has the same memory complexity as the HNN and the same time complexity as Euler-Cauchy method. In this regard, our approach recognizes the optimal solution of the said instances.
Journal of Engineering and Technology, 2012
This paper discusses the implementation of Hopfield neural networks for solving constraint satisfaction problems using field programmable gate arrays (FPGAs). It discusses techniques for formulating such problems as discrete neural networks, and then it describes the N-Queen problem using this formulation. Finally results will be presented which compare the computation times for the custom computer against the simulation of the Hopfield network run on a high end workstation. In this way, the speed-up can be determined, that illustrate a speedup of up to 2 to 3 orders of magnitude is possible using current FPGAs devices. Key words: Hopfield neural network, field programmable gate arrays (FPGA), N-Queen problem.
A new approach to solve the of maximum constraint satisfaction problem
IAES International Journal of Artificial Intelligence (IJ-AI)
The premature convergence of the simulated annealing algorithm, to solve many complex problems of artificial intelligence, refers to a failure mode where the process stops at a stable point that does not represent to an overall solution. Accelerating the speed of convergence and avoiding local solutions is the concern of this work. To overcome this weakness in order to improve the performance of the solution, a new hybrid approach is proposed. The new approach is able to take into consideration the state of the system during convergence via the use of Hopfield neural networks. To implement the proposed approach, the problem of maximum constraint satisfaction is modeled as a quadratic programming. This problem is solved via the use of the new approach. The approach is compared with other methods to show the effectiveness of the proposed approach.
PROCEEDINGS OF INTERNATIONAL CONFERENCE ON ADVANCES IN MATERIALS RESEARCH (ICAMR - 2019), 2020
Hopfield Neural Network (HNN) is a sort of neural network that is strongly dependent to energy minimization of solution. Although HNN managed to solve various optimization problem, the output of HNN suffered from a lack of interpretability and variation. This has severely limited the practical usability of HNN in doing logic programming. Inspired by random neuron perturbation, Estimation of Distribution Algorithm (EDA) has been proposed to explore various optimal neuron state. EDAs employs a probabilistic model to sample the neuron state in order to move toward the various optimal location of global minimum energy. In this paper, a new Mutation Hopfield Neural Network (MHNN) will be proposed to do k Satisfiability programming. Based on the experimental result, the proposed MHNN has outperformed conventional HNN in various performance metric.
Maximal constraint satisfaction problems solved by continuous hopfield networks
2013
In this paper, we propose a new approach to solve the maximal constraint satisfaction problems (Max-CSP) using the continuous Hopfield network. This approach is divided into two steps: the first step involves modeling the maximal constraint satisfaction problem as 0-1 quadratic programming subject to linear constraints (QP). The second step concerns applying the continuous Hopfield network (CHN) to solve the QP problem. Therefore, the generalized energy function associated with the CHN and an appropriate parametersetting procedure about Max-CSP problems are given in detail. Finally, the proposed algorithm and some computational experiments solving the Max-CSP are shown. Key-Words: Maximal constraint satisfaction problems, quadratic 0-1 programming, continuous Hopfield network, energy function
Convergence and stability study of Hopfield's neural network for linear programming
1992
Parameters that affect stability and convergence of the Hopfield model were identified by simulation. The Hopfield model used to solve optimization problems was defined by an analog electrical circuit. The authors illustrate that by introducing one additional amplifier a convergence with a good stability can be obtained. It is shown that convergence and stability can be obtained without oscillations. This novel model was used to solve a linear programming problem. Some results are presented
A Connectionist Approach for Solving Large Constraint Satisfaction Problems
1997
An efficient neural network technique is presented for the solution of binary constraint satisfaction problems. The method is based on the application of a double-update technique to the operation of the discrete Hopfield-type neural network that can be constructed for the solution of such problems. This operation scheme ensures that the network moves only between consistent states, such that each problem variable is assigned exactly one value, and leads to a fast and efficient search of the problem state space. Extensions of the proposed method are considered in order to include several optimisation criteria in the search. Experimental results concerning many real-size instances of the Radio Links Frequency Assignment Problem demonstrate very good performance.