Constraints Satisfaction through Recursive Neural Networks with Mixed Penalties: a Case Study (original) (raw)

Neural networks for finite constraint satisfaction

Neural Computing & Applications, 1995

Constraint Satisfaction Problems (CSPs) are in general NP-hard, and a general deterministic polynomial time algorithm is not known. They play a central role in real-life problems. The satisfaction of a Conjunctive Normal Form (CNF-SAT) is the core of any CSP. We present a new modelisation technique for any CSP with finite variable domains, and, in particular, for solving CNF-SAT. The knowledge representation is based on two fundamental types of constraint: the choice constraint, and the exclusion constraint. These models are then implemented by means of several different neural networks, some based on backpropagation learning and others on different procedures. All these networks are trained through a supervised procedure, and learn to efficiently solve CNF-SA T. The results of significant tests are described: they show that some networks can effectively solve the proposed problems.

Minimization of the 0-1 linear programming problem under linear constraints by using neural networks: synthesis and analysis

IEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications, 1996

In this brief, we propose a new design: a Boolean Neural Network (BNN) for the 0-1 linear programming problem under inequalities constraints by using the connection between concave programming and integer programming problems. This connection is based on the concavity and penalty function methods. The general objective function obtained, which combines the objective function and constraints is fixed as the energy of the system. The simulation results for the new BNN show that the system converge rapidly within a few neural time constant.

A Connectionist Approach for Solving Large Constraint Satisfaction Problems

1997

An efficient neural network technique is presented for the solution of binary constraint satisfaction problems. The method is based on the application of a double-update technique to the operation of the discrete Hopfield-type neural network that can be constructed for the solution of such problems. This operation scheme ensures that the network moves only between consistent states, such that each problem variable is assigned exactly one value, and leads to a fast and efficient search of the problem state space. Extensions of the proposed method are considered in order to include several optimisation criteria in the search. Experimental results concerning many real-size instances of the Radio Links Frequency Assignment Problem demonstrate very good performance.

An analysis of a class of neural networks for solving linear programming problems

1999

Abstract A class of neural networks that solve linear programming problems is analyzed. The neural networks considered are modeled by dynamic gradient systems that are constructed using a parametric family of exact (nondifferentiable) penalty functions. It is proved that for a given linear programming problem and sufficiently large penalty parameters, any trajectory of the neural network converges in finite time to its solution set.

Neural Networks for Solving Constrained Optimization Problems

In this paper we consider several Neural Network architectures for solving constrained optimization problems with inequality constrains. We present a new architecture based on the exact penalty function approach. Simulation results based on SIMULINK ® models are given and compared.

Amelioration of Modeling and Solving the Weighted Constraint Satisfaction Problems via the Hopfield neural network approach 1

A wide variety of combinatorial problems can be viewed as Weighted Constraint Satisfaction Problems (WCSPs). All resolution methods have an exponential time complexity for big instances. Moreover, they combine several techniques, use a wide variety of concepts and notations that are difficult to understand and implement. In this paper, we model this problem in terms of an original 0-1 quadratic programming subject to linear constraints. This model is validated by the proposed and demonstrated theorem. View its performance, we use the Hopfield neural network to solve the obtained model basing on original energy function. To validate our model, we solve several instances of benchmarking WCSP. Our approach has the same memory complexity as the HNN and the same time complexity as Euler-Cauchy method. In this regard, our approach recognizes the optimal solution of the said instances.

A new neural network for solving linear programming problems

European Journal of Operational Research, 1996

We propose and analyse a new class of neural network models for solving linear programming (LP) problems in real time. We introduce a novel energy function that transforms linear programming into a system of nonlinear differential equations. This system of differential equations can be solved on-fine by a simplified low-cost analog neural network containing only one single artificial neuron with adaptive synaptic weights. The network architecture is suitable for currently available CMOS VLSI implementations. An important feature of the proposed neural network architecture is its flexibility and universality. The correctness and performance of the proposed neural network is illustrated by extensive computer simulation experiments.