Heuristic Search Research Papers - Academia.edu (original) (raw)
Particle Swarm Optimization is a popular heuristic search algorithm which is inspired by the social learning of birds or fishes. It is a swarm intelligence technique for optimization developed by Eberhart and Kennedy [1] in 1995. Inertia... more
Particle Swarm Optimization is a popular heuristic search algorithm which is inspired by the social learning of birds or fishes. It is a swarm intelligence technique for optimization developed by Eberhart and Kennedy [1] in 1995. Inertia weight is an important parameter in PSO, which significantly affects the convergence and exploration-exploitation trade-off in PSO process. Since inception of Inertia Weight in PSO, a large number of variations of Inertia Weight strategy have been proposed. In order to propose one or more ...
From 1955 to 1965, the most well-known checker-playing program was Samuel's (1967, 1959). This work remains a mile-stone in AI research. Samuel's program report-edly beat a master and solved the game of checkers. Both... more
From 1955 to 1965, the most well-known checker-playing program was Samuel's (1967, 1959). This work remains a mile-stone in AI research. Samuel's program report-edly beat a master and solved the game of checkers. Both journalistic claims were false, but they ...
We propose an exact method which combines the resolution search and branch & bound algorithms for solving the 0?1 Multidimensional Knapsack Problem. This algorithm is able to prove large?scale strong correlated instances. The optimal... more
We propose an exact method which combines the resolution search and branch & bound algorithms for solving the 0?1 Multidimensional Knapsack Problem. This algorithm is able to prove large?scale strong correlated instances. The optimal values of the 10 constraint, 500 variable instances of the OR-Library are exposed. These values were previously unknown.
Abstract. This paper is a review of the approaches developed to solve 2D packing problems with meta-heuristic algorithms. As packing tasks are combinatorial problems with very large search spaces, the recent literature encourages the use... more
Abstract. This paper is a review of the approaches developed to solve 2D packing problems with meta-heuristic algorithms. As packing tasks are combinatorial problems with very large search spaces, the recent literature encourages the use of meta-heuristic search methods, in ...
From 1955 to 1965, the most well-known checker-playing program was Samuel's (1967, 1959). This work remains a mile-stone in AI research. Samuel's program report-edly beat a master and solved the game of checkers. Both... more
From 1955 to 1965, the most well-known checker-playing program was Samuel's (1967, 1959). This work remains a mile-stone in AI research. Samuel's program report-edly beat a master and solved the game of checkers. Both journalistic claims were false, but they ...
CREAM (“Constraints with Rules to EAse Modelling”) is a general rule-based modelling language designed to make easy to use for engineers the formulation of combinatorial problems and the integration of domain-specific knowledge in... more
CREAM (“Constraints with Rules to EAse Modelling”) is a general rule-based modelling language designed to make easy to use for engineers the formulation of combinatorial problems and the integration of domain-specific knowledge in libraries.
This work had a practical application within the European STReP Net-WMS, which aimed at tackling complex real-world 3D packing problems (with rotations) subject to weight and stability constraints from the automotive industry (PSA, FIAT).
- by Todd Zenger and +1
- •
- Management, Marketing, Knowledge sharing, Theory of the Firm
Dynamic real-time scheduling can be applied to a broad range of industrial problems to provide a suitable allocation of resources. The majority of published work concentrates on systems in which a separate off-line monitoring system... more
Dynamic real-time scheduling can be applied to a broad range of industrial problems to
provide a suitable allocation of resources. The majority of published work concentrates on
systems in which a separate off-line monitoring system provides the knowledge that aids
and reduces the complexity of the operations of a real-time scheduling system, whose
main requirement is to produce a suitable schedule within a specified critical time limit.
This paper introduces a solution to the real-time lift scheduling problems using a
prioritised form of an A* heuristic search, supported by a lift monitoring system. System
and traffic models obtained from an existing lift installation are used to generate results
showing that the performance of the new intelligent lift scheduling system compares
favourably with that of a conventional scheduler.
Software module clustering is the problem of automatically partitioning the structure of a software system using low-level dependencies in the source code to understand and improve the system's architecture. Munch, a clustering tool based... more
Software module clustering is the problem of automatically partitioning the structure of a software system using low-level dependencies in the source code to understand and improve the system's architecture. Munch, a clustering tool based on search-based software engineering techniques, was used to modularise a unique dataset of sequential source code software versions. This paper employs a seeding technique, based on results from previous modularisations, to improve the effectiveness and efficiency of the procedure. In order to reduce the running time further, a statistic for controlling the number of iterations of the modularisation based on the similarities between time adjacent graphs is introduced. We examine the convergence of the heuristic search technique and estimate and evaluate a number of stopping criterion. The paper reports the results of extensive experiments conducted on our comprehensive time-series dataset and provides evidence to support our proposed techniques.
- by Jonathan Schaeffer and +1
- •
- Heuristic Search, Problem Solving, Cg, Search Space
Most work in heuristic search considers problems where a low cost solution is preferred (MIN problems). In this paper, we investigate the complementary setting where a solution of high reward is preferred (MAX problems). Example MAX... more
Most work in heuristic search considers problems where a low cost solution is preferred (MIN problems). In this paper, we investigate the complementary setting where a solution of high reward is preferred (MAX problems). Example MAX problems include finding the longest simple path in a graph, maximal coverage, and various constraint optimization problems. We examine several popular search algorithms for MIN problems — optimal, suboptimal, and bounded suboptimal - and discover the curious ways in which they misbehave on MAX problems. We propose modifications that preserve the original intentions behind the algorithms but allow them to solve MAX problems, and compare them theoretically and empirically. Interesting results include the failure of bidirectional search and a discovered close relationships between Dijkstra's algorithm, weighted A*, and depth-first search. This work demonstrates that MAX problems demand their own heuristic search algorithms, which are worthy objects of ...
This paper proposes a weighted double-heuristic search algorithm to find the shortest path between two points. It can be used in numerous fields such as graph theory, game theory, and network. This algorithm, called T*, uses a weighted... more
This paper proposes a weighted double-heuristic search algorithm to find the shortest path between two points. It can be used in numerous fields such as graph theory, game theory, and network. This algorithm, called T*, uses a weighted and heuristic function as f(x) = α × t(x) + β × h1(x) + γ × h2(x). It selects the path which minimises f(x) where x is a current node on the path, t(x) is cost of the path from start to x, h1(x) is a heuristic to estimate the cost from x to the straight line passing through start and target, and h2(x) is a heuristic to estimate cost of the cheapest path from x to target. Furthermore, α, β, and γ indicate effective weights of each sub-function on f(x). T* algorithm is compared to the Greedy and A* algorithms in terms of hit rate and the number of processed nodes. Comparison results show that the proposed algorithm has a high efficiency compared to other algorithms.