Todd Wareham - Academia.edu (original) (raw)

Papers by Todd Wareham

Research paper thumbnail of A Change for The Better? Assessing the Computational Cost of Re-representation

PsycEXTRA Dataset

In this supplementary material, we provide proofs of the complexity-theoretic statements appearin... more In this supplementary material, we provide proofs of the complexity-theoretic statements appearing in our paper. In order to prove intractability results for the input-output mappings associated with the computational-level models of re-representation formulated in our paper, we must formalize the various entities mentioned in these input-output mappings. We first review the graph-based formalization of analogy derivation given in [6] (Section 1). This is followed by a brief summary of techniques for establishing polynomial-time and fixed-parameter intractability (Section 2) and the proofs of the various intractability and tractability results presented in our paper (Section 3).

Research paper thumbnail of UC Merced Proceedings of the Annual Meeting of the Cognitive Science Society Title Similarity as Tractable Transformation Similarity as Tractable Transformation

According to the transformational approach to similarity, two objects are judged to be more simil... more According to the transformational approach to similarity, two objects are judged to be more similar the simpler the transformation of one of the object representations into the other. This approach draws inspiration from the mathematical theory of Kolmogorov complexity, but otherwise remains an informal theory to this day. In this paper we investigate several different ways in which the informal theory of transforma-tional similarity can be understood, providing a formalization for each possible reading. We then study the computational (in)tractability of each formalization for a variety of parameter settings. Our results have both theoretical and empirical implications for transformational approaches to similarity.

Research paper thumbnail of The parameterized complexity of sequence alignment and consensus (Extended Abstract)

The Longest common subsequence problem is examined from the point of view of parameterized comput... more The Longest common subsequence problem is examined from the point of view of parameterized computational complexity. There are several ways in which parameters enter the problem: the number of sequences to be analyzed, the length of the common subsequence, and the size of the alphabet. Lower bounds on the complexity of this basic problem imply lower bounds on more general sequence alignment and consensus problems. At issue in the theory of parameterized complexity is whether a problem can be solved in time O(nα) for each fixed parameter value k, where α is a constant independent of k (termed fixed-parameter tractability). It can be argued that this is the appropriate asymptotic model of feasible computability for problems for which a small range of parameter values cover important applications — a situation which certainly holds for many problems in sequence analysis. Our main results show that: (1) The Longest Common Subsequence (LCS) parameterized by the number of sequences to be ...

Research paper thumbnail of Introducing Phon: A Software Solution for the Study of Phonological Acquisition

Boston University Conference on Language Development. Proceedings, 2006

Research paper thumbnail of Quartet Cleaning: Improved Algorithms and Simulations

A critical step in all quartet methods for constructing evolutionary trees is the inference of th... more A critical step in all quartet methods for constructing evolutionary trees is the inference of the topology for each set of four species (i.e. quartet). It is a well-known fact that all quartet topology inference methods make mistakes that result in the incorrect inference of quartet topology. These mistakes are called quartet errors. In this paper, two efficient algorithms for correcting bounded numbers of quartet errors are presented. These "quartet cleaning" algorithms are shown to be optimal in that no algorithm can correct more quartet errors. An extensive simulation study reveals that sets of quartet topologies inferred by three popular methods (Neighbor Joining [15], Ordinal Quartet [14] and Maximum Parsimony [10]) almost always contain quartet errors and that a large portion of these quartet errors are corrected by the quartet cleaning algorithms.

Research paper thumbnail of Identifying sources of intractability in cognitive models: An illustration using analogical structure mapping

Many computational models in cognitive science and artificial intelligence face the problem of co... more Many computational models in cognitive science and artificial intelligence face the problem of computational intractability when assumed to operate for unrestricted input domains. Tractability may be achieved by restricting the input domain, but some degree of generality is typically required to model human-like intelligence. Moreover, it is often non-obvious which restrictions will render a model tractable or not. We present an analytical tool that can be used to identify sources of intractability in a model’s input domain. For our illustration, we use Gentner’s Structure-Mapping Theory of analogy as a running example.

Research paper thumbnail of Exploruing Algorithmic Options for the Efficient Design and Reconfiguration of Reactive Robot Swarms

Proceedings of the 9th EAI International Conference on Bio-inspired Information and Communications Technologies (formerly BIONETICS)

A key challenge in robot swarm engineering is the design of individual robot controllers such tha... more A key challenge in robot swarm engineering is the design of individual robot controllers such that the robots as a group can perform a specified task. In this paper, we explore algorithmic options for designing and reconfiguring swarms of synchronous reactive robots to perform a joint navigation / morphogenesis task in a known world. Our results show that neither of these problems can be solved both efficiently and correctly either in general or relative to a surprisingly large number of restrictions on robot and swarm architecture. We also give restrictions under which these problems can be solved both efficiently and correctly.

Research paper thumbnail of My Thesis Notes

We can now talk about algorithm efficiency. Typical computational resources of interest are time ... more We can now talk about algorithm efficiency. Typical computational resources of interest are time and space, which correspond to the number of instructions executed or the amount of memory used by the algorithm when it is implemented on some standard type of computer, e.g., a deterministic Turing machine (for detailed descriptions of the various kinds of Turing machines, see [2, 3, 4]). For some resource R and problem Π, let RA : DΠ 7→ N be be the function that gives the amount of resource R that is used by algorithm A to solve a given instance of Π. The resource-usage behavior of an algorithm over all possible instances of its associated problem is typically stated in terms of a function of instance size that summarizes this behavior in some useful manner. The creation of such functions has three steps:

Research paper thumbnail of Assessing the computational adequacy of the General Problem Solver model

Cognitive Science, 2014

Assessing the computational adequacy of the General Problem Solver model Zahra Sajedinia Memorial... more Assessing the computational adequacy of the General Problem Solver model Zahra Sajedinia Memorial University of Newfoundland, St. John’s, NL, Canada Todd Wareham Memorial University of Newfoundland, St. John’s, NL, Canada Abstract: Problem solving is a core cognitive ability. Human performance varies widely in solving different types of problems. Ideally, cognitive models of problem solving should explain these variations in two ways: (1) the model should reproduce the sequences of actions applied by humans during problem solving (empirical adequacy), and (2) the time required by the model should match that required by humans, i.e., the model should be fast (slow) when humans are fast (slow) (computational adequacy). The former can be assessed by traditional psychological experiments; however, the latter requires the application of techniques from computational complexity theory. In this poster, we describe the first formal assessment of the computational adequacy of Newell and Simo...

Research paper thumbnail of Bridging the communicative gap between robots and humans, by analogy

Cognitive Science, 2015

The ability to create and understand novel communicative signals is exemplary of people’s creativ... more The ability to create and understand novel communicative signals is exemplary of people’s creative and inferential abilities. For example, when traveling and unable to speak the local language, we can make ourselves understood by creating novel gestures. This ability is a form of abductive inference, and requires people to generate novel hypotheses about possible meanings of signals (abduction proper). We propose that novel hypotheses may be generated from scratch by re-conceptualizing perceptual and conceptual representations through analogical augmentation. We plan to use robotics methodology to assess the plausibility of this model. By enhancing a robot with analogical augmentation we aim to enable it to generate novel gestures based on analogies. This lays the groundwork for more natural human-robot interaction. Furthermore, by studying the robot’s gestures and to what extent people can understand them, we gain better understanding of the abduction-based computational processes ...

Research paper thumbnail of Natural Language Syllable Alignment : From Conception to Implementation

Several types of linguistic analyses, e.g., studies of first or second language acquisition, requ... more Several types of linguistic analyses, e.g., studies of first or second language acquisition, require the alignment of the (actual) syllables produced by a speaker with those in the corresponding intended (target) utterance. Maddocks (2005) described a dynamic programming algorithm for performing such alignments. In this paper, we will describe (1) the implementation of this algorithm and its integration into Phon, a software program for analyzing language acquisition data (Rose et al. 2005) and (2) the development of a genetic algorithm for optimizing the performance of this algorithm relative to a user-provided corpus of actual / target utterance pairs. Keywords— Software Design; User Interface Design; Dynamic Programming; Genetic Algorithm

Research paper thumbnail of How hard is cognitive science?

This is a preprint version of a published paper. Please cite as: Rich, P., de Haan, R., Wareham, ... more This is a preprint version of a published paper. Please cite as: Rich, P., de Haan, R., Wareham, T., & van Rooij, I. (2021). How hard is cognitive science? Proceedings of the Annual Meeting of the Cognitive Science Society, 43. Retrieved from https://escholarship.org/uc/item/8cr8x1c4

Research paper thumbnail of Exploring Algorithmic Options for the Efficient Design and Reconfiguration of Reactive Robot Swarms

A key challenge in robot swarm engineering is the design of individual robot controllers such tha... more A key challenge in robot swarm engineering is the design of individual robot controllers such that the robots as a group can perform a specified task. In this paper, we explore algorithmic options for designing and reconfiguring swarms of synchronous reactive robots to perform a joint navigation / morphogenesis task in a known world. Our results show that neither of these problems can be solved both efficiently and correctly either in general or relative to a surprisingly large number of restrictions on robot and swarm architecture. We also give restrictions under which these problems can be solved both efficiently and correctly.A key challenge in robot swarm engineering is the design of individual robot controllers such that the robots as a group can perform a specified task. In this paper, we explore algorithmic options for designing and reconfiguring swarms of synchronous reactive robots to perform a joint navigation / morphogenesis task in a known world. Our results show that ne...

Research paper thumbnail of Computational complexity analysis for cognitive scientists

Computational complexity analysis for cognitive scientists Iris van Rooij [contact person], Johan... more Computational complexity analysis for cognitive scientists Iris van Rooij [contact person], Johan Kwisthout, Mark Blokpoel Radboud University Nijmegen, Donders Institute for Brain, Cognition and Behaviour Montessorilaan 3, 6525 HR Nijmegen, The Netherlands, i.vanrooij@donders.ru.nl Todd Wareham Department of Computer Science, Memorial University of Newfoundland, St. John’s, NL, Canada Keywords: computational complexity theory, computational modeling, intractability, NP-hard, scalability, algorithms, fixed- parameter tractability, approximation. computational complexity analysis, and (c) learn about the philosophical foundations of, and debate surrounding, the use of computational complexity theory for analyzing computational-level theories of cognition. The tutorial will assume a basic level knowledge of cognitive psychology and an affinity with computational considerations. Aims and Motivation Many computational- or rational-level models of cognition postulate computations that app...

Research paper thumbnail of On the Computational Complexity of Software (Re)Modularization: Elaborations and Opportunities

Proceedings of the 9th EAI International Conference on Bio-inspired Information and Communications Technologies (formerly BIONETICS), 2016

Software system modularization and remodularization are key challenges in software engineering. A... more Software system modularization and remodularization are key challenges in software engineering. All previous research has assumed that these problems are computationally intractable and hence focused on heuristic methods such as hill-climbing, evolutionary algorithms, and simulated annealing that are fast but do not guarantee to produce solutions of optimal or even near-optimal quality. However, this intractability has never been formally established. In this paper, we give the first proofs of the N P-hardness of software modularization and remodularization relative to several models of module-internal connectivity. We also review three popular algorithmic approaches for producing provably optimal or near-optimal solutions efficiently and both discuss the applicability of these approaches in and list results in the literature relevant to practical software modularization and remodularization.

Research paper thumbnail of Designing Robot Teams for Distributed Construction, Repair, and Maintenance

ACM Transactions on Autonomous and Adaptive Systems

Designing teams of autonomous robots that can create target structures or repair damage to those ... more Designing teams of autonomous robots that can create target structures or repair damage to those structures on either a one-off or ongoing basis is an important problem in distributed robotics. However, it is not known if a team design algorithm for any of these tasks can both have low runtime and produce teams that will always perform their specified tasks quickly and correctly. In this article, we give the first computational and parameterized complexity analyses of several robot team design problems associated with creating, repairing, and maintaining target structures in given environments. Our goals are to establish whether efficient design algorithms exist that operate reliably on all possible inputs and, if not, under which restrictions such algorithms are and are not possible. We prove that all of our design problems are not efficiently solvable in general for heterogeneous robot teams and remain so under a number of plausible restrictions on robot controllers, environments,...

Research paper thumbnail of Demons of Ecological Rationality

Cognitive science, Jan 2, 2017

? The Adaptive Toolbox theory answers this question by defining human rationality in terms of a d... more ? The Adaptive Toolbox theory answers this question by defining human rationality in terms of a degree of adaptation of decision strategies (heuristics) to different environments (Gigerenzer & Todd, 1999; Todd & Gigerenzer, 2012). When heuristics are adapted to the environment and lead to "good enough" (or even high-quality) decisions, they are said to be ecologically rational. For almost two decades, this theory has been considered a tractable alternative to classical theories of human rationality based on logic or probability theory (Gigerenzer, 2015; Gigerenzer & Todd, 1999). These classical theories have been criticized for postulating intractable (e.g., NP-hard) 1 computations (Arkes, Gigerenzer, & Hertwig, 2016; Gigerenzer, 2008; Oaksford & Chater, 1998), which suggests that humans must possess demonic computational powers in order to make rational decisions (so-called demons of rationality; Gigerenzer & Todd, 1999; Goldstein & Gigerenzer, 1999). It is widely assumed that the Adaptive Toolbox theory circumvents the intractability problem that plagues classical accounts of human rationality, because heuristics are by definition tractable. Yet the notion of ecological rationality hinges on the existence of tractable adaptation processes. Here, we present an argument that, contrary to common belief, the Adaptive Toolbox theory has not yet tamed the intractability demon. Rather, the demon is hiding in the theory's cornerstone assumption that ecological rationality is achieved by processes of adaptation, such as evolution, development, or learning.

Research paper thumbnail of 2005) Natural Language Syllable Alignment: From Conception to Implementation

Several types of linguistic analyses, e.g., studies of first or second language acquisition, requ... more Several types of linguistic analyses, e.g., studies of first or second language acquisition, require the alignment of the (actual) syllables produced by a speaker with those in the corresponding intended (target) utterance. Maddocks (2005) described a dynamic programming algorithm for performing such alignments. In this paper, we will describe (1) the implementation of this algorithm and its integration into Phon, a software program for analyzing language acquisition data (Rose et al. 2005) and (2) the development of a genetic algorithm for optimizing the performance of this algorithm relative to a user-provided corpus of actual / target utterance pairs.

Research paper thumbnail of The Parameterized Complexity of Some Problems in Logic and Linguistics (Extended Abstract)

Research paper thumbnail of Optimal Algorithms for Local Vertex Quartet

Research paper thumbnail of A Change for The Better? Assessing the Computational Cost of Re-representation

PsycEXTRA Dataset

In this supplementary material, we provide proofs of the complexity-theoretic statements appearin... more In this supplementary material, we provide proofs of the complexity-theoretic statements appearing in our paper. In order to prove intractability results for the input-output mappings associated with the computational-level models of re-representation formulated in our paper, we must formalize the various entities mentioned in these input-output mappings. We first review the graph-based formalization of analogy derivation given in [6] (Section 1). This is followed by a brief summary of techniques for establishing polynomial-time and fixed-parameter intractability (Section 2) and the proofs of the various intractability and tractability results presented in our paper (Section 3).

Research paper thumbnail of UC Merced Proceedings of the Annual Meeting of the Cognitive Science Society Title Similarity as Tractable Transformation Similarity as Tractable Transformation

According to the transformational approach to similarity, two objects are judged to be more simil... more According to the transformational approach to similarity, two objects are judged to be more similar the simpler the transformation of one of the object representations into the other. This approach draws inspiration from the mathematical theory of Kolmogorov complexity, but otherwise remains an informal theory to this day. In this paper we investigate several different ways in which the informal theory of transforma-tional similarity can be understood, providing a formalization for each possible reading. We then study the computational (in)tractability of each formalization for a variety of parameter settings. Our results have both theoretical and empirical implications for transformational approaches to similarity.

Research paper thumbnail of The parameterized complexity of sequence alignment and consensus (Extended Abstract)

The Longest common subsequence problem is examined from the point of view of parameterized comput... more The Longest common subsequence problem is examined from the point of view of parameterized computational complexity. There are several ways in which parameters enter the problem: the number of sequences to be analyzed, the length of the common subsequence, and the size of the alphabet. Lower bounds on the complexity of this basic problem imply lower bounds on more general sequence alignment and consensus problems. At issue in the theory of parameterized complexity is whether a problem can be solved in time O(nα) for each fixed parameter value k, where α is a constant independent of k (termed fixed-parameter tractability). It can be argued that this is the appropriate asymptotic model of feasible computability for problems for which a small range of parameter values cover important applications — a situation which certainly holds for many problems in sequence analysis. Our main results show that: (1) The Longest Common Subsequence (LCS) parameterized by the number of sequences to be ...

Research paper thumbnail of Introducing Phon: A Software Solution for the Study of Phonological Acquisition

Boston University Conference on Language Development. Proceedings, 2006

Research paper thumbnail of Quartet Cleaning: Improved Algorithms and Simulations

A critical step in all quartet methods for constructing evolutionary trees is the inference of th... more A critical step in all quartet methods for constructing evolutionary trees is the inference of the topology for each set of four species (i.e. quartet). It is a well-known fact that all quartet topology inference methods make mistakes that result in the incorrect inference of quartet topology. These mistakes are called quartet errors. In this paper, two efficient algorithms for correcting bounded numbers of quartet errors are presented. These "quartet cleaning" algorithms are shown to be optimal in that no algorithm can correct more quartet errors. An extensive simulation study reveals that sets of quartet topologies inferred by three popular methods (Neighbor Joining [15], Ordinal Quartet [14] and Maximum Parsimony [10]) almost always contain quartet errors and that a large portion of these quartet errors are corrected by the quartet cleaning algorithms.

Research paper thumbnail of Identifying sources of intractability in cognitive models: An illustration using analogical structure mapping

Many computational models in cognitive science and artificial intelligence face the problem of co... more Many computational models in cognitive science and artificial intelligence face the problem of computational intractability when assumed to operate for unrestricted input domains. Tractability may be achieved by restricting the input domain, but some degree of generality is typically required to model human-like intelligence. Moreover, it is often non-obvious which restrictions will render a model tractable or not. We present an analytical tool that can be used to identify sources of intractability in a model’s input domain. For our illustration, we use Gentner’s Structure-Mapping Theory of analogy as a running example.

Research paper thumbnail of Exploruing Algorithmic Options for the Efficient Design and Reconfiguration of Reactive Robot Swarms

Proceedings of the 9th EAI International Conference on Bio-inspired Information and Communications Technologies (formerly BIONETICS)

A key challenge in robot swarm engineering is the design of individual robot controllers such tha... more A key challenge in robot swarm engineering is the design of individual robot controllers such that the robots as a group can perform a specified task. In this paper, we explore algorithmic options for designing and reconfiguring swarms of synchronous reactive robots to perform a joint navigation / morphogenesis task in a known world. Our results show that neither of these problems can be solved both efficiently and correctly either in general or relative to a surprisingly large number of restrictions on robot and swarm architecture. We also give restrictions under which these problems can be solved both efficiently and correctly.

Research paper thumbnail of My Thesis Notes

We can now talk about algorithm efficiency. Typical computational resources of interest are time ... more We can now talk about algorithm efficiency. Typical computational resources of interest are time and space, which correspond to the number of instructions executed or the amount of memory used by the algorithm when it is implemented on some standard type of computer, e.g., a deterministic Turing machine (for detailed descriptions of the various kinds of Turing machines, see [2, 3, 4]). For some resource R and problem Π, let RA : DΠ 7→ N be be the function that gives the amount of resource R that is used by algorithm A to solve a given instance of Π. The resource-usage behavior of an algorithm over all possible instances of its associated problem is typically stated in terms of a function of instance size that summarizes this behavior in some useful manner. The creation of such functions has three steps:

Research paper thumbnail of Assessing the computational adequacy of the General Problem Solver model

Cognitive Science, 2014

Assessing the computational adequacy of the General Problem Solver model Zahra Sajedinia Memorial... more Assessing the computational adequacy of the General Problem Solver model Zahra Sajedinia Memorial University of Newfoundland, St. John’s, NL, Canada Todd Wareham Memorial University of Newfoundland, St. John’s, NL, Canada Abstract: Problem solving is a core cognitive ability. Human performance varies widely in solving different types of problems. Ideally, cognitive models of problem solving should explain these variations in two ways: (1) the model should reproduce the sequences of actions applied by humans during problem solving (empirical adequacy), and (2) the time required by the model should match that required by humans, i.e., the model should be fast (slow) when humans are fast (slow) (computational adequacy). The former can be assessed by traditional psychological experiments; however, the latter requires the application of techniques from computational complexity theory. In this poster, we describe the first formal assessment of the computational adequacy of Newell and Simo...

Research paper thumbnail of Bridging the communicative gap between robots and humans, by analogy

Cognitive Science, 2015

The ability to create and understand novel communicative signals is exemplary of people’s creativ... more The ability to create and understand novel communicative signals is exemplary of people’s creative and inferential abilities. For example, when traveling and unable to speak the local language, we can make ourselves understood by creating novel gestures. This ability is a form of abductive inference, and requires people to generate novel hypotheses about possible meanings of signals (abduction proper). We propose that novel hypotheses may be generated from scratch by re-conceptualizing perceptual and conceptual representations through analogical augmentation. We plan to use robotics methodology to assess the plausibility of this model. By enhancing a robot with analogical augmentation we aim to enable it to generate novel gestures based on analogies. This lays the groundwork for more natural human-robot interaction. Furthermore, by studying the robot’s gestures and to what extent people can understand them, we gain better understanding of the abduction-based computational processes ...

Research paper thumbnail of Natural Language Syllable Alignment : From Conception to Implementation

Several types of linguistic analyses, e.g., studies of first or second language acquisition, requ... more Several types of linguistic analyses, e.g., studies of first or second language acquisition, require the alignment of the (actual) syllables produced by a speaker with those in the corresponding intended (target) utterance. Maddocks (2005) described a dynamic programming algorithm for performing such alignments. In this paper, we will describe (1) the implementation of this algorithm and its integration into Phon, a software program for analyzing language acquisition data (Rose et al. 2005) and (2) the development of a genetic algorithm for optimizing the performance of this algorithm relative to a user-provided corpus of actual / target utterance pairs. Keywords— Software Design; User Interface Design; Dynamic Programming; Genetic Algorithm

Research paper thumbnail of How hard is cognitive science?

This is a preprint version of a published paper. Please cite as: Rich, P., de Haan, R., Wareham, ... more This is a preprint version of a published paper. Please cite as: Rich, P., de Haan, R., Wareham, T., & van Rooij, I. (2021). How hard is cognitive science? Proceedings of the Annual Meeting of the Cognitive Science Society, 43. Retrieved from https://escholarship.org/uc/item/8cr8x1c4

Research paper thumbnail of Exploring Algorithmic Options for the Efficient Design and Reconfiguration of Reactive Robot Swarms

A key challenge in robot swarm engineering is the design of individual robot controllers such tha... more A key challenge in robot swarm engineering is the design of individual robot controllers such that the robots as a group can perform a specified task. In this paper, we explore algorithmic options for designing and reconfiguring swarms of synchronous reactive robots to perform a joint navigation / morphogenesis task in a known world. Our results show that neither of these problems can be solved both efficiently and correctly either in general or relative to a surprisingly large number of restrictions on robot and swarm architecture. We also give restrictions under which these problems can be solved both efficiently and correctly.A key challenge in robot swarm engineering is the design of individual robot controllers such that the robots as a group can perform a specified task. In this paper, we explore algorithmic options for designing and reconfiguring swarms of synchronous reactive robots to perform a joint navigation / morphogenesis task in a known world. Our results show that ne...

Research paper thumbnail of Computational complexity analysis for cognitive scientists

Computational complexity analysis for cognitive scientists Iris van Rooij [contact person], Johan... more Computational complexity analysis for cognitive scientists Iris van Rooij [contact person], Johan Kwisthout, Mark Blokpoel Radboud University Nijmegen, Donders Institute for Brain, Cognition and Behaviour Montessorilaan 3, 6525 HR Nijmegen, The Netherlands, i.vanrooij@donders.ru.nl Todd Wareham Department of Computer Science, Memorial University of Newfoundland, St. John’s, NL, Canada Keywords: computational complexity theory, computational modeling, intractability, NP-hard, scalability, algorithms, fixed- parameter tractability, approximation. computational complexity analysis, and (c) learn about the philosophical foundations of, and debate surrounding, the use of computational complexity theory for analyzing computational-level theories of cognition. The tutorial will assume a basic level knowledge of cognitive psychology and an affinity with computational considerations. Aims and Motivation Many computational- or rational-level models of cognition postulate computations that app...

Research paper thumbnail of On the Computational Complexity of Software (Re)Modularization: Elaborations and Opportunities

Proceedings of the 9th EAI International Conference on Bio-inspired Information and Communications Technologies (formerly BIONETICS), 2016

Software system modularization and remodularization are key challenges in software engineering. A... more Software system modularization and remodularization are key challenges in software engineering. All previous research has assumed that these problems are computationally intractable and hence focused on heuristic methods such as hill-climbing, evolutionary algorithms, and simulated annealing that are fast but do not guarantee to produce solutions of optimal or even near-optimal quality. However, this intractability has never been formally established. In this paper, we give the first proofs of the N P-hardness of software modularization and remodularization relative to several models of module-internal connectivity. We also review three popular algorithmic approaches for producing provably optimal or near-optimal solutions efficiently and both discuss the applicability of these approaches in and list results in the literature relevant to practical software modularization and remodularization.

Research paper thumbnail of Designing Robot Teams for Distributed Construction, Repair, and Maintenance

ACM Transactions on Autonomous and Adaptive Systems

Designing teams of autonomous robots that can create target structures or repair damage to those ... more Designing teams of autonomous robots that can create target structures or repair damage to those structures on either a one-off or ongoing basis is an important problem in distributed robotics. However, it is not known if a team design algorithm for any of these tasks can both have low runtime and produce teams that will always perform their specified tasks quickly and correctly. In this article, we give the first computational and parameterized complexity analyses of several robot team design problems associated with creating, repairing, and maintaining target structures in given environments. Our goals are to establish whether efficient design algorithms exist that operate reliably on all possible inputs and, if not, under which restrictions such algorithms are and are not possible. We prove that all of our design problems are not efficiently solvable in general for heterogeneous robot teams and remain so under a number of plausible restrictions on robot controllers, environments,...

Research paper thumbnail of Demons of Ecological Rationality

Cognitive science, Jan 2, 2017

? The Adaptive Toolbox theory answers this question by defining human rationality in terms of a d... more ? The Adaptive Toolbox theory answers this question by defining human rationality in terms of a degree of adaptation of decision strategies (heuristics) to different environments (Gigerenzer & Todd, 1999; Todd & Gigerenzer, 2012). When heuristics are adapted to the environment and lead to "good enough" (or even high-quality) decisions, they are said to be ecologically rational. For almost two decades, this theory has been considered a tractable alternative to classical theories of human rationality based on logic or probability theory (Gigerenzer, 2015; Gigerenzer & Todd, 1999). These classical theories have been criticized for postulating intractable (e.g., NP-hard) 1 computations (Arkes, Gigerenzer, & Hertwig, 2016; Gigerenzer, 2008; Oaksford & Chater, 1998), which suggests that humans must possess demonic computational powers in order to make rational decisions (so-called demons of rationality; Gigerenzer & Todd, 1999; Goldstein & Gigerenzer, 1999). It is widely assumed that the Adaptive Toolbox theory circumvents the intractability problem that plagues classical accounts of human rationality, because heuristics are by definition tractable. Yet the notion of ecological rationality hinges on the existence of tractable adaptation processes. Here, we present an argument that, contrary to common belief, the Adaptive Toolbox theory has not yet tamed the intractability demon. Rather, the demon is hiding in the theory's cornerstone assumption that ecological rationality is achieved by processes of adaptation, such as evolution, development, or learning.

Research paper thumbnail of 2005) Natural Language Syllable Alignment: From Conception to Implementation

Several types of linguistic analyses, e.g., studies of first or second language acquisition, requ... more Several types of linguistic analyses, e.g., studies of first or second language acquisition, require the alignment of the (actual) syllables produced by a speaker with those in the corresponding intended (target) utterance. Maddocks (2005) described a dynamic programming algorithm for performing such alignments. In this paper, we will describe (1) the implementation of this algorithm and its integration into Phon, a software program for analyzing language acquisition data (Rose et al. 2005) and (2) the development of a genetic algorithm for optimizing the performance of this algorithm relative to a user-provided corpus of actual / target utterance pairs.

Research paper thumbnail of The Parameterized Complexity of Some Problems in Logic and Linguistics (Extended Abstract)

Research paper thumbnail of Optimal Algorithms for Local Vertex Quartet