Stephyn Butcher | Johns Hopkins University (original) (raw)

Papers by Stephyn Butcher

Research paper thumbnail of Comparative performance and scaling of the pareto improving particle swarm optimization algorithm

Proceedings of the Genetic and Evolutionary Computation Conference Companion, 2018

The Pareto Improving Particle Swarm Optimization algorithm (PI-PSO) has been shown to perform bet... more The Pareto Improving Particle Swarm Optimization algorithm (PI-PSO) has been shown to perform better than Global Best PSO on a variety of benchmark problems. However, these experiments used benchmark problems with a single dimension, namely 32d. Here we compare global best PSO and PI-PSO on benchmark problems of varying dimensions and with varying numbers of particles. The experiments show that PI-PSO generally achieves better performance than PSO as the number of dimensions increases. PI-PSO also outperforms PSO on problems with the same dimension but with the same or fewer particles.

Research paper thumbnail of An actor model implementation of distributed factored evolutionary algorithms

Proceedings of the Genetic and Evolutionary Computation Conference Companion, 2018

With the rise of networked multi-core machines, we have seen an increased emphasis on parallel an... more With the rise of networked multi-core machines, we have seen an increased emphasis on parallel and distributed programming. In this paper we describe an implementation of Factored Evolutionary Algorithms (FEA) and Distributed Factored Evolutionary Algorithms (DFEA) using the Actor model. FEA and DFEA are multi-population algorithms, which make them good candidates for distributed implementation. The Actor model is a robust architecture for implementing distributed, reactive programs. After walking through the translation of the serial pseudocode into an Actor implementation, we run validation experiments against an FEA baseline. The evidence supports the claim that the Actor versions preserve the semantics and operational performance of the FEA baseline. We also discuss some of the nuances of translating serial pseudocode into an actual distributed implementation.

Research paper thumbnail of Information Exchange and Conflict Resolution in Particle Swarm Optimization Variants

Single population, biologically-inspired algorithms such as Genetic Algorithm and Particle Swarm ... more Single population, biologically-inspired algorithms such as Genetic Algorithm and Particle Swarm Optimization are effective tools for solving a variety of optimization problems. Like many such algorithms, however, they fall victim to the curse of dimensionality. Additionally, these algorithms often suffer from a phenomenon known as hitchhiking where improved solutions are not unequivocally better for all variables. Insofar as individuals within these populations are deemed to be competitive, one solution to both the curse of dimensionality and the problem of hitchhiking has been to introduce more cooperation. These multi-population algorithms cooperate by decomposing a problem into parts and assigning a population to each part. Factored Evolutionary Algorithms (FEA) generalize this decomposition and cooperation to any evolutionary algorithm. A key element of FEA is a global solution that provides missing information to individual populations and coordinates them. This dissertation e...

Research paper thumbnail of Pareto Improving Selection of the Global Best in Particle Swarm Optimization

2018 IEEE Congress on Evolutionary Computation (CEC), 2018

Particle Swarm Optimization is an effective stochastic optimization technique that simulates a sw... more Particle Swarm Optimization is an effective stochastic optimization technique that simulates a swarm of particles that fly through a problem space. In the process of searching the problem space for a solution, the individual variables of a candidate solution will often take on inferior values characterized as “Two Steps Forward, One Step Back.” Several approaches to solving this problem have introduced varying notions of cooperation and competition. Instead we characterize the success of these multi-swarm techniques as reconciling conflicting information through a mechanism that makes successive candidates Pareto improvements. We use this analysis to construct a variation of PSO that applies this mechanism to gbest selection. Experiments show that this algorithm performs better than the standard gbest PSO algorithm.

Research paper thumbnail of Information sharing and conflict resolution in distributed factored evolutionary algorithms

Proceedings of the Genetic and Evolutionary Computation Conference, 2018

Competition and cooperation are powerful metaphors that have informed improvements in multi-popul... more Competition and cooperation are powerful metaphors that have informed improvements in multi-population algorithms such as the Cooperative Coevolutionary Genetic Algorithm, Cooperative Particle Swarm Optimization, and Factored Evolutionary Algorithms (FEA). However, we suggest a different perspective can give a finer grained understanding of how multi-population algorithms come together to avoid problems like hitchhiking and pseudo-minima. In this paper, we apply the concepts of information sharing and conflict resolution through Pareto improvements to analyze the distributed version of FEA (DFEA). As a result, we find the original DFEA failed to implement FEA with complete fidelity. We then revise DFEA and examine the differences between it and FEA and the new implications for relaxing consensus in the distributed algorithm.

Research paper thumbnail of A formal approach to deriving factored evolutionary algorithm architectures

2017 IEEE Symposium Series on Computational Intelligence (SSCI), 2017

Factored Evolutionary Algorithms (FEA) are a class of evolutionary search-based optimization algo... more Factored Evolutionary Algorithms (FEA) are a class of evolutionary search-based optimization algorithms that have been applied successfully to various problems, such as training neural networks and performing abductive inference in graphical models. An FEA is unique in that it factors the objective function by creating overlapping subpopulations that optimize over a subset of variables of the function. One consideration in using an FEA is determining the appropriate factor architecture, which determines the set of variables each factor will optimize. In this paper, we provide a formal method for deriving factor architectures and give theoretical justification for its use. Specifically, we utilize factor graphs of variables in probabilistic graphical models as a way to define factor architectures. We also prove how a class of problems, like maximizing NK landscapes, are equivalent to abductive inference in probabilistic graphical models. This allows us to take a factor graph architecture and apply it to NK landscapes and a set of commonly used benchmark functions. Finally, we show empirically that using the factor graph representation to derive factors for FEA provides the best performance in the majority of cases studied.

Research paper thumbnail of A New Discrete Particle Swarm Optimization Algorithm

Proceedings of the Genetic and Evolutionary Computation Conference 2016, 2016

Particle Swarm Optimization (PSO) has been shown to perform very well on a wide range of optimiza... more Particle Swarm Optimization (PSO) has been shown to perform very well on a wide range of optimization problems. One of the drawbacks to PSO is that the base algorithm assumes continuous variables. In this paper, we present a version of PSO that is able to optimize over discrete variables. This new PSO algorithm, which we call Integer and Categorical PSO (ICPSO), incorporates ideas from Estimation of Distribution Algorithms (EDAs) in that particles represent probability distributions rather than solution values, and the PSO update modifies the probability distributions. In this paper, we describe our new algorithm and compare its performance against other discrete PSO algorithms. In our experiments, we demonstrate that our algorithm outperforms comparable methods on both discrete benchmark functions and NK landscapes, a mathematical framework that generates tunable fitness landscapes for evaluating EAs.

Research paper thumbnail of Relaxing Consensus in Distributed Factored Evolutionary Algorithms

Proceedings of the Genetic and Evolutionary Computation Conference 2016, 2016

Factored Evolutionary Algorithms (FEA) have proven to be fast and efficient optimization methods,... more Factored Evolutionary Algorithms (FEA) have proven to be fast and efficient optimization methods, often outperforming established methods using single populations. One restriction to FEA is that it requires a central communication point between all of the factors, making FEA difficult to use in completely distributed settings. The Distributed Factored Evolutionary Algorithm (DFEA) relaxes this requirement on central communication by having neighboring factors communicate directly with one another. While DFEA has been effective at finding good solutions, there is often an increase in computational complexity due to the communication between factors. In previous work on DFEA, the authors required the algorithm reach full consensus between factors during communication. In this paper, we demonstrate that even without full consensus, the performance of DFEA was not statistically different on problems with low epistasis. Additionally, we found that there is a relationship between the convergence of consensus between factors and the convergence of fitness of DFEA.

Research paper thumbnail of Standard Diagnostic Services for the ATS framework

2009 IEEE AUTOTESTCON, 2009

The US Navy has been supporting the demonstration of several IEEE standards with the intent of im... more The US Navy has been supporting the demonstration of several IEEE standards with the intent of implementing these standards for future automatic test system procurement. In this paper, we discuss the second phase of a demonstration focusing on the IEEE P1232 AI-ESTATE standard. This standard specifies exchange formats and service interfaces for diagnostic reasoners. The first phase successfully demonstrated the ability to exchange diagnostic models through semantically enriched XML files. The second phase is focusing on the services and has been implemented using a web-based, service-oriented architecture. Here, we discuss implementation issues and preliminary results.

Research paper thumbnail of Not-So-Naïve Bayesian Networks and Unique Identification in Developing Advanced Diagnostics

2006 IEEE Aerospace Conference

Problems in accuracy and effectiveness in system diagnosis and prognosis arise from constructing ... more Problems in accuracy and effectiveness in system diagnosis and prognosis arise from constructing models from design data that do not match implementation, failing to account for inherent uncertainty in test data, and failing to account for characteristics unique to specific units due to variations in usage, environment, or other factors. Large sums of money have been expended by owners of these systems, but little improvement in measures such as retest-OK rate and cannot duplicate rate has been reported. In fact, simply losing track of where specific units are located has resulted in substantial losses of money. In this paper, we study the problem of performing diagnosis and prognosis on systems and describe an approach to building models based on data collected about specific units. We rely on the emerging Department of Defense (DoD) Unique Identification (UID) program that is focusing on obtaining this data and apply Bayesian methods for constructing such diagnostic models. Specifically, we discuss an alternative class of Bayesian model that we call the "not-so-naive" Bayesian network (NBN). We also discuss the concept of the NBN in the context of the UID program as a means of tracking and deriving probabilities for creating the network. Finally, we focus on the specific problems encountered and lessons learned from working with a large, real-world database for the US Navy's STANDARD Missile.

Research paper thumbnail of Asset-specific bayesian diagnostics in mixed contexts

2007 IEEE Autotestcon, 2007

In this paper we build upon previous work to examine the efficacy of blending probabilities in as... more In this paper we build upon previous work to examine the efficacy of blending probabilities in asset-specific classifiers to improve diagnostic accuracy for a fleet of assets. In previous work we also introduced the idea of using split probabilities. We add environmental differentiation to asset differentiation in the experiments and assume that data is acquired in the context of online health monitoring. We hypothesize that overall diagnostic accuracy will be increased with the blending approach relative to the single aggregate classifier or split probability assetspecific classifiers. The hypothesis is largely supported by the results. Future work will concentrate on improving the blending mechanism and working with small data sets.

Research paper thumbnail of On the Linear Separability of Diagnostic Models

2006 IEEE Autotestcon, 2006

As new approaches and algorithms are developed for system diagnosis, it is important to reflect o... more As new approaches and algorithms are developed for system diagnosis, it is important to reflect on existing approaches to determine their strengths and weaknesses. Of concern is identifying potential reasons for false pulls during maintenance. Within the aerospace community, one approach to system diagnosis-based on the D-matrix derived from test dependency modeling-is used widely, yet little has been done to perform any theoretical assessment of the merits of the approach. Past assessments have been limited, largely, to empirical analysis and case studies. In this paper, we provide a theoretical assessment of the representation power of the D-matrix and suggest algorithms and model types for which the D-matrix is appropriate. Finally, we relate the processing of the D-matrix with several diagnostic approaches and suggest how to extend the power of the D-matrix to take advantage of the power of those approaches. 5 The extended version has been submitted to the Journal of Electronic Testing: Theory and Application and includes an analysis of the complexity of building optimal fault trees using the linearly separable formalism.

Research paper thumbnail of Experiments in Bayesian Diagnostics with IUID-Enabled Data

2006 IEEE Autotestcon, 2006

The Department of Defense (DOD) has recognized the importance of improving asset management and h... more The Department of Defense (DOD) has recognized the importance of improving asset management and has created Item Unique Identification numbers (IUIDs) to improve the situation. IUIDs will be used to track financial and contract records and obtain location and status information about parts in DoD inventory. IUIDs will also support data collection for weapon systems from build, test, operations, maintenance, repair, and overhaul histories. In addition to improving the overall logistics process, IUIDs offer an opportunity to utilize asset-specific data to improve system maintenance and support. An Office of the Secretary of Defense (OSD) Pilot Project to implement IUID on a Navy weapon system presents an immediate opportunity to evaluate this use of IUID data. This paper reports on experiments conducted to see if a set of asset-specific diagnostic classifiers trained on subsets of data is more accurate than a general, composite classifier trained on all of the data. In general, it is determined that the set is more accurate than the single classifier given enough data. However, other factors play an important role such as system complexity and noise levels in the data. Additionally, the improvements found do not arise until larger amounts of data are available. This suggests that future work should concentrate on tying the process of data collection to the estimation of the associated probabilities.

Research paper thumbnail of Demonstrating semantic interoperability of diagnostic models via AI-ESTATE

2009 IEEE Aerospace conference, 2009

The Institute for Electrical and Electronics Engineers (IEEE), through its Standards Coordinating... more The Institute for Electrical and Electronics Engineers (IEEE), through its Standards Coordinating Committee 20 (SCC20), is developing interface standards focusing on Automatic Test System-related elements in cooperation with a Department of Defense (DoD) initiative to define, demonstrate, and mandate such standards. One of these standards-IEEE Std 1232-2002 Artificial Intelligence Exchange and Service Tie to All Test Environments (AI-ESTATE)-has been chosen for demonstration prior to mandate. In this paper, we discuss the results of the first phase of the AI-ESTATE demonstration, focusing on semantic interoperability of diagnostic models. The results of this demonstration successfully showed the effectiveness of semantic modeling in information exchange. In addition, the engineering burden was demonstrated to be manageable: all applications were constructed in less than four months by three graduate students working part time. 1,2 TABLE OF CONTENTS 1. INTRODUCTION.

Research paper thumbnail of A Formal Analysis of Fault Diagnosis with D-matrices

Journal of Electronic Testing, 2007

As new approaches and algorithms are developed for system diagnosis, it is important to reflect o... more As new approaches and algorithms are developed for system diagnosis, it is important to reflect on existing approaches to determine their strengths and weaknesses. Of concern is identifying potential reasons for false pulls during maintenance. Within the aerospace community, one approach to system diagnosis-based on the D-matrix derived from test dependency modeling-is used widely, yet little has been done to perform any theoretical assessment of the merits of the approach. Past assessments have been limited, largely, to empirical analysis and case studies. In this paper, we provide a theoretical assessment of the representation power of the D-matrix and suggest algorithms and model types for which the D-matrix is appropriate. We also prove a surprising result relative to the difficulty of generating optimal diagnostic strategies from D-matrices. Finally, we relate the processing of the D-matrix with several diagnostic approaches and suggest how to extend the power of the D-matrix to take advantage of the power of those approaches.

Research paper thumbnail of Employment and Earnings of Performing Artists, 1970-1990. Draft Report, May 1995

Major data sources are described. I. summary of findings indicates that during the period from 19... more Major data sources are described. I. summary of findings indicates that during the period from 1970-1990: (1) performing artists as a work force grew 507. faster than the national labor forc-; (2) intermittent work periods, which are characteristic of the periorming arts profession, increases the likelihood that the number of this labor force is being undercounted; (3) performers' unemployment rates consistently exceed the national average and long-term unemployment and frequent cycles between employment hampers the full utilization of this work force; (4) technologies which substitute for live performances have affected employment in the performing arts industry; (5) performers as a group increased their income levels, but gains are not equally shared and earning levels depend on the type of work performers do; and (6) performing artists tend to cluster in the West and Northwest of the United States, areas which also have the highest rate of unemployment for performing artists. An increase in the work force in the South is a significant trend. Recommendations are given for improved data collection. An appendix includes discussions about general economic and technical issues, an annotated bibliography, and data tables produced from the Additional thanks are due Tony Phipps of SAG and Pam Varricchio and Tony Andretta of Standard Data Corporation, who drew the data we needed from the pension records of the three actor unions. Thanks are also appropriate for Bill Di Paulo of Abruzzi Corporation who transferred the Artist Extract Files from tape to disk. Tom Bailey assisted us with research. We also wish to acknowledge the continuing support we received from Tom Bradshaw, director of the Endowment's Research Division, who oversaw our work and ensured our access to the Artist Extract Files as well as published studies and draft reports.

Research paper thumbnail of Improving diagnostic accuracy by blending probabilities: Some initial experiments

Proceedings of the 18th International Workshop on Principles of Diagnosis (DX-07), 2007

Inspired by the impending availability of asset specific data on several US Department of Defense... more Inspired by the impending availability of asset specific data on several US Department of Defense programs, in a previous paper we looked at the possibility that a set of Bayesian diagnostic models constructed from asset specific data would outperform a single Bayesian diagnostic model constructed from all of the data. There were situations where a set of asset-specific classifiers was superior to a single composite classifier but it wasn't universally the case. The hypothesis in this paper is that a blended classifier can be ...

Research paper thumbnail of Distributional Smoothing in Bayesian Fault Diagnosis

IEEE Transactions on Instrumentation and Measurement, 2009

Previously, we demonstrated the potential value of constructing asset-specific models for fault d... more Previously, we demonstrated the potential value of constructing asset-specific models for fault diagnosis. We also examined the effects of using split probabilities, where prior probabilities come from asset-specific statistics and likelihoods from fleet-wide statistics. In this paper, we build upon that work to examine the efficacy of smoothing probability distributions between asset-specific and fleet-wide distributions to further improve diagnostic accuracy. In the current experiments, we also add environmental differentiation to asset differentiation under the assumption that data are acquired in the context of online health monitoring. We hypothesize that the overall diagnostic accuracy will be increased with the smoothing approach relative to a fleet-wide model or a set of asset-specific models. The hypothesis is largely supported by the results. Future work will concentrate on improving the smoothing mechanism and in the context of small data sets.

Research paper thumbnail of Employment and Earnings of Performing Artists, 1970-1990

Research paper thumbnail of A Formal Analysis of Fault Diagnosis with D-matrices

As new approaches and algorithms are developed for system diagnosis, it is important to reflect o... more As new approaches and algorithms are developed for system diagnosis, it is important to reflect on existing approaches to determine their strengths and weaknesses. Of concern is identifying potential reasons for false pulls during maintenance. Within the aerospace community, one approach to system diagnosis-based on the D-matrix derived from test dependency modeling-is used widely, yet little has been done to perform any theoretical assessment of the merits of the approach. Past assessments have been limited, largely, to empirical analysis and case studies. In this paper, we provide a theoretical assessment of the representation power of the D-matrix and suggest algorithms and model types for which the D-matrix is appropriate. We also prove a surprising result relative to the difficulty of generating optimal diagnostic strategies from D-matrices. Finally, we relate the processing of the D-matrix with several diagnostic approaches and suggest how to extend the power of the D-matrix to take advantage of the power of those approaches.

Research paper thumbnail of Comparative performance and scaling of the pareto improving particle swarm optimization algorithm

Proceedings of the Genetic and Evolutionary Computation Conference Companion, 2018

The Pareto Improving Particle Swarm Optimization algorithm (PI-PSO) has been shown to perform bet... more The Pareto Improving Particle Swarm Optimization algorithm (PI-PSO) has been shown to perform better than Global Best PSO on a variety of benchmark problems. However, these experiments used benchmark problems with a single dimension, namely 32d. Here we compare global best PSO and PI-PSO on benchmark problems of varying dimensions and with varying numbers of particles. The experiments show that PI-PSO generally achieves better performance than PSO as the number of dimensions increases. PI-PSO also outperforms PSO on problems with the same dimension but with the same or fewer particles.

Research paper thumbnail of An actor model implementation of distributed factored evolutionary algorithms

Proceedings of the Genetic and Evolutionary Computation Conference Companion, 2018

With the rise of networked multi-core machines, we have seen an increased emphasis on parallel an... more With the rise of networked multi-core machines, we have seen an increased emphasis on parallel and distributed programming. In this paper we describe an implementation of Factored Evolutionary Algorithms (FEA) and Distributed Factored Evolutionary Algorithms (DFEA) using the Actor model. FEA and DFEA are multi-population algorithms, which make them good candidates for distributed implementation. The Actor model is a robust architecture for implementing distributed, reactive programs. After walking through the translation of the serial pseudocode into an Actor implementation, we run validation experiments against an FEA baseline. The evidence supports the claim that the Actor versions preserve the semantics and operational performance of the FEA baseline. We also discuss some of the nuances of translating serial pseudocode into an actual distributed implementation.

Research paper thumbnail of Information Exchange and Conflict Resolution in Particle Swarm Optimization Variants

Single population, biologically-inspired algorithms such as Genetic Algorithm and Particle Swarm ... more Single population, biologically-inspired algorithms such as Genetic Algorithm and Particle Swarm Optimization are effective tools for solving a variety of optimization problems. Like many such algorithms, however, they fall victim to the curse of dimensionality. Additionally, these algorithms often suffer from a phenomenon known as hitchhiking where improved solutions are not unequivocally better for all variables. Insofar as individuals within these populations are deemed to be competitive, one solution to both the curse of dimensionality and the problem of hitchhiking has been to introduce more cooperation. These multi-population algorithms cooperate by decomposing a problem into parts and assigning a population to each part. Factored Evolutionary Algorithms (FEA) generalize this decomposition and cooperation to any evolutionary algorithm. A key element of FEA is a global solution that provides missing information to individual populations and coordinates them. This dissertation e...

Research paper thumbnail of Pareto Improving Selection of the Global Best in Particle Swarm Optimization

2018 IEEE Congress on Evolutionary Computation (CEC), 2018

Particle Swarm Optimization is an effective stochastic optimization technique that simulates a sw... more Particle Swarm Optimization is an effective stochastic optimization technique that simulates a swarm of particles that fly through a problem space. In the process of searching the problem space for a solution, the individual variables of a candidate solution will often take on inferior values characterized as “Two Steps Forward, One Step Back.” Several approaches to solving this problem have introduced varying notions of cooperation and competition. Instead we characterize the success of these multi-swarm techniques as reconciling conflicting information through a mechanism that makes successive candidates Pareto improvements. We use this analysis to construct a variation of PSO that applies this mechanism to gbest selection. Experiments show that this algorithm performs better than the standard gbest PSO algorithm.

Research paper thumbnail of Information sharing and conflict resolution in distributed factored evolutionary algorithms

Proceedings of the Genetic and Evolutionary Computation Conference, 2018

Competition and cooperation are powerful metaphors that have informed improvements in multi-popul... more Competition and cooperation are powerful metaphors that have informed improvements in multi-population algorithms such as the Cooperative Coevolutionary Genetic Algorithm, Cooperative Particle Swarm Optimization, and Factored Evolutionary Algorithms (FEA). However, we suggest a different perspective can give a finer grained understanding of how multi-population algorithms come together to avoid problems like hitchhiking and pseudo-minima. In this paper, we apply the concepts of information sharing and conflict resolution through Pareto improvements to analyze the distributed version of FEA (DFEA). As a result, we find the original DFEA failed to implement FEA with complete fidelity. We then revise DFEA and examine the differences between it and FEA and the new implications for relaxing consensus in the distributed algorithm.

Research paper thumbnail of A formal approach to deriving factored evolutionary algorithm architectures

2017 IEEE Symposium Series on Computational Intelligence (SSCI), 2017

Factored Evolutionary Algorithms (FEA) are a class of evolutionary search-based optimization algo... more Factored Evolutionary Algorithms (FEA) are a class of evolutionary search-based optimization algorithms that have been applied successfully to various problems, such as training neural networks and performing abductive inference in graphical models. An FEA is unique in that it factors the objective function by creating overlapping subpopulations that optimize over a subset of variables of the function. One consideration in using an FEA is determining the appropriate factor architecture, which determines the set of variables each factor will optimize. In this paper, we provide a formal method for deriving factor architectures and give theoretical justification for its use. Specifically, we utilize factor graphs of variables in probabilistic graphical models as a way to define factor architectures. We also prove how a class of problems, like maximizing NK landscapes, are equivalent to abductive inference in probabilistic graphical models. This allows us to take a factor graph architecture and apply it to NK landscapes and a set of commonly used benchmark functions. Finally, we show empirically that using the factor graph representation to derive factors for FEA provides the best performance in the majority of cases studied.

Research paper thumbnail of A New Discrete Particle Swarm Optimization Algorithm

Proceedings of the Genetic and Evolutionary Computation Conference 2016, 2016

Particle Swarm Optimization (PSO) has been shown to perform very well on a wide range of optimiza... more Particle Swarm Optimization (PSO) has been shown to perform very well on a wide range of optimization problems. One of the drawbacks to PSO is that the base algorithm assumes continuous variables. In this paper, we present a version of PSO that is able to optimize over discrete variables. This new PSO algorithm, which we call Integer and Categorical PSO (ICPSO), incorporates ideas from Estimation of Distribution Algorithms (EDAs) in that particles represent probability distributions rather than solution values, and the PSO update modifies the probability distributions. In this paper, we describe our new algorithm and compare its performance against other discrete PSO algorithms. In our experiments, we demonstrate that our algorithm outperforms comparable methods on both discrete benchmark functions and NK landscapes, a mathematical framework that generates tunable fitness landscapes for evaluating EAs.

Research paper thumbnail of Relaxing Consensus in Distributed Factored Evolutionary Algorithms

Proceedings of the Genetic and Evolutionary Computation Conference 2016, 2016

Factored Evolutionary Algorithms (FEA) have proven to be fast and efficient optimization methods,... more Factored Evolutionary Algorithms (FEA) have proven to be fast and efficient optimization methods, often outperforming established methods using single populations. One restriction to FEA is that it requires a central communication point between all of the factors, making FEA difficult to use in completely distributed settings. The Distributed Factored Evolutionary Algorithm (DFEA) relaxes this requirement on central communication by having neighboring factors communicate directly with one another. While DFEA has been effective at finding good solutions, there is often an increase in computational complexity due to the communication between factors. In previous work on DFEA, the authors required the algorithm reach full consensus between factors during communication. In this paper, we demonstrate that even without full consensus, the performance of DFEA was not statistically different on problems with low epistasis. Additionally, we found that there is a relationship between the convergence of consensus between factors and the convergence of fitness of DFEA.

Research paper thumbnail of Standard Diagnostic Services for the ATS framework

2009 IEEE AUTOTESTCON, 2009

The US Navy has been supporting the demonstration of several IEEE standards with the intent of im... more The US Navy has been supporting the demonstration of several IEEE standards with the intent of implementing these standards for future automatic test system procurement. In this paper, we discuss the second phase of a demonstration focusing on the IEEE P1232 AI-ESTATE standard. This standard specifies exchange formats and service interfaces for diagnostic reasoners. The first phase successfully demonstrated the ability to exchange diagnostic models through semantically enriched XML files. The second phase is focusing on the services and has been implemented using a web-based, service-oriented architecture. Here, we discuss implementation issues and preliminary results.

Research paper thumbnail of Not-So-Naïve Bayesian Networks and Unique Identification in Developing Advanced Diagnostics

2006 IEEE Aerospace Conference

Problems in accuracy and effectiveness in system diagnosis and prognosis arise from constructing ... more Problems in accuracy and effectiveness in system diagnosis and prognosis arise from constructing models from design data that do not match implementation, failing to account for inherent uncertainty in test data, and failing to account for characteristics unique to specific units due to variations in usage, environment, or other factors. Large sums of money have been expended by owners of these systems, but little improvement in measures such as retest-OK rate and cannot duplicate rate has been reported. In fact, simply losing track of where specific units are located has resulted in substantial losses of money. In this paper, we study the problem of performing diagnosis and prognosis on systems and describe an approach to building models based on data collected about specific units. We rely on the emerging Department of Defense (DoD) Unique Identification (UID) program that is focusing on obtaining this data and apply Bayesian methods for constructing such diagnostic models. Specifically, we discuss an alternative class of Bayesian model that we call the "not-so-naive" Bayesian network (NBN). We also discuss the concept of the NBN in the context of the UID program as a means of tracking and deriving probabilities for creating the network. Finally, we focus on the specific problems encountered and lessons learned from working with a large, real-world database for the US Navy's STANDARD Missile.

Research paper thumbnail of Asset-specific bayesian diagnostics in mixed contexts

2007 IEEE Autotestcon, 2007

In this paper we build upon previous work to examine the efficacy of blending probabilities in as... more In this paper we build upon previous work to examine the efficacy of blending probabilities in asset-specific classifiers to improve diagnostic accuracy for a fleet of assets. In previous work we also introduced the idea of using split probabilities. We add environmental differentiation to asset differentiation in the experiments and assume that data is acquired in the context of online health monitoring. We hypothesize that overall diagnostic accuracy will be increased with the blending approach relative to the single aggregate classifier or split probability assetspecific classifiers. The hypothesis is largely supported by the results. Future work will concentrate on improving the blending mechanism and working with small data sets.

Research paper thumbnail of On the Linear Separability of Diagnostic Models

2006 IEEE Autotestcon, 2006

As new approaches and algorithms are developed for system diagnosis, it is important to reflect o... more As new approaches and algorithms are developed for system diagnosis, it is important to reflect on existing approaches to determine their strengths and weaknesses. Of concern is identifying potential reasons for false pulls during maintenance. Within the aerospace community, one approach to system diagnosis-based on the D-matrix derived from test dependency modeling-is used widely, yet little has been done to perform any theoretical assessment of the merits of the approach. Past assessments have been limited, largely, to empirical analysis and case studies. In this paper, we provide a theoretical assessment of the representation power of the D-matrix and suggest algorithms and model types for which the D-matrix is appropriate. Finally, we relate the processing of the D-matrix with several diagnostic approaches and suggest how to extend the power of the D-matrix to take advantage of the power of those approaches. 5 The extended version has been submitted to the Journal of Electronic Testing: Theory and Application and includes an analysis of the complexity of building optimal fault trees using the linearly separable formalism.

Research paper thumbnail of Experiments in Bayesian Diagnostics with IUID-Enabled Data

2006 IEEE Autotestcon, 2006

The Department of Defense (DOD) has recognized the importance of improving asset management and h... more The Department of Defense (DOD) has recognized the importance of improving asset management and has created Item Unique Identification numbers (IUIDs) to improve the situation. IUIDs will be used to track financial and contract records and obtain location and status information about parts in DoD inventory. IUIDs will also support data collection for weapon systems from build, test, operations, maintenance, repair, and overhaul histories. In addition to improving the overall logistics process, IUIDs offer an opportunity to utilize asset-specific data to improve system maintenance and support. An Office of the Secretary of Defense (OSD) Pilot Project to implement IUID on a Navy weapon system presents an immediate opportunity to evaluate this use of IUID data. This paper reports on experiments conducted to see if a set of asset-specific diagnostic classifiers trained on subsets of data is more accurate than a general, composite classifier trained on all of the data. In general, it is determined that the set is more accurate than the single classifier given enough data. However, other factors play an important role such as system complexity and noise levels in the data. Additionally, the improvements found do not arise until larger amounts of data are available. This suggests that future work should concentrate on tying the process of data collection to the estimation of the associated probabilities.

Research paper thumbnail of Demonstrating semantic interoperability of diagnostic models via AI-ESTATE

2009 IEEE Aerospace conference, 2009

The Institute for Electrical and Electronics Engineers (IEEE), through its Standards Coordinating... more The Institute for Electrical and Electronics Engineers (IEEE), through its Standards Coordinating Committee 20 (SCC20), is developing interface standards focusing on Automatic Test System-related elements in cooperation with a Department of Defense (DoD) initiative to define, demonstrate, and mandate such standards. One of these standards-IEEE Std 1232-2002 Artificial Intelligence Exchange and Service Tie to All Test Environments (AI-ESTATE)-has been chosen for demonstration prior to mandate. In this paper, we discuss the results of the first phase of the AI-ESTATE demonstration, focusing on semantic interoperability of diagnostic models. The results of this demonstration successfully showed the effectiveness of semantic modeling in information exchange. In addition, the engineering burden was demonstrated to be manageable: all applications were constructed in less than four months by three graduate students working part time. 1,2 TABLE OF CONTENTS 1. INTRODUCTION.

Research paper thumbnail of A Formal Analysis of Fault Diagnosis with D-matrices

Journal of Electronic Testing, 2007

As new approaches and algorithms are developed for system diagnosis, it is important to reflect o... more As new approaches and algorithms are developed for system diagnosis, it is important to reflect on existing approaches to determine their strengths and weaknesses. Of concern is identifying potential reasons for false pulls during maintenance. Within the aerospace community, one approach to system diagnosis-based on the D-matrix derived from test dependency modeling-is used widely, yet little has been done to perform any theoretical assessment of the merits of the approach. Past assessments have been limited, largely, to empirical analysis and case studies. In this paper, we provide a theoretical assessment of the representation power of the D-matrix and suggest algorithms and model types for which the D-matrix is appropriate. We also prove a surprising result relative to the difficulty of generating optimal diagnostic strategies from D-matrices. Finally, we relate the processing of the D-matrix with several diagnostic approaches and suggest how to extend the power of the D-matrix to take advantage of the power of those approaches.

Research paper thumbnail of Employment and Earnings of Performing Artists, 1970-1990. Draft Report, May 1995

Major data sources are described. I. summary of findings indicates that during the period from 19... more Major data sources are described. I. summary of findings indicates that during the period from 1970-1990: (1) performing artists as a work force grew 507. faster than the national labor forc-; (2) intermittent work periods, which are characteristic of the periorming arts profession, increases the likelihood that the number of this labor force is being undercounted; (3) performers' unemployment rates consistently exceed the national average and long-term unemployment and frequent cycles between employment hampers the full utilization of this work force; (4) technologies which substitute for live performances have affected employment in the performing arts industry; (5) performers as a group increased their income levels, but gains are not equally shared and earning levels depend on the type of work performers do; and (6) performing artists tend to cluster in the West and Northwest of the United States, areas which also have the highest rate of unemployment for performing artists. An increase in the work force in the South is a significant trend. Recommendations are given for improved data collection. An appendix includes discussions about general economic and technical issues, an annotated bibliography, and data tables produced from the Additional thanks are due Tony Phipps of SAG and Pam Varricchio and Tony Andretta of Standard Data Corporation, who drew the data we needed from the pension records of the three actor unions. Thanks are also appropriate for Bill Di Paulo of Abruzzi Corporation who transferred the Artist Extract Files from tape to disk. Tom Bailey assisted us with research. We also wish to acknowledge the continuing support we received from Tom Bradshaw, director of the Endowment's Research Division, who oversaw our work and ensured our access to the Artist Extract Files as well as published studies and draft reports.

Research paper thumbnail of Improving diagnostic accuracy by blending probabilities: Some initial experiments

Proceedings of the 18th International Workshop on Principles of Diagnosis (DX-07), 2007

Inspired by the impending availability of asset specific data on several US Department of Defense... more Inspired by the impending availability of asset specific data on several US Department of Defense programs, in a previous paper we looked at the possibility that a set of Bayesian diagnostic models constructed from asset specific data would outperform a single Bayesian diagnostic model constructed from all of the data. There were situations where a set of asset-specific classifiers was superior to a single composite classifier but it wasn't universally the case. The hypothesis in this paper is that a blended classifier can be ...

Research paper thumbnail of Distributional Smoothing in Bayesian Fault Diagnosis

IEEE Transactions on Instrumentation and Measurement, 2009

Previously, we demonstrated the potential value of constructing asset-specific models for fault d... more Previously, we demonstrated the potential value of constructing asset-specific models for fault diagnosis. We also examined the effects of using split probabilities, where prior probabilities come from asset-specific statistics and likelihoods from fleet-wide statistics. In this paper, we build upon that work to examine the efficacy of smoothing probability distributions between asset-specific and fleet-wide distributions to further improve diagnostic accuracy. In the current experiments, we also add environmental differentiation to asset differentiation under the assumption that data are acquired in the context of online health monitoring. We hypothesize that the overall diagnostic accuracy will be increased with the smoothing approach relative to a fleet-wide model or a set of asset-specific models. The hypothesis is largely supported by the results. Future work will concentrate on improving the smoothing mechanism and in the context of small data sets.

Research paper thumbnail of Employment and Earnings of Performing Artists, 1970-1990

Research paper thumbnail of A Formal Analysis of Fault Diagnosis with D-matrices

As new approaches and algorithms are developed for system diagnosis, it is important to reflect o... more As new approaches and algorithms are developed for system diagnosis, it is important to reflect on existing approaches to determine their strengths and weaknesses. Of concern is identifying potential reasons for false pulls during maintenance. Within the aerospace community, one approach to system diagnosis-based on the D-matrix derived from test dependency modeling-is used widely, yet little has been done to perform any theoretical assessment of the merits of the approach. Past assessments have been limited, largely, to empirical analysis and case studies. In this paper, we provide a theoretical assessment of the representation power of the D-matrix and suggest algorithms and model types for which the D-matrix is appropriate. We also prove a surprising result relative to the difficulty of generating optimal diagnostic strategies from D-matrices. Finally, we relate the processing of the D-matrix with several diagnostic approaches and suggest how to extend the power of the D-matrix to take advantage of the power of those approaches.