Ralph Gailis | Defence Science & Technology Organisation (original) (raw)
Papers by Ralph Gailis
Atmospheric Environment, 2016
2012 15th International Conference on Information Fusion, 2012
The Hazard Prediction and Assessment Capability (HPAC) is a hazard modelling tool that predicts t... more The Hazard Prediction and Assessment Capability (HPAC) is a hazard modelling tool that predicts the effects of the release of toxic materials to the atmosphere. Detailed meteorological data is required to obtain reliable modelling results, including surface observations, upper-air profiles and gridded forecast data. HPAC is a US developed model and cannot interpret locally available meteorological data. This report describes the DSTO-built computer program AWSMSTR which retrieves the relevant data from a remote server and reformats it for use in HPAC.
This paper provides an outline of the technical solution to be adopted when integrating the Hazar... more This paper provides an outline of the technical solution to be adopted when integrating the Hazard Prediction and Assessment Capability (HPAC) with DSTO's Joint Seminar Wargaming Adjudication Tool (jSWAT). Opportunities to conduct "least path of resistance" integration between the two applications are explored to support an eventual Proof of Concept demonstration. The report concludes with some observations on achievable longer term integration goals.
This paper undertakes a technical review of the properties of three diseases often associated wit... more This paper undertakes a technical review of the properties of three diseases often associated with the threat of bioterrorism: anthrax, plague and smallpox. While the literature on these agents is extensive, the information is not generally available in a concise form for the purpose of modelling the course of spread of disease through a population, including the emergence of observable indicators of community infection. The focus here is to extract from the literature information describing these agents for the purpose of developing models of symptoms emergence and disease progression in individual people, as well as parameters describing how the diseases may spread in a community and be observable. This information can be used to parameterise models of disease diagnosis in a community, for the purpose of analysing candidate syndromic surveillance systems. Information is also provided on common diseases that may be misdiagnosed in the early stages of an outbreak due to an act of bioterrorism, namely, influenza, chickenpox, and community acquired pneumonia. Where information is not available in precise quantitative form, semi-quantitative graphs are provided, which provide a useful summary for developers of probabilistic models of disease diagnosis.
Http Dx Doi Org 10 1080 0740817x 2013 803641, May 1, 2014
ABSTRACT The onset of an epidemic can be foreshadowed theoretically through observation of a numb... more ABSTRACT The onset of an epidemic can be foreshadowed theoretically through observation of a number of syndromic signals, such as absenteeism or rising sales of particular pharmaceuticals. The success of such approaches depends on how well the uncertainty associated with the early stages of an epidemic can be managed. This article uses scenario trees to summarize the uncertainty in the parameters defining an epidemiological process and the future path the epidemic might take. Extensive simulations are used to generate various syndromic and epidemic time series, which are then summarized in scenario trees, creating a simple data structure that can be explored quickly at surveillance time without the need to fit models. Decisions can be made based on the subset of the uncertainty (the subtree) that best fits the current observed syndromic signals. Simulations are performed to investigate how well an underlying dynamic model of an epidemic with inhomogeneous mixing and noise fluctuations can capture the effects of social interactions. Two noise terms are introduced to capture the observable fluctuations in the social network connectivity and variation in some model parameters (e.g., infectious time). Finally, it is shown how the entire framework can be used to compare syndromic surveillance systems against each other; to evaluate the effect of lag and noise on accuracy; and to evaluate the impact that differences in syndromic behavior among susceptible and infected populations have on accuracy.
This report summarises recent research conducted by the DSTO in plume model development for urban... more This report summarises recent research conducted by the DSTO in plume model development for urban environments, with an emphasis on establishing clear physical grounds for the models, yet maintaining enough simplicity to be treated numerically in an operationally viable way. The aim is not to replace existing operational models with a new generation of more accurate models, but to provide a more physics-based framework for flow and dispersion in an urban environment that can reconcile the empirically based approach of current operational models, and the more sophisticated computational fluid dynamics techniques now gaining popularity for atmospheric dispersion applications. A key feature of the model framework developed in this report is the definition of a single parameter that describes canopy morphology, and links this to canopy flow variables. A simple canopy dispersion model is then developed, based on flow parameters generated by the canopy model. In relevant areas the wellknown Urban Dispersion Model by the UK Defence Science and Technology Laboratory is used as a benchmark for comparison. Supporting evidence for the models developed here is supplied through comparison with experimental data from a water channel simulation.
: An example of one of the obstacle arrays.
The higher-order correlation functions for the concentration fluctuations arising from a two-poin... more The higher-order correlation functions for the concentration fluctuations arising from a two-point-source configuration have been calculated analytically within the context of the phenomenology of a fluctuating plume model (viz., a meandering plume model that explicitly incorporates internal fluctuations). Explicit expressions for the second-, third-, and fourth-order correlation functions between the concentration fluctuations produced by two point sources are given in terms of the source separation d and the five physically based parameters that define the generalized fluctuating plume model: namely, the absolute plume dispersion, σ a , which determines the outer plume length scale; the relative plume dispersion, σ r , which determines the inner plume length scale; the fluctuation intensity, i r , in relative coordinates, which determines the internal concentration fluctuation level; the correlation coefficient, r, between the positions of the centroids of the two interfering plumes; and, the correlation coefficient, r * , between the concentration fluctuations of the two plumes in relative coordinates, which determines the degree of internal mixing of the two scalars. Furthermore, the form of the total concentration probability density function arising from the interference produced by two point sources is presented. Predictions for the second-order correlation function, ρ, and for the total concentration probability density function have been compared with some new experimental data for a two-point-source configuration in grid turbulence generated in a water-channel simulation. These results are in good agreement with the data and suggest that the analytical model for the second-order correlation function and the total concentration probability density function can reproduce many qualitative trends in the interaction of plumes from two sources.
IIE Transactions, 2014
ABSTRACT The onset of an epidemic can be foreshadowed theoretically through observation of a numb... more ABSTRACT The onset of an epidemic can be foreshadowed theoretically through observation of a number of syndromic signals, such as absenteeism or rising sales of particular pharmaceuticals. The success of such approaches depends on how well the uncertainty associated with the early stages of an epidemic can be managed. This article uses scenario trees to summarize the uncertainty in the parameters defining an epidemiological process and the future path the epidemic might take. Extensive simulations are used to generate various syndromic and epidemic time series, which are then summarized in scenario trees, creating a simple data structure that can be explored quickly at surveillance time without the need to fit models. Decisions can be made based on the subset of the uncertainty (the subtree) that best fits the current observed syndromic signals. Simulations are performed to investigate how well an underlying dynamic model of an epidemic with inhomogeneous mixing and noise fluctuations can capture the effects of social interactions. Two noise terms are introduced to capture the observable fluctuations in the social network connectivity and variation in some model parameters (e.g., infectious time). Finally, it is shown how the entire framework can be used to compare syndromic surveillance systems against each other; to evaluate the effect of lag and noise on accuracy; and to evaluate the impact that differences in syndromic behavior among susceptible and infected populations have on accuracy.
Boundary-Layer Meteorology, 2006
The dispersion of tracers within a large array of obstacles is investigated in a boundary-layer w... more The dispersion of tracers within a large array of obstacles is investigated in a boundary-layer wind-tunnel simulation. The experiment models at 1:50 scale a large outdoor field study that simulated an urban boundary layer with an array of shipping containers, known as MUST (Mock Urban Setting Test – Yee and Biltoft, 2004, Boundary-Layer Meteorology 111, 363–415). The wind-tunnel results are
ABSTRACT The paper derives the theoretical Cramer-Rao lower bound for parameter estimation of a s... more ABSTRACT The paper derives the theoretical Cramer-Rao lower bound for parameter estimation of a source (of emitting energy, gas, aerosol), monitored by a network of sensors providing binary measurements. The theoretical bound is studied in the context of a source of a continuous release in the atmosphere of hazardous gas or aerosol. Numerical results show a good agreement with the empirical errors, obtained using an MCMC parameter estimation technique.
Signal Processing, 2015
ABSTRACT Localisation of a source of a toxic release of biochemical aerosols in the atmosphere is... more ABSTRACT Localisation of a source of a toxic release of biochemical aerosols in the atmosphere is a problem of great importance for public safety. Two main practical difficulties are encountered in this problem: the lack of knowledge of the likelihood function of measurements collected by biochemical sensors, and the plethora of candidate dispersion models, developed under various assumptions (e.g. meteorological conditions, terrain). Aiming to overcome these two difficulties, the paper proposes a likelihood-free approximate Bayesian computation method, which simultaneously uses a set of candidate dispersion models, to localise the source. This estimation framework is implemented via the Monte Carlo method and tested using two experimental datasets.
2014 IEEE Workshop on Statistical Signal Processing (SSP), 2014
ABSTRACT The Gaussian plume model is the core of most regulatory atmospheric dispersion models. T... more ABSTRACT The Gaussian plume model is the core of most regulatory atmospheric dispersion models. The parameters of the model include the source characteristics (e.g. location, strength, size) and environmental parameters (wind speed, direction, atmospheric stability conditions). A sensor network is at disposal to measure the concentration of biological pathogen or chemical substance within the plume. This paper presents a theoretical analysis of the best achievable accuracy in estimation of Gaussian plume model parameters. Numerical results illustrate how parameter estimation accuracy depends on sensor measurement accuracy, the density of sensors and the quality of (prior) meteorological advice. The theoretical bounds are compared with empirical errors obtained using an importance sampling parameter estimation technique.
Information Fusion, 2014
ABSTRACT The Gaussian plume model is the core of most regulatory atmospheric dispersion models. T... more ABSTRACT The Gaussian plume model is the core of most regulatory atmospheric dispersion models. The parameters of the model include the source characteristics (e.g. location, strength) and environmental parameters (wind speed, direction, atmospheric stability conditions). The paper presents a theoretical analysis of the best achievable accuracy in estimation of Gaussian plume parameters in the context of a continuous point-source release and using a binary sensor network for acquisition of measurements. The problem is relevant for automatic localisation of atmospheric pollutants with applications in public health and defence. The theoretical bounds of achievable accuracy provide a guideline for sensor network deployment and its performance under various environmental conditions. The bounds are compared with empirical errors obtained using a Markov chain Monte Carlo (MCMC) parameter estimation technique.
Journal of theoretical biology, Jan 28, 2015
Evaluating whether a disease outbreak has occurred based on limited information in medical record... more Evaluating whether a disease outbreak has occurred based on limited information in medical records is inherently a probabilistic problem. This paper presents a methodology for consistently analysing the probability that a disease targeted by a surveillance system has appeared in the population, based on the medical records of the individuals within the target population, using a Bayesian network. To enable the system to produce a probability density function of the fraction of the population that is infected, a mathematically consistent conjoining of Bayesian networks and particle filters is used. This approach is tested against the default algorithm of ESSENCE Desktop Edition (which adaptively uses Poisson, Exponentially Weighted Moving Average and Linear Regression techniques as needed), and is shown, for the simulated test data used, to give significantly shorter detection times at false alarm rates of practical interest. This methodology shows promise to greatly improve detectio...
2007 Information, Decision and Control, 2007
The problem is to localise a source of gamma radiation using dose rate measurements taken with a ... more The problem is to localise a source of gamma radiation using dose rate measurements taken with a gamma probe at various points in space. A statistical model of dose-rate counts is developed using experimental data recorded in a laboratory. The problem is then studied using the theoretical Cramer-Rao bound (CRB) analysis, which quantifies the accuracy with which it is possible to localise the source and estimate its intensity. Three estimation algorithms are implemented and their performance investigated by simulations. The algorithms are then applied to field trial radiological survey data. The maximum likelihood algorithm performs the best of the considered algorithms.
Atmospheric Environment, 2016
2012 15th International Conference on Information Fusion, 2012
The Hazard Prediction and Assessment Capability (HPAC) is a hazard modelling tool that predicts t... more The Hazard Prediction and Assessment Capability (HPAC) is a hazard modelling tool that predicts the effects of the release of toxic materials to the atmosphere. Detailed meteorological data is required to obtain reliable modelling results, including surface observations, upper-air profiles and gridded forecast data. HPAC is a US developed model and cannot interpret locally available meteorological data. This report describes the DSTO-built computer program AWSMSTR which retrieves the relevant data from a remote server and reformats it for use in HPAC.
This paper provides an outline of the technical solution to be adopted when integrating the Hazar... more This paper provides an outline of the technical solution to be adopted when integrating the Hazard Prediction and Assessment Capability (HPAC) with DSTO's Joint Seminar Wargaming Adjudication Tool (jSWAT). Opportunities to conduct "least path of resistance" integration between the two applications are explored to support an eventual Proof of Concept demonstration. The report concludes with some observations on achievable longer term integration goals.
This paper undertakes a technical review of the properties of three diseases often associated wit... more This paper undertakes a technical review of the properties of three diseases often associated with the threat of bioterrorism: anthrax, plague and smallpox. While the literature on these agents is extensive, the information is not generally available in a concise form for the purpose of modelling the course of spread of disease through a population, including the emergence of observable indicators of community infection. The focus here is to extract from the literature information describing these agents for the purpose of developing models of symptoms emergence and disease progression in individual people, as well as parameters describing how the diseases may spread in a community and be observable. This information can be used to parameterise models of disease diagnosis in a community, for the purpose of analysing candidate syndromic surveillance systems. Information is also provided on common diseases that may be misdiagnosed in the early stages of an outbreak due to an act of bioterrorism, namely, influenza, chickenpox, and community acquired pneumonia. Where information is not available in precise quantitative form, semi-quantitative graphs are provided, which provide a useful summary for developers of probabilistic models of disease diagnosis.
Http Dx Doi Org 10 1080 0740817x 2013 803641, May 1, 2014
ABSTRACT The onset of an epidemic can be foreshadowed theoretically through observation of a numb... more ABSTRACT The onset of an epidemic can be foreshadowed theoretically through observation of a number of syndromic signals, such as absenteeism or rising sales of particular pharmaceuticals. The success of such approaches depends on how well the uncertainty associated with the early stages of an epidemic can be managed. This article uses scenario trees to summarize the uncertainty in the parameters defining an epidemiological process and the future path the epidemic might take. Extensive simulations are used to generate various syndromic and epidemic time series, which are then summarized in scenario trees, creating a simple data structure that can be explored quickly at surveillance time without the need to fit models. Decisions can be made based on the subset of the uncertainty (the subtree) that best fits the current observed syndromic signals. Simulations are performed to investigate how well an underlying dynamic model of an epidemic with inhomogeneous mixing and noise fluctuations can capture the effects of social interactions. Two noise terms are introduced to capture the observable fluctuations in the social network connectivity and variation in some model parameters (e.g., infectious time). Finally, it is shown how the entire framework can be used to compare syndromic surveillance systems against each other; to evaluate the effect of lag and noise on accuracy; and to evaluate the impact that differences in syndromic behavior among susceptible and infected populations have on accuracy.
This report summarises recent research conducted by the DSTO in plume model development for urban... more This report summarises recent research conducted by the DSTO in plume model development for urban environments, with an emphasis on establishing clear physical grounds for the models, yet maintaining enough simplicity to be treated numerically in an operationally viable way. The aim is not to replace existing operational models with a new generation of more accurate models, but to provide a more physics-based framework for flow and dispersion in an urban environment that can reconcile the empirically based approach of current operational models, and the more sophisticated computational fluid dynamics techniques now gaining popularity for atmospheric dispersion applications. A key feature of the model framework developed in this report is the definition of a single parameter that describes canopy morphology, and links this to canopy flow variables. A simple canopy dispersion model is then developed, based on flow parameters generated by the canopy model. In relevant areas the wellknown Urban Dispersion Model by the UK Defence Science and Technology Laboratory is used as a benchmark for comparison. Supporting evidence for the models developed here is supplied through comparison with experimental data from a water channel simulation.
: An example of one of the obstacle arrays.
The higher-order correlation functions for the concentration fluctuations arising from a two-poin... more The higher-order correlation functions for the concentration fluctuations arising from a two-point-source configuration have been calculated analytically within the context of the phenomenology of a fluctuating plume model (viz., a meandering plume model that explicitly incorporates internal fluctuations). Explicit expressions for the second-, third-, and fourth-order correlation functions between the concentration fluctuations produced by two point sources are given in terms of the source separation d and the five physically based parameters that define the generalized fluctuating plume model: namely, the absolute plume dispersion, σ a , which determines the outer plume length scale; the relative plume dispersion, σ r , which determines the inner plume length scale; the fluctuation intensity, i r , in relative coordinates, which determines the internal concentration fluctuation level; the correlation coefficient, r, between the positions of the centroids of the two interfering plumes; and, the correlation coefficient, r * , between the concentration fluctuations of the two plumes in relative coordinates, which determines the degree of internal mixing of the two scalars. Furthermore, the form of the total concentration probability density function arising from the interference produced by two point sources is presented. Predictions for the second-order correlation function, ρ, and for the total concentration probability density function have been compared with some new experimental data for a two-point-source configuration in grid turbulence generated in a water-channel simulation. These results are in good agreement with the data and suggest that the analytical model for the second-order correlation function and the total concentration probability density function can reproduce many qualitative trends in the interaction of plumes from two sources.
IIE Transactions, 2014
ABSTRACT The onset of an epidemic can be foreshadowed theoretically through observation of a numb... more ABSTRACT The onset of an epidemic can be foreshadowed theoretically through observation of a number of syndromic signals, such as absenteeism or rising sales of particular pharmaceuticals. The success of such approaches depends on how well the uncertainty associated with the early stages of an epidemic can be managed. This article uses scenario trees to summarize the uncertainty in the parameters defining an epidemiological process and the future path the epidemic might take. Extensive simulations are used to generate various syndromic and epidemic time series, which are then summarized in scenario trees, creating a simple data structure that can be explored quickly at surveillance time without the need to fit models. Decisions can be made based on the subset of the uncertainty (the subtree) that best fits the current observed syndromic signals. Simulations are performed to investigate how well an underlying dynamic model of an epidemic with inhomogeneous mixing and noise fluctuations can capture the effects of social interactions. Two noise terms are introduced to capture the observable fluctuations in the social network connectivity and variation in some model parameters (e.g., infectious time). Finally, it is shown how the entire framework can be used to compare syndromic surveillance systems against each other; to evaluate the effect of lag and noise on accuracy; and to evaluate the impact that differences in syndromic behavior among susceptible and infected populations have on accuracy.
Boundary-Layer Meteorology, 2006
The dispersion of tracers within a large array of obstacles is investigated in a boundary-layer w... more The dispersion of tracers within a large array of obstacles is investigated in a boundary-layer wind-tunnel simulation. The experiment models at 1:50 scale a large outdoor field study that simulated an urban boundary layer with an array of shipping containers, known as MUST (Mock Urban Setting Test – Yee and Biltoft, 2004, Boundary-Layer Meteorology 111, 363–415). The wind-tunnel results are
ABSTRACT The paper derives the theoretical Cramer-Rao lower bound for parameter estimation of a s... more ABSTRACT The paper derives the theoretical Cramer-Rao lower bound for parameter estimation of a source (of emitting energy, gas, aerosol), monitored by a network of sensors providing binary measurements. The theoretical bound is studied in the context of a source of a continuous release in the atmosphere of hazardous gas or aerosol. Numerical results show a good agreement with the empirical errors, obtained using an MCMC parameter estimation technique.
Signal Processing, 2015
ABSTRACT Localisation of a source of a toxic release of biochemical aerosols in the atmosphere is... more ABSTRACT Localisation of a source of a toxic release of biochemical aerosols in the atmosphere is a problem of great importance for public safety. Two main practical difficulties are encountered in this problem: the lack of knowledge of the likelihood function of measurements collected by biochemical sensors, and the plethora of candidate dispersion models, developed under various assumptions (e.g. meteorological conditions, terrain). Aiming to overcome these two difficulties, the paper proposes a likelihood-free approximate Bayesian computation method, which simultaneously uses a set of candidate dispersion models, to localise the source. This estimation framework is implemented via the Monte Carlo method and tested using two experimental datasets.
2014 IEEE Workshop on Statistical Signal Processing (SSP), 2014
ABSTRACT The Gaussian plume model is the core of most regulatory atmospheric dispersion models. T... more ABSTRACT The Gaussian plume model is the core of most regulatory atmospheric dispersion models. The parameters of the model include the source characteristics (e.g. location, strength, size) and environmental parameters (wind speed, direction, atmospheric stability conditions). A sensor network is at disposal to measure the concentration of biological pathogen or chemical substance within the plume. This paper presents a theoretical analysis of the best achievable accuracy in estimation of Gaussian plume model parameters. Numerical results illustrate how parameter estimation accuracy depends on sensor measurement accuracy, the density of sensors and the quality of (prior) meteorological advice. The theoretical bounds are compared with empirical errors obtained using an importance sampling parameter estimation technique.
Information Fusion, 2014
ABSTRACT The Gaussian plume model is the core of most regulatory atmospheric dispersion models. T... more ABSTRACT The Gaussian plume model is the core of most regulatory atmospheric dispersion models. The parameters of the model include the source characteristics (e.g. location, strength) and environmental parameters (wind speed, direction, atmospheric stability conditions). The paper presents a theoretical analysis of the best achievable accuracy in estimation of Gaussian plume parameters in the context of a continuous point-source release and using a binary sensor network for acquisition of measurements. The problem is relevant for automatic localisation of atmospheric pollutants with applications in public health and defence. The theoretical bounds of achievable accuracy provide a guideline for sensor network deployment and its performance under various environmental conditions. The bounds are compared with empirical errors obtained using a Markov chain Monte Carlo (MCMC) parameter estimation technique.
Journal of theoretical biology, Jan 28, 2015
Evaluating whether a disease outbreak has occurred based on limited information in medical record... more Evaluating whether a disease outbreak has occurred based on limited information in medical records is inherently a probabilistic problem. This paper presents a methodology for consistently analysing the probability that a disease targeted by a surveillance system has appeared in the population, based on the medical records of the individuals within the target population, using a Bayesian network. To enable the system to produce a probability density function of the fraction of the population that is infected, a mathematically consistent conjoining of Bayesian networks and particle filters is used. This approach is tested against the default algorithm of ESSENCE Desktop Edition (which adaptively uses Poisson, Exponentially Weighted Moving Average and Linear Regression techniques as needed), and is shown, for the simulated test data used, to give significantly shorter detection times at false alarm rates of practical interest. This methodology shows promise to greatly improve detectio...
2007 Information, Decision and Control, 2007
The problem is to localise a source of gamma radiation using dose rate measurements taken with a ... more The problem is to localise a source of gamma radiation using dose rate measurements taken with a gamma probe at various points in space. A statistical model of dose-rate counts is developed using experimental data recorded in a laboratory. The problem is then studied using the theoretical Cramer-Rao bound (CRB) analysis, which quantifies the accuracy with which it is possible to localise the source and estimate its intensity. Three estimation algorithms are implemented and their performance investigated by simulations. The algorithms are then applied to field trial radiological survey data. The maximum likelihood algorithm performs the best of the considered algorithms.