Denis Reilly | Liverpool John Moores University (original) (raw)
Papers by Denis Reilly
Contemporary Issues in Information Systems - A Global Perspective
Digest of Papers. Fourth International Symposium on Wearable Computers
The paper describes a system level design approach to the power and performance of a dedicated ap... more The paper describes a system level design approach to the power and performance of a dedicated application wearable compute< the Speech Translator Smart Module. While processor speed and type affect power consumption and performance, memory size, and type of secondary storage have an even greater influence. Inparticulac the time to swap information in / o u t of main memory can reduce the performance of a wearable computer by more than h a y
2011 Developments in E-systems Engineering, 2011
Many large enterprises are wasting vast amounts of electricity due to computers that are powered-... more Many large enterprises are wasting vast amounts of electricity due to computers that are powered-up yet unused for considerable periods of time. It is unfortunate that although the facilities to power management are already available, many administrators are reluctant to adopt power saving features. This paper first highlights the scale of electricity wastage and goes on to consider the issues associated with computer power management before providing a detailed description of the development of a framework that can deliver power savings to large organisations The proposed framework effectively integrates standard network services, standard applications and off-the-shelf power management software together with policies and procedures to enforce power saving measures on individual computers.
Proceedings IEEE/WIC International Conference on Web Intelligence (WI 2003)
Next generation software applications will be required to run on globally distributed heterogeneo... more Next generation software applications will be required to run on globally distributed heterogeneous assemblies of disparate resources including: emerging computing grids. Such application calls for seamless integration and interoperation between varieties of service standards and ...
IEEE Access
Machine learning is well developed amongst the scientific community in terms of theoretical found... more Machine learning is well developed amongst the scientific community in terms of theoretical foundations (statistics and algorithms) and frameworks (Tensorflow, PyTorch, H2O). However, machine learning is heavily focused on numerical data, or numerical data mixed with some categorical data. For numerical datasets, scientists and engineers can enjoy reasonable success with only a limited knowledge of theoretical foundations and the inner workings of machine learning frameworks. However, it is a different story when dealing with purely categorical datasets, which require a deeper understanding of machine learning frameworks and associated encodings and algorithms in order to achieve success. This paper addresses the issues in handling purely categorical datasets for multi-classification problems and provides a set of heuristics for dealing with purely categorical data. In particular, issues such as pre-processing, feature encoding and algorithm selection are considered. The heuristics are then demonstrated through a case study, based on a categorical data set of domestic fire injuries, covering a 10-year period. Novel contributions are made through the heuristics and the performance analysis of different encoding techniques. The case study itself also makes a novel contribution through the classification of different types of injuries, based on related features.
Abstract—This research has innovated an emerging
The next generation software applications will be required to run on globally distributed heterog... more The next generation software applications will be required to run on globally distributed heterogeneous assemblies of disparate resources including emerging computing grids. Such applications call for seamless integration and interoperation between varieties of service standards and architectures developed and deployed using existing service middleware standards and architectures such as: DCOM, COBRA, Jini, Web service UPnP. Whilst such middleware adequately provide different APIs, programming models for distributed components and service integration and interoperation at both design and runtime. There is still need for additional middleware service to support runtime services invocation regardless of the components/service standards and type of middleware used. Based on ongoing research focusing on self-adaptive software for adaptive middleware, this paper will describe a proposed on-demand (runtime) service invocation mechanism, and the associated service interoperation protocol. ...
An essential feature of dependable software is its adaptive capability to respond to changes that... more An essential feature of dependable software is its adaptive capability to respond to changes that occur in its operating environment through the dynamic transformation and reconfiguration of its components and/or services. Such adaptive capability is often a design aspect derived from the software architecture model, which describes the software components and their interactions, the properties and policies that regulate the composition of the components and norms that limit the allowable systems adaptation operations. Research in reflective middleware architectures and policybased distributed systems management has focused on the use of managerial or meta-level protocols to attain reactive adaptive behaviour. However, reflective and policy-based management approaches alone cannot address all of the needs of self-adaptive software due to their inability to maintain a faithful runtime model of the system. This paper considers the development of control architecture for selfadaptive s...
IEEE Transactions on Emerging Topics in Computational Intelligence
Gynaecologists and obstetricians visually interpret cardiotocography (CTG) traces using the Inter... more Gynaecologists and obstetricians visually interpret cardiotocography (CTG) traces using the International Federation of Gynaecology and Obstetrics (FIGO) guidelines to assess the wellbeing of the foetus during antenatal care. This approach has raised concerns among professionals concerning inter-and intra-variability where clinical diagnosis only has a 30% positive predictive value when classifying pathological outcomes. Machine learning models, trained with FIGO and other user derived features extracted from CTG traces, have been shown to increase positive predictive capacity and minimise variability. This is only possible however when class distributions are equal which is rarely the case in clinical trials where case-control observations are heavily skewed. Classes can be balanced using either synthetic data derived from resampled case training data or by decreasing the number of control instances. However, this introduces bias and removes valuable information. Concerns have also been raised regarding machine learning studies and their reliance on manually handcrafted features. While this has led to some interesting results, deriving an optimal set of features is considered to be an art as well as a science and is often an empirical and time consuming process. In this paper, we address both of these issues and propose a novel CTG analysis methodology that a) splits CTG time series signals into n-size windows with equal class distributions, and b) automatically extracts features from time-series windows using a one dimensional convolutional neural network (1DCNN) and multilayer perceptron (MLP) ensemble. Collectively, the proposed approach normally distributes classes and removes the requirement for handcrafted features from CTG traces. The 1DCNN-MLP models trained with several windowing strategies are evaluated to determine how well they can distinguish between normal and pathological birth outcomes. Our proposed method achieved good results using a window size of 200 with (Sens=0.7981, Spec=0.7881, F1=0.7830, Kappa=0.5849, AUC=0.8599, and Logloss=0.4791). The results were compared with a Support Vector Machine (SVM), a Random Forest (RF) and a Fishers Linear Discriminant Analysis (FLDA) classifier, which all failed to improve on the windowing 1DCNN strategy proposed in this study.
2020 International Joint Conference on Neural Networks (IJCNN)
The study in this paper presents a one-dimensional convolutional neural network (1DCNN) model, de... more The study in this paper presents a one-dimensional convolutional neural network (1DCNN) model, designed for the automated detection of obstructive Sleep Apnoea (OSA) captured from single-channel electrocardiogram (ECG) signals. The system provides mechanisms in clinical practice that help diagnose patients suffering with OSA. Using the state-of-the-art in 1DCNNs, a model is constructed using convolutional, max pooling layers and a fully connected Multilayer Perceptron (MLP) consisting of a hidden layer and SoftMax output for classification. The 1DCNN extracts prominent features, which are used to train an MLP. The model is trained using segmented ECG signals grouped into 5 unique datasets of set window sizes. 35 ECG signal recordings were selected from an annotated database containing 70 night-time ECG recordings. (Group A-a01 to a20 (Apnoea breathing), Group B-b01 to b05 (moderate), and Group C-c01 to c10 (normal). A total of 6514 minutes of Apnoea was recorded. Evaluation of the model is performed using a set of standard metrics which show the proposed model achieves high classification results in both training and validation using our windowing strategy, particularly W=500
IEEE Access
Bayesian Networks are probabilistic graph models that can be used for classification, prediction,... more Bayesian Networks are probabilistic graph models that can be used for classification, prediction, diagnosis and parameter learning. Probabilities can be inferred from the models and missing values can be imputed, based on probability theory. Missing person cases place a strain on the already overstretched resources of Police Forces. Such cases predominantly come from at risk groups such as children in care and people suffering from depression or dementia. Current approaches for dealing with such cases are manual and rely upon empirical studies and domain knowledge. This paper proposes the use of a Bayesian Network model, which can be used to predict the likely location of a missing person (misper) for a number of at risk groups. The model is evaluated using a set of misper cases and results compare very favourably with those of the manual processes currently used by UK Police forces. The novel approach described provides both a theoretical foundation and a practical framework for the future development of a decision support system. In addition to the model, a contribution is made through guidelines, which recount experiences in learning a Bayesian Network from data. INDEX TERMS Algorithms, Bayesian methods, computer and information processing, decision support systems, modeling, probability.
Personal Communications, …, 2001
Carnegie Mellon's Wearable Computers Laboratory has built four generations of real-time spee... more Carnegie Mellon's Wearable Computers Laboratory has built four generations of real-time speech translation wearable computers, culminating in the Speech Translator Smart Module. Smart Modules use a family of interoperable modules supporting real-time ...
ABSTRACT: Interoperation between heterogeneous services results in a variety of serious security ... more ABSTRACT: Interoperation between heterogeneous services results in a variety of serious security concerns, from privacy through to authentication and policy enforcement. We look at composition analysis techniques, enabled using instrumentation, as a means of improving security in interoperating systems. The techniques described harness the system of systems nature inherent in all interoperating system configurations. We present the ongoing development of a framework for combining instrumentation and composition analysis capabilities in a novel manner and discuss a case study involving the prevention of data leakage through access control analysis.
IEEE Access
One of the most important challenges in the analysis of high-throughput genetic data is the devel... more One of the most important challenges in the analysis of high-throughput genetic data is the development of efficient computational methods to identify statistically significant Single Nucleotide Polymorphisms (SNPs). Genome-wide association studies (GWAS) use single-locus analysis where each SNP is independently tested for association with phenotypes. The limitation with this approach, however, is its inability to explain genetic variation in complex diseases. Alternative approaches are required to model the intricate relationships between SNPs. Our proposed approach extends GWAS by combining deep learning stacked autoencoders (SAEs) and association rule mining (ARM) to identify epistatic interactions between SNPs. Following traditional GWAS quality control and association analysis, the most significant SNPs are selected and used in the subsequent analysis to investigate epistasis. SAERMA controls the classification results produced in the final fully connected multi-layer perceptron neural network (MLPNN) by manipulating the interestingness measures, support and confidence, in the rule generation process. The best classification results were achieved with 204 SNPs compressed to 100 units (77% AUC, 77% SE, 68% SP, 53% Gini, logloss = 0.58, and MSE = 0.20), although it was possible to achieve 73% AUC (77% SE, 63% SP, 45% Gini, logloss = 0.62, and MSE = 0.21) with 50 hidden units-both supported by close model interpretation.
Games and Culture
Designing and developing computer games can be a complex activity that may involve professionals ... more Designing and developing computer games can be a complex activity that may involve professionals from a variety of disciplines. In this article, we examine the use of game theory for supporting the design of gameplay within the different sections of a computer game and demonstrate its application in practice via adapted high-level decision trees for modeling the flow in gameplay and payoff matrices for modeling skill or challenge levels.
Marketing Intelligence & Planning
Purpose The purpose of this paper is to use the game theory combined with Monte Carlo simulation ... more Purpose The purpose of this paper is to use the game theory combined with Monte Carlo simulation modelling to support the analysis of different retail marketing strategies, in particular, using payoff matrices for modelling the likely outcomes from different retail marketing strategies. Design/methodology/approach Theoretical research was utilised to develop a practical approach for applying game theory to retail marketing strategies via payoff matrices combined with Monte Carlo simulation modelling. Findings Game theory combined with Monte Carlo simulation modelling can provide a formal approach to understanding consumer decision making in a retail environment, which can support the development of retail marketing strategies. Research limitations/implications Game theory combined with Monte Carlo simulation modelling can support the modelling of the interaction between retail marketing actions and consumer responses in a practical formal probabilistic manner, which can inform marke...
Journal of Strategic Marketing
Internet enabled consumer devices are beginning to be developed by manufacturers. In this paper, ... more Internet enabled consumer devices are beginning to be developed by manufacturers. In this paper, we examine how the internet of things can support marketing activities including customer relationship management, business intelligence and product design. In particular, the research reported in this paper examines how the internet of things can provide communication channels to support targeted marketing for product owners and enhance customer relationship management and product support. In addition, in this paper we examine how data gained from the operational use of internet-enabled devices can support business intelligence in terms of how consumers actually use a product, and can also support new product design in terms what features of current internet enabled products are most commonly used, and how they are used.
Network Security
The version presented here may differ from the published version or from the version of the recor... more The version presented here may differ from the published version or from the version of the record. Please see the repository URL above for details on accessing the published version and note that access may require a subscription.
Journal of Systems and Information Technology
The version presented here may differ from the published version or from the version of the recor... more The version presented here may differ from the published version or from the version of the record. Please see the repository URL above for details on accessing the published version and note that access may require a subscription.
Contemporary Issues in Information Systems - A Global Perspective
Digest of Papers. Fourth International Symposium on Wearable Computers
The paper describes a system level design approach to the power and performance of a dedicated ap... more The paper describes a system level design approach to the power and performance of a dedicated application wearable compute< the Speech Translator Smart Module. While processor speed and type affect power consumption and performance, memory size, and type of secondary storage have an even greater influence. Inparticulac the time to swap information in / o u t of main memory can reduce the performance of a wearable computer by more than h a y
2011 Developments in E-systems Engineering, 2011
Many large enterprises are wasting vast amounts of electricity due to computers that are powered-... more Many large enterprises are wasting vast amounts of electricity due to computers that are powered-up yet unused for considerable periods of time. It is unfortunate that although the facilities to power management are already available, many administrators are reluctant to adopt power saving features. This paper first highlights the scale of electricity wastage and goes on to consider the issues associated with computer power management before providing a detailed description of the development of a framework that can deliver power savings to large organisations The proposed framework effectively integrates standard network services, standard applications and off-the-shelf power management software together with policies and procedures to enforce power saving measures on individual computers.
Proceedings IEEE/WIC International Conference on Web Intelligence (WI 2003)
Next generation software applications will be required to run on globally distributed heterogeneo... more Next generation software applications will be required to run on globally distributed heterogeneous assemblies of disparate resources including: emerging computing grids. Such application calls for seamless integration and interoperation between varieties of service standards and ...
IEEE Access
Machine learning is well developed amongst the scientific community in terms of theoretical found... more Machine learning is well developed amongst the scientific community in terms of theoretical foundations (statistics and algorithms) and frameworks (Tensorflow, PyTorch, H2O). However, machine learning is heavily focused on numerical data, or numerical data mixed with some categorical data. For numerical datasets, scientists and engineers can enjoy reasonable success with only a limited knowledge of theoretical foundations and the inner workings of machine learning frameworks. However, it is a different story when dealing with purely categorical datasets, which require a deeper understanding of machine learning frameworks and associated encodings and algorithms in order to achieve success. This paper addresses the issues in handling purely categorical datasets for multi-classification problems and provides a set of heuristics for dealing with purely categorical data. In particular, issues such as pre-processing, feature encoding and algorithm selection are considered. The heuristics are then demonstrated through a case study, based on a categorical data set of domestic fire injuries, covering a 10-year period. Novel contributions are made through the heuristics and the performance analysis of different encoding techniques. The case study itself also makes a novel contribution through the classification of different types of injuries, based on related features.
Abstract—This research has innovated an emerging
The next generation software applications will be required to run on globally distributed heterog... more The next generation software applications will be required to run on globally distributed heterogeneous assemblies of disparate resources including emerging computing grids. Such applications call for seamless integration and interoperation between varieties of service standards and architectures developed and deployed using existing service middleware standards and architectures such as: DCOM, COBRA, Jini, Web service UPnP. Whilst such middleware adequately provide different APIs, programming models for distributed components and service integration and interoperation at both design and runtime. There is still need for additional middleware service to support runtime services invocation regardless of the components/service standards and type of middleware used. Based on ongoing research focusing on self-adaptive software for adaptive middleware, this paper will describe a proposed on-demand (runtime) service invocation mechanism, and the associated service interoperation protocol. ...
An essential feature of dependable software is its adaptive capability to respond to changes that... more An essential feature of dependable software is its adaptive capability to respond to changes that occur in its operating environment through the dynamic transformation and reconfiguration of its components and/or services. Such adaptive capability is often a design aspect derived from the software architecture model, which describes the software components and their interactions, the properties and policies that regulate the composition of the components and norms that limit the allowable systems adaptation operations. Research in reflective middleware architectures and policybased distributed systems management has focused on the use of managerial or meta-level protocols to attain reactive adaptive behaviour. However, reflective and policy-based management approaches alone cannot address all of the needs of self-adaptive software due to their inability to maintain a faithful runtime model of the system. This paper considers the development of control architecture for selfadaptive s...
IEEE Transactions on Emerging Topics in Computational Intelligence
Gynaecologists and obstetricians visually interpret cardiotocography (CTG) traces using the Inter... more Gynaecologists and obstetricians visually interpret cardiotocography (CTG) traces using the International Federation of Gynaecology and Obstetrics (FIGO) guidelines to assess the wellbeing of the foetus during antenatal care. This approach has raised concerns among professionals concerning inter-and intra-variability where clinical diagnosis only has a 30% positive predictive value when classifying pathological outcomes. Machine learning models, trained with FIGO and other user derived features extracted from CTG traces, have been shown to increase positive predictive capacity and minimise variability. This is only possible however when class distributions are equal which is rarely the case in clinical trials where case-control observations are heavily skewed. Classes can be balanced using either synthetic data derived from resampled case training data or by decreasing the number of control instances. However, this introduces bias and removes valuable information. Concerns have also been raised regarding machine learning studies and their reliance on manually handcrafted features. While this has led to some interesting results, deriving an optimal set of features is considered to be an art as well as a science and is often an empirical and time consuming process. In this paper, we address both of these issues and propose a novel CTG analysis methodology that a) splits CTG time series signals into n-size windows with equal class distributions, and b) automatically extracts features from time-series windows using a one dimensional convolutional neural network (1DCNN) and multilayer perceptron (MLP) ensemble. Collectively, the proposed approach normally distributes classes and removes the requirement for handcrafted features from CTG traces. The 1DCNN-MLP models trained with several windowing strategies are evaluated to determine how well they can distinguish between normal and pathological birth outcomes. Our proposed method achieved good results using a window size of 200 with (Sens=0.7981, Spec=0.7881, F1=0.7830, Kappa=0.5849, AUC=0.8599, and Logloss=0.4791). The results were compared with a Support Vector Machine (SVM), a Random Forest (RF) and a Fishers Linear Discriminant Analysis (FLDA) classifier, which all failed to improve on the windowing 1DCNN strategy proposed in this study.
2020 International Joint Conference on Neural Networks (IJCNN)
The study in this paper presents a one-dimensional convolutional neural network (1DCNN) model, de... more The study in this paper presents a one-dimensional convolutional neural network (1DCNN) model, designed for the automated detection of obstructive Sleep Apnoea (OSA) captured from single-channel electrocardiogram (ECG) signals. The system provides mechanisms in clinical practice that help diagnose patients suffering with OSA. Using the state-of-the-art in 1DCNNs, a model is constructed using convolutional, max pooling layers and a fully connected Multilayer Perceptron (MLP) consisting of a hidden layer and SoftMax output for classification. The 1DCNN extracts prominent features, which are used to train an MLP. The model is trained using segmented ECG signals grouped into 5 unique datasets of set window sizes. 35 ECG signal recordings were selected from an annotated database containing 70 night-time ECG recordings. (Group A-a01 to a20 (Apnoea breathing), Group B-b01 to b05 (moderate), and Group C-c01 to c10 (normal). A total of 6514 minutes of Apnoea was recorded. Evaluation of the model is performed using a set of standard metrics which show the proposed model achieves high classification results in both training and validation using our windowing strategy, particularly W=500
IEEE Access
Bayesian Networks are probabilistic graph models that can be used for classification, prediction,... more Bayesian Networks are probabilistic graph models that can be used for classification, prediction, diagnosis and parameter learning. Probabilities can be inferred from the models and missing values can be imputed, based on probability theory. Missing person cases place a strain on the already overstretched resources of Police Forces. Such cases predominantly come from at risk groups such as children in care and people suffering from depression or dementia. Current approaches for dealing with such cases are manual and rely upon empirical studies and domain knowledge. This paper proposes the use of a Bayesian Network model, which can be used to predict the likely location of a missing person (misper) for a number of at risk groups. The model is evaluated using a set of misper cases and results compare very favourably with those of the manual processes currently used by UK Police forces. The novel approach described provides both a theoretical foundation and a practical framework for the future development of a decision support system. In addition to the model, a contribution is made through guidelines, which recount experiences in learning a Bayesian Network from data. INDEX TERMS Algorithms, Bayesian methods, computer and information processing, decision support systems, modeling, probability.
Personal Communications, …, 2001
Carnegie Mellon's Wearable Computers Laboratory has built four generations of real-time spee... more Carnegie Mellon's Wearable Computers Laboratory has built four generations of real-time speech translation wearable computers, culminating in the Speech Translator Smart Module. Smart Modules use a family of interoperable modules supporting real-time ...
ABSTRACT: Interoperation between heterogeneous services results in a variety of serious security ... more ABSTRACT: Interoperation between heterogeneous services results in a variety of serious security concerns, from privacy through to authentication and policy enforcement. We look at composition analysis techniques, enabled using instrumentation, as a means of improving security in interoperating systems. The techniques described harness the system of systems nature inherent in all interoperating system configurations. We present the ongoing development of a framework for combining instrumentation and composition analysis capabilities in a novel manner and discuss a case study involving the prevention of data leakage through access control analysis.
IEEE Access
One of the most important challenges in the analysis of high-throughput genetic data is the devel... more One of the most important challenges in the analysis of high-throughput genetic data is the development of efficient computational methods to identify statistically significant Single Nucleotide Polymorphisms (SNPs). Genome-wide association studies (GWAS) use single-locus analysis where each SNP is independently tested for association with phenotypes. The limitation with this approach, however, is its inability to explain genetic variation in complex diseases. Alternative approaches are required to model the intricate relationships between SNPs. Our proposed approach extends GWAS by combining deep learning stacked autoencoders (SAEs) and association rule mining (ARM) to identify epistatic interactions between SNPs. Following traditional GWAS quality control and association analysis, the most significant SNPs are selected and used in the subsequent analysis to investigate epistasis. SAERMA controls the classification results produced in the final fully connected multi-layer perceptron neural network (MLPNN) by manipulating the interestingness measures, support and confidence, in the rule generation process. The best classification results were achieved with 204 SNPs compressed to 100 units (77% AUC, 77% SE, 68% SP, 53% Gini, logloss = 0.58, and MSE = 0.20), although it was possible to achieve 73% AUC (77% SE, 63% SP, 45% Gini, logloss = 0.62, and MSE = 0.21) with 50 hidden units-both supported by close model interpretation.
Games and Culture
Designing and developing computer games can be a complex activity that may involve professionals ... more Designing and developing computer games can be a complex activity that may involve professionals from a variety of disciplines. In this article, we examine the use of game theory for supporting the design of gameplay within the different sections of a computer game and demonstrate its application in practice via adapted high-level decision trees for modeling the flow in gameplay and payoff matrices for modeling skill or challenge levels.
Marketing Intelligence & Planning
Purpose The purpose of this paper is to use the game theory combined with Monte Carlo simulation ... more Purpose The purpose of this paper is to use the game theory combined with Monte Carlo simulation modelling to support the analysis of different retail marketing strategies, in particular, using payoff matrices for modelling the likely outcomes from different retail marketing strategies. Design/methodology/approach Theoretical research was utilised to develop a practical approach for applying game theory to retail marketing strategies via payoff matrices combined with Monte Carlo simulation modelling. Findings Game theory combined with Monte Carlo simulation modelling can provide a formal approach to understanding consumer decision making in a retail environment, which can support the development of retail marketing strategies. Research limitations/implications Game theory combined with Monte Carlo simulation modelling can support the modelling of the interaction between retail marketing actions and consumer responses in a practical formal probabilistic manner, which can inform marke...
Journal of Strategic Marketing
Internet enabled consumer devices are beginning to be developed by manufacturers. In this paper, ... more Internet enabled consumer devices are beginning to be developed by manufacturers. In this paper, we examine how the internet of things can support marketing activities including customer relationship management, business intelligence and product design. In particular, the research reported in this paper examines how the internet of things can provide communication channels to support targeted marketing for product owners and enhance customer relationship management and product support. In addition, in this paper we examine how data gained from the operational use of internet-enabled devices can support business intelligence in terms of how consumers actually use a product, and can also support new product design in terms what features of current internet enabled products are most commonly used, and how they are used.
Network Security
The version presented here may differ from the published version or from the version of the recor... more The version presented here may differ from the published version or from the version of the record. Please see the repository URL above for details on accessing the published version and note that access may require a subscription.
Journal of Systems and Information Technology
The version presented here may differ from the published version or from the version of the recor... more The version presented here may differ from the published version or from the version of the record. Please see the repository URL above for details on accessing the published version and note that access may require a subscription.