Luis Javier Mena Camare | Upsin (original) (raw)

Papers by Luis Javier Mena Camare

Research paper thumbnail of Services Composition Modeling for Migrating Users in Intelligent Spaces

IEEE Latin America Transactions , 2018

This research paper addresses the problem of modeling service between heterogeneous intelligent s... more This research paper addresses the problem of modeling service between heterogeneous intelligent spaces, essentially to share complex services. In the last decade, development-focused health care systems in intelligent environments have taken great interest, especially through the development of systems that have proven useful for medical monitoring of users. However, these approaches are limited to small geographic areas such as a home or hospital. Current methods in cooperative coordination of multi-agent systems have investigated the problem of composing services, but only among the agents of a single society, leaving open the question of composition of services between different societies of agents. The purpose of this research is the development of a model service composition that can be used in the migration of users between smart spaces.

Research paper thumbnail of A Novel Framework for Generating Personalized Network Datasets for NIDS, Based on Traffic Aggregation

Sensors, 2022

In this paper, we addressed the problem of dataset scarcity for the task of network intrusion det... more In this paper, we addressed the problem of dataset scarcity for the task of network intrusion detection. Our main contribution was to develop a framework that provides a complete process for generating network traffic datasets based on the aggregation of real network traces. In addition, we proposed a set of tools for attribute extraction and labeling of traffic sessions. A new dataset with botnet network traffic was generated by the framework to assess our proposed method with machine learning algorithms suitable for unbalanced data. The performance of the classifiers was evaluated in
terms of macro-averages of F1-score (0.97) and the Matthews Correlation Coefficient (0.94), showing a good overall performance average.

Research paper thumbnail of Role of Digital Transformation for Achieving Sustainability: Mediated Role of Stakeholders, Key Capabilities, and Technology

Sustainability, 2023

Sustainability through digital transformation is essential for contemporary businesses. Embracing... more Sustainability through digital transformation is essential for contemporary businesses. Embracing sustainability, micro-, small-, and medium-sized enterprises (MSMEs) can gain a competitive advantage, attracting customers and investors who share these values. Moreover, incorporating sustainable practices empowers MSMEs to drive innovation, reduce costs, and enhance their reputation. This study aims to identify how owners or senior managers of MSMEs can initiate a sustainable digital
transformation project. A systematic literature review was carried out, including 59 publications from 2019 to 2023. As a result, this research identifies the first steps owners of MSMEs can take to begin the transition by identifying critical organizational capabilities necessary for successful transformation, explores the technologies that can support MSMEs in their sustainability goals, and emphasizes the significance of stakeholders in achieving a successful digital transformation journey. Firstly, owners or senior managers should change the organizational culture to support decisions and strategies focus on sustainability. Secondly, the leading role of stakeholders is in the innovation process that allows businesses to be more competitive locally and globally. Finally, big data is the technology that can provide the most significant benefit to MSMEs because it will enable analyzing data of all kinds and contributes disruptively to decision-making.

Research paper thumbnail of Sustainable Digital Transformation for SMEs: A Comprehensive Framework for Informed Decision-Making

Sustainability, 2024

This study presents a sustainable digital transformation framework to integrate sustainable pract... more This study presents a sustainable digital transformation framework to integrate sustainable practices into digital transformation initiatives within Small and Medium Enterprises (SMEs). The methodology includes a literature review, a framework creation, and a case study with passive participation. The framework was structured to help industries implement responsible digitalization initiatives in five key stages: setting objectives, fostering stakeholder-focused engagement, defining sustainable objectives and dimensions, creating a sustainable model, and executing the project. Validating the proposal in the context of an SME allowed us to discern the tangible benefits of sustainability practices for the organization, further reinforcing the relevance and applicability of the framework. In conclusion, this research offers valuable information for SMEs considering starting a sustainable digitalization process.

Research paper thumbnail of Enhancing Financial Risk Prediction with Symbolic Classifiers: Addressing Class Imbalance and the Accuracy–interpretability Trade–off

Humanities and Social Sciences Communications , 2024

Machine learning for financial risk prediction has garnered substantial interest in recent decade... more Machine learning for financial risk prediction has garnered substantial interest in recent decades. However, the class imbalance problem and the dilemma of accuracy gain by loss interpretability have yet to be widely studied. Symbolic classifiers have emerged as a promising solution for forecasting banking failures and estimating creditworthiness as it addresses class imbalance while maintaining both accuracy and nterpretability. This paper aims to evaluate the effectiveness of REMED, a symbolic classifier, in the context of financial risk management, and focuses on its ability to handle class imbalance and provide interpretable decision rules. Through empirical analysis of a real-world imbalanced financial dataset from the Federal Deposit Insurance Corporation, we demonstrate that REMED effectively handles class imbalance, improving performance accuracy metrics while ensuring interpretability through a concise and easily understandable rule system. A comparative analysis is conducted against two well-known rule-generating approaches, J48 and JRip. The findings suggest that, with further development and validation, REMED can be implemented as a competitive approach to improve predictive accuracy on imbalanced financial datasets without compromising model interpretability.

Research paper thumbnail of Assessing Employee Susceptibility to Cybersecurity Risks

International Journal of Information Security and Privacy, 2024

This research challenges assumptions about cybersecurity risk factors, revealing that age, gender... more This research challenges assumptions about cybersecurity risk factors, revealing that age, gender, and educational background are not significant determinants of employee susceptibility. It highlights the importance of inclusive cybersecurity training programs that cater to individuals of all age groups, dispelling the misconception that older employees are inherently less tech-savvy and more susceptible to cybersecurity threats. The findings show that cybersecurity teams within organizations significantly impact the adoption of security policies and data handling practices among employees, even though their influence on password and account security practices is limited. Organizations can adopt a holistic approach to cybersecurity training and awareness programs by leveraging these insights. This approach transcends traditional demographics and focuses on enhancing password and account security, ultimately strengthening cybersecurity postures, fostering a culture of cybersecurity consciousness, and fortifying defenses against the evolving landscape of digital threats.

Research paper thumbnail of Computer Simulation of Catheter Cryoablation for Pulmonary Vein Isolation

Healthcare, 2024

Cryoablation is a well-established medical procedure for surgically treating atrial fibrillation.... more Cryoablation is a well-established medical procedure for surgically treating atrial fibrillation. Cryothermal catheter therapy induces cellular necrosis by freezing the insides of pulmonary veins, with the goal of disrupting abnormal electrical heart signals. Nevertheless, tissue damage induced by cold temperatures may also lead to other complications after cardiac surgery. In this sense, the simulation of catheter ablation can provide safer environments for training and the performance
of cryotherapy interventions. Therefore, in this paper, we propose a novel approach to help better understand how temperature rates can affect this procedure by using computer tools to develop a simulation framework to predict lesion size and determine optimal temperature conditions for reducing the risk of major complications. The results showed that a temperature profile of around −40 ◦C caused less penetration, reduced necrotic damage, and smaller lesion size in the tissue. Instead, cryotherapy close to −60 ◦C achieved a greater depth of temperature flow inside the tissue and a larger cross-section area of the lesion. With further development and validation, the framework could represent a cost-effective strategy for providing personalized modeling, better planning of cryocatheter-based treatment, and preventing surgical complications.

Research paper thumbnail of A Reliable Index for the Prognostic Significance of Blood Pressure Variability

Journal of Hypertension, 2005

Objectives This study presents a reliable index inspired by the total variability concept of real... more Objectives This study presents a reliable index inspired by the total variability concept of real analysis in mathematics, called average real variability (ARV), for the prognostic significance of blood pressure variability (BPV) overcoming the pitfalls of the commonly used standard deviation (SD). Background Recent studies have suggested that an increase in BPV is associated with an increase in subsequent cardiovascular events/complications. However, there are other studies where the cited association was not found or was lost in the presence of other well-known risk factors. An explanation for these apparently contradictory results may be the selection of the variability index used (SD). Methods Ambulatory blood pressure monitoring in 312 subjects aged >-55 years. Logistic regression models and survival methods were used to establish the prognostic significance of awake systolic BPV: in particular, (i) the performance of ARV versus SD, and (ii) the value of BPV relative to other well-known risk factors. Results The analyses using the ARV index show a statistically significant relative risk equal to 4.548 (P U 0.006) for the group with high BPV with respect to the low BPV group (reference level); in contrast, the corresponding relative risk associated to the SD index was not statistically significant. Furthermore, ARV exhibited a similar predictive value to systolic blood pressure. Conclusions The proposed ARV index is a more reliable representation of time series variability than SD and may be less sensitive to the relative low sampling frequency of the ambulatory blood pressure monitoring devices. The results suggest that ARV adds prognostic value to the ABPM and could prompt the use of therapeutic measures to control BPV.

Research paper thumbnail of Noise-tolerant Modular Neural Network System for Classifying ECG Signal

Informatica, 2019

Millions of electrocardiograms (ECG) are interpreted every year, requiring specialized training f... more Millions of electrocardiograms (ECG) are interpreted every year, requiring specialized training for accurate interpretation. Because automated and accurate classification ECG signals will improve early diagnosis of heart condition, several neural network (NN) approaches have been proposed for classifying ECG signals. Current strategies for a critical step, the preprocessing for noise removal, are still unsatisfactory. We propose a modular NN approach based on artificial noise injection, to improve the generalization capability of the resulting model. The NN classifier initially performed a fairly accurate recognition of four types of cardiac anomalies in simulated ECG signals with minor, moderate, severe, and extreme noise, with an average accuracy of 99.2%, 95.1%, 91.4%, and 85.2% respectively. Ultimately we discriminated normal and abnormal heartbeat patterns for single lead of raw ECG signals, obtained 95.7% of overall accuracy and 99.5% of Precision. Therefore, the propose approach is a useful tool for the detection and diagnosis of cardiac abnormalities. Povzetek: V članku je opisana metoda modularnih nevronskim mrež za prepoznavanje šumnih ECG signalov.

Research paper thumbnail of Extracting New Patterns for Cardiovascular Disease Prognosis

Expert Systems, 2009

Cardiovascular diseases constitute one of the main causes of mortality in the world, and machine ... more Cardiovascular diseases constitute one of the main causes of mortality in the world, and machine learning has become a powerful tool for analysing medical data in the last few years. In this paper we present an interdisciplinary work based on an ambulatory blood pressure study and the development of a new classification algorithm named REMED. We focused on the discovery of new patterns for abnormal blood pressure variability as a possible cardiovascular risk factor. We compared our results with other classification algorithms based on Bayesian methods, decision trees, and rule induction techniques. In the comparison, REMED showed similar accuracy to these algorithms but it has the advantage of being superior in its capacity to classify sick people correctly. Therefore, our method could represent an innovative approach that might be useful in medical decision support for cardiovascular disease prognosis.

Research paper thumbnail of Machine Learning Approach to Extract Diagnostic and Prognostic Thresholds: Application in Prognosis of Cardiovascular Mortality

Computational and Mathematical Methods in Medicine, 2012

Machine learning has become a powerful tool for analysing medical domains, assessing the importan... more Machine learning has become a powerful tool for analysing medical domains, assessing the importance of clinical parameters, and extracting medical knowledge for outcomes research. In this paper, we present a machine learning method for extracting diagnostic and prognostic thresholds, based on a symbolic classification algorithm called REMED. We evaluated the performance of our method by determining new prognostic thresholds for well-known and potential cardiovascular risk factors that are used to support medical decisions in the prognosis of fatal cardiovascular diseases. Our approach predicted 36% of cardiovascular deaths with 80% specificity and 75% general accuracy. The new method provides an innovative approach that might be useful to support decisions about medical diagnoses and prognoses.

Research paper thumbnail of Mobile Personal Health Monitoring for Automated Classification of Electrocardiogram Signals in Elderly

Computational and Mathematical Methods in Medicine, 2018

Mobile electrocardiogram (ECG) monitoring is an emerging area that has received increasing attent... more Mobile electrocardiogram (ECG) monitoring is an emerging area that has received increasing attention in recent years, but still real-life validation for elderly residing in low and middle-income countries is scarce. We developed a wearable ECG monitor that is integrated with a self-designed wireless sensor for ECG signal acquisition. It is used with a native purposely designed smartphone application, based on machine learning techniques, for automated classification of captured ECG beats from aged people. When tested on 100 older adults, the monitoring system discriminated normal and abnormal ECG signals with a high degree of accuracy (97%), sensitivity (100%), and specificity (96.6%). With further verification, the system could be useful for detecting cardiac abnormalities in the home environment and contribute to prevention, early diagnosis, and effective treatment of cardiovascular diseases, while keeping costs down and increasing access to healthcare services for older persons.

Research paper thumbnail of Mobile Personal Health System for Ambulatory Blood Pressure Monitoring

Computational and Mathematical Methods in Medicine, 2013

The ARVmobile v1.0 is a multiplatform mobile personal health monitor (PHM) application for ambula... more The ARVmobile v1.0 is a multiplatform mobile personal health monitor (PHM) application for ambulatory blood pressure (ABP) monitoring that has the potential to aid in the acquisition and analysis of detailed profile of ABP and heart rate (HR), improve the early detection and intervention of hypertension, and detect potential abnormal BP and HR levels for timely medical feedback. The PHM system consisted of ABP sensor to detect BP and HR signals and smartphone as receiver to collect the transmitted digital data and process them to provide immediate personalized information to the user. Android and Blackberry platforms were developed to detect and alert of potential abnormal values, offer friendly graphical user interface for elderly people, and provide feedback to professional healthcare providers via e-mail. ABP data were obtained from twenty-one healthy individuals (>51 years) to test the utility of the PHM application. The ARVmobile v1.0 was able to reliably receive and process the ABP readings from the volunteers. The preliminary results demonstrate that the ARVmobile 1.0 application could be used to perform a detailed profile of ABP and HR in an ordinary daily life environment, bedsides of estimating potential diagnostic thresholds of abnormal BP variability measured as average real variability.

Research paper thumbnail of How Many Measurements Are Needed to Estimate Blood Pressure Variability Without Loss of Prognostic Information

American Journal of Hypertension, 2014

BACKGROUND: Average real variability (ARV) is a recently proposed index for shortterm blood press... more BACKGROUND: Average real variability (ARV) is a recently proposed index for shortterm blood pressure (BP) variability. We aimed to determine the minimum number of BP readings required to compute ARV without loss of prognostic information.
METHODS: ARV was calculated from a discovery dataset that included 24-hour ambulatory BP measurements for 1,254 residents (mean age = 56.6 years; 43.5% women) of Copenhagen, Denmark. Concordance between ARV from full (≥80 BP readings) and randomly reduced 24-hour BP recordings was examined, as was prognostic accuracy. A test dataset that included 5,353 subjects (mean age = 54.0 years; 45.6% women) with at least 48 BP measurements from 11 randomly recruited population cohorts was used to validate the results.
RESULTS: In the discovery dataset, a minimum of 48 BP readings allowed an
accurate assessment of the association between cardiovascular risk and ARV. In the test dataset, over 10.2 years (median), 806 participants died (335 cardiovascular deaths, 206 cardiac deaths) and 696 experienced a major fatal or nonfatal cardiovascular event. Standardized multivariable-adjusted hazard ratios (HRs) were computed for associations between outcome and BP variability. Higher diastolic ARV in 24-hour ambulatory BP recordings predicted (P < 0.01) total (HR = 1.12),
cardiovascular (HR = 1.19), and cardiac (HR = 1.19) mortality and fatal combined with nonfatal cerebrovascular events (HR = 1.16). Higher systolic ARV in 24-hour ambulatory BP recordings predicted (P < 0.01) total (HR = 1.12), cardiovascular (HR = 1.17), and cardiac (HR = 1.24) mortality.
CONCLUSIONS: Forty-eight BP readings over 24 hours were observed to be adequate to compute ARV without meaningful loss of prognostic information.included 5,353 subjects (mean age = 54.0 years; 45.6% women) with at least 48 BP measurements from 11 randomly recruited population cohorts was used to validate the results.

Research paper thumbnail of A Pilot Study of the Use of Emerging Computer Technologies to Improve the Effectiveness of Reading and Writing therapies in Children with Down Syndrome

British Journal of Educational Technology, 2017

Despite the potential benefits that computer approaches could provide for children with cognitive... more Despite the potential benefits that computer approaches could provide for children with cognitive disabilities, research and implementation of emerging approaches to learning supported by computing technology has not received adequate attention. We conducted a pilot study to assess the effectiveness of a computer-assisted learning tool, named “HATLE,” for children with Down syndrome. The tool helps to improve reading and writing abilities in Spanish, through mobile computing, multimedia design, and computer speech-recognition techniques. An experimental design with nonequivalent groups was used to assess the effectiveness of HATLE. The treatment group was taught using HATLE; the control group received typical instructions with the same material. Individual literacy achievement was assessed for both groups, before and after therapy sessions. The dependent variables in all analyses were posttest scores, adjusted via Analysis of Covariance (ANCOVA) for pretest variance. Differences between treatment and control groups were statistically significant in favor of the HATLE group on measures of Single-Word Reading (p50.048) and Handwriting-Form (p50.046) with large effect sizes (d>0.8). Results indicate that HATLE might be effective in supporting computer-aided learning for children with intellectual disabilities. The results are discussed in terms of limitations and implications.

Research paper thumbnail of 24-Hour Blood Pressure Variability Assessed by Average Real Variability: A Systematic Review and Meta-analysis

Journal of the American Heart Association, 2017

Background-Although 24-hour blood pressure (BP) variability (BPV) is predictive of cardiovascular... more Background-Although 24-hour blood pressure (BP) variability (BPV) is predictive of cardiovascular outcomes independent of absolute BP levels, it is not regularly assessed in clinical practice. One possible limitation to routine BPV assessment is the lack of standardized methods for accurately estimating 24-hour BPV. We conducted a systematic review to assess the predictive power of reported BPV indexes to address appropriate quantification of 24-hour BPV, including the average real variability (ARV) index. Methods and Results-Studies chosen for review were those that presented data for 24-hour BPV in adults from meta-analysis, longitudinal or cross-sectional design, and examined BPV in terms of the following issues: (1) methods used to calculate and evaluate ARV; (2) assessment of 24-hour BPV determined using noninvasive ambulatory BP monitoring; (3) multivariate analysis adjusted for covariates, including some measure of BP; (4) association of 24-hour BPV with subclinical organ damage; and (5) the predictive value of 24-hour BPV on target organ damage and rate of cardiovascular events. Of the 19 assessed studies, 17 reported significant associations between high ARV and the presence and progression of subclinical organ damage, as well as the incidence of hard end points, such as cardiovascular events. In all these cases, ARV remained a significant independent predictor (P<0.05) after adjustment for BP and other clinical factors. In addition, increased ARV in systolic BP was associated with risk of all cardiovascular events (hazard ratio, 1.18; 95% confidence interval, 1.09-1.27). Only 2 cross-sectional studies did not find that high ARV was a significant risk factor. Conclusions-Current evidence suggests that ARV index adds significant prognostic information to 24-hour ambulatory BP monitoring and is a useful approach for studying the clinical value of BPV.

Research paper thumbnail of Mobile Personal Health Care System for Noninvasive, Pervasive, and Continuous Blood Pressure Monitoring: Development and Usability Study

JMIR Mhealth Uhealth, 2020

Background: Smartphone-based blood pressure (BP) monitoring using photoplethysmography (PPG) tech... more Background: Smartphone-based blood pressure (BP) monitoring using photoplethysmography (PPG) technology has emerged as a promising approach to empower users with self-monitoring for effective diagnosis and control of hypertension. Objective: This study aimed to develop a mobile personal health care system for noninvasive, pervasive, and continuous estimation of BP level and variability, which is user friendly for elderly people. Methods: The proposed approach was integrated by a self-designed cuffless, calibration-free, wireless, and wearable PPG-only sensor and a native purposely designed smartphone app using multilayer perceptron machine learning techniques from raw signals. We performed a development and usability study with three older adults (mean age 61.3 years, SD 1.5 years; 66% women) to test the usability and accuracy of the smartphone-based BP monitor. Results: The employed artificial neural network model had good average accuracy (>90%) and very strong correlation (>0.90) (P<.001) for predicting the reference BP values of our validation sample (n=150). Bland-Altman plots showed that most of the errors for BP prediction were less than 10 mmHg. However, according to the Association for the Advancement of Medical Instrumentation and British Hypertension Society standards, only diastolic blood pressure prediction met the clinically accepted accuracy thresholds. Conclusions: With further development and validation, the proposed system could provide a cost-effective strategy to improve the quality and coverage of health care, particularly in rural zones, areas lacking physicians, and areas with solitary elderly populations.

Research paper thumbnail of Marco para Automatizar la Selección de un Mecanismo de Coordinación entre Sociedades de Agentes

DYNA INGENIERIA E INDUSTRIA, 2017

La creciente disponibilidad de nueva y mejor tecnologia esta propiciando que los hogares se trans... more La creciente disponibilidad de nueva y mejor tecnologia esta propiciando que los hogares se transformen en entornos donde los dispositivos tecnologicos se insertan de manera natural y son asimilados de forma rapida y transparente por los usuarios. En este sentido, la automatizacion de procesos o servicios tales como el encendido y apagado de las luces, el control de la temperatura del clima domestico, y la apertura o desbloqueo de puertas son cada vez mas comunes [1]. En consecuencia, el siguiente paso en esta evolucion tecnologia/espacios, es el desarrollo de entornos inteligentes que cuenten con una red de dispositivos interconectados (sensores y actuadores), de forma tal que trabajen como un solo elemento, y cuya operacion este relacionada, permitiendo asi integrar servicios complejos, entendiendo estos como la combinacion de nuevas funcionalidades disponibles pero no explotadas por el sistema [2].

Research paper thumbnail of Symbolic One-class Learning from Imbalanced Datasets: Application in Medical Diagnosis

International Journal on Artificial Intelligence Tools, 2009

When working with real-world applications we often find imbalanced datasets, those for which ther... more When working with real-world applications we often find imbalanced datasets, those for which there exists a majority class with normal data and a minority class with abnormal or important data. In this work, we make an overview of the class imbalance problem; we review consequences, possible causes and existing strategies to cope with the inconveniences associated to this problem. As an effort to contribute to the solution of this problem, we propose a new rule induction algorithm named
Rule Extraction for MEdical Diagnosis (REMED), as a symbolic one-class learning approach. For the evaluation of the proposed method, we use different medical diagnosis datasets taking into account quantitative metrics, comprehensibility, and reliability. We performed a comparison of REMED versus C4.5 and RIPPER combined with over-sampling and cost-sensitive strategies. This empirical analysis of the REMED algorithm showed it to be quantitatively competitive with C4.5 and RIPPER in terms of the area under the Receiver Operating Characteristic curve (AUC) and the
geometric mean, but overcame them in terms of comprehensibility and reliability. Results of our experiments show that REMED generated rules systems with a larger degree of abstraction and patterns closer to well-known abnormal values associated to each considered medical dataset.

Research paper thumbnail of Services Composition Modeling for Migrating Users in Intelligent Spaces

IEEE Latin America Transactions , 2018

This research paper addresses the problem of modeling service between heterogeneous intelligent s... more This research paper addresses the problem of modeling service between heterogeneous intelligent spaces, essentially to share complex services. In the last decade, development-focused health care systems in intelligent environments have taken great interest, especially through the development of systems that have proven useful for medical monitoring of users. However, these approaches are limited to small geographic areas such as a home or hospital. Current methods in cooperative coordination of multi-agent systems have investigated the problem of composing services, but only among the agents of a single society, leaving open the question of composition of services between different societies of agents. The purpose of this research is the development of a model service composition that can be used in the migration of users between smart spaces.

Research paper thumbnail of A Novel Framework for Generating Personalized Network Datasets for NIDS, Based on Traffic Aggregation

Sensors, 2022

In this paper, we addressed the problem of dataset scarcity for the task of network intrusion det... more In this paper, we addressed the problem of dataset scarcity for the task of network intrusion detection. Our main contribution was to develop a framework that provides a complete process for generating network traffic datasets based on the aggregation of real network traces. In addition, we proposed a set of tools for attribute extraction and labeling of traffic sessions. A new dataset with botnet network traffic was generated by the framework to assess our proposed method with machine learning algorithms suitable for unbalanced data. The performance of the classifiers was evaluated in
terms of macro-averages of F1-score (0.97) and the Matthews Correlation Coefficient (0.94), showing a good overall performance average.

Research paper thumbnail of Role of Digital Transformation for Achieving Sustainability: Mediated Role of Stakeholders, Key Capabilities, and Technology

Sustainability, 2023

Sustainability through digital transformation is essential for contemporary businesses. Embracing... more Sustainability through digital transformation is essential for contemporary businesses. Embracing sustainability, micro-, small-, and medium-sized enterprises (MSMEs) can gain a competitive advantage, attracting customers and investors who share these values. Moreover, incorporating sustainable practices empowers MSMEs to drive innovation, reduce costs, and enhance their reputation. This study aims to identify how owners or senior managers of MSMEs can initiate a sustainable digital
transformation project. A systematic literature review was carried out, including 59 publications from 2019 to 2023. As a result, this research identifies the first steps owners of MSMEs can take to begin the transition by identifying critical organizational capabilities necessary for successful transformation, explores the technologies that can support MSMEs in their sustainability goals, and emphasizes the significance of stakeholders in achieving a successful digital transformation journey. Firstly, owners or senior managers should change the organizational culture to support decisions and strategies focus on sustainability. Secondly, the leading role of stakeholders is in the innovation process that allows businesses to be more competitive locally and globally. Finally, big data is the technology that can provide the most significant benefit to MSMEs because it will enable analyzing data of all kinds and contributes disruptively to decision-making.

Research paper thumbnail of Sustainable Digital Transformation for SMEs: A Comprehensive Framework for Informed Decision-Making

Sustainability, 2024

This study presents a sustainable digital transformation framework to integrate sustainable pract... more This study presents a sustainable digital transformation framework to integrate sustainable practices into digital transformation initiatives within Small and Medium Enterprises (SMEs). The methodology includes a literature review, a framework creation, and a case study with passive participation. The framework was structured to help industries implement responsible digitalization initiatives in five key stages: setting objectives, fostering stakeholder-focused engagement, defining sustainable objectives and dimensions, creating a sustainable model, and executing the project. Validating the proposal in the context of an SME allowed us to discern the tangible benefits of sustainability practices for the organization, further reinforcing the relevance and applicability of the framework. In conclusion, this research offers valuable information for SMEs considering starting a sustainable digitalization process.

Research paper thumbnail of Enhancing Financial Risk Prediction with Symbolic Classifiers: Addressing Class Imbalance and the Accuracy–interpretability Trade–off

Humanities and Social Sciences Communications , 2024

Machine learning for financial risk prediction has garnered substantial interest in recent decade... more Machine learning for financial risk prediction has garnered substantial interest in recent decades. However, the class imbalance problem and the dilemma of accuracy gain by loss interpretability have yet to be widely studied. Symbolic classifiers have emerged as a promising solution for forecasting banking failures and estimating creditworthiness as it addresses class imbalance while maintaining both accuracy and nterpretability. This paper aims to evaluate the effectiveness of REMED, a symbolic classifier, in the context of financial risk management, and focuses on its ability to handle class imbalance and provide interpretable decision rules. Through empirical analysis of a real-world imbalanced financial dataset from the Federal Deposit Insurance Corporation, we demonstrate that REMED effectively handles class imbalance, improving performance accuracy metrics while ensuring interpretability through a concise and easily understandable rule system. A comparative analysis is conducted against two well-known rule-generating approaches, J48 and JRip. The findings suggest that, with further development and validation, REMED can be implemented as a competitive approach to improve predictive accuracy on imbalanced financial datasets without compromising model interpretability.

Research paper thumbnail of Assessing Employee Susceptibility to Cybersecurity Risks

International Journal of Information Security and Privacy, 2024

This research challenges assumptions about cybersecurity risk factors, revealing that age, gender... more This research challenges assumptions about cybersecurity risk factors, revealing that age, gender, and educational background are not significant determinants of employee susceptibility. It highlights the importance of inclusive cybersecurity training programs that cater to individuals of all age groups, dispelling the misconception that older employees are inherently less tech-savvy and more susceptible to cybersecurity threats. The findings show that cybersecurity teams within organizations significantly impact the adoption of security policies and data handling practices among employees, even though their influence on password and account security practices is limited. Organizations can adopt a holistic approach to cybersecurity training and awareness programs by leveraging these insights. This approach transcends traditional demographics and focuses on enhancing password and account security, ultimately strengthening cybersecurity postures, fostering a culture of cybersecurity consciousness, and fortifying defenses against the evolving landscape of digital threats.

Research paper thumbnail of Computer Simulation of Catheter Cryoablation for Pulmonary Vein Isolation

Healthcare, 2024

Cryoablation is a well-established medical procedure for surgically treating atrial fibrillation.... more Cryoablation is a well-established medical procedure for surgically treating atrial fibrillation. Cryothermal catheter therapy induces cellular necrosis by freezing the insides of pulmonary veins, with the goal of disrupting abnormal electrical heart signals. Nevertheless, tissue damage induced by cold temperatures may also lead to other complications after cardiac surgery. In this sense, the simulation of catheter ablation can provide safer environments for training and the performance
of cryotherapy interventions. Therefore, in this paper, we propose a novel approach to help better understand how temperature rates can affect this procedure by using computer tools to develop a simulation framework to predict lesion size and determine optimal temperature conditions for reducing the risk of major complications. The results showed that a temperature profile of around −40 ◦C caused less penetration, reduced necrotic damage, and smaller lesion size in the tissue. Instead, cryotherapy close to −60 ◦C achieved a greater depth of temperature flow inside the tissue and a larger cross-section area of the lesion. With further development and validation, the framework could represent a cost-effective strategy for providing personalized modeling, better planning of cryocatheter-based treatment, and preventing surgical complications.

Research paper thumbnail of A Reliable Index for the Prognostic Significance of Blood Pressure Variability

Journal of Hypertension, 2005

Objectives This study presents a reliable index inspired by the total variability concept of real... more Objectives This study presents a reliable index inspired by the total variability concept of real analysis in mathematics, called average real variability (ARV), for the prognostic significance of blood pressure variability (BPV) overcoming the pitfalls of the commonly used standard deviation (SD). Background Recent studies have suggested that an increase in BPV is associated with an increase in subsequent cardiovascular events/complications. However, there are other studies where the cited association was not found or was lost in the presence of other well-known risk factors. An explanation for these apparently contradictory results may be the selection of the variability index used (SD). Methods Ambulatory blood pressure monitoring in 312 subjects aged >-55 years. Logistic regression models and survival methods were used to establish the prognostic significance of awake systolic BPV: in particular, (i) the performance of ARV versus SD, and (ii) the value of BPV relative to other well-known risk factors. Results The analyses using the ARV index show a statistically significant relative risk equal to 4.548 (P U 0.006) for the group with high BPV with respect to the low BPV group (reference level); in contrast, the corresponding relative risk associated to the SD index was not statistically significant. Furthermore, ARV exhibited a similar predictive value to systolic blood pressure. Conclusions The proposed ARV index is a more reliable representation of time series variability than SD and may be less sensitive to the relative low sampling frequency of the ambulatory blood pressure monitoring devices. The results suggest that ARV adds prognostic value to the ABPM and could prompt the use of therapeutic measures to control BPV.

Research paper thumbnail of Noise-tolerant Modular Neural Network System for Classifying ECG Signal

Informatica, 2019

Millions of electrocardiograms (ECG) are interpreted every year, requiring specialized training f... more Millions of electrocardiograms (ECG) are interpreted every year, requiring specialized training for accurate interpretation. Because automated and accurate classification ECG signals will improve early diagnosis of heart condition, several neural network (NN) approaches have been proposed for classifying ECG signals. Current strategies for a critical step, the preprocessing for noise removal, are still unsatisfactory. We propose a modular NN approach based on artificial noise injection, to improve the generalization capability of the resulting model. The NN classifier initially performed a fairly accurate recognition of four types of cardiac anomalies in simulated ECG signals with minor, moderate, severe, and extreme noise, with an average accuracy of 99.2%, 95.1%, 91.4%, and 85.2% respectively. Ultimately we discriminated normal and abnormal heartbeat patterns for single lead of raw ECG signals, obtained 95.7% of overall accuracy and 99.5% of Precision. Therefore, the propose approach is a useful tool for the detection and diagnosis of cardiac abnormalities. Povzetek: V članku je opisana metoda modularnih nevronskim mrež za prepoznavanje šumnih ECG signalov.

Research paper thumbnail of Extracting New Patterns for Cardiovascular Disease Prognosis

Expert Systems, 2009

Cardiovascular diseases constitute one of the main causes of mortality in the world, and machine ... more Cardiovascular diseases constitute one of the main causes of mortality in the world, and machine learning has become a powerful tool for analysing medical data in the last few years. In this paper we present an interdisciplinary work based on an ambulatory blood pressure study and the development of a new classification algorithm named REMED. We focused on the discovery of new patterns for abnormal blood pressure variability as a possible cardiovascular risk factor. We compared our results with other classification algorithms based on Bayesian methods, decision trees, and rule induction techniques. In the comparison, REMED showed similar accuracy to these algorithms but it has the advantage of being superior in its capacity to classify sick people correctly. Therefore, our method could represent an innovative approach that might be useful in medical decision support for cardiovascular disease prognosis.

Research paper thumbnail of Machine Learning Approach to Extract Diagnostic and Prognostic Thresholds: Application in Prognosis of Cardiovascular Mortality

Computational and Mathematical Methods in Medicine, 2012

Machine learning has become a powerful tool for analysing medical domains, assessing the importan... more Machine learning has become a powerful tool for analysing medical domains, assessing the importance of clinical parameters, and extracting medical knowledge for outcomes research. In this paper, we present a machine learning method for extracting diagnostic and prognostic thresholds, based on a symbolic classification algorithm called REMED. We evaluated the performance of our method by determining new prognostic thresholds for well-known and potential cardiovascular risk factors that are used to support medical decisions in the prognosis of fatal cardiovascular diseases. Our approach predicted 36% of cardiovascular deaths with 80% specificity and 75% general accuracy. The new method provides an innovative approach that might be useful to support decisions about medical diagnoses and prognoses.

Research paper thumbnail of Mobile Personal Health Monitoring for Automated Classification of Electrocardiogram Signals in Elderly

Computational and Mathematical Methods in Medicine, 2018

Mobile electrocardiogram (ECG) monitoring is an emerging area that has received increasing attent... more Mobile electrocardiogram (ECG) monitoring is an emerging area that has received increasing attention in recent years, but still real-life validation for elderly residing in low and middle-income countries is scarce. We developed a wearable ECG monitor that is integrated with a self-designed wireless sensor for ECG signal acquisition. It is used with a native purposely designed smartphone application, based on machine learning techniques, for automated classification of captured ECG beats from aged people. When tested on 100 older adults, the monitoring system discriminated normal and abnormal ECG signals with a high degree of accuracy (97%), sensitivity (100%), and specificity (96.6%). With further verification, the system could be useful for detecting cardiac abnormalities in the home environment and contribute to prevention, early diagnosis, and effective treatment of cardiovascular diseases, while keeping costs down and increasing access to healthcare services for older persons.

Research paper thumbnail of Mobile Personal Health System for Ambulatory Blood Pressure Monitoring

Computational and Mathematical Methods in Medicine, 2013

The ARVmobile v1.0 is a multiplatform mobile personal health monitor (PHM) application for ambula... more The ARVmobile v1.0 is a multiplatform mobile personal health monitor (PHM) application for ambulatory blood pressure (ABP) monitoring that has the potential to aid in the acquisition and analysis of detailed profile of ABP and heart rate (HR), improve the early detection and intervention of hypertension, and detect potential abnormal BP and HR levels for timely medical feedback. The PHM system consisted of ABP sensor to detect BP and HR signals and smartphone as receiver to collect the transmitted digital data and process them to provide immediate personalized information to the user. Android and Blackberry platforms were developed to detect and alert of potential abnormal values, offer friendly graphical user interface for elderly people, and provide feedback to professional healthcare providers via e-mail. ABP data were obtained from twenty-one healthy individuals (>51 years) to test the utility of the PHM application. The ARVmobile v1.0 was able to reliably receive and process the ABP readings from the volunteers. The preliminary results demonstrate that the ARVmobile 1.0 application could be used to perform a detailed profile of ABP and HR in an ordinary daily life environment, bedsides of estimating potential diagnostic thresholds of abnormal BP variability measured as average real variability.

Research paper thumbnail of How Many Measurements Are Needed to Estimate Blood Pressure Variability Without Loss of Prognostic Information

American Journal of Hypertension, 2014

BACKGROUND: Average real variability (ARV) is a recently proposed index for shortterm blood press... more BACKGROUND: Average real variability (ARV) is a recently proposed index for shortterm blood pressure (BP) variability. We aimed to determine the minimum number of BP readings required to compute ARV without loss of prognostic information.
METHODS: ARV was calculated from a discovery dataset that included 24-hour ambulatory BP measurements for 1,254 residents (mean age = 56.6 years; 43.5% women) of Copenhagen, Denmark. Concordance between ARV from full (≥80 BP readings) and randomly reduced 24-hour BP recordings was examined, as was prognostic accuracy. A test dataset that included 5,353 subjects (mean age = 54.0 years; 45.6% women) with at least 48 BP measurements from 11 randomly recruited population cohorts was used to validate the results.
RESULTS: In the discovery dataset, a minimum of 48 BP readings allowed an
accurate assessment of the association between cardiovascular risk and ARV. In the test dataset, over 10.2 years (median), 806 participants died (335 cardiovascular deaths, 206 cardiac deaths) and 696 experienced a major fatal or nonfatal cardiovascular event. Standardized multivariable-adjusted hazard ratios (HRs) were computed for associations between outcome and BP variability. Higher diastolic ARV in 24-hour ambulatory BP recordings predicted (P < 0.01) total (HR = 1.12),
cardiovascular (HR = 1.19), and cardiac (HR = 1.19) mortality and fatal combined with nonfatal cerebrovascular events (HR = 1.16). Higher systolic ARV in 24-hour ambulatory BP recordings predicted (P < 0.01) total (HR = 1.12), cardiovascular (HR = 1.17), and cardiac (HR = 1.24) mortality.
CONCLUSIONS: Forty-eight BP readings over 24 hours were observed to be adequate to compute ARV without meaningful loss of prognostic information.included 5,353 subjects (mean age = 54.0 years; 45.6% women) with at least 48 BP measurements from 11 randomly recruited population cohorts was used to validate the results.

Research paper thumbnail of A Pilot Study of the Use of Emerging Computer Technologies to Improve the Effectiveness of Reading and Writing therapies in Children with Down Syndrome

British Journal of Educational Technology, 2017

Despite the potential benefits that computer approaches could provide for children with cognitive... more Despite the potential benefits that computer approaches could provide for children with cognitive disabilities, research and implementation of emerging approaches to learning supported by computing technology has not received adequate attention. We conducted a pilot study to assess the effectiveness of a computer-assisted learning tool, named “HATLE,” for children with Down syndrome. The tool helps to improve reading and writing abilities in Spanish, through mobile computing, multimedia design, and computer speech-recognition techniques. An experimental design with nonequivalent groups was used to assess the effectiveness of HATLE. The treatment group was taught using HATLE; the control group received typical instructions with the same material. Individual literacy achievement was assessed for both groups, before and after therapy sessions. The dependent variables in all analyses were posttest scores, adjusted via Analysis of Covariance (ANCOVA) for pretest variance. Differences between treatment and control groups were statistically significant in favor of the HATLE group on measures of Single-Word Reading (p50.048) and Handwriting-Form (p50.046) with large effect sizes (d>0.8). Results indicate that HATLE might be effective in supporting computer-aided learning for children with intellectual disabilities. The results are discussed in terms of limitations and implications.

Research paper thumbnail of 24-Hour Blood Pressure Variability Assessed by Average Real Variability: A Systematic Review and Meta-analysis

Journal of the American Heart Association, 2017

Background-Although 24-hour blood pressure (BP) variability (BPV) is predictive of cardiovascular... more Background-Although 24-hour blood pressure (BP) variability (BPV) is predictive of cardiovascular outcomes independent of absolute BP levels, it is not regularly assessed in clinical practice. One possible limitation to routine BPV assessment is the lack of standardized methods for accurately estimating 24-hour BPV. We conducted a systematic review to assess the predictive power of reported BPV indexes to address appropriate quantification of 24-hour BPV, including the average real variability (ARV) index. Methods and Results-Studies chosen for review were those that presented data for 24-hour BPV in adults from meta-analysis, longitudinal or cross-sectional design, and examined BPV in terms of the following issues: (1) methods used to calculate and evaluate ARV; (2) assessment of 24-hour BPV determined using noninvasive ambulatory BP monitoring; (3) multivariate analysis adjusted for covariates, including some measure of BP; (4) association of 24-hour BPV with subclinical organ damage; and (5) the predictive value of 24-hour BPV on target organ damage and rate of cardiovascular events. Of the 19 assessed studies, 17 reported significant associations between high ARV and the presence and progression of subclinical organ damage, as well as the incidence of hard end points, such as cardiovascular events. In all these cases, ARV remained a significant independent predictor (P<0.05) after adjustment for BP and other clinical factors. In addition, increased ARV in systolic BP was associated with risk of all cardiovascular events (hazard ratio, 1.18; 95% confidence interval, 1.09-1.27). Only 2 cross-sectional studies did not find that high ARV was a significant risk factor. Conclusions-Current evidence suggests that ARV index adds significant prognostic information to 24-hour ambulatory BP monitoring and is a useful approach for studying the clinical value of BPV.

Research paper thumbnail of Mobile Personal Health Care System for Noninvasive, Pervasive, and Continuous Blood Pressure Monitoring: Development and Usability Study

JMIR Mhealth Uhealth, 2020

Background: Smartphone-based blood pressure (BP) monitoring using photoplethysmography (PPG) tech... more Background: Smartphone-based blood pressure (BP) monitoring using photoplethysmography (PPG) technology has emerged as a promising approach to empower users with self-monitoring for effective diagnosis and control of hypertension. Objective: This study aimed to develop a mobile personal health care system for noninvasive, pervasive, and continuous estimation of BP level and variability, which is user friendly for elderly people. Methods: The proposed approach was integrated by a self-designed cuffless, calibration-free, wireless, and wearable PPG-only sensor and a native purposely designed smartphone app using multilayer perceptron machine learning techniques from raw signals. We performed a development and usability study with three older adults (mean age 61.3 years, SD 1.5 years; 66% women) to test the usability and accuracy of the smartphone-based BP monitor. Results: The employed artificial neural network model had good average accuracy (>90%) and very strong correlation (>0.90) (P<.001) for predicting the reference BP values of our validation sample (n=150). Bland-Altman plots showed that most of the errors for BP prediction were less than 10 mmHg. However, according to the Association for the Advancement of Medical Instrumentation and British Hypertension Society standards, only diastolic blood pressure prediction met the clinically accepted accuracy thresholds. Conclusions: With further development and validation, the proposed system could provide a cost-effective strategy to improve the quality and coverage of health care, particularly in rural zones, areas lacking physicians, and areas with solitary elderly populations.

Research paper thumbnail of Marco para Automatizar la Selección de un Mecanismo de Coordinación entre Sociedades de Agentes

DYNA INGENIERIA E INDUSTRIA, 2017

La creciente disponibilidad de nueva y mejor tecnologia esta propiciando que los hogares se trans... more La creciente disponibilidad de nueva y mejor tecnologia esta propiciando que los hogares se transformen en entornos donde los dispositivos tecnologicos se insertan de manera natural y son asimilados de forma rapida y transparente por los usuarios. En este sentido, la automatizacion de procesos o servicios tales como el encendido y apagado de las luces, el control de la temperatura del clima domestico, y la apertura o desbloqueo de puertas son cada vez mas comunes [1]. En consecuencia, el siguiente paso en esta evolucion tecnologia/espacios, es el desarrollo de entornos inteligentes que cuenten con una red de dispositivos interconectados (sensores y actuadores), de forma tal que trabajen como un solo elemento, y cuya operacion este relacionada, permitiendo asi integrar servicios complejos, entendiendo estos como la combinacion de nuevas funcionalidades disponibles pero no explotadas por el sistema [2].

Research paper thumbnail of Symbolic One-class Learning from Imbalanced Datasets: Application in Medical Diagnosis

International Journal on Artificial Intelligence Tools, 2009

When working with real-world applications we often find imbalanced datasets, those for which ther... more When working with real-world applications we often find imbalanced datasets, those for which there exists a majority class with normal data and a minority class with abnormal or important data. In this work, we make an overview of the class imbalance problem; we review consequences, possible causes and existing strategies to cope with the inconveniences associated to this problem. As an effort to contribute to the solution of this problem, we propose a new rule induction algorithm named
Rule Extraction for MEdical Diagnosis (REMED), as a symbolic one-class learning approach. For the evaluation of the proposed method, we use different medical diagnosis datasets taking into account quantitative metrics, comprehensibility, and reliability. We performed a comparison of REMED versus C4.5 and RIPPER combined with over-sampling and cost-sensitive strategies. This empirical analysis of the REMED algorithm showed it to be quantitatively competitive with C4.5 and RIPPER in terms of the area under the Receiver Operating Characteristic curve (AUC) and the
geometric mean, but overcame them in terms of comprehensibility and reliability. Results of our experiments show that REMED generated rules systems with a larger degree of abstraction and patterns closer to well-known abnormal values associated to each considered medical dataset.