Probabilistic risk modeling Research Papers (original) (raw)
A good understanding of climate change damages is vital to design effective adaptation policies and measures. Using a dataset of probabilistic sea-level rise and other of flood damages and protection cost curves for the 600 largest... more
A good understanding of climate change damages is vital to design effective adaptation policies and measures. Using a dataset of probabilistic sea-level rise and other of flood damages and protection cost curves for the 600 largest European coastal cities we generate stochastic damage curves and their distributions with and without adaptation. We apply the Generalized Extreme Value distribution to characterize the distributions and calculate two risk measures: the Value at Risk and the Expected Shortfall, which contribute to understanding the magnitude and probability of high-end sea-level rise represented by the upper tail of the distribution. This allows the costs of sea-level rise to be estimated (that is, in addition to other costs related to coastal extreme events) and supports decision-makers in integrating the high uncertainty related to future projections. This knowledge is necessary for an adequate risk management that does not underestimate risk. Furthermore, it allows city planners to tailor their risk tolerance. A great number of cities in Europe are currently undertaking adaptation plans or have already done so. Making these findings available should therefore be of great priority value to inform these processes.
African swine fever is the viral disease of pigs that can lead up to 100% mortality among infected animals and causes huge economic losses in the Russian Federation. The space-time analysis of the disease spread conducted by means of... more
African swine fever is the viral disease of pigs that can lead up to 100% mortality among infected animals and causes huge economic losses in the Russian Federation. The space-time analysis of the disease spread conducted by means of ArcGIS tools allowed to develop a model that detects a high risk area of the infection diffusion from an affected site. Recommendations for veterinary services were proposed to improve preventive measures by adjusting sizes and shapes of surveillance zones.
The theory of dual numerical means of random and experienced variables is briefly described in the framework of the new theory of experience and the chance that arises as an axiomatic synthesis of two dual theories — the Kolmogorov theory... more
The theory of dual numerical means of random and experienced variables is briefly described in the framework of the new theory of experience and the chance that arises as an axiomatic synthesis of two dual theories — the Kolmogorov theory of probability and the theory of believability. A new term is introduced for the numerical mean of the experienced variable — mathematical reflection, which is dual to the mathematical expectation of a random variable within the framework of the new theory. The basic properties and examples of dual numerical means are considered.
Considering corrosion rate during the remaining life assessment of aging pipelines is fundamental to calculate the interval between two consecutive inspections. A total of 798 internal and external corrosion defects have been detected,... more
Considering corrosion rate during the remaining life assessment of aging pipelines is fundamental to calculate the interval between two consecutive inspections. A total of 798 internal and external corrosion defects have been detected, using the Magnetic Flux Leakage intelligent inspection tool, over 48 km of a pipeline length located in the west region of Algeria. The statistical analysis has shown that there is a strong correlation between the corrosion defect length and the corrosion defect circumferential width, with a significant correlation coefficient equal to 82.87%. A probabilistic methodology is presented for the assessment of the remaining life of a corroded pipeline transporting gas, and a finite element method (FEM) was used to assess the pipeline failure pressure. The numerical FEM modeling results were compared with the commonly used codes-models for calculating limit pressure to establish a more realistic and accurate engineering model. The reliability analysis of an API 5L X60 steel made Algerian natural gas pipeline, in service for thirty years, and exposed to active corrosion attack, is presented. The sensitivity analysis of the basic random variables within the nonlinear limit state function was carried out to bring out the relative contribution of each variable affecting the remaining life of corroded pipelines. The reliability analysis is carried out by using Breitung's formula, based on the second-order reliability method (SORM). The reliability assessment of the corroded pipeline is based on the usage of the notched failure assessment diagram (NFAD), different codes for the calculation of the failure pressure, and the numerical results using the finite element analyses (FEA) software ANSYS.
Corrosion is one of the major causes of failure in onshore and offshore oil and gas operations. Microbiologically influenced corrosion (MIC) is inherently more complex to predict, detect and measure because, for instance, the presence of... more
Corrosion is one of the major causes of failure in onshore and offshore oil and gas operations. Microbiologically influenced corrosion (MIC) is inherently more complex to predict, detect and measure because, for instance, the presence of biofilm and/or bacterial products is not sufficient to indicate active microbiological corrosion. The major challenge for current MIC models is to correlate factors that influence corrosion (i.e. chemical, physical, biological and molecular variables) with the potential of having MIC. Previous work has proposed the potential for MIC as a simple product of multiple factors, without fully considering the synergy or the interference among the
factors. The present work proposes a network-based approach to analyse and predict MIC potential considering the complex interactions among a total of 60 influencing factors and 20 screening parameters. The proposed model has the ability to capture the complex interdependences and the synergic interactions of the factors used to assess MIC potential and uses an object-oriented approach based on a Bayesian Network. The model has been tested and verified using real data
from a pipeline leakage incident that was a result of MIC. The proposed model constitutes a significant step in deepening the understanding of when MIC occurs and its predictability.
For a long time, one of my dreams was to describe the nature of uncertainty axiomatically, and it looks like I've finally done it in my co∼eventum mechanics! Now it remains for me to explain to everyone the co∼eventum mechanics in the... more
For a long time, one of my dreams was to describe the nature of uncertainty axiomatically, and it looks like I've finally done it in my co∼eventum mechanics! Now it remains for me to explain to everyone the co∼eventum mechanics in the most approachable way. This is what I'm trying to do in this work. The co∼eventum mechanics is another name for the co∼event theory, i.e., for the theory of experience and chance which I axiomatized in 2016 [1, 2]. In my opinion, this name best reflects the co∼event-based idea of the new dual theory of uncertainty, which combines the probability theory as a theory of chance, with its dual half, the believability theory as a theory of experience. In addition, I like this new name indicates a direct connection between the co∼event theory and quantum mechanics, which is intended for the physical explanation and description of the conict between quantum observers and quantum observations [4]. Since my theory of uncertainty satises the Kolmogorov axioms of probability theory, to explain this co∼eventum mechanics I will use a way analogous to the already tested one, which explains the theory of probability as a theory of chance describing the results of a random experiment. The simplest example of a random experiment in probability theory is the " tossing a coin ". Therefore, I decided to use this the simplest random experiment itself, as well as the two its analogies: the " "flipping a coin " and the " spinning a coin " to explain the co∼eventum mechanics, which describes the results of a combined experienced random experiment. I would like to resort to the usual for the probability theory " coin-based " analogy to explain (and first of all for myself) the logic of the co∼eventum mechanics as a logic of experience and chance. Of course, this analogy one may seem strange if not crazy. But I did not come up with a better way of tying the explanations of the logic of the co∼eventum mechanics to the coin-based explanations that are commonly used in probability theory to explain at least for myself the logic of the chance through a simple visual " coin-based " model that clarifies what occurs as a result of a combined experienced random experiment in which the experience of observer faces the chance of observation. I hope this analogy can be useful not only for me in understanding the co∼eventum mechanics.
En este trabajo realizamos una tarea fundamentalmente de recopilación informativa para hacer un recuento de las políticas públicas en El Salvador relacionadas con la gestión del riesgo y las medidas preventivas y de organización... more
En este trabajo realizamos una tarea fundamentalmente de recopilación informativa para hacer un recuento de las políticas públicas en El Salvador relacionadas con la gestión del riesgo y las medidas preventivas y de organización comunitaria frente a vulnerabilidades en su territorio y hábitat derivadas del cambio climático y actividad sísmica, en especial los desastres que tienen su origen en huracanes, ciclones, inundaciones y terremotos. Recogemos las deliberaciones y opiniones de los funcionarios salvadoreños en el marco de reuniones regionales recientes relativas a la prevención del riesgo y vulnerabilidad de la región centroamericana y en especial de El Salvador.
We aim at developing and improving the imbalanced business risk modeling via jointly using proper evaluation criteria, resampling, cross-validation, classifier regularization, and ensembling techniques. Area Under the Receiver Operating... more
We aim at developing and improving the imbalanced business risk modeling via jointly using proper
evaluation criteria, resampling, cross-validation, classifier regularization, and ensembling techniques.
Area Under the Receiver Operating Characteristic Curve (AUC of ROC) is used for model comparison
based on 10-fold cross validation. Two undersampling strategies including random undersampling (RUS)
and cluster centroid undersampling (CCUS), as well as two oversampling methods including random
oversampling (ROS) and Synthetic Minority Oversampling Technique (SMOTE), are applied. Three highly
interpretable classifiers, including logistic regression without regularization (LR), L1-regularized LR
(L1LR), and decision tree (DT) are implemented. Two ensembling techniques, including Bagging and
Boosting, are applied on the DT classifier for further model improvement. The results show that, Boosting
on DT by using the oversampled data containing 50% positives via SMOTE is the optimal model and it can
achieve AUC, recall, and F1 score valued 0.8633, 0.9260, and 0.8907, respectively.
After the nuclear accident at Fukushima Daiichi, Japan, in 2011, many countries have identified the need for improving environmental and nuclear licensing procedures for its nuclear reactors in order to consolidate the lessons learned... more
After the nuclear accident at Fukushima Daiichi, Japan, in 2011, many countries have identified the need for improving environmental and nuclear licensing procedures for its nuclear reactors in order to consolidate the lessons learned from this event. Generally, the safety assessment during the licensing process is performed in a deterministic way, supplemented by probabilistic methods. In Brazil, the nuclear licensing is conducted by Comissão Nacional de Energia Nuclear (CNEN) and environmental licensing is the responsibility of the Instituto Brasileiro de Meio Ambiente e dos Recursos Naturais Renováveis (IBAMA). The licensing model of these two agencies, in the aspects of safety analysis and risk management, is deterministic in its essence. In the nuclear licensing, the behavior of the nuclear power plant, after identifying an initiating event, is studied with methods of calculation and simulation, describing the physical phenomena that occur in the nuclear reactors. The Probabilistic Safety Assessment (PSA) focuses on a sequence of events that can lead to rector core meltdown and reliability studies of safety systems. In the environmental licensing, a Quantitative Risk Analysis (QRA) is required, including probabilistic assessments, to support the Risk Analysis Study (EAR), the Risk Management Program and the Emergency Plan of the plant. This master thesis describes the existing methodologies in Brazil and other countries, as regards the PSA and its implications on environmental and nuclear licensing of nuclear power plants, in order to identify opportunities for improving the licensing process on the Brazilian nuclear reactors, and to improve the safety assessment of systems and reliability of the nuclear reactors.
You yourself, or what is the same, your experience is such ``coin'' that, while you aren't questioned, it rotates all the time in ``free flight''. And only when you answer the question the ``coin'' falls on one of the sides: ``Yes'' or... more
You yourself, or what is the same, your experience is such ``coin'' that, while you aren't questioned, it rotates all the time in ``free flight''. And only when you answer the question the ``coin'' falls on one of the sides: ``Yes'' or ``No'' with the believability that your experience tells you.
Manufacturers are choosing direct marketing channels over conventional marketing channels to keep their distribution costs low. Network marketing or Multilevel marketing(MLM) is a technique by which the companies recruit and use part-time... more
Manufacturers are choosing direct marketing channels over conventional marketing channels to keep their distribution costs low. Network marketing or Multilevel marketing(MLM) is a technique by which the companies recruit and use part-time representatives to sell their products on commission basis. Lot of debate exists as to whether a MLM is really a cheaper option and in reality gives the returns to the channel members as promised. In the present study an attempt has been made to mathematically explain the behavior of networks and to predict the probability of networks' success at various levels
Un taux d'intérêt est un prix qui s'applique à une somme d'argent prêtée ou empruntée. Globalement, les taux d'intérêt se forment sur un marché (le marché des capitaux), à partir de la rencontre d'une offre et d'une demande de capitaux.... more
Un taux d'intérêt est un prix qui s'applique à une somme d'argent prêtée ou empruntée. Globalement, les taux d'intérêt se forment sur un marché (le marché des capitaux), à partir de la rencontre d'une offre et d'une demande de capitaux. Dans le cadre de ce stage , on va s'intéresser principalement au taux zéro-coupon.
Докладът предлага вероятностен модел за количествена оценка на въздействието на риск събития върху разходите и срока на изпълнение на обекти от пътната инфраструктура. За съставяне на вероятностния модел, включително за отчитане на... more
Докладът предлага вероятностен модел за количествена оценка на въздействието на риск събития върху разходите и срока на изпълнение на обекти от пътната инфраструктура. За съставяне на вероятностния модел, включително за отчитане на неопреде-леността на основните променливи и рискови събития е приложен симулационен метод Монте Карло, чрез специализиран софтуерен продукт @RISK 7.5.1. Създаването и въвеждането на такъв модел ще даде възможност на участниците в процеса на подготовка и изпълнение на пътен проект, като изпълнители, възложители и бенефициенти, да имат максимален достъп до предварителна информация и влияние върху потенциалните събития, застрашаващи целите на проекта.
NRC is tasked with assessment, inspection, and regulation of commercial nuclear facilities. In recent years, the scientific community has been able, through analysis of operating events both here and abroad, to determine that management... more
NRC is tasked with assessment, inspection, and regulation of commercial nuclear facilities. In recent years, the scientific community has been able, through analysis of operating events both here and abroad, to determine that management and organization, including safety culture, may play an important role in event initiation, management and recovery. Moreover, there may be a need to deepen the understanding of processes at nuclear power plants (NPPs) that relate to risk. There may be a need to develop tools that assist staff to inspect and assess management and organization factors that influence safety, including safety culture. Since safety culture is believed to be a leading indicator of human and plant performance, assessment will contribute to more effective and proactive regulation.
Data from nuclear power plant (NPP) simulator environments reflect important aspects of human performance and potentially the causes of human error. However, the information collected is constrained by the uniqueness of each particular... more
Data from nuclear power plant (NPP) simulator environments reflect important aspects of human performance and potentially the causes of human error. However, the information collected is constrained by the uniqueness of each particular plant and design of each particular simulator study. Thus, the ability to apply the findings of a given simulator study to other studies of human performance at NPPs is often limited. A well-defined approach to experimental
design, a common language or measurement technique for describing and classifying errors and human actions, and an associated theoretical underpinning for the design and resulting statistical analyses are required to allow extrapolation of the results of simulator studies across plant designs, crew make-ups, and scenarios.
The purpose of this report is: to propose an approach for collecting human performance data from NPP simulators and employing the reliability engineering (RE) concept of limit state, to describe the process for collecting data, and to present illustrative examples of data analyses. This approach borrows from techniques applied in traditional RE while using experimental techniques derived from the behavioral sciences.
The ultimate aim of the proposed approach is to achieve greater consensus, consistency and convergence of human reliability analysis (HRA) methods by stimulating simulator data exchange, greater communication, and review within the HRA research and practitioner community. The overall objective of this work is in accordance with SRM-M090204B.
This paper presents a review of available information related to human performance to support Human Reliability Analysis (HRA) performed for nuclear power plants (NPPs). A number of data sources are identified as potentially useful. These... more
This paper presents a review of available information related to human performance to support Human Reliability Analysis (HRA) performed for nuclear power plants (NPPs). A number of data sources are identified as potentially useful. These include NPP licensee event reports, augmented inspection team reports, operator requalification data, results from the literature in experimental psychology, and the Aviation Safety Reporting System. The paper discusses how utilizing such information improves our capability to model and quantify human performance. In particular, the paper discusses how information related to performance shaping factors can be extracted from empirical data to determine their size effect, their relative effects, as well as their interactions. The paper concludes that appropriate use of existing sources can help addressing some of the important issues we are currently facing in HRA.
Décima tercera edición Introducción a la probabilidad y estadística, décima tercera edición, conserva la presentación sencilla y el esbozo tradicional para las estadísticas descriptiva e inferencial, e incorpora útiles ayudas de... more
Décima tercera edición Introducción a la probabilidad y estadística, décima tercera edición, conserva la presentación sencilla y el esbozo tradicional para las estadísticas descriptiva e inferencial, e incorpora útiles ayudas de aprendizaje como los entrenadores Mi entrenador personal, Mi applet y Mi consejo para garantizar que los estudiantes aprenden y comprenden la importancia de los materiales. Además de mostrar cómo aplicar procedimientos estadísticos, los autores explican cómo describir significativamente conjun-tos de datos reales, lo que significan las pruebas estadísticas en términos de sus aplicaciones prácticas, cómo evaluar la validez de los supuestos detrás de pruebas estadísticas y qué hacer cuando supuestos estadísticos han sido violados.
CARACTERÍSTICAS • Amplia cobertura: ofrece una oferta más rigurosa con cobertura tradicional de probabilidad. Más de 35 años de enseñanza y experiencia en la escritura contribuyen a la exposición clara, ejemplos interesantes y ejercicios eficaces. • Datos reales: el primero en incorporar los estudios de casos y datos reales, Mendenhall/Beaver/Beaver sigue la norma. Muchos ejemplos y ejercicios usan conjuntos de datos auténticos, ayudando a los estudiantes a ver las conexiones entre sus estudios y sus vidas.
A review of the Reactor Oversight Process (ROP) and its characterization of human performance was performed by the Idaho National Engineering and Environmental Laboratory (INEEL) to describe the means by which the Nuclear Regulatory... more
A review of the Reactor Oversight Process (ROP) and its characterization of human performance was performed by the Idaho National Engineering and Environmental Laboratory (INEEL) to describe the means by which the Nuclear Regulatory Commission (NRC) monitors, analyzes and feeds back information on human performance. Review of detailed human performance findings and trends observed in 37 operating events identified through the Accident Sequence Precursor (ASP) program served as the sample of operating experience. All events reviewed had a conditional core damage probability of 1.0E-5 or greater and indicated the influence of human performance. Reviews also considered Individual Plant Examinations (IPEs) and Augmented Inspection Team (AIT) reports.
These reviews were then compared to ROP source materials. The ROP source materials included SECY-99-007/007A, SECY-00-0049, NRC manual chapters and inspection procedures, inspection and supplementary inspection reports, plant issues matrices (PIMs) risk-informed inspection notebooks, and the Significance Determination Process (SDP) for Operator Requalification. Insights regarding the characterization of human performance in the ROP are presented.
This report presents the findings of a study of the contributions of human performance to risk in operating events at commercial nuclear power plants. The Nuclear Regulatory Commission (NRC) Accident Sequence Precursor (ASP) Program and... more
This report presents the findings of a study of the contributions of human performance to
risk in operating events at commercial nuclear power plants. The Nuclear Regulatory Commission (NRC) Accident Sequence Precursor (ASP) Program and the Human Performance Events Database (HPED) were used to identify safety significant events in which human performance was a major contributor to risk. Conditional core damage probabilities (CCDPs) were calculated for these events using Systems Analysis Programs for Hands-on Integrated Reliability Evaluation (SAPHIRE) software and Standardized Plant Analysis Risk (SPAR) models.
Forty-eight events described in licensee event reports and augmented inspection team reports were reviewed. Human performance did not play a role in 11 of the events, so they were excluded from the sample. The remaining 37 events were qualitatively analyzed. Twenty-three of these 37 events were also analyzed using SPAR models and methods. Fourteen events were excluded from the SPAR analyses because they involved operating modes or conditions outside the scope of the SPAR models.
The results showed that human performance contributed significantly to analyzed events. Two hundred and seventy human errors were identified in the events reviewed and multiple human errors were involved in every event. Latent errors (i.e., errors committed prior to the event whose effects are not discovered until an event occurs) were present four times more often than were active errors (i.e., those occurring during event response). The latent errors included failures to correct known problems and errors committed during design, maintenance, and operations activities. The results of this study indicate that multiple errors in events contribute to the probabilistic risk assessment (PRA) basic events present in SPAR models and that the underlying models of dependency in HRA may warrant further
attention.
We present an exact probability distribution (meta-distribution) for p-values across ensembles of statistically identical phenomena, as well as the distribution of the minimum p-value among m independents tests. We derive the distribution... more
Failure scenarios analysis constitutes one of the cornerstones of risk assessment and availability analysis. After a detailed review of available methods, this paper identified two distinct formalisms to analyze failure scenarios and... more
Failure scenarios analysis constitutes one of the cornerstones of risk assessment and availability analysis. After a detailed review of available methods, this paper identified two distinct formalisms to analyze failure scenarios and systems' availability: generalized stochastic Petri nets (GSPN) and Fault tree driven Markov processes (FTDMP). The FTDMP formalism is a combination of the Markov process and the fault tree. This aims to overcome fault tree limitations while maintaining the use of deductive logic. The GSPN is a Petri net with probabilistic analysis using Monte Carlo simulation. The effectiveness of both methods is studied through an emergency flare system including a knockout drum. It is observed that GSPN provides a robust and reliable mechanism for accident scenario analysis. It provides additional information such as events' frequencies at operating and failing modes and expected occurrence timing and durations resulting from different complex sequences. Even for multi-state variables which could be used to design a safety management system. Although FTDMP is a powerful formalism, it provides limited information.
Reliability analysis and multivariate statistical fitting are valuable techniques that enhance the scientific basis of regulatory decisions in geotechnical problems. This study introduces the use of several R packages specifically... more
Reliability analysis and multivariate statistical fitting are valuable techniques that enhance the scientific basis of regulatory decisions in geotechnical problems. This study introduces the use of several R packages specifically developed to assist risk assessors in their geotechnical projects. Firstly, the fitting of parameterised models either to the distribution of observed samples or to characterise the dependence structures among variables, or both is presented. Secondly, the most popular reliability analysis methods, such as the first- and second-order reliability methods and the random sampling simulation method, are implemented in R. The efficiency of implementing these classical approximation methods is demonstrated through two example problems.
A methodology is developed in this paper to estimate the time-dependent reliability of a pipeline in Algeria, which is in exploitation from 1981 and with internal and external localized corrosion defects detected in 2009. A probabilistic... more
A methodology is developed in this paper to estimate the time-dependent reliability of a pipeline in Algeria, which is in exploitation from 1981 and with internal and external localized corrosion defects detected in 2009. A probabilistic approach was applied to a pipeline made of API 5L X60 steel for the long-distance transportation of natural gas (55 km), which crossing several geographical reliefs in different country areas. The analysis was carried out by taking into consideration the potential stochastic variables altering drastically the reliability of the pipeline, i.e., dimensions of the localized corrosion defects, their locations, and distribution, and the corrosion rate. The correlations between the pipeline spatial corrosion defects distribution and their dimensions , obtained by using an intelligent inspection tool are used to analyze the actual in-service corrosion attack rate and for enhancement of the accuracy of the reliability assessment. The failure scenario was considered as the moment when the pipeline operating pressure exceeds the failure pressure defined analytically in accordance with different commonly used standards. The assessment of the reliability index of the corroded pipeline subjected to internal pressure and the failure probability was done. The detailed reliability analysis is carried out by using the second order reliability method (SORM) for the basic random variables with different prob-abilistic density within a nonlinear limit state function. The latter is based on the limit analysis of the failure pressure model for different standards and the numerical analysis using the finite element method and ANSYS software.
Clinical workflows are safety critical workflows as they have the potential to cause harm or death to patients. Their safety needs to be considered as early as possible in the development process. Effective safety analysis methods are... more
Clinical workflows are safety critical workflows as they have the potential to cause harm or death to patients. Their safety needs to be considered as early as possible in the development process. Effective safety analysis methods are required to ensure the safety of these high-risk workflows, because errors that may happen through routine workflow could propagate within the workflow to result in harmful failures of the system’s output. This paper shows how to apply an approach for safety analysis of clinical workflows to analyse the safety of the workflow within a radiology department and evaluates the approach in terms of usability and benefits. The outcomes of using this approach include identification of the root causes of hazardous workflow failures that may put patients’ lives at risk. We show that the approach is applicable to this area of healthcare and is able to present added value through the detailed information on possible failures, of both their causes and effects; therefore, it has the potential to improve the safety of radiology and other clinical workflows.
Modern financial modeling extensively uses scaling laws to model market fluctuations, through the use of stochastic processes in the equations of stock price behavior. Initially implied, and present in Brownian motion but not considered... more
Modern financial modeling extensively uses scaling laws to model market fluctuations, through the use of stochastic processes in the equations of stock price behavior. Initially implied, and present in Brownian motion but not considered as such, the scaling laws have resurfaced explicitly as a modeling issue since the financial market supervision authorities drew the attention of banking institutions to the problem of risk regulation, which is badly captured by the conventional Gaussian distributions. The components of this discussion are introduced by following the Mandelbrot fractal models as a guideline, in the wake of which the main conflicts in the modelling of stock market fluctuations over the last forty years have arisen. It is shown that the issue of the existence of scaling laws is related to the more profound issue of the temporal characteristics of the markets.
Background The novel coronavirus (2019-nCOV) outbreak has been a serious concern around the globe. Since people are in tremor due to the massive spread of Coronavirus in the major parts of the world, it requires to predict the risk of... more
Background The novel coronavirus (2019-nCOV) outbreak has been a serious concern around the globe. Since people are in tremor due to the massive spread of Coronavirus in the major parts of the world, it requires to predict the risk of this infectious disease. In this situation, we develop a model to measure the risk of infectious disease and predict the risk of 2019-nCOV transmission by using data of four countries - United States, Australia, Canada and China. Methods The model underlies that higher the population density, higher the risk of transmission of infectious disease from human to human. Besides, population size, case identification rate and travel of infected passengers in different regions are also incorporated in this model. Results According to the calculated risk index, our study identifies New York State in United States (US) to be the most vulnerable area affected by the novel Coronavirus. Besides, other areas (province/state/territory) such as Hubei (China, 2nd), Massachusetts (US, 3rd), District of Columbia (US, 4th), New Jersey (US, 5th), Quebec (Canada, 20th), Australian Capital Territory (Australia, 29th) are also found as the most risky areas in US, China, Australia and Canada. Conclusion The study suggests avoiding any kind of mass gathering, maintaining recommended physical distances and restricting inbound and outbound flights of highly risk prone areas for tackling 2019-nCOV transmission.
MEMS reliability analysis is a challenging area of research which comprises various physics of failure and diverse failure mechanisms. Reliability issues are critical in both design and fabrication phases of MEMS devices as their... more
MEMS reliability analysis is a challenging area of research which comprises various physics of failure and diverse failure mechanisms. Reliability issues are critical in both design and fabrication phases of MEMS devices as their commercialization is still delayed by these problems. In this research, a hybrid methodology is developed for the reliability evaluation of MEMS devices. Its first step is the identification of dominant failure modes by FMEA, evaluation of failure mechanisms and an updated lifetime estimation by Bayesian method. The reliability of MEMS devices is studied using probabilistic physics of failure (PPoF) by determining the dominant failure mechanism. Accordingly, a deterministic model is selected for the analysis of the life and reliability of the dominant failure mechanisms. To convert the deterministic model to a probabilistic model, the uncertainty sources affecting the dielectric lifetime are determined. This model is simulated by utilization of Monte Carlo method. In the final stage, the results of life estimation are updated using the Bayesian method. Considering the wide applications and advantages of RF MEMS capacitive switches, they have been selected for a case study. A framework is developed for the reliability evaluation of these switches failures due to stiction mechanism. The results contain FMEA table, lifetime estimation in different voltages, number of duty cycles and at the end, updated results of life estimation using Bayesian method.
Probabilistic methods to evaluate the seismic vulnerability of reinforced concrete (RC) frames are largely used in the context of performance based design and assessment, often describing the structural response using global engineering... more
Probabilistic methods to evaluate the seismic vulnerability of reinforced concrete (RC) frames are largely used in the context of performance based design and assessment, often describing the structural response using global engineering demand parameters (EDPs) such as the maximum interstory drift. While such EDPs are able to synthetically describe the structural behavior, the use of local EDPs is necessary to provide a more realistic and thorough description of failure mechanisms of low-ductility frames lacking seismic details. The objective of this paper is to investigate viable probabilistic seismic demand models of local EDPs, which may be used in developing fragility curves for the assessment of the low-ductility RC frames. The present work explores adequate regression models, probability distributions and uncertainty variation of the demand models. In addition, the adequacy of several ground motion intensity measures (IMs) to be used for predictive modeling of local EDPs is investigated. A realistic benchmark three-story RC frame representative of non-ductile buildings is used as a case study to identify key considerations.
Objectives. We developed a survival scorecard from the Titanic passenger dataset using statistical scoring methods. We discuss the scorecard development process, assess the effectiveness of statistical scorecards, and analyze the... more
Objectives. We developed a survival scorecard from the Titanic passenger dataset using statistical scoring methods. We discuss the scorecard development process, assess the effectiveness of statistical scorecards, and analyze the characteristics of Titanic passengers that led to survival. Methods. From the Titanic dataset of 1,309 passengers and a binary dependent variable representing survival, we assessed nine (9) features using chi-square, Weight of Evidence (WoE), and Information Value. A logistic regression was fitted on the feature WoEs to predict survival, feature coefficients were used to determine score weights for each attribute, and an additive model on attribute scores per passenger determined survival scores. The resulting scorecards were assessed for risk ranking, and the characteristics of the passenger population were assessed for survivability and population shifts. Results. The resulting survival scorecard was able to rank survivability amongst Titanic passengers (K-S 0.558, ROC 0.892, CAP 0.765) with survival classification accuracy varying by score cutoff (AR 0.63-0.84). Sex was the strongest predictor of survivability (IV 145), followed by fare amount (IV 53), cabin class (IV 52), and passenger class (IV 51). Women passengers had four times higher survivability compared to men (72% vs. 16%). Passengers who paid $100 or more for their trip had nearly ten times higher survivability compared to free passengers (75% vs. 7.6%). Cabin passengers had higher survivability compared to non-cabin passengers with cabin B having nearly three times higher survivability compared to non-cabin passengers (75% vs. 27%). Class 1 passengers had nearly three times higher survivability compared to Class 3 passengers (62% vs. 22%). Conclusion. This paper illustrates the benefits of statistical scoring methods compared to other machine learning approaches in the analysis of event likelihood risks and performing population risk segmentation. Machine learning approaches usually focus on prediction accuracy while scorecards allow for cross-sectional analysis of population risks. Scorecard cutoffs provide avenues for decision-making on populations accounting for tradeoffs in accuracy, recall, precision, and specificity.
This paper discusses some features in modeling robotic technology modules. An adapted model of PTM with parallel structure is presented. With more than 2 machines with different handling times serviced by a robot, waiting service losses... more
This paper discusses some features in modeling robotic technology modules. An adapted model of PTM with parallel structure is presented. With more than 2 machines with different handling times serviced by a robot, waiting service losses accumulate, which affects the overall performance of the PTM.
Objetivo: Objetiva-se neste trabalho analisar o efeito da mortalidade geral brasileira sobre o custo normal e a reserva matemática.Fundamento: A necessidade de desenvolver modelos surge em um contexto de grandes mudanças nos padrões de... more
Objetivo: Objetiva-se neste trabalho analisar o efeito da mortalidade geral brasileira sobre o custo normal e a reserva matemática.Fundamento: A necessidade de desenvolver modelos surge em um contexto de grandes mudanças nos padrões de mortalidade e de composição populacional, onde os modelos clássicos muitas vezes já não detêm o mesmo poder preditivo, em geral pela constância dos pressupostos assumidos para seu uso (Oliveira et al., 2012). No que tange ao comportamento da mortalidade em tempos recentes, os níveis decaíram para patamares muito além dos previstos a décadas atrás (Myrrha et al., 2014). A inconstância do evento passou a ser visado como um problema para a produção de estimativas fidedignas, atribuído às flutuações aleatórias, onde tanto os óbitos, quanto a população, ambos componentes da taxa de mortalidade, possuem influência nessas estimativas (Souza, 2014).Método: Utilizam-se a Escala AA, o Modelo de Lee-Carter Estendido e o Modelo Quadrático Múltiplo, a partir de da...
The agri-food supply chain currently requires more attention due to it related to critical issues such as food supply availability and food security. The agri-food supply chain is vulnerable to risks that affect its performance and... more
The agri-food supply chain currently requires more attention due to it related to critical issues such as food supply availability and food security. The agri-food supply chain is vulnerable to risks that affect its performance and sustainability. The risks propagate from upstream to downstream in the supply chain. This study aimed to identify risks and to develop a risk propagation model in the agri-food industry supply chain. The first objective is achieved by reviewing the relevant literature. The second objective is achieved through development Bayesian Network (BN) model. The results of the study shown main risks in agri-food supply chain include natural risk, supply risk, process risk, quality risk, transportation risk, price risk, and demand risk. The model indicates propagation of a risk to another risk through dependency links. The risk network shows how any changes in the probability of node (risk) gradually affect another risk and contribute to an impact (loss). The model urgently for further development through a real study in specific agri-food supply chain.
Under abnormal conditions, this research has been conducted to measure the extent to which optimistic– pessimistic outlook is existing among Administration and Economics / University of Mustansiriya students and how it varies between them... more
Under abnormal conditions, this research has been conducted to measure the extent to which optimistic–
pessimistic outlook is existing among Administration and Economics / University of Mustansiriya students and how it
varies between them according to gender (male/female). To approach the targeted objectives of the research, applied
Optimism – Pessimism scale to 300 mixed sample of university student statistical treatments of the collected data revealed
that about two thirds of the sample or exactly 194,i.e 74.9% range between optimistic and moderate pessimistic positions.
This attitude predominates among 62.7% males and 37.3% females.
- by Dr. Suhad A L I Shaheed and +1
- •
- Probabilistic risk modeling
The paper illustrates a probabilistic methodology for assessing the vulnerability of existing r.c. buildings withlimited ductility capacity and retrofitted by means of dissipative braces. The methodology is based on thedevelopment of... more
The paper illustrates a probabilistic methodology for assessing the vulnerability of existing r.c. buildings withlimited ductility capacity and retrofitted by means of dissipative braces. The methodology is based on thedevelopment of single component and system fragility curves before and after the retrofit. The proposed approachallows to highlight the possible changes in the most significant collapse modalities before and after the retrofit andto evaluate the effectiveness of the retrofit by taking into account the probabilistic properties of the seismicbehaviour of the considered systems.A benchmark 2-dimensional reinforced concrete frame with low ductility capacity is considered as case study. Theframe is designed for gravity-loads only and does not comply with modern anti-seismic code requirements. It isretrofitted by introducing elasto-plastic dissipative braces designed for different levels of their target base-shearcapacity, following a design method involving the pushover analysis of the system before and after retrofit. Theobtained results show that the use of braces yields a significant increase in the seismic capacity, though anincreased dispersion of the behaviour is observed in the retrofitted system.
The Just In Time (JIT) is a production organization system for factories, of Japanese origin, which basically focuses on the limitation of stored materials or products, to meet up-to-date demand. In this article I will take the liberty of... more
The Just In Time (JIT) is a production organization system for factories, of Japanese origin, which basically focuses on the limitation of stored materials or products, to meet up-to-date demand. In this article I will take the liberty of borrowing this concept to talk about something that was the result of a great reflection after the experience of 35 years in the mountains, the more than 20 I have been researching, analyzing, consulting and training in security for all those different matters that involve activities at heights (sports, recreational, industrial, rescue, etc.) and of course the great exchange with prestigious friends and colleagues that I have the privilege and pleasure of having around the world.
Objetivo: Objetiva-se neste trabalho analisar o efeito da mortalidade geral brasileira sobre o custo normal e a reserva matemática. Fundamento: A necessidade de desenvolver modelos surge em um contexto de grandes mudanças nos padrões de... more
Objetivo: Objetiva-se neste trabalho analisar o efeito da mortalidade geral brasileira sobre o custo normal e a reserva matemática. Fundamento: A necessidade de desenvolver modelos surge em um contexto de grandes mudanças nos padrões de mortalidade e de composição populacional, onde os modelos clássicos muitas vezes já não detêm o mesmo poder preditivo, em geral pela constância dos pressupostos assumidos para seu uso . No que tange ao comportamento da mortalidade em tempos recentes, os níveis decaíram para patamares muito além dos previstos a décadas atrás . A inconstância do evento passou a ser visado como um problema para a produção de estimativas fidedignas, atribuído às flutuações aleatórias, onde tanto os óbitos, quanto a população, ambos componentes da taxa de mortalidade, possuem influência nessas estimativas (Souza, 2014). Método: Utilizam-se a Escala AA, o Modelo de Lee-Carter Estendido e o Modelo Quadrático Múltiplo, a partir de dados disponibilizados pelas tábuas do IBGE (2011-2016), além de dados de um Regime Próprio de Previdência Social (RPPS), fictício. Resultados: Dentre os principais resultados, tem-se que: para ambos os sexos, a probabilidade de morte por idade se reduziu anualmente entre 2011 e 2016; para ambos os sexos, houve efeito da idade e de período sobre a mortalidade, mas não houve efeito de corte; houve ganho de longevidade para ambos os sexos; a mortalidade da mulher, em relação ao homem, é menor; a redução da mortalidade do homem, em relação a mulher, e do idoso, em relação aos mais jovens, é maior; o custo normal e a reserva matemática decorrentes das Tábuas AA e LC, em relação à Tábua 2016 observada, foram subestimados, enquanto a decorrente da Tábua MQM foi superestimada. Contribuições: Apesar do interesse, no presente estudo, em mensurar o impacto das diferentes estimativas de mortalidade sobre o cálculo atuarial, seus efeitos ecoam, na verdade, sobre diversas áreas: sob a perspectiva demográfica, a mortalidade se constitui num dos três componentes (junto com a fecundidade e migração) que determinam o ritmo do crescimento populacional; sob o olhar das políticas públicas, em que a redução da mortalidade em alguma ou em todas as faixas etárias devem nortear a oferta de serviços públicos; sob a ótica previdenciária, define o tipo de relação entre ativos e inativos. Assim, de forma menos restrita, é possível compreender a importância de estudos relativos à mortalidade e suas inúmeras aplicações.