Risk based uncertainty quantification to improve robustness of manufacturing operations (original) (raw)
Related papers
Proc. of …, 2009
The need to assess robust performance for complex systems have led to a considerable rise of industrial interest in the simulation challenges to treat and propagate uncertainty through complex physical and numerical simulation frameworks. The industrial stakes require that both the methodology and the numerical methods for uncertainty treatment have to be openly validated and enriched both by the academic world and the certification authorities. A general methodology has emerged from the joint effort of both industrial companies and academic institutions, and a list of well-established numerical methods have been selected to support this methodology. EDF R&D, EADS Innovation Works and PhiMECA have developed an Open Source software platform dedicated to uncertainty treatment by probabilistic methods, named OpenTURNS for Open source Treatment of Uncertainty, Risk 'N Statistics.
(Value, Risk)-Based Performance Evaluation of Manufacturing Processes
14th IFAC Symposium on Information Control Problems in Manufacturing, 2012
A value/risk-based performance evaluation framework is proposed in the context of manufacturing processes at the industrialization phase of product development. Various risk factors of the manufacturing process are identified through Failure Mode and Effect Analysis (FMEA) and then embedded in the process plan models. Modelling and simulation are then employed for determining the value a process plan can create and the risk it is exposed to. Alternative scenarios are developed, simulated and compared with a reference scenario. The methodology is illustrated with a case study issued from parts manufacturing but is applicable to a wide range of other processes.
Characterizing and modelling uncertainties in production systems
Numerous decisions must be taken within the framework of conception, optimization and management of production system or supply chain. With this aim, managers need to model their systems. Unfortunately, data can be imprecise at the design stage or can evolve during the exploitation of the system. However, data can have a deep impact on the system performance. Thus, decision-makers have to take into account these uncertainties and use the appropriate tools and methodology to model them. In this paper we propose a conceptual framework for imperfect data characterizing and modelling. The goal of this framework is to provide a decision tool to help users to choose the most suitable modelling approach for their manufacturing system problems.
Risk Analysis, 2010
Quantitative risk analysis (QRA) is a systematic approach for evaluating likelihood, consequences, and risk of adverse events. QRA based on event (ETA) and fault tree analyses (FTA) employs two basic assumptions. The first assumption is related to likelihood values of input events, and the second assumption is regarding interdependence among the events (for ETA) or basic events (for FTA). Traditionally, FTA and ETA both use crisp probabilities; however, to deal with uncertainties, the probability distributions of input event likelihoods are assumed. These probability distributions are often hard to come by and even if available, they are subject to incompleteness (partial ignorance) and imprecision. Furthermore, both FTA and ETA assume that events (or basic events) are independent. In practice, these two assumptions are often unrealistic. This article focuses on handling uncertainty in a QRA framework of a process system. Fuzzy set theory and evidence theory are used to describe the uncertainties in the input event likelihoods. A method based on a dependency coefficient is used to express interdependencies of events (or basic events) in ETA and FTA. To demonstrate the approach, two case studies are discussed.
Process Reliability and Six-Sigma
Reliability of manufacturing processes can be obtained from daily production data when process failure criteria are established. Results of the analysis are displayed on Weibull probability plots. Losses are categorized and identified for corrective action based on a demonstrated production criterion, which gives a point estimate for the daily production value. Concepts from six-sigma methodology are used to establish the effective nameplate capacity rating for the process. The differences between the nameplate rating and the demonstrated production are labeled as efficiency and utilization losses.
Uncertainty aspects in process safety analysis
Journal of Loss Prevention in the Process Industries, 2010
Uncertainties of input data as well as of simulation models used in process safety analysis (PSA) are key issues in the application of risk analysis results. Mostly, it is connected with an incomplete and uncertain identification of representative accident scenario (RAS) and other vague and ambiguous information required for the assessment of particular elements of risk, especially for determination of frequency as well as severity of the consequences of RAS. The authors discuss and present the sources and types of uncertainties encountered in PSA and also methods to deal with them. There are different approaches to improve such analysis including sensitivity analysis, expert method, statistics and fuzzy logic. Statistical approach uses probability distribution of the input data and fuzzy logic approach uses fuzzy sets. This paper undertakes the fuzzy approach and presents a proposal for fuzzy risk assessment. It consists of a combination of traditional part, where methods within the process hazard analysis (PHA) are used, and "fuzzy part", applied quantitatively, where fuzzy logic system (FLS) is involved. It concerns frequency, severity of the consequences of RAS and risk evaluation. In addition, a new element called risk correction index (RCI) is introduced to take into account uncertainty concerned with the identification of RAS. The preliminary tests confirmed that the final results on risk index are more precisely and realistically determined.
Tolerance allocation: A reliability based optimisation approach
Procedia Manufacturing, 2020
Tolerance analysis and allocation are two activities of great importance in product development. The mathematical formulation of the latter concerns the establishment and solution of a constraint optimisation problem. In this work, making one step forward, a probabilistic framework is developed and the tolerance synthesis problem is reformulated to a reliability based optimisation one introducing probabilistic constraints. Advanced reliability methods are merged with professional computer aided tolerance tools to estimate the distribution of the assembly key characteristic. Cost-tolerance relationships based on the variability of the manufacturing resources rather than on empirical formulas were adopted in a process based cost modelling methodology. The suggested framework is compared to the classical tolerance allocation approaches of the worst case scenario and the root sum square. It was found that despite the increased computational cost, further relaxation in the design tolerance can be achieved using reliability based optimisation techniques driving down the product cost.
Shoul d industrial uncertainty analysis be Bayesian
2011
Quantitative simulation is probably the major tool of industrial R&D studies. Com- puter codes are widely used to predict the behavior and the reliability of a complex system in given operating conditions or, in the design phase, to ensure that it will fulfill given required performances. Whatever their complexity, quantitative models are essentially physics-based and, consequently, de- terministic. On the other hand, their inputs and/or the code itself may be aected by uncertainties of various nature which aect the final results and must be properly taken into account. Uncertainty analysis has gained more and more importance in industrial practice in the last years and is at the heart of several working groups and funded projects involving industrial and academic researchers. We present hereby how the common industrial approach to uncertainty analysis can be formulated in a full Bayesian and decisional setting. First, we will introduce the methodological approach to industrial unce...
Management of measurement uncertainty for effective statistical process control
… and Measurement Technology …, 2002
In the context of quality assurance strategies, statistical process control techniques and conformance testing are necessary to perform a correct quality auditing of process outcomes. However, data collection is based on measurements and every measurement is intrinsically affected by uncertainty. Even if adopted instruments are in a condition of metrological confirmation, random and systematic measurement errors can not be completely eliminated. Moreover, the consequence of wrong measurement-based decisions can seriously decrease company profits because of larger repairing and shipping costs, as well as for the loss of reputation due to customers' dissatisfaction. This paper deals with a theoretical analysis aimed at estimating the growth in decisional risks due to both random and systematic errors. Also, it provides some useful guidelines about how to choose the Test Uncertainty Ratio (TUR) of industry-rated measurement instruments in order to bound the risk of making wrong decisions below a preset maximum value.