The compilation of decision models (original) (raw)

Decision making involving imperfect knowledge

IEEE Transactions on Systems, Man, and Cybernetics, 1993

The human capability of making complex decisions is one of the most fascinating facets of human intelligence, especially if vague, judgemental, default, or uncertain knowledge is involved. This paper focuses on the automation of this capability. It advocates treating rulebased decision making as a problem of evidence combination: Rules are considered to provide evidence for or against a certain decision; after the rules have been processed, the evidence associated with the individual decisions is combined and the best decision is selected. Advantages and problems of this approach are discussed and experiences we had in applying this approach in a Computer Bridge project are surveyed. Finally, a language construct, called a decision block, that facilitates the automation of rule-based decision making involving imperfect knowledge is introduced. The benefits of decision blocks for writing complex decision support systems are evaluated, and their integration with a rule-based forward chaining language is discussed.

Implementable Prescriptive Decision Making

2016

The need for inspecting (ir)rationality in decision making (DM)-the observed discrepancy between real and prescriptive DMs-stems from omnipresence of DM in individuals' and society life. Active approaches try to diminish this discrepancy either by changing behaviour of participants (DM subjects) or modifying prescriptive theories as done in this text. It provides a core of unified merging methodology of probabilities serving for knowledge fusion and information sharing exploited in cooperative DM. Specifically, it unifies merging methodologies supporting a flat cooperation of interacting self-interested DM participants. They act without a facilitator and they are unwilling to spare a non-negligible deliberation effort on merging. They are supposed to solve their DM tasks via the fully probabilistic design (FPD) of decision strategies. This option is motivated by the fact that FPD is axiomatically justified and extends standard Bayesian DM. Merging is a supporting DM task and is also solved via FPD. The proposed merging formulation tries to be as general as possible without entering into technicalities of measure theory. The results generalise and unify earlier work and open a pathway to systematic solutions of specific, less general, problems.

Individualizing generic decision models using assessments as evidence

Journal of Biomedical Informatics, 2005

Complex decision models in expert systems often depend upon a number of utilities and subjective probabilities for an individual. Although these values can be estimated for entire populations or demographic subgroups, a model should be customized to the indi-vidualÕs specific parameter values. This process can be onerous and inefficient for practical decisions. We propose an interactive approach for incrementally improving our knowledge about a specific individualÕs parameter values, including utilities and probabilities, given a decision model and a prior joint probability distribution over the parameter values. We define the concept of value of elicitation and use it to determine dynamically the next most informative elicitation for a given individual. We evaluated the approach using an example model and demonstrate that we can improve the decision quality by focusing on those parameter values most material to the decision.

Strategies for Combining Decision Procedures

2003

Implementing efficient algorithms for combining decision procedures has been a challenge and their correctness precarious. In this paper we describe an inference system that has the classical Nelson-Oppen procedure at its core and includes several optimizations: variable abstraction with sharing, canonization of terms at the theory level, and Shostak's streamlined generation of new equalities for theories with solvers. The transitions of our system are fine-grained enough to model most of the mechanisms currently used in designing combination procedures. In particular, with a simple language of regular expressions we are able to describe several combination algorithms as strategies for our inference system, from the basic Nelson-Oppen to the very highly optimized one recently given by Shankar and Rueß. Presenting the basic system at a high level of generality and non-determinism allows transparent correctness proofs that can be extended in a modular fashion when new features are introduced in the system.

Inferential modelling and decision making with data

2017 23rd International Conference on Automation and Computing (ICAC), 2017

In this paper, we introduce the main concepts of a new maximum likelihood evidential reasoning (MAKER) framework for data-driven inferential modelling and decision making under different types of uncertainty. It consists of two types of model: state space model (SSM) and evidence space model (ESM), driven by the data that reflects the relationships between system inputs and output. SSM is constructed to describe different system states and changes. ESM is established by mapping data to a set of evidence that is partitioned into evidential elements each pointing to a system state set and together represents system behaviours in a probabilistic and distributed manner. The reliability of evidence and interdependence between a pair of evidence are explicitly measured. It is in the joint evidence-state space that multiple pieces of evidence with different degrees of interdependence and reliability are acquired from system inputs and combined to inference system output. A general optimal learning model is constructed, where evidence reliability can be learnt from historical data by maximising the likelihood of true state. In the MAKER framework, different types of uncertainty can be taken into account for inferential modelling, probabilistic prediction and decision making.

An empirical test of the evidential reasoning approach’s synthesis axioms

Expert Systems with Applications, 2012

This paper reports results from two empirical tests of the descriptive validity of synthesis axioms used by the evidential reasoning (ER) approach to aggregate performance over multiple criteria. These show that evaluations which invoke the axioms frequently violate them. The two most systematic aspects of the violations are that aggregate evaluations tend to be more favourable than basic evaluations, and that small amounts of ignorance on one attribute may be compensated for by complete assessments on other attributes. The implications for prescriptive use of the ER approach are discussed and some practical assessment procedures suggested. and quantitative evaluations. It is computationally simple to implement (special-purpose software is readily available ) and many applications have been reported in the literature (e.g. . At the heart of the approach is an algorithm for aggregating information across different criteria (hereafter referred to as the ER algorithm). To draw an analogy with decision methods based on utility functions: the belief functions act like marginal utility functions and contain information about intra-criterion preferences, while the ER algorithm acts like the additive, multiplicative, or multilinear aggregation of marginal performance over criteria . The use of the ER algorithm is often motivated on the basis that it

Modeling by construction: a new methodology for constructing models for decision support

Proceedings of HICSS-29: 29th Hawaii International Conference on System Sciences, 1996

When to compare the fields of model building and m d e l solving / analyzing, it is the lmer that is more widely researched and understood. ThB is particularly true in respect of the construction of solving-paradigm-free representations (i.e. created not for a special solving method like e.g. linear programming) such as are needed in the case of general models for decision support, to function as problem descriptions. The process of formulating such modeh is supported badly in a conceptual and in a technical way. In this paper we focus on the conceptual way. Model building ist viewed as a creative act, which needs constructive achievement, and which is not only a mapping of reality. We provide a framework for a step-by-step conception and formulation of general models for decision support in an easyto-use and problemdriven manner. Modeling by construction is based on a theoretical 'concept of independence' which definitwn B directly derivedfrom element types of a problem. It b shown that a distinction between semantic and conceptual modeling is arguable on this basis. This concept of independence is the foundation of the definition of eight modeling levels which can function as components of the modeling process. We shall discuss these levekr in detail. We supply an example which uses a graphical modeling language carefully suited to the levek;. Some implications for the transformation operations of the graphical model are considered. We view our methodology not as a replacement of GEOFFNON'S structured modeling, but as a complementary approach to the difficult task of general decision model construction.

Making Decisions: Using Bayesian Nets and MCDA

Knowledge Based Systems, 2001

Bayesian Belief Nets (BBNs) have proven to be an extremely powerful technique for reasoning under uncertainty. We have used them in a range of real applications concerned with predicting properties of critical systems. In most of these applications we are interested in a single attribute of the system such as safety or reliability. Although such BBNs provide important support for decision making, in many circumstances we need to make decisions based on multiple criteria. For example, a BBN for predicting the safety of a critical system cannot be used to make a decision about whether or not the system should be deployed. This is because such a decision must be based on criteria other than just safety (cost, politics, and environmental factors being obvious examples). In such situations the BBN must be complemented by other decision making techniques such as those of multi-criteria decision aid (MCDA). In this article we explain the role of BBNs in such decision-making and describe a generic decision-making procedure that uses BBNs and MCDA in a complementary way. The procedure consists of identifying the objective and perspective for the decision problem, as well as the stakeholders. This in turn leads to a set of possible actions, a set of criteria and constraints. We distinguish between, uncertain and certain criteria. The BBN links all the criteria and enables us to calculate a value (within some probability distribution in the case of the uncertain criteria) for each criterion for a given action. This means that we can apply traditional MCDA techniques to combine the values for a given action and then to rank the set of actions. The techniques described are demonstrated by real examples, including a safety assessment example that is being used by a major transportation organisation.