Nonconglomerative coherent conditional probabilities in statistical inference (original) (raw)
Conditional Random Quantities and Iterated Conditioning in the Setting of Coherence
Lecture Notes in Computer Science, 2013
We consider conditional random quantities (c.r.q.'s) in the setting of coherence. Given a numerical r.q. X and a non impossible event H, based on betting scheme we represent the c.r.q. X|H as the unconditional r.q. XH + µH c , where µ is the prevision assessed for X|H. We develop some elements for an algebra of c.r.q.'s, by giving a condition under which two c.r.q.'s X|H and Y |K coincide. We show that X|HK coincides with a suitable c.r.q. Y |K and we apply this representation to Bayesian updating of probabilities, by also deepening some aspects of Bayes' formula. Then, we introduce a notion of iterated c.r.q. (X|H)|K, by analyzing its relationship with X|HK. Our notion of iterated conditional cannot formalize Bayesian updating but has an economic rationale. Finally, we define the coherence for prevision assessments on iterated c.r.q.'s and we give an illustrative example.
On coherent conditional probabilities and disintegrations
Annals of Mathematics and Artificial Intelligence, 2002
Existence of coherent extensions of coherent conditional probabilities is one of the major merits of de Finetti's theory of probability. However, coherent extensions which meet some special property, like s-additivity or disintegrability, can fail to exist. An example is given ...
Locally strong coherence and inference with lower-upper probabilities
Soft Computing, 2003
We introduce an operational way to reduce the spatial complexity in inference processes based on conditional lower–upper probabilities assessments. To reach such goal we must suitably exploit zero probabilities taking account of logical conditions characterizing locally strong coherence. We actually re-formulate for conditional lower–upper probabilities the notion of locally strong coherence already introduced for conditional precise probabilities. Thanks to the characterization, we avoid to build all atoms, so that several real problems become feasible. In fact, the real complexity problem is connected to the number of atoms. Since for an inferential process with lower–upper probabilities several sequences of constraints must be fulfilled, our simplification can have either a “global” or a “partial” effect, being applicable to all or just to some sequences. The whole procedure has been implemented by XLisp-Stat language. A comparison with other approaches will be done by an example.
International Journal of Approximate Reasoning, 2013
We contrast Williams' and Walley's theories of coherent lower previsions in the light of conglomerability. These are two of the most credited approaches to a behavioural theory of imprecise probability. Conglomerability is the notion that distinguishes them the most: Williams' theory does not consider it, while Walley aims at embedding it in his theory. This question is important, as conglomerability is a major point of disagreement at the foundations of probability, since it was first defined by de Finetti in 1930. We show that Walley's notion of joint coherence (which is the single axiom of his theory) for conditional lower previsions does not take all the implications of conglomerability into account. Considered also some previous results in the literature, we deduce that Williams' theory should be the one to use when conglomerability is not required; for the opposite case, we define the new theory of conglomerably coherent lower previsions, which is arguably the one to use, and of which Walley's theory can be understood as an approximation. We show that this approximation is exact in two important cases: when all conditioning events have positive lower probability, and when conditioning partitions are nested.
Conditional Probability and Defeasible Inference
Journal of Philosophical Logic, 2005
We offer a probabilistic model of rational consequence relations (Lehmann and Magidor, 1990) by appealing to the extension of the classical Ramsey-Adams test proposed by Vann McGee in (McGee, 1994). Previous and influential models of non-monotonic consequence relations have been produced in terms of the dynamics of expectations (Gärdenfors and Makinson, 1994), (Gärdenfors, 1993). 'Expectation' is a term of art in these models, which should not be confused with the notion of expected utility. The expectations of an agent are some form of belief weaker than absolute certainty. Our model offers a modified and extended version of an account of qualitative belief in terms of conditional probability, first presented in (van Fraassen, 1995). We use this model to relate probabilistic and qualitative models of nonmonotonic relations in terms of expectations. In doing so we propose a probabilistic model of the notion of expectation. We provide characterization results both for logically finite languages and for logically infinite, but countable, languages. The latter case shows the relevance of the axiom of countable additivity for our probability functions. We show that a rational logic defined over a logically infinite language can only be fully characterized in terms of finitely additive conditional probability.
How Sets of Coherent Probabilities May Serve as Models for Degrees of Incoherence
International Journal of Uncertainty, Fuzziness and Knowledge-based Systems, 2000
We introduce two indices for the degree of incoherence in a set of lower and upper previsions: maximizing the rate of loss the incoherent bookmaker experiences in a Dutch Book, or maximizing the rate of profit the gambler achieves who makes Dutch Book against the incoherent bookmaker. We report how efficient bookmaking is achieved against these two indices in the case of incoherent previsions for events on a finite partition, and for incoherent previsions that include also a simple random variable. We relate the epsilon-contamination model to efficient bookmaking in the case of the rate of profit.
Against Probabilistic Measures of Coherence
Erkenntnis, 2005
It is shown that the probabilistic theories of coherence proposed up to now produce a number of counter-intuitive results. The last section provides some reasons for believing that no probabilistic measure will ever be able to adequately capture coherence. First, there can be no function whose arguments are nothing but tuples of probabilities, and which assigns different values to pairs of propositions {A, B} and {A, C} if A implies both B and C, or their negations, and if P(B)=P(C). But such sets may indeed differ in their degree of coherence. Second, coherence is sensitive to explanatory relations between the propositions in question. Explanation, however, can hardly be captured solely in terms of probability.
On the Coherence Between Probability and Possibility Measures 1
2007
The purpose of this paper is to study possibility and probability measures in continuous universes, taking different line to the one proposed and dealt with by other authors. We study the coherence between the probability measure and the possibility measure determined by a function that is both a possibility density and distribution function. For this purpose, we first examine functions that satisfy this condition and then we anlyze the coherence in some notable probability distributions cases.
Information Sciences, 2000
We refer to an arbitrary family H fH 1 Y H 2 Y F F F Y H n g of events (hypotheses), i.e., H has neither any particular algebraic structure nor is a partition of the certain event X. We detect logical relations among the given events (the latter could represent some possible diseases), and some further information is carried by probability assessments, relative to an event E (e.g., a symptom) conditionally to some of the H i 's (``partial likelihood''). If we assess (prior) probabilities for the events H i 's, then the ensuing problems are: (i) is this assessment coherent? (ii) is the partial likelihood coherent``per se''? (iii) is the global assignment (the initial one together with the likelihood) coherent? If the relevant answers are all YES, then we may try to``update'' (coherently) the priors P H i into the posteriors P H i jE. This is an instance of a more general issue, the problem of coherent extensions: a very particular case is Bayes' updating for exhaustive and mutually exclusive hypotheses, in which this extension is unique. In the general case the lack of uniqueness gives rise to upper and lower updated probabilities, and we could now update again the latter, given a new event F and a corresponding (possibly partial) likelihood. In this paper, many relevant features of this problem are discussed, keeping an eye on the distinction between semantic and syntactic aspects.
Coherent probability from incoherent judgment
Journal of Experimental Psychology: Applied, 2001
People often have knowledge about the chances of events but are unable to express the knowledge in the form of coherent probabilities. We propose to correct incoherent judgment via an optimization procedure that seeks the (coherent) probability distribution nearest to the judge's estimates of chance. Our method was applied to the chances of simple and complex meteorological events, as estimated by college undergraduates. No judge responded coherently but our optimization method found close (coherent) approximations to their estimates. Moreover, the approximations were reliably more accurate than the original estimates, as measured by the quadratic scoring rule. Methods for correcting incoherence facilitate the analysis of expected utility, and allow human judgment to be more easily exploited in the construction of expert systems.