On a simple method for testing independencies in Bayesian networks (original) (raw)

An empirical study of testing independencies in Bayesian networks using rp-separation

International Journal of Approximate Reasoning, 2018

Directed separation (d-separation) played a fundamental role in the founding of Bayesian networks (BNs) and continues to be useful today in a wide range of applications. Given an independence to be tested, current implementations of d-separation explore the active part of a BN. On the other hand, an overlooked property of d-separation implies that d-separation need only consider the relevant part of a BN. We propose a new method for testing independencies in BNs, called relevant path separation (rp-separation), which explores the intersection between the active and relevant parts of a BN. Favourable experimental results are reported.

Context-specific independence in Bayesian networks

Proceedings of the Twelfth …, 1996

Bayesian networks provide a language for qualitatively representing the conditional independence properties of a distribution. This allows a natural and compact representation of the distribution, eases knowledge acquisition, and supports effective inference algorithms. It is well-known, however, that there are certain independencies that we cannot capture qualitatively within the Bayesian network structure: independencies that hold only in certain contexts, i.e., given a specific assignment of values to certain variables. In this paper, we propose a formal notion of context-specific independence (CSI), based on regularities in the conditional probability tables (CPTs) at a node. We present a technique, analogous to (and based on) d-separation, for determining when such independence holds in a given network. We then focus on a particular qualitative representation scheme---tree-structured CPTs---for capturing CSI. We suggest ways in which this representation can be used to support effective inference algorithms. In particular, we present a structural decomposition of the resulting network which can improve the performance of clustering algorithms, and an alternative algorithm based on cutset conditioning.

p-d-Separation - A Concept for Expressing Dependence/Independence Relations in Causal Networks

ArXiv, 2020

Spirtes, Glymour and Scheines formulated a Conjecture that a direct dependence test and a head-to-head meeting test would suffice to construe directed acyclic graph decompositions of a joint probability distribution (Bayesian network) for which Pearl's d-separation applies. This Conjecture was later shown to be a direct consequence of a result of Pearl and Verma. This paper is intended to prove this Conjecture in a new way, by exploiting the concept of p-d-separation (partial dependency separation). While Pearl's d-separation works with Bayesian networks, p-d-separation is intended to apply to causal networks: that is partially oriented networks in which orientations are given to only to those edges, that express statistically confirmed causal influence, whereas undirected edges express existence of direct influence without possibility of determination of direction of causation. As a consequence of the particular way of proving the validity of this Conjecture, an algorithm f...

Partial Dependency Separation - a New Concept for Expressing Dependence-Independence Relations in Causal Networks

Demonstratio Mathematica, 1999

Spirtes, Glymour and Scheines [19] formulated a Conjecture that a direct dependence test and a head-to-head meeting test would suffice to construe directed acyclic graph decompositions of a joint probability distribution (Bayesian network) for which Pearl's d-separation [2] applies. This Conjecture was later shown to be a direct consequence of a result of Pearl and Verma [21], cited as Theorem 1 in [13], see also Theorem 3.4. in [20]). This paper is intended to prove this Conjecture in a new way, by introducing the concept of p-d-separation (partial dependency separation). While Pearl's d-separation works with Bayesian networks, p-d-separation is intended to apply to causal networks: that is partially oriented networks in which orientations are given to only to those edges, that express statistically confirmed causal influence, whereas undirected edges express existence of direct influence without possibility of determination of direction of causation. As a consequence of the particular way of proving the validity of this Conjecture, an algorithm for construction of all the directed acyclic graphs (dags) carrying the available independence information is also presented. The notion of a partially oriented graph (pog) is introduced and within this graph the notion of p-d-separation is defined. It is demonstrated that the p-d-separation within the pog is equivalent to d-separation in all derived dags.

Exploiting causal independence in Bayesian network inference

Arxiv preprint cs/9612101, 1996

A new method is proposed for exploiting causal independencies in exact Bayesian network inference. A Bayesian network can be viewed as representing a factorization of a joint probability into the multiplication of a set of conditional probabilities. We present a notion of causal independence that enables one to further factorize the conditional probabilities into a combination of even smaller factors and consequently obtain a finer-grain factorization of the joint probability. The new formulation of causal independence lets us specify the conditional probability of a variable given its parents in terms of an associative and commutative operator, such as "or", "sum" or "max", on the contribution of each parent. We start with a simple algorithm VE for Bayesian network inference that, given evidence and a query variable, uses the factorization to find the posterior distribution of the query. We show how this algorithm can be extended to exploit causal independence. Empirical studies, based on the CPCS networks for medical diagnosis, show that this method is more efficient than previous methods and allows for inference in larger networks than previous algorithms.

A simple approach to Bayesian network computations

PROCEEDINGS OF THE BIENNIAL CONFERENCE- …, 1994

The general problem of computing posterior probabilities in Bayesian networks is NP-hard (Cooper 1990). However e cient algorithms are often possible for particular applications by exploiting problem structures. It is well understood that the key to the materialization of such a possibility i s t o m a k e use of conditional independence and work with factorizations of joint probabilities rather than joint probabilities themselves. Di erent exact approaches can be characterized in terms of their choices of factorizations. We propose a new approach which adopts a straightforward way for factorizing joint probabilities. In comparison with the clique tree propagation approach, our approach i s v ery simple. It allows the pruning of irrelevant v ariables, it accommodates changes to the knowledge base more easily. it is easier to implement. More importantly, it can be adapted to utilize both intercausal independence and conditional independence in one uniform framework. On the other hand, clique tree propagation is better in terms of facilitating precomputations.

From Arguments to Constraints on a Bayesian Network

2016

In this paper, we propose a way to derive constraints for a Bayesian Network from structured arguments. Argumentation and Bayesian networks can both be considered decision support techniques, but are typically used by experts with different backgrounds. Bayesian network experts have the mathematical skills to understand and construct such networks, but lack expertise in the application domain; domain experts may feel more comfortable with argumentation approaches. Our proposed method allows us to check Bayesian networks given arguments constructed for the same problem, and also allows for transforming arguments into a Bayesian network structure, thereby facilitating Bayesian network construction.

Reasoning about Independence in Probabilistic Models of Relational Data

Bayesian networks leverage conditional independence to compactly encode joint probability distributions. Many learning algorithms exploit the constraints implied by observed conditional independencies to learn the structure of Bayesian networks. The rules of d -separation provide a theoretical and algorithmic framework for deriving conditional independence facts from model structure. However, this theory only applies to Bayesian networks. Many realworld systems, such as social or economic systems, are characterized by interacting heterogeneous entities and probabilistic dependencies that cross the boundaries of entities. Consequently, researchers have developed extensions to Bayesian networks that can represent these relational dependencies. We show that the theory of d -separation inaccurately infers conditional independence when applied directly to the structure of probabilistic models of relational data. We introduce relational d -separation, a theory for deriving conditional independence facts from relational models. We provide a new representation, the abstract ground graph, that enables a sound, complete, and computationally efficient method for answering d -separation queries about relational models, and we present empirical results that demonstrate effectiveness.

Causal independence for probability assessment and inference using Bayesian networks

IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans, 1996

This material is posted here with permission of the IEEE. Internal or personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution must be obtained from the IEEE by writing to pubs-permissions@ieee.org.

An Algorithm for the Construction of Bayesian Network Structures from Data

Uncertainty in Artificial Intelligence, 1993

Previous algorithms for the construction of Bayesian belief network structures from data have been either highly dependent on conditional independence (CI) tests, or have required an ordering on the nodes to be supplied by the user. We present an algorithm that integrates these two approaches -CI tests are used to generate an ordering on the nodes from the database which is then used to recover the underlying Bayesian network structure using a non CI based method. Results of preliminary evaluation of the algorithm on two networks (ALARM and LED) are presented. We also discuss some algorithm performance issues and open problems.