Evaluating the difference between graph structures in Gaussian Bayesian networks (original) (raw)

Dealing with uncertainty in Gaussian Bayesian networks from a regression perspective

Some sensitivity analyses have been developed to evaluate the impact of uncertainty about the mean vector and the covariance matrix that specify the joint distribution of the variables in the nodes of a Gaussian Bayesian network (GBN). Nevertheless, uncertainty about the alternative conditional specification of GBN based on the regression coefficients of each variable given its parents in the directed acyclic graph (DAG), has received low attention in the literature. In this line, we focus on evaluating the effect of regression coefficients misspecification by means of the Kullback-Leibler (KL) divergence.

Bayesian Estimation Sensitivity Analysis in Gaussian Bayesian Networks Using a DivergenceMeasure

2007

This article develops a method for computing the sensitivity analysis in a Gaussian Bayesian network. The measure presented is based on the Kullback–Leibler divergence and is useful to evaluate the impact of prior changes over the posterior marginal density of the target variable in the network. We find that some changes do not disturb the posterior marginal density of interest. Finally, we describe a method to compare different sensitivity measures obtained depending on where the inaccuracy was. An example is used to illustrate the concepts and methods presented.

Sensitivity Analysis in Gaussian Bayesian Networks Using a Divergence Measure

Communications in Statistics-theory and Methods, 2007

Bayesian network. The measure presented is based on the Kullback-Leibler divergence and is useful to evaluate the impact of prior changes over the posterior marginal density of the target variable in the network. We find that some changes do not disturb the posterior marginal density of interest. Finally, we describe a method to compare different sensitivity measures obtained depending on where the inaccuracy was. An example is used to illustrate the concepts and methods presented.

Sensitivity of Gaussian Bayesian networks to inaccuracies in their parameters

To determine the effect of a set of inaccurate parameters in Gaussian Bayesian networks, it is necessary to study the sensitivity of the model. With this aim we propose a sensitivity analysis based on comparing two different models: the original model with the initial parameters assigned to the Gaussian Bayesian network and the perturbed model obtained after perturbing a set of inaccurate parameters with specific characteristics.

The effect of block parameter perturbations in Gaussian Bayesian networks: Sensitivity and robustness

Information Sciences, 2013

In this work we study the effects of model inaccuracies on the description of a Gaussian Bayesian network with a set of variables of interest and a set of evidential variables. Using the Kullback-Leibler divergence measure, we compare the output of two different networks after evidence propagation: the original network, and a network with perturbations representing uncertainties in the quantitative parameters. We describe two methods for analyzing the sensitivity and robustness of a Gaussian Bayesian network on this basis. In the sensitivity analysis, different expressions are obtained depending on which set of parameters is considered inaccurate. This fact makes it possible to determine the set of parameters that most strongly disturbs the network output. If all of the divergences are small, we can conclude that the network output is insensitive to the proposed perturbations. The robustness analysis is similar, but considers all potential uncertainties jointly. It thus yields only one divergence, which can be used to confirm the overall sensitivity of the network. Some practical examples of this method are provided, including a complex, real-world problem.

Perturbing the structure in Gaussian Bayesian networks

2009

This paper introduces a n-way sensitivity analysis for Gaussian Bayesian networks where it studies the joint effect of variations in a set of similar parameters. The aim is to determine the sensitivity of the model when the parameters that describe the quantitative part are given ...

Sensitivity analysis of extreme inaccuracies in Gaussian Bayesian Networks

2006

We present the behavior of a sensitivity measure defined to evaluate the impact of model inaccuracies over the posterior marginal density of the variable of interest, after the evidence propagation is executed, for extreme perturbations of parameters in Gaussian Bayesian networks. This sensitivity measure is based on the Kullback-Leibler divergence and yields different expressions depending on the type of parameter (mean, variance or covariance) to be perturbed. This analysis is useful to know the extreme effect of uncertainty about some of the initial parameters of the model in a Gaussian Bayesian network. These concepts and methods are illustrated with some examples.

Sensitivity to hyperprior parameters in Gaussian Bayesian networks

Journal of Multivariate Analysis, 2014

Our focus is on learning Gaussian Bayesian networks (GBNs) from data. In GBNs the multivariate normal joint distribution can be alternatively specified by the normal regression models of each variable given its parents in the DAG (directed acyclic graph). In the latter representation the parameters are the mean vector, the regression coefficients and the corresponding conditional variances. The problem of Bayesian learning in this context has been handled with different approximations, all of them concerning the use of different priors for the parameters considered. We work with the most usual prior given by the normal/inverse gamma form. In this setting we are interested in evaluating the effect of prior hyperparameters choice on posterior distribution. The Kullback-Leibler divergence measure is used as a tool to define local sensitivity comparing the prior and posterior deviations. This method can be useful to decide the values to be chosen for the hyperparameters.

Inaccurate parameters in Gaussian Bayesian networks

2008

Resumen To determine the effect of a set of inaccurate parameters in Gaussian Bayesian networks, it is necessary to study the sensitive of the model. With this aim we propose a sensitivity analysis based on comparing two differents models: the original model with the ...

Extreme inaccuracies in Gaussian Bayesian networks

Journal of Multivariate Analysis, 2008

To evaluate the impact of model inaccuracies over the network's output, after the evidence propagation, in a Gaussian Bayesian network, a sensitivity measure is introduced. This sensitivity measure is the Kullback-Leibler divergence and yields different expressions depending on the type of parameter to be perturbed, i.e. on the inaccurate parameter.