On compatible priors for Bayesian networks | IEEE Journals & Magazine | IEEE Xplore

On compatible priors for Bayesian networks


Abstract:

Given a Bayesian network of discrete random variables with a hyper-Dirichlet prior, a method is proposed for assigning Dirichlet priors to the conditional probabilities o...Show More

Abstract:

Given a Bayesian network of discrete random variables with a hyper-Dirichlet prior, a method is proposed for assigning Dirichlet priors to the conditional probabilities of structurally different networks. It defines a distance measure between priors which is to be minimized for the assignment process. Intuitively one would expect that if two models priors are to qualify as being 'close' in some sense, then their posteriors should also be nearby after an observation. However one does not know in advance what will be observed next. Thus we are led to propose an expectation of Kullback-Leibler distances over all possible next observations to define a measure of distance between priors. In conjunction with the additional assumptions of global and local independence of the parameters, a number of theorems emerge which are usually taken as reasonable assumptions in the Bayesian network literature. A simple example is given to illustrate the technique.
Published in: IEEE Transactions on Pattern Analysis and Machine Intelligence ( Volume: 18, Issue: 9, September 1996)
Page(s): 901 - 911
Date of Publication: 06 August 2002

ISSN Information:


References

References is not available for this document.