KSL-91-53
## Approximating Probabilistic Inference in Bayesian Belief Networks is NP-Hard

**Reference: **
Dagum, P. &
Luby, M. Approximating Probabilistic Inference in Bayesian Belief Networks is NP-Hard. April, 1993.

**Abstract:**
It is known that exact computation of probabilistic inference is NP-hard,
[Cooper90]. For inferences that are not conditioned on any evidence nodes, it
is relatively straightforward to get approximations in polynomial time with
absolute error e and probability of failure d. However, the effort directed at
finding efficient approximation algorithms for general inferences, i.e.,
inferences conditioned on evidence, has been met with little success.
Polynomial time algorithms that approximate inferences with absolute or
relative bounds have been constructed for special case belief networks only.
We prove that even approximating probabilistic inference conditioned on
evidence in general belief networks is NP-hard. We prove that if P is not
equal to NP, there does not exist an algorithm that can output an estimate Z
such that for some e<1/2, Pr[x|y]-e <= Z <= Pr[x|y]+e, for inferences Pr[x|y]
with single node instantiations x and y. Furthermore, we present evidence that
it is intractable to determine whether with probability greater than 1/2, Z
satisfies Pr[x|y]-e <= Z<= Pr[x|y]+e for e<1/2.

*Jump to*...
[KSL]
[SMI]
[Reports by Author]
[Reports by KSL Number]
[Reports by Year]

Send mail to:
ksl-info@ksl.stanford.edu to send a message to the maintainer of the
KSL Reports.