Loading the book...
Sorry, This website doesn't support IE browsers, please use Firefox or google chrome or opera. Thanks!

Introduction

Bayesian network

A Bayesian network, also called Bayes network, belief network, Bayes(ian) model or probabilistic directed acyclic graphical model is a probabilistic graphical model (a type of statistical model) that represents a set of random variables and their conditional dependencies via a directed acyclic graph (DAG). [From Wikipedia]

Bayesian inference

Bayesian inference is an approach to answer probability queries about the underlying probability distribution.

Junction tree algorithm

The junction tree algorithm (also known as 'Clique Tree' or 'Join Tree') is a method to extract marginalization. In essence, it entails performing belief propagation on a modified graph called a junction tree. [From Wikipedia]

The purpose of this site

This site is supposed to help you understand the junction tree algorithms in terms of Hugin propagation and Shafer-Shenoy propagation.

This site is for tutorial purpose, not for the real-world application.

If you require any further information, feel free to contact me. linden.quan@hotmail.com

Draw a DAG and enter CPTs

For the sake of simplicity, the name of a variable must be a single letter in uppercase, which means there are 26 variables at most. All of the variables are binary-valued, and all value is in lowercase.

For example: A = {a, -a}, T = {t, -t}.

Instructions:

  • select a variable name from a to z, and create a new variable by clicking "spawn" button.
  • Click on a node, hold and drag in the dotted area.
  • Double click on a node to display CPT.
  • Right click a node to show the context menu. In the menu, you can choose to edit the CPT or delte the node.

Moralization

To answer the query, we need to generate the junction tree.

The first step of generating a junction tree is moralizing the original DAG.

To transform a DAG to the moral graph, add edges between all pairs of nodes that have a common child, and remove all directions.

Triangulation

In this step, we need to transform the moralized graph to the triangulated graph. To do so, we need perform maximum cardinality search(MCS) algorithm to determin the ordering. After obtaining the ordering, make the ordering perfect by adding edges to the moral graph. The new graph with perfect ordering is the triangulated graph.

MCS algorithm and the concept of perfect ordering can be found online. You can also read the source code of all alorithms.

Junction Tree

A junction tree is constructed from a triangulated graph. The process contains three steps.

  • Step 1: Transform the triangulated graph to the clique graph.
  • Step 2: Construct a maximum-weight spanning tree from the clique graph.
  • Step 3: Assign CPTs to cliques to become their potentials Φ.

The weight of an edge of a spanning tree is the number of common nodes shared between the cliques.

If a clique includes all variables in a CPT, then that CPT is assigned to the clique. For example, CPT of A is P(C|B,A), and a clique consists of variable A,B,C,D, then CPT of A is assigned to the clique. Because {C,B,A}⊆{A,B,C,D}.

Propagation

After the junction tree is obtained, we can start to propagate messages to get marginals. In this tutorial, two propagation algorithms are mentioned - Hugin and Shenoy-Shafer.

In Hugin propagation. For each edge, there is a seperator whose domain is the intersection of the two cliques. There is a root node, and it involves two phases - inward and outward. After the propagation is done, all cliques and seperators become marginals. For example, Φ(C,D) => P(C, D)

In Shenoy-Shafer propagation, For each edge, there are two seperators for each direction. There is no root node. After propagation is done, all cliques become marginals.

Query

Drag the variables into the query set or evidence set. For the sake of simplicity, you can't specify any value.

Variables spawned in step 1:

Query Set:
Evidence Set:
P()