An EliminationTree
represents the computational structure of sequential variable elimination (like Gaussian elimination) on a FactorGraph
given a specific Ordering
.
Each node in the tree corresponds to a variable being eliminated. The children of a node represent variables that were eliminated earlier and produced factors involving the parent variable. The factors originally involving the variable at a node are stored at that node.
Eliminating an EliminationTree
yields a BayesNet
.
While fundamental to the theory, direct manipulation of EliminationTree
objects in Python is less common than using the eliminateSequential
method on a FactorGraph
, which uses the EliminationTree
internally.
import gtsam
import numpy as np
# EliminationTree is templated, need concrete types
from gtsam import GaussianFactorGraph, Ordering, GaussianEliminationTree, GaussianBayesNet
from gtsam import symbol_shorthand
X = symbol_shorthand.X
L = symbol_shorthand.L
Creating an EliminationTree¶
An EliminationTree
is constructed from a FactorGraph
and an Ordering
.
# Create a graph (same as BayesTree example)
graph = GaussianFactorGraph()
model = gtsam.noiseModel.Isotropic.Sigma(1, 1.0)
graph.add(X(0), -np.eye(1), np.zeros(1), model)
graph.add(X(0), -np.eye(1), X(1), np.eye(1), np.zeros(1), model)
graph.add(X(1), -np.eye(1), X(2), np.eye(1), np.zeros(1), model)
graph.add(L(1), -np.eye(1), X(0), np.eye(1), np.zeros(1), model)
graph.add(L(1), -np.eye(1), X(1), np.eye(1), np.zeros(1), model)
graph.add(L(2), -np.eye(1), X(1), np.eye(1), np.zeros(1), model)
graph.add(L(2), -np.eye(1), X(2), np.eye(1), np.zeros(1), model)
# Define an ordering
ordering = Ordering([X(0), L(1), X(1), L(2), X(2)])
# Construct the Elimination Tree
elimination_tree = GaussianEliminationTree(graph, ordering)
elimination_tree.print("Elimination Tree: ")
Elimination Tree: -(x2)
Elimination Tree: | -(l2)
Elimination Tree: | -
A[l2] = [
-1
]
A[x2] = [
1
]
b = [ 0 ]
Noise model: unit (1)
Elimination Tree: | | -(x1)
Elimination Tree: | | -
A[x1] = [
-1
]
A[x2] = [
1
]
b = [ 0 ]
Noise model: unit (1)
Elimination Tree: | | -
A[l2] = [
-1
]
A[x1] = [
1
]
b = [ 0 ]
Noise model: unit (1)
Elimination Tree: | | | -(l1)
Elimination Tree: | | | -
A[l1] = [
-1
]
A[x1] = [
1
]
b = [ 0 ]
Noise model: unit (1)
Elimination Tree: | | | | -(x0)
Elimination Tree: | | | | -
A[x0] = [
-1
]
b = [ 0 ]
Noise model: unit (1)
Elimination Tree: | | | | -
A[x0] = [
-1
]
A[x1] = [
1
]
b = [ 0 ]
Noise model: unit (1)
Elimination Tree: | | | | -
A[l1] = [
-1
]
A[x0] = [
1
]
b = [ 0 ]
Noise model: unit (1)
Elimination¶
The primary use of an EliminationTree
is to perform sequential elimination to produce a BayesNet
.
# The eliminate function needs to be specified (e.g., EliminateGaussian)
# In Python, this is usually handled internally by graph.eliminateSequential
# but the C++ EliminationTree has an eliminate method.
# Let's call the graph's eliminateSequential which uses the tree internally
bayes_net, remaining_graph = graph.eliminatePartialSequential(ordering)
print("BayesNet from EliminationTree:")
bayes_net.print()
BayesNet from EliminationTree:
size: 5
conditional 0: p(x0 | l1 x1)
R = [ 1.73205 ]
S[l1] = [ -0.57735 ]
S[x1] = [ -0.57735 ]
d = [ 0 ]
logNormalizationConstant: -0.369632
No noise model
conditional 1: p(l1 | x1)
R = [ 1.29099 ]
S[x1] = [ -1.0328 ]
d = [ 0 ]
logNormalizationConstant: -0.663526
No noise model
conditional 2: p(x1 | l2 x2)
R = [ 1.61245 ]
S[l2] = [ -0.620174 ]
S[x2] = [ -0.620174 ]
d = [ 0 ]
logNormalizationConstant: -0.441183
No noise model
conditional 3: p(l2 | x2)
R = [ 1.27098 ]
S[x2] = [ -1.08941 ]
d = [ 0 ]
logNormalizationConstant: -0.679152
No noise model
conditional 4: p(x2)
R = [ 0.654654 ]
d = [ 0 ]
mean: 1 elements
x2: 0
logNormalizationConstant: -1.34259
No noise model