A HybridEliminationTree is the hybrid equivalent of gtsam.GaussianEliminationTree. It represents the structure of sequential variable elimination performed on a HybridGaussianFactorGraph given a specific Ordering.
Each node in the tree corresponds to a variable being eliminated. The factors associated with that variable (from the original graph) are stored at that node. The tree structure reflects the dependencies created during sequential elimination: the parent of a node j is the variable i appearing earliest in the ordering such that the conditional resulting from eliminating j involves i.
Eliminating a HybridEliminationTree yields a HybridBayesNet.
While fundamental to sequential elimination, direct manipulation of HybridEliminationTree objects is less common than using the eliminateSequential method on a HybridGaussianFactorGraph, which uses this structure internally. It primarily serves to understand the computational flow of sequential elimination in the hybrid context.
import gtsam
import numpy as np
import graphviz
from gtsam import (
HybridGaussianFactorGraph,
Ordering,
HybridEliminationTree,
JacobianFactor,
DecisionTreeFactor,
HybridGaussianFactor,
)
from gtsam.symbol_shorthand import X, DCreating a HybridEliminationTree¶
It is constructed from a HybridGaussianFactorGraph and an Ordering.
# --- Create a HybridGaussianFactorGraph (same as HybridBayesTree example) ---
hgfg = HybridGaussianFactorGraph()
dk0 = (D(0), 2) # Binary discrete variable
# Prior on D0: P(D0=0)=0.6, P(D0=1)=0.4
prior_d0 = DecisionTreeFactor([dk0], "0.6 0.4")
hgfg.add(prior_d0) # Factor 0
# Prior on X0: P(X0) = N(0, 1)
prior_x0 = JacobianFactor(X(0), np.eye(1), np.zeros(1), gtsam.noiseModel.Isotropic.Sigma(1, 1.0))
hgfg.add(prior_x0) # Factor 1
# Conditional measurement on X1: P(X1 | D0)
# Mode 0: P(X1 | D0=0) = N(1, 0.25)
gf0 = JacobianFactor(X(1), np.eye(1), np.array([1.0]), gtsam.noiseModel.Isotropic.Sigma(1, 0.5))
# Mode 1: P(X1 | D0=1) = N(5, 1.0)
gf1 = JacobianFactor(X(1), np.eye(1), np.array([5.0]), gtsam.noiseModel.Isotropic.Sigma(1, 1.0))
meas_x1_d0 = HybridGaussianFactor(dk0, [gf0, gf1])
hgfg.add(meas_x1_d0) # Factor 2
# Factor connecting X0 and X1: P(X1 | X0) = N(X0+1, 0.1)
odom_x0_x1 = JacobianFactor(X(0), -np.eye(1), X(1), np.eye(1), np.array([1.0]), gtsam.noiseModel.Isotropic.Sigma(1, np.sqrt(0.1)))
hgfg.add(odom_x0_x1) # Factor 3
print("Original HybridGaussianFactorGraph:")
# hgfg.print()
graphviz.Source(hgfg.dot())Original HybridGaussianFactorGraph:
# --- Define an Ordering ---
# Eliminate continuous first, then discrete (matches default HybridOrdering)
ordering = Ordering([X(0), X(1), D(0)])
print(f"\nElimination Ordering: {ordering}")
# --- Construct the Elimination Tree ---
het = HybridEliminationTree(hgfg, ordering)
print("\nResulting HybridEliminationTree:")
# Printing shows the tree structure and factors stored at each node
het.print()
Elimination Ordering: Position 0: x0, x1, d0
Resulting HybridEliminationTree:
-(d0)
- f[ (d0,2), ]
Choice(d0)
0 Leaf 0.6
1 Leaf 0.4
| -(x1)
| -
Hybrid [x1; d0]{
Choice(d0)
0 Leaf :
A[x1] = [
1
]
b = [ 1 ]
isotropic dim=1 sigma=0.5
scalar: 0
1 Leaf :
A[x1] = [
1
]
b = [ 5 ]
Noise model: unit (1)
scalar: 0
}
| | -(x0)
| | -
A[x0] = [
1
]
b = [ 0 ]
Noise model: unit (1)
| | -
A[x0] = [
-1
]
A[x1] = [
1
]
b = [ 1 ]
isotropic dim=1 sigma=0.316228
The primary use for elimination trees is in sequential elimination, yielding a HybridBayesNet. This is typically done via the eliminateSequential method of the factor graph itself.