HybridGaussianFactorGraph is a specialized HybridFactorGraph designed to represent linearized hybrid systems or systems that are inherently Conditional Linear Gaussian (CLG). It primarily contains:
gtsam.GaussianFactor(or derived types likeJacobianFactor,HessianFactor)gtsam.DiscreteFactor(or derived types likeDecisionTreeFactor,TableFactor)gtsam.HybridGaussianFactor(includingHybridGaussianConditional)
This graph type is typically the result of linearizing a HybridNonlinearFactorGraph. It supports specialized elimination methods (eliminateSequential, eliminateMultifrontal via EliminateableFactorGraph base) that correctly handle the mix of discrete and continuous variables, producing a HybridBayesNet or HybridBayesTree.
import gtsam
import numpy as np
import graphviz
from gtsam import (
JacobianFactor,
DecisionTreeFactor,
HybridGaussianFactorGraph,
HybridGaussianFactor,
GaussianConditional,
HybridValues,
)
from gtsam.symbol_shorthand import X, DInitialization and Adding Factors¶
hgfg = HybridGaussianFactorGraph()
# Gaussian Factor (e.g., prior on X0)
prior_model = gtsam.noiseModel.Isotropic.Sigma(1, 1.0)
prior_factor = JacobianFactor(X(0), -np.eye(1), np.zeros(1), prior_model)
hgfg.add(prior_factor)
# Discrete Factor (e.g., prior on D0)
dk0 = (D(0), 2) # D0 has 2 states
discrete_prior = DecisionTreeFactor([dk0], "0.7 0.3") # P(D0=0)=0.7, P(D0=1)=0.3
hgfg.add(discrete_prior)
# Hybrid Gaussian Factor (e.g., measurement on X1 depends on D0)
dk1 = (D(1), 2)
# Measurement model 0: N(X1; 0, 1)
gf0 = JacobianFactor(X(1), np.eye(1), np.zeros(1), gtsam.noiseModel.Isotropic.Sigma(1, 1.0))
# Measurement model 1: N(X1; 2, 0.5)
gf1 = JacobianFactor(X(1), np.eye(1), np.array([2.0]), gtsam.noiseModel.Isotropic.Sigma(1, 0.5))
hybrid_meas = HybridGaussianFactor(dk1, [gf0, gf1])
hgfg.add(hybrid_meas)
hgfg.print("\nHybrid Gaussian Factor Graph:")
Hybrid Gaussian Factor Graph:
size: 3
Factor 0
GaussianFactor:
A[x0] = [
-1
]
b = [ 0 ]
Noise model: unit (1)
Factor 1
DiscreteFactor:
f[ (d0,2), ]
Choice(d0)
0 Leaf 0.7
1 Leaf 0.3
Factor 2
HybridGaussianFactor:
Hybrid [x1; d1]{
Choice(d1)
0 Leaf :
A[x1] = [
1
]
b = [ 0 ]
Noise model: unit (1)
scalar: 0
1 Leaf :
A[x1] = [
1
]
b = [ 2 ]
isotropic dim=1 sigma=0.5
scalar: 0
}
# We can visualize the graph using Graphviz
graphviz.Source(hgfg.dot())Key Inspection and Choosing a Gaussian Subgraph¶
Besides the standard keys(), discreteKeys(), etc., you can choose a specific GaussianFactorGraph corresponding to a discrete assignment.
print(f"\nDiscrete Keys: {hgfg.discreteKeySet()}")
print(f"Continuous Keys: {hgfg.continuousKeySet()}")
# Choose the GaussianFactorGraph for a specific assignment
assignment = gtsam.DiscreteValues()
assignment[D(0)] = 0
assignment[D(1)] = 1 # Selects gf1 from hybrid_meas
gaussian_subgraph = hgfg.choose(assignment)
print(f"\nChosen GaussianFactorGraph for D0=0, D1=1:")
gaussian_subgraph.print()
Discrete Keys: d0d1
Continuous Keys: x0x1
Chosen GaussianFactorGraph for D0=0, D1=1:
size: 2
factor 0:
A[x0] = [
-1
]
b = [ 0 ]
Noise model: unit (1)
factor 1:
A[x1] = [
1
]
b = [ 2 ]
isotropic dim=1 sigma=0.5
# Note: The discrete factor is ignored in the chosen graph.
# The HybridGaussianFactor contributes its gf1 component.
# The prior on X0 is included.
graphviz.Source(gaussian_subgraph.dot())Computing Errors (error, errorTree, probPrime, discretePosterior)¶
Several methods exist to evaluate the graph under different assumptions.
# --- Error methods ---
hybrid_values = HybridValues()
hybrid_values.insert(X(0), np.array([0.1]))
hybrid_values.insert(X(1), np.array([2.1])) # Close to mean of gf1
hybrid_values.insert(assignment) # Use D0=0, D1=1
# Sum of errors (Gaussian parts) + neg log prob (Discrete parts)
total_error = hgfg.error(hybrid_values)
print(f"\nTotal Hybrid Error (Gaussian+Discrete): {total_error}") # Should be low
# Error Tree (only considers Gaussian factors, returns ADT over discrete keys)
continuous_values = hybrid_values.continuous()
error_tree = hgfg.errorTree(continuous_values)
print("\nError Tree (Gaussian factor errors):")
error_tree.print()
Total Hybrid Error (Gaussian+Discrete): 0.38167494393873247
Error Tree (Gaussian factor errors):
ErrorTree
Choice(d1)
ErrorTree
0 Choice(d0)
ErrorTree
0 0 Leaf 2.5666749
ErrorTree
0 1 Leaf 3.4139728
ErrorTree
1 Choice(d0)
ErrorTree
1 0 Leaf 0.38167494
ErrorTree
1 1 Leaf 1.2289728
# probPrime (unnormalized posterior density for a full hybrid assignment)
prob_prime = hgfg.probPrime(hybrid_values)
print(f"\nProbability Prime (unnormalized density): {prob_prime}") # exp(-error)
# discretePosterior (P(M | X=x), needs only continuous values)
discrete_posterior_tree = hgfg.discretePosterior(continuous_values)
print("\nDiscrete Posterior Tree P(M | X=x):")
discrete_posterior_tree.print() # ADT representing normalized discrete posterior
Probability Prime (unnormalized density): 0.6827169384198328
Discrete Posterior Tree P(M | X=x):
ErrorTree
Choice(d1)
ErrorTree
0 Choice(d0)
ErrorTree
0 0 Leaf 0.070773923
ErrorTree
0 1 Leaf 0.030331681
ErrorTree
1 Choice(d0)
ErrorTree
1 0 Leaf 0.62922608
ErrorTree
1 1 Leaf 0.26966832
Elimination (eliminateSequential, eliminateMultifrontal)¶
Elimination methods handle the hybrid nature, producing HybridBayesNet or HybridBayesTree. Discrete keys are typically eliminated after continuous keys by default using HybridOrdering.
# Default ordering eliminates continuous then discrete
ordering = gtsam.HybridOrdering(hgfg)
print(f"\nDefault Hybrid Ordering: {ordering}") # Expect X(0), X(1), D(0), D(1)...
# Sequential elimination
hybrid_bayes_net: gtsam.HybridBayesNet
hybrid_bayes_net, remaining_factors = hgfg.eliminatePartialSequential(ordering)
print("\nResulting HybridBayesNet:")
graphviz.Source(hybrid_bayes_net.dot())
Default Hybrid Ordering: Position 0: x0, x1, d0, d1
Resulting HybridBayesNet:
# Multifrontal elimination (often preferred)
hybrid_bayes_tree = hgfg.eliminateMultifrontal(ordering)
print("\nResulting HybridBayesTree:")
graphviz.Source(hybrid_bayes_tree.dot())
Resulting HybridBayesTree: