Skip to article frontmatterSkip to article content
Site not loading correctly?

This may be due to an incorrect BASE_URL configuration. See the MyST Documentation for reference.

HybridGaussianISAM

Open In Colab

HybridGaussianISAM implements the Incremental Smoothing and Mapping (ISAM) algorithm for hybrid factor graphs, specifically HybridGaussianFactorGraphs. It inherits from gtsam.ISAM<HybridBayesTree>, meaning it maintains an underlying HybridBayesTree representing the smoothed posterior distribution P(X,MZ)P(X, M | Z) over continuous variables XX, discrete variables MM, given measurements ZZ.

The key feature is the update method, which efficiently incorporates new factors (measurements) into the existing HybridBayesTree without re-processing the entire history. This involves:

  1. Identifying the portion of the Bayes tree affected by the new factors.

  2. Removing the affected cliques (orphans).

  3. Re-eliminating the variables in the orphaned cliques along with the new factors.

  4. Merging the newly created Bayes sub-tree back into the main tree.

It provides an incremental solution for problems involving both continuous and discrete variables, where the underlying system dynamics are linear or have been linearized (resulting in a HybridGaussianFactorGraph).

import gtsam
import numpy as np

from gtsam import (
    HybridGaussianISAM, HybridGaussianFactorGraph, HybridBayesTree,
    JacobianFactor, DecisionTreeFactor, HybridGaussianFactor,
    DiscreteValues, VectorValues, HybridValues, Ordering
)
from gtsam.symbol_shorthand import X, D

Initialization

Can be initialized empty or from an existing HybridBayesTree.

# 1. Empty ISAM
hisam1 = gtsam.HybridGaussianISAM()
print("Empty HybridGaussianISAM created.")

# 2. From existing HybridBayesTree
# Create a minimal initial graph and Bayes tree P(D0), P(X0)
initial_graph = gtsam.HybridGaussianFactorGraph()
dk0 = (D(0), 2)
initial_graph.add(DecisionTreeFactor([dk0], "0.6 0.4")) # P(D0)
initial_graph.add(JacobianFactor(X(0), np.eye(1), np.zeros(1), gtsam.noiseModel.Unit.Create(1))) # P(X0)
ordering = gtsam.Ordering([X(0), D(0)])
initial_hbt = initial_graph.eliminateMultifrontal(ordering)

hisam2 = gtsam.HybridGaussianISAM(initial_hbt)
print("\nHybridGaussianISAM from initial HybridBayesTree:")
hisam2.print()
Empty HybridGaussianISAM created.

HybridGaussianISAM from initial HybridBayesTree:
HybridBayesTree
: cliques: 2, variables: 2
HybridBayesTree
-p(x0)
  R = [ 1 ]
  d = [ 0 ]
  mean: 1 elements
  x0: 0
  logNormalizationConstant: -0.918939
  No noise model
HybridBayesTree
- P( d0 ):
 f[ (d0,2), ]
(d0, 0) | 0.6        | 0
(d0, 1) | 0.4        | 1
number of nnzs: 2

Incremental Updates

The update method takes a HybridGaussianFactorGraph containing new factors to be added.

# Start with hisam2 from above
hisam = hisam2

# --- Update 1: Add factors connecting X0, X1, D0 ---
update1_graph = gtsam.HybridGaussianFactorGraph()
# Add P(X1 | X0) = N(X0+1, 0.1)
update1_graph.add(JacobianFactor(X(0), -np.eye(1), X(1), np.eye(1), np.array([1.0]), gtsam.noiseModel.Isotropic.Sigma(1, np.sqrt(0.1))))
# Add P(X1 | D0) = mixture N(1, 0.25); N(5, 1.0)
gf0 = JacobianFactor(X(1), np.eye(1), np.array([1.0]), gtsam.noiseModel.Isotropic.Sigma(1, 0.5))
gf1 = JacobianFactor(X(1), np.eye(1), np.array([5.0]), gtsam.noiseModel.Isotropic.Sigma(1, 1.0))
update1_graph.add(HybridGaussianFactor(dk0, [gf0, gf1]))

print("\nAdding Update 1 Factors:")
update1_graph.print()

hisam.update(update1_graph)
print("\nISAM state after Update 1:")
hisam.print()

# --- Update 2: Add factor connecting X1, X2 ---
update2_graph = gtsam.HybridGaussianFactorGraph()
update2_graph.add(JacobianFactor(X(1), -np.eye(1), X(2), np.eye(1), np.array([2.0]), gtsam.noiseModel.Isotropic.Sigma(1, 1.0)))

print("\nAdding Update 2 Factors:")
update2_graph.print()

hisam.update(update2_graph)
print("\nISAM state after Update 2:")
hisam.print()

Adding Update 1 Factors:

size: 2
Factor 0
GaussianFactor:

  A[x0] = [
	-1
]
  A[x1] = [
	1
]
  b = [ 1 ]
isotropic dim=1 sigma=0.316228

Factor 1
HybridGaussianFactor:
Hybrid [x1; d0]{
 Choice(d0) 
 0 Leaf :
  A[x1] = [
	1
]
  b = [ 1 ]
isotropic dim=1 sigma=0.5
scalar: 0

 1 Leaf :
  A[x1] = [
	1
]
  b = [ 5 ]
  Noise model: unit (1) 
scalar: 0

}


ISAM state after Update 1:
HybridBayesTree
: cliques: 3, variables: 3
HybridBayesTree
- P( d0 ):
 f[ (d0,2), ]
(d0, 0) | 0.976859   | 0
(d0, 1) | 0.0231405  | 1
number of nnzs: 2

HybridBayesTree
| - P( x1 | d0)
 Discrete Keys = (d0, 2), 
 logNormalizationConstant: -0.123394

 Choice(d0) 
 0 Leaf p(x1)
  R = [ 2.21565 ]
  d = [ 2.21565 ]
  mean: 1 elements
  x1: 1
  logNormalizationConstant: -0.123394
  No noise model

 1 Leaf p(x1)
  R = [ 1.3817 ]
  d = [ 4.27669 ]
  mean: 1 elements
  x1: 3.09524
  logNormalizationConstant: -0.595625
  No noise model

HybridBayesTree
| | -p(x0 | x1)
  R = [ 3.31662 ]
  S[x1] = [ -3.01511 ]
  d = [ -3.01511 ]
  logNormalizationConstant: 0.280009
  No noise model

Adding Update 2 Factors:

size: 1
Factor 0
GaussianFactor:

  A[x1] = [
	-1
]
  A[x2] = [
	1
]
  b = [ 2 ]
  Noise model: unit (1) 


ISAM state after Update 2:
HybridBayesTree
: cliques: 3, variables: 4
HybridBayesTree
- P( d0 ):
 f[ (d0,2), ]
(d0, 0) | 0.976859   | 0
(d0, 1) | 0.0231405  | 1
number of nnzs: 2

HybridBayesTree
| - P( x1 x2 | d0)
 Discrete Keys = (d0, 2), 
 logNormalizationConstant: -1.04233

 Choice(d0) 
 0 Leaf p(x1 x2 )
  R = [   2.43086 -0.411377 ]
      [         0  0.911465 ]
  d = [ 1.19673  2.7344 ]
  mean: 2 elements
  x1: 1
  x2: 3
  logNormalizationConstant: -1.04233
  No noise model

 1 Leaf p(x1 x2 )
  R = [   1.70561 -0.586302 ]
      [         0  0.810093 ]
  d = [ 2.29191 4.12761 ]
  mean: 2 elements
  x1: 3.09524
  x2: 5.09524
  logNormalizationConstant: -1.51456
  No noise model

HybridBayesTree
| | -p(x0 | x1)
  R = [ 3.31662 ]
  S[x1] = [ -3.01511 ]
  d = [ -3.01511 ]
  logNormalizationConstant: 0.280009
  No noise model

Solution and Marginals

After updates, the underlying HybridBayesTree can be used to obtain the current MAP estimate or calculate marginals, similar to the batch case.

# Get the current MAP estimate from the ISAM object
# ISAM inherits optimize() from HybridBayesTree
current_map_solution = hisam.optimize()
print("\nCurrent MAP Solution from ISAM:")
current_map_solution.print()

Current MAP Solution from ISAM:
HybridValues: 
  Continuous: 3 elements
  x0: 2.67796e-16
  x1: 1
  x2: 3
  Discrete: (d0, 0)
  Nonlinear
Values with 0 values:
# Access the underlying HybridBayesTree methods
# Get a specific GaussianBayesTree for an MPE assignment
mpe = hisam.mpe()
print("\nMPE Assignment:", mpe)
gbt_mpe = hisam.choose(mpe)
print("\nGaussianBayesTree for MPE assignment:")
gbt_mpe.print()

MPE Assignment: DiscreteValues{7205759403792793600: 0}

GaussianBayesTree for MPE assignment:
: cliques: 3, variables: 3
- p()
  R = Empty (0x0)
  d = Empty (0x1)
  mean: 0 elements
  logNormalizationConstant: -0
  No noise model
| - p(x1 x2 )
  R = [   2.43086 -0.411377 ]
      [         0  0.911465 ]
  d = [ 1.19673  2.7344 ]
  mean: 2 elements
  x1: 1
  x2: 3
  logNormalizationConstant: -1.04233
  No noise model
| | - p(x0 | x1)
  R = [ 3.31662 ]
  S[x1] = [ -3.01511 ]
  d = [ -3.01511 ]
  logNormalizationConstant: 0.280009
  No noise model