Skip to article frontmatterSkip to article content

ISAM

gtsam.ISAM (Incremental Smoothing and Mapping) is a class that inherits from BayesTree and adds an update method. This method allows for efficient incremental updates to the solution when new factors (e.g., new measurements) are added to the problem.

Instead of re-eliminating the entire factor graph from scratch, iSAM identifies the part of the Bayes tree affected by the new factors, removes that part, and re-eliminates only the necessary variables, merging the results back into the existing tree.

Like BayesTree, it’s templated (e.g., GaussianISAM which inherits from GaussianBayesTree). For practical applications requiring incremental updates, ISAM2 is often preferred due to further optimizations like fluid relinearization and support for variable removal, but ISAM demonstrates the core incremental update concept based on the Bayes tree.

Open In Colab

import gtsam
import numpy as np

# Use Gaussian variants for demonstration
from gtsam import GaussianFactorGraph, Ordering, GaussianISAM, GaussianBayesTree
from gtsam import symbol_shorthand

X = symbol_shorthand.X
L = symbol_shorthand.L

Initialization

An ISAM object can be created empty or initialized from an existing BayesTree.

# Create an empty ISAM object
isam1 = GaussianISAM()

# Create from an existing Bayes Tree (e.g., from an initial batch solve)
initial_graph = GaussianFactorGraph()
model = gtsam.noiseModel.Isotropic.Sigma(1, 1.0)
initial_graph.add(X(0), -np.eye(1), np.zeros(1), model) # Prior on x0

initial_bayes_tree = initial_graph.eliminateMultifrontal(Ordering([X(0)]))
print("Initial BayesTree:")
initial_bayes_tree.print()

isam2 = GaussianISAM(initial_bayes_tree)
print("ISAM from BayesTree:")
isam2.print()
Initial BayesTree:
: cliques: 1, variables: 1
- p(x0)
  R = [ 1 ]
  d = [ 0 ]
  mean: 1 elements
  x0: 0
  logNormalizationConstant: -0.918939
  No noise model
ISAM from BayesTree:
GaussianISAM: : cliques: 1, variables: 1
GaussianISAM: - p(x0)
  R = [ 1 ]
  d = [ 0 ]
  mean: 1 elements
  x0: 0
  logNormalizationConstant: -0.918939
  No noise model

Incremental Update

The core functionality is the update(newFactors) method.

# Start with the ISAM object containing the prior on x0
isam = GaussianISAM(initial_bayes_tree)
model = gtsam.noiseModel.Isotropic.Sigma(1, 1.0)

# --- First Update ---
new_factors1 = GaussianFactorGraph()
new_factors1.add(X(0), -np.eye(1), X(1), np.eye(1), np.zeros(1), model) # x0 -> x1
isam.update(new_factors1)

print("ISAM after first update (x0, x1):")
isam.print()

# --- Second Update ---
new_factors2 = GaussianFactorGraph()
new_factors2.add(X(1), -np.eye(1), X(2), np.eye(1), np.zeros(1), model) # x1 -> x2
isam.update(new_factors2)

print("\nISAM after second update (x0, x1, x2):")
isam.print()
ISAM after first update (x0, x1):
GaussianISAM: : cliques: 1, variables: 2
GaussianISAM: - p(x1 x0 )
  R = [  1 -1 ]
      [  0  1 ]
  d = [ 0 0 ]
  mean: 2 elements
  x0: 0
  x1: 0
  logNormalizationConstant: -1.83788
  No noise model

ISAM after second update (x0, x1, x2):
GaussianISAM: : cliques: 2, variables: 3
GaussianISAM: - p(x0 x1 )
  R = [   1.41421 -0.707107 ]
      [         0  0.707107 ]
  d = [ 0 0 ]
  mean: 2 elements
  x0: 0
  x1: 0
  logNormalizationConstant: -1.83788
  No noise model
GaussianISAM: | - p(x2 | x1)
  R = [ 1 ]
  S[x1] = [ -1 ]
  d = [ 0 ]
  logNormalizationConstant: -0.918939
  No noise model

Solution and Marginals

Since ISAM inherits from BayesTree, you can use the same methods like optimize() and marginalFactor() after performing updates.

# Get the solution from the final ISAM state
solution = isam.optimize()
print("Optimized Solution after updates:")
solution.print()
Optimized Solution after updates:
VectorValues: 3 elements
  x0: 0
  x1: 0
  x2: 0