Skip to article frontmatterSkip to article content
Site not loading correctly?

This may be due to an incorrect BASE_URL configuration. See the MyST Documentation for reference.

HybridGaussianConditional

Open In Colab

A HybridGaussianConditional represents a conditional probability distribution P(XM,Z)P(X | M, Z) where the frontal variables XX are continuous, and the parent variables include both discrete variables MM and potentially other continuous variables ZZ. It inherits from HybridGaussianFactor and Conditional<HybridGaussianFactor, HybridGaussianConditional>.

Essentially, it’s a collection of GaussianConditionals, where the choice of which GaussianConditional applies depends on the assignment mm of the discrete parent variables MM.

P(XM=m,Z)=Pm(XZ)P(X | M=m, Z) = P_m(X | Z)

where Pm(XZ)P_m(X | Z) is a standard GaussianConditional.

Internally, it uses a DecisionTree<Key, GaussianConditional::shared_ptr> to store the different conditional components, indexed by the discrete parent DiscreteKeys MM. Note that unlike HybridGaussianFactor, the scalar value stored alongside the factor in the base class represents the negative log normalization constant for that branch, ensuring each Pm(XZ)P_m(X|Z) integrates to 1.

These conditionals are the standard result of eliminating continuous variables when discrete variables are present in the separator during hybrid elimination.

import gtsam
import numpy as np

from gtsam import HybridGaussianConditional, GaussianConditional
from gtsam.symbol_shorthand import X, D

Initialization

Can be initialized with one or more discrete parent keys and the corresponding GaussianConditional components.

# --- Example: Single Discrete Parent P(X0 | D0) ---
dk0 = (D(0), 2) # Binary discrete parent D0

# Gaussian conditional for mode D0=0: P(X0|D0=0) = N(0, 1)
# R=1, d=0
gc0 = GaussianConditional(X(0), np.zeros(1), np.eye(1), gtsam.noiseModel.Unit.Create(1))

# Gaussian conditional for mode D0=1: P(X0|D0=1) = N(5, 4=2^2)
# R=1/2=0.5, d = R*mean = 0.5*5 = 2.5
# Need noise model with sigma=2.0
noise1 = gtsam.noiseModel.Isotropic.Sigma(1, 2.0)
gc1 = GaussianConditional(X(0), np.array([2.5]), np.eye(1)*0.5, noise1)

hgc1 = HybridGaussianConditional(dk0, [gc0, gc1])
print("HybridGaussianConditional P(X0 | D0):")
hgc1.print()
HybridGaussianConditional P(X0 | D0):
HybridGaussianConditional

 P( x0 | d0)
 Discrete Keys = (d0, 2), 
 logNormalizationConstant: -0.918939

 Choice(d0) 
 0 Leaf p(x0)
  R = [ 1 ]
  d = [ 0 ]
  mean: 1 elements
  x0: 0
  logNormalizationConstant: -0.918939
  Noise model: unit (1) 

 1 Leaf p(x0)
  R = [ 0.5 ]
  d = [ 2.5 ]
  mean: 1 elements
  x0: 5
  logNormalizationConstant: -2.30523
isotropic dim=1 sigma=2

Accessing Components and Properties

You can retrieve the underlying decision tree of conditionals and select a specific GaussianConditional based on a discrete assignment.

# Using hgc1 from Example 1: P(X0 | D0) with modes N(0,1) and N(5,4)
# Get the GaussianConditional for a specific assignment using choose() or ()
assignment0 = gtsam.DiscreteValues()
assignment0[D(0)] = 0
selected_gc0 = hgc1.choose(assignment0) # or hgc1(assignment0)
print("\nSelected GaussianConditional for D0=0:")
selected_gc0.print() # Should be N(0, 1)

assignment1 = gtsam.DiscreteValues()
assignment1[D(0)] = 1
selected_gc1 = hgc1(assignment1)
print(f"\nSelected GaussianConditional for D0=1\n: {selected_gc1}") # Should be N(5, 4)

# Access conditional properties
print(f"Discrete Parents (part of HybridFactor): {hgc1.discreteKeys()}")
print(f"Continuous Parents: {hgc1.continuousParents()}") # Convenience method

Selected GaussianConditional for D0=0:
GaussianConditional p(x0)
  R = [ 1 ]
  d = [ 0 ]
  mean: 1 elements
  x0: 0
  logNormalizationConstant: -0.918939
  Noise model: unit (1) 

Selected GaussianConditional for D0=1
: (GaussianConditional p(x0)
  R = [ 0.5 ]
  d = [ 2.5 ]
  mean: 1 elements
  x0: 5
  logNormalizationConstant: -2.30523
isotropic dim=1 sigma=2
, 1.3862943611198908)
Discrete Parents (part of HybridFactor): 
d0 2

Continuous Parents: []

Evaluation (logProbability, evaluate)

These methods evaluate the probability density Pm(XZ)P_m(X | Z) corresponding to the selected discrete mode mm. They require a HybridValues object containing assignments for all involved variables (frontal XX, discrete parents MM, continuous parents ZZ).

# Using hgc1: P(X0 | D0) with modes N(0,1) and N(5,4)

# --- Evaluate mode 0: P(X0=0.1 | D0=0) ---
hybrid_vals0 = gtsam.HybridValues()
hybrid_vals0.insert(X(0), np.array([0.1])) # Frontal value
hybrid_vals0.insert(D(0), 0)             # Discrete parent assignment

log_prob0 = hgc1.logProbability(hybrid_vals0)
prob0 = hgc1.evaluate(hybrid_vals0)
# Expected: log density of N(0.1; 0, 1)
print(f"\nP(X0=0.1 | D0=0): LogProb={log_prob0}, Prob={prob0}")

# --- Evaluate mode 1: P(X0=4.0 | D0=1) ---
hybrid_vals1 = gtsam.HybridValues()
hybrid_vals1.insert(X(0), np.array([4.0])) # Frontal value
hybrid_vals1.insert(D(0), 1)             # Discrete parent assignment

log_prob1 = hgc1.logProbability(hybrid_vals1)
prob1 = hgc1.evaluate(hybrid_vals1)
# Expected: log density of N(4.0; 5, 4)
print(f"P(X0=4.0 | D0=1): LogProb={log_prob1}, Prob={prob1}")

# Note: The error() method inherited from HybridGaussianFactor gives
# the Gaussian negative log-likelihood *plus* the neg log normalization constant
# stored in the tree.
# error = -log P(x|m,z) - log(NormConstant)
err0 = hgc1.error(hybrid_vals0)
print(f"Error(X0=0.1|D0=0): {err0} (Should be approx -log_prob0 - negLogConstant(gc0))")
neg_log_const_0 = gc0.negLogConstant() # Should be ~0.9189
print(f"  Check: {-log_prob0 - neg_log_const_0}")

P(X0=0.1 | D0=0): LogProb=-0.9239385332046728, Prob=0.39695254747701175
P(X0=4.0 | D0=1): LogProb=-2.3364828943245635, Prob=0.0966670292007123
Error(X0=0.1|D0=0): 0.005000000000000001 (Should be approx -log_prob0 - negLogConstant(gc0))
  Check: 0.0050000000000000044