Skip to article frontmatterSkip to article content

Conditional

gtsam.Conditional is the base class for conditional probability distributions or densities that result from variable elimination.

Let FF be the set of frontal variables and SS be the set of parent (separator) variables. A conditional represents:

P(FS)P(F | S)

The methods evaluate, logProbability, and error are related:

evaluate(F,S)=P(FS)\text{evaluate}(F, S) = P(F | S)

logProbability(F,S)=logP(FS)\text{logProbability}(F, S) = \log P(F | S)

logProbability(F,S)=(negLogConstant+error(F,S))\text{logProbability}(F, S) = -(\text{negLogConstant} + \text{error}(F, S))

where negLogConstant is logk-\log k for the normalization constant kk ensuring P(FS)dF=1\int P(F|S) dF = 1.

Like gtsam.Factor, you typically don’t instantiate gtsam.Conditional directly. Instead, you work with derived classes obtained from elimination, such as:

  • gtsam.GaussianConditional
  • gtsam.DiscreteConditional
  • gtsam.HybridGaussianConditional
  • gtsam.SymbolicConditional

This notebook demonstrates the common interface provided by the base class.

Open In Colab

import gtsam
import numpy as np

# We need concrete graph types and elimination to get a Conditional
from gtsam import GaussianFactorGraph, Ordering
from gtsam import symbol_shorthand

X = symbol_shorthand.X
L = symbol_shorthand.L

Example: Obtaining and Inspecting a Conditional

We’ll create a simple GaussianFactorGraph and eliminate one variable to get a GaussianConditional.

# Create a simple Gaussian Factor Graph P(x0) P(x1|x0)
graph = GaussianFactorGraph()
model1 = gtsam.noiseModel.Isotropic.Sigma(1, 1.0)
model2 = gtsam.noiseModel.Isotropic.Sigma(1, 1.0)

# Prior on x0
graph.add(X(0), -np.eye(1), np.zeros(1), model1)
# Factor between x0 and x1
graph.add(X(0), -np.eye(1), X(1), np.eye(1), np.zeros(1), model2)

print("Eliminating x0 from graph:")
graph.print()

# Eliminate x0
ordering = Ordering([X(0)])
bayes_net, remaining_graph = graph.eliminatePartialSequential(ordering)

print("\nResulting BayesNet:")
bayes_net.print()

# Get the resulting conditional P(x0 | x1)
# In this case, it's a GaussianConditional
conditional = bayes_net.at(0) # or bayes_net[0]

# Access methods from the Conditional base class
print(f"Conditional Keys (all): {conditional.keys()}")
print(f"First Frontal Key: {conditional.firstFrontalKey()} ({gtsam.DefaultKeyFormatter(conditional.firstFrontalKey())})")

# Conditional objects can also be printed
# conditional.print("P(x0 | x1): ")
Eliminating x0 from graph:

size: 2
factor 0: 
  A[x0] = [
	-1
]
  b = [ 0 ]
  Noise model: unit (1) 
factor 1: 
  A[x0] = [
	-1
]
  A[x1] = [
	1
]
  b = [ 0 ]
  Noise model: unit (1) 

Resulting BayesNet:

size: 1
conditional 0:  p(x0 | x1)
  R = [ 1.41421 ]
  S[x1] = [ -0.707107 ]
  d = [ 0 ]
  logNormalizationConstant: -0.572365
  No noise model
Conditional Keys (all): [8646911284551352320, 8646911284551352321]
First Frontal Key: 8646911284551352320 (x0)

Evaluation (Derived Class Methods)

Concrete conditional classes provide methods like logProbability(values) or evaluate(values) to compute the conditional probability (or density) given values for the parent variables. These methods are defined in the derived classes, not the Conditional base class itself.

# Example for GaussianConditional (requires VectorValues)
vector_values = gtsam.VectorValues()
vector_values.insert(X(0), np.array([0.0])) # Value for frontal variable
vector_values.insert(X(1), np.array([1.0])) # Value for parent variable

# These methods are specific to GaussianConditional / other concrete types
try:
    log_prob = conditional.logProbability(vector_values)
    print(f"\nLog Probability P(x0|x1=1.0): {log_prob}")
    prob = conditional.evaluate(vector_values)
    print(f"Probability P(x0|x1=1.0): {prob}")
except AttributeError:
    print("\nNote: logProbability/evaluate called on base Conditional pointer, needs derived type.")
    # In C++, you'd typically have a shared_ptr<GaussianConditional>.
    # In Python, if you know the type, you might access methods directly,
    # but the base class wrapper doesn't expose derived methods.
    pass

# To properly evaluate, you often use the BayesNet/BayesTree directly
bayes_net.logProbability(vector_values)

Log Probability P(x0|x1=1.0): -0.8223649429247
Probability P(x0|x1=1.0): 0.43939128946772243
-0.8223649429247