HybridConditional
acts as a type-erased wrapper for different kinds of conditional distributions that can appear in a HybridBayesNet
or HybridBayesTree
. It allows these containers to hold conditionals resulting from eliminating different types of variables (discrete, continuous, or mixtures) without needing to be templated on the specific conditional type.
A HybridConditional
object internally holds a shared pointer to one of the following concrete conditional types:
gtsam.GaussianConditional
gtsam.DiscreteConditional
gtsam.HybridGaussianConditional
It inherits from HybridFactor
and Conditional<HybridFactor, HybridConditional>
, providing access to both factor-like properties (keys) and conditional-like properties (frontals, parents).
import gtsam
import numpy as np
from gtsam import (
GaussianConditional,
DiscreteConditional,
HybridConditional,
HybridGaussianConditional,
)
from gtsam.symbol_shorthand import X, D
Initialization¶
A HybridConditional
is created by wrapping a shared pointer to one of the concrete conditional types. These concrete conditionals are usually obtained from factor graph elimination.
# --- Create concrete conditionals (examples) ---
# 1. GaussianConditional P(X0 | X1)
gc = GaussianConditional(X(0), np.array([1.0]), np.eye(1)*2.0, # d, R
X(1), np.array([[0.5]]), # Parent, S
gtsam.noiseModel.Diagonal.Sigmas([0.5])) # sigma=0.5 -> prec=4 -> R=2
# 2. DiscreteConditional P(D0 | D1) (D0, D1 binary)
dk0 = (D(0), 2)
dk1 = (D(1), 2)
dc = DiscreteConditional(dk0, [dk1], "4/1 1/4") # P(D0|D1=0) = 80/20, P(D0|D1=1) = 20/80
# 3. HybridGaussianConditional P(X2 | D2) (X2 1D, D2 binary)
dk2 = (D(2), 2)
# Mode 0: P(X2 | D2=0) = N(0, 1) -> R=1, d=0
hgc_gc0 = GaussianConditional(X(2), np.zeros(1), np.eye(1), gtsam.noiseModel.Unit.Create(1))
# Mode 1: P(X2 | D2=1) = N(5, 0.25) -> R=2, d=10
hgc_gc1 = GaussianConditional(X(2), np.array([10.0]), np.eye(1)*2.0, gtsam.noiseModel.Isotropic.Sigma(1,0.5))
# This constructor takes vector of conditionals directly if parents match
hgc = HybridGaussianConditional(dk2, [hgc_gc0, hgc_gc1])
# --- Wrap them into HybridConditionals ---
hybrid_cond_g = HybridConditional(gc)
hybrid_cond_d = HybridConditional(dc)
hybrid_cond_h = HybridConditional(hgc)
print("HybridConditional from GaussianConditional:")
hybrid_cond_g.print()
print("\nHybridConditional from DiscreteConditional:")
hybrid_cond_d.print()
print("\nHybridConditional from HybridGaussianConditional:")
hybrid_cond_h.print()
HybridConditional from GaussianConditional:
Hybrid Conditional
p(x0 | x1)
R = [ 2 ]
S[x1] = [ 0.5 ]
d = [ 1 ]
logNormalizationConstant: 0.467356
isotropic dim=1 sigma=0.5
HybridConditional from DiscreteConditional:
Hybrid Conditional
P( d0 | d1 ):
Choice(d1)
0 Choice(d0)
0 0 Leaf 0.8
0 1 Leaf 0.2
1 Choice(d0)
1 0 Leaf 0.2
1 1 Leaf 0.8
HybridConditional from HybridGaussianConditional:
Hybrid Conditional
P( x2 | d2)
Discrete Keys = (d2, 2),
logNormalizationConstant: 0.467356
Choice(d2)
0 Leaf p(x2)
R = [ 1 ]
d = [ 0 ]
mean: 1 elements
x2: 0
logNormalizationConstant: -0.918939
Noise model: unit (1)
1 Leaf p(x2)
R = [ 2 ]
d = [ 10 ]
mean: 1 elements
x2: 5
logNormalizationConstant: 0.467356
isotropic dim=1 sigma=0.5
Accessing Information and Inner Type¶
You can access keys, frontals, and parents like any conditional. You can also check the underlying type and attempt to cast back to the concrete type.
print("\n--- Inspecting HybridConditional from Gaussian ---")
print(f"Keys: {hybrid_cond_g.keys()}")
print(f"Frontals: {hybrid_cond_g.nrFrontals()}")
print(f"Parents: {hybrid_cond_g.nrParents()}")
print(f"Is Continuous? {hybrid_cond_g.isContinuous()}") # True
print(f"Is Discrete? {hybrid_cond_g.isDiscrete()}") # False
print(f"Is Hybrid? {hybrid_cond_g.isHybrid()}") # False
# Try casting back
inner_gaussian = hybrid_cond_g.asGaussian()
if inner_gaussian:
print("Successfully cast back to GaussianConditional:")
inner_gaussian.print()
else:
print("Failed to cast back to GaussianConditional.")
inner_discrete = hybrid_cond_g.asDiscrete()
print(f"Cast back to DiscreteConditional successful? {inner_discrete is not None}")
print("\n--- Inspecting HybridConditional from Hybrid ---")
print(f"Keys: {hybrid_cond_h.keys()}")
print(f"Frontals: {hybrid_cond_h.nrFrontals()}")
print(f"Parents: {hybrid_cond_h.nrParents()}") # Contains continuous AND discrete parents
print(f"Continuous Keys: {hybrid_cond_h.continuousKeys()}")
print(f"Discrete Keys: {hybrid_cond_h.discreteKeys()}")
print(f"Is Continuous? {hybrid_cond_h.isContinuous()}") # False
print(f"Is Discrete? {hybrid_cond_h.isDiscrete()}") # False
print(f"Is Hybrid? {hybrid_cond_h.isHybrid()}") # True
# Try casting back
inner_hybrid = hybrid_cond_h.asHybrid()
if inner_hybrid:
print("Successfully cast back to HybridGaussianConditional.")
else:
print("Failed to cast back to HybridGaussianConditional.")
--- Inspecting HybridConditional from Gaussian ---
Keys: [8646911284551352320, 8646911284551352321]
Frontals: 1
Parents: 1
Is Continuous? True
Is Discrete? False
Is Hybrid? False
Successfully cast back to GaussianConditional:
GaussianConditional p(x0 | x1)
R = [ 2 ]
S[x1] = [ 0.5 ]
d = [ 1 ]
logNormalizationConstant: 0.467356
isotropic dim=1 sigma=0.5
Cast back to DiscreteConditional successful? False
--- Inspecting HybridConditional from Hybrid ---
Keys: [8646911284551352322, 7205759403792793602]
Frontals: 1
Parents: 1
Continuous Keys: [8646911284551352322]
Discrete Keys:
d2 2
Is Continuous? False
Is Discrete? False
Is Hybrid? True
Successfully cast back to HybridGaussianConditional.
Evaluation (error
, logProbability
, evaluate
)¶
These methods delegate to the underlying concrete conditional’s implementation. They require a HybridValues
object containing assignments for all involved variables (frontal and parents).
# --- Evaluate the Gaussian Conditional P(X0 | X1) ---
vals_g = gtsam.HybridValues()
vals_g.insert(X(0), np.array([2.0])) # Frontal
vals_g.insert(X(1), np.array([1.0])) # Parent
err_g = hybrid_cond_g.error(vals_g)
log_prob_g = hybrid_cond_g.logProbability(vals_g)
prob_g = hybrid_cond_g.evaluate(vals_g) # Equivalent to exp(logProbability)
print(f"\nGaussian HybridConditional P(X0=2|X1=1):")
print(f" Error: {err_g}")
print(f" LogProbability: {log_prob_g}")
print(f" Probability: {prob_g}")
# --- Evaluate the Discrete Conditional P(D0 | D1) ---
vals_d = gtsam.HybridValues()
vals_d.insert(D(0), 1) # Frontal = 1
vals_d.insert(D(1), 0) # Parent = 0
err_d = hybrid_cond_d.error(vals_d) # -log(P(D0=1|D1=0)) = -log(0.2)
log_prob_d = hybrid_cond_d.logProbability(vals_d) # log(0.2)
prob_d = hybrid_cond_d.evaluate(vals_d) # 0.2
print(f"\nDiscrete HybridConditional P(D0=1|D1=0):")
print(f" Error: {err_d}")
print(f" LogProbability: {log_prob_d}")
print(f" Probability: {prob_d}")
# --- Evaluate the Hybrid Gaussian Conditional P(X2 | D2) ---
vals_h = gtsam.HybridValues()
vals_h.insert(X(2), np.array([4.5])) # Frontal
vals_h.insert(D(2), 1) # Parent (selects mode 1: N(5, 0.25))
err_h = hybrid_cond_h.error(vals_h)
log_prob_h = hybrid_cond_h.logProbability(vals_h)
prob_h = hybrid_cond_h.evaluate(vals_h)
print(f"\nHybrid Gaussian HybridConditional P(X2=4.5|D2=1):")
print(f" Error: {err_h}")
print(f" LogProbability: {log_prob_h}")
print(f" Probability: {prob_h}")
Gaussian HybridConditional P(X0=2|X1=1):
Error: 24.5
LogProbability: -24.032644172084783
Probability: 3.6538881633458336e-11
Discrete HybridConditional P(D0=1|D1=0):
Error: 1.6094379124341003
LogProbability: -1.6094379124341003
Probability: 0.2
Hybrid Gaussian HybridConditional P(X2=4.5|D2=1):
Error: 2.0
LogProbability: -1.5326441720847823
Probability: 0.21596386605275217
Restriction (restrict
)¶
The restrict
method allows fixing the discrete parent variables, potentially simplifying the conditional (e.g., a HybridGaussianConditional
might become a GaussianConditional
).
# Restrict the HybridGaussianConditional P(X2 | D2)
assignment = gtsam.DiscreteValues()
assignment[D(2)] = 1 # Fix D2 to mode 1
restricted_factor = hybrid_cond_h.restrict(assignment)
restricted_factor.print("\nRestricted HybridConditional (D2=1):")
Restricted HybridConditional (D2=1):p(x2)
R = [ 2 ]
d = [ 10 ]
mean: 1 elements
x2: 5
logNormalizationConstant: 0.467356
isotropic dim=1 sigma=0.5