Purpose and Audience¶
ExtendedPriorFactor is the generalized building block that underlies PriorFactor and other soft anchoring mechanisms. It lets you express an (optionally shifted) Gaussian (or robust) likelihood in the tangent space of an arbitrary manifold value type. This notebook targets advanced GTSAM users designing custom factors, experimenting with non-zero tangent-space means, or working with non-traditional manifold types.
Mathematical Definition¶
Given a value type T (Lie group or general manifold) with local coordinates operator Local(x, y) (mapping from manifold to tangent space at x), an origin value o, an optional tangent-space mean vector m (default 0), and a noise model with (robust) loss rho, the factor defines:
Raw error:
e(x) = - Local(x, o) - m(ifmprovided, else-Local(x,o))Loss:
loss(x) = rho( || e(x) ||^2_Σ )whereΣis the covariance inside the noise model.Likelihood:
L(x) = exp( - loss(x) ).
For Lie groups where Local(x,o) = Log( x^{-1} o ), this is (up to an additive constant in the exponent) the extended concentrated Gaussian:
Choosing a non-zero mean m effectively shifts the maximum-likelihood point from o to o.retract(m).
Relation to PriorFactor¶
PriorFactor<T> is a specialization that sets m = 0 and names the origin prior. It also implements the Jacobian as identity by using the -Local(x, prior) form, which avoids computing derivatives of Local.
Use PriorFactor unless you explicitly need a shifted tangent-space mean or want to treat the resulting distribution as a likelihood (e.g., for probabilistic weighting or annealing strategies).
When to Use ExtendedPriorFactor Directly¶
You need a soft constraint centered at
origin.retract(mean)but want to keep a fixed linearization frame atoriginfor numerical reasons.You are implementing a factor that conceptually models a latent density over a manifold variable (e.g., pose priors from learned models producing mean offset vectors).
You want to evaluate normalized or relative likelihoods on candidate values (
likelihood(x)convenience method).You work with a robust noise model to downweight outliers in the tangent residual of a pseudo-prior.
You prototype new manifold types: avoiding providing analytic Jacobians for local coordinates beyond identity can simplify early experiments.
C++ API Recap¶
Constructor (templated on VALUE):
ExtendedPriorFactor(Key key, const VALUE& origin,
const SharedNoiseModel& model,
const std::optional<Vector>& mean = {});Key methods:
Vector evaluateError(const VALUE& x, OptionalMatrixType H)returns the tangent residual (with identity Jacobian ifH).double error(const VALUE& x)negative log-likelihood (robust).double likelihood(const VALUE& x)returnsexp(-error(x)).Accessors:
origin(),mean().
Practical Example (Pose2)¶
We replicate the core logic of the unit tests (see nonlinear/tests/testExtendedPriorFactor.cpp) using Python bindings.
# Example: Pose2 ExtendedPriorFactor usage
import math
import numpy as np
import gtsam
from gtsam import Pose2, noiseModel
key = 1
origin = Pose2(1.0, 2.0, 0.3)
model = noiseModel.Isotropic.Sigma(3, 0.5) # std dev 0.5 in all dims
mean = np.array([0.1, 0.2, 0.05]) # tangent space shift
factor = gtsam.ExtendedPriorFactorPose2(key, origin, mean, model)
# Evaluate at origin.retract(mean): should yield near-zero residual and likelihood ~ 1
x_mode = origin.retract(mean)
residual_at_mode = factor.evaluateError(x_mode)
likelihood_at_mode = factor.likelihood(x_mode)
residual_at_mode, likelihood_at_mode(array([ 0.00987086, -0.00524786, 0. ]), 0.9997500833133725)Interpreting the Residual¶
Because the Jacobian reported is the identity, this factor is numerically stable and cheap: linearization contributes just an identity term. The residual is the signed difference in local coordinates, shifted by the mean.
Adding to a Graph¶
You can add ExtendedPriorFactor directly to a NonlinearFactorGraph, but unlike PriorFactor, there is no sugar addPrior<T>(). You construct and add explicitly.
from gtsam import NonlinearFactorGraph, Values, LevenbergMarquardtOptimizer
graph = NonlinearFactorGraph()
graph.push_back(factor) # our ExtendedPriorFactor
initial = Values()
# Provide an initial guess somewhat away from the mode
initial.insert(key, Pose2(1.2, 1.9, 0.25))
params = gtsam.LevenbergMarquardtParams()
optimizer = LevenbergMarquardtOptimizer(graph, initial, params)
result = optimizer.optimize()
final_pose = result.atPose2(key)
final_pose(1.02536, 2.22216, 0.35)Design Notes and Pitfalls¶
The frame of the tangent space is fixed at
origin. A very large mean might implicitly request motion far from the linearization region; consider re-centering or using a different modeling approach if optimization struggles.Robust models (e.g.
noiseModel.Robust.Create(mEstimator, baseModel)) apply their loss to the squared Mahalanobis distance of the residual; this lets you treat the tangent residual as an outlier-prone quantity.For custom manifold types, ensure
traits<T>::Localandretract(implicitly viaretractif you want to compute the mode value) are implemented and consistent.Unlike some priors, this factor purposefully does not store or recompute a dynamic Jacobian; the identity assumption is a deliberate design choice.
Serialization support relies on
NoiseModelFactor1for backward compatibility; keep that in mind if extending the class.
When Not to Use¶
If you simply need a zero-mean soft prior: prefer PriorFactor.
If you require a hard equality constraint: consider
NonlinearEquality.If the residual should depend on another variable or measurement, design a custom factor instead of embedding complexity in
Local.
See Also¶
Other manifold priors:
BetweenFactor,NonlinearEquality, and robust variantsUnderlying noise models:
noiseModel::Isotropic,noiseModel::Diagonal, robust creators