Skip to Main content Skip to Navigation
Conference papers

Rotational and translational bias estimation based on depth and image measurements

Abstract : Constant biases associated to measured linear and angular velocities of a moving object can be estimated from measurements of a static environment by embedded camera and depth sensor. We propose here a Lyapunov-based observer taking advantage of the SO(3)-invariance of the partial differential equations satisfied by the measured brightness and depth fields. The resulting observer is governed by a nonlinear integro/partial differential system whose inputs are the linear/angular velocities and the brightness/depth fields. Convergence analysis is investigated under C3 regularity assumptions on the object motion and its environment. Technically, it relies on Ascoli-Arzela theorem and pre-compacity of the observer trajectories. It ensures asymptotic convergence of the estimated brightness and depth fields. Convergence of the estimated biases is characterized by constraints depending only on the environment. We conjecture that these constraints are automatically satisfied when the environment does not admit any rotational symmetry axis. Such asymptotic observers can be adapted to any realistic camera model. Preliminary simulations with synthetic image and depth data (corrupted by noise around 10%) indicate that such Lyapunov-based observers converge for much weaker regularity assumptions.
Document type :
Conference papers
Complete list of metadata
Contributor : François Chaplais Connect in order to contact the contributor
Submitted on : Tuesday, February 12, 2013 - 11:12:53 PM
Last modification on : Wednesday, November 17, 2021 - 12:31:00 PM


  • HAL Id : hal-00787792, version 1


Nadège Zarrouati, Pierre Rouchon, Karine Beauchard. Rotational and translational bias estimation based on depth and image measurements. Conference on Decision and Control (2012), Dec 2012, France. pp.6627- 6634. ⟨hal-00787792⟩



Record views