Access the full text.
Sign up today, get DeepDyve free for 14 days.
Agostino Martinelli (2014)
Visual-inertial structure from motion: Observability vs minimum number of sensors2014 IEEE International Conference on Robotics and Automation (ICRA)
H. Strasdat, J. Montiel, A. Davison (2010)
Real-time monocular SLAM: Why filter?2010 IEEE International Conference on Robotics and Automation
S. Weiss (2012)
Vision based navigation for micro helicopters
Eun-Hwan Shin, N. El-Sheimy (2004)
An unscented Kalman filter for in-motion alignment of low-cost IMUsPLANS 2004. Position Location and Navigation Symposium (IEEE Cat. No.04CH37556)
Jong-Hyuk Kim, S. Sukkarieh (2007)
Real-time implementation of airborne inertial-SLAMRobotics Auton. Syst., 55
J. Lobo, J. Dias (2007)
Relative Pose Calibration Between Visual and Inertial SensorsThe International Journal of Robotics Research, 26
Dennis Strelow, Sanjiv Singh (2003)
Online Motion Estimation from Image and Inertial Measurements
Jonathan Kelly, G. Sukhatme (2011)
Visual-Inertial Sensor Fusion: Localization, Mapping and Sensor-to-Sensor Self-calibrationThe International Journal of Robotics Research, 30
J. Nikolic, J. Rehder, M. Burri, Pascal Gohl, Stefan Leutenegger, P. Furgale, R. Siegwart (2014)
A synchronized visual-inertial sensor system with FPGA pre-processing for accurate real-time SLAM2014 IEEE International Conference on Robotics and Automation (ICRA)
G. Huang, Anastasios Mourikis, S. Roumeliotis (2011)
An observability-constrained sliding window filter for SLAM2011 IEEE/RSJ International Conference on Intelligent Robots and Systems
Yuanxin Wu (2013)
On 'A Kalman Filter-Based Algorithm for IMU-Camera Calibration: Observability Analysis and Performance Evaluation'ArXiv, abs/1311.4769
Gabe Sibley, Larry Matthies, G. Sukhatme (2010)
Sliding window filter with application to planetary landingJournal of Field Robotics, 27
Lin Chai, W. Hoff, T. Vincent (2002)
Three-Dimensional Motion and Structure Estimation Using Inertial Sensors and Computer Vision for Augmented RealityPresence: Teleoperators & Virtual Environments, 11
Joel Hesch, Dimitrios Kottas, Sean Bowman, S. Roumeliotis (2012)
Observability-constrained Vision-aided Inertial Navigation
C. Harris, M. Stephens (1988)
A Combined Corner and Edge Detector
A. Davison (2003)
Real-time simultaneous localisation and mapping with a single cameraProceedings Ninth IEEE International Conference on Computer Vision
E. Jones, Stefano Soatto (2011)
Visual-inertial navigation, mapping and localization: A scalable real-time causal approachThe International Journal of Robotics Research, 30
S. Ebcin, M. Veth (2007)
Tightly-Coupled Image-Aided Inertial Navigation Using the Unscented Kalman Filter
P. Savage (1998)
STRAPDOWN INERTIAL NAVIGATION INTEGRATION ALGORITHM DESIGN. PART 2: VELOCITY AND POSITION ALGORITHMSJournal of Guidance Control and Dynamics, 21
Agostino Martinelli (2011)
State Estimation Based on the Concept of Continuous Symmetry and Observability Analysis: The Case of CalibrationIEEE Transactions on Robotics, 27
T. Barfoot, J. Forbes, P. Furgale (2011)
Pose estimation using linearized rotations and quaternion algebraActa Astronautica, 68
G. Huang, M. Kaess, J. Leonard (2014)
Towards consistent visual-inertial navigation2014 IEEE International Conference on Robotics and Automation (ICRA)
M. Bryson, M. Johnson-Roberson, S. Sukkarieh (2009)
Airborne smoothing and mapping using vision and inertial sensors2009 IEEE International Conference on Robotics and Automation
Stefan Leutenegger, M. Chli, R. Siegwart (2011)
BRISK: Binary Robust invariant scalable keypoints2011 International Conference on Computer Vision
Christopher Mei, Gabe Sibley, M. Cummins, P. Newman, I. Reid (2011)
RSLAM: A System for Large-Scale Mapping in Constant-Time Using StereoInternational Journal of Computer Vision, 94
Ananth Ranganathan, M. Kaess, F. Dellaert (2007)
Fast 3D pose estimation with out-of-sequence measurements2007 IEEE/RSJ International Conference on Intelligent Robots and Systems
M. Kaess, H. Johannsson, Richard Roberts, V. Ila, J. Leonard, F. Dellaert (2012)
iSAM2: Incremental smoothing and mapping using the Bayes treeThe International Journal of Robotics Research, 31
Dimitrios Kottas, Joel Hesch, Sean Bowman, S. Roumeliotis (2012)
On the Consistency of Vision-Aided Inertial Navigation
G. Huang, Anastasios Mourikis, S. Roumeliotis (2009)
A First-Estimates Jacobian EKF for Improving SLAM Consistency
Anastasios Mourikis, N. Trawny, S. Roumeliotis, Andrew Johnson, A. Ansar, L. Matthies (2009)
Vision-Aided Inertial Navigation for Spacecraft Entry, Descent, and LandingIEEE Transactions on Robotics, 25
S. Roumeliotis, Andrew Johnson, J. Montgomery (2002)
Augmenting inertial navigation with image-based motion estimationProceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292), 4
Mingyang Li, Anastasios Mourikis (2012)
Optimization-Based Estimator Design for Vision-Aided Inertial Navigation, 08
V. Indelman, Stephen Williams, M. Kaess, F. Dellaert (2012)
Factor graph based incremental smoothing in inertial navigation systems2012 15th International Conference on Information Fusion
Mingyang Li, B. Kim, Anastasios Mourikis (2013)
Real-time motion tracking on a cellphone using inertial sensing and a rolling-shutter camera2013 IEEE International Conference on Robotics and Automation
Georg Klein, D. Murray (2007)
Parallel Tracking and Mapping for Small AR Workspaces2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality
P. Furgale (2011)
Extensions to the Visual Odometry Pipeline for the Exploration of Planetary Surfaces
Mingyang Li, Anastasios Mourikis (2012)
Improving the accuracy of EKF-based visual-inertial odometry2012 IEEE International Conference on Robotics and Automation
S. Thrun, Michael Montemerlo (2006)
The Graph SLAM Algorithm with Applications to Large-Scale Mapping of Urban StructuresThe International Journal of Robotics Research, 25
(2010)
Ceres Solver
(2007)
IMU–camera Calibration: Bundle Adjustment Implementation
Anastasios Mourikis, S. Roumeliotis (2007)
A Multi-State Constraint Kalman Filter for Vision-aided Inertial NavigationProceedings 2007 IEEE International Conference on Robotics and Automation
Tue-Cuong Dong-Si, Anastasios Mourikis (2012)
Estimator initialization in vision-aided inertial navigation with unknown camera-IMU calibration2012 IEEE/RSJ International Conference on Intelligent Robots and Systems
Chaoshe Jia, B. Evans (2012)
Probabilistic 3-D motion estimation for rolling shutter video rectification from visual and inertial measurements2012 IEEE 14th International Workshop on Multimedia Signal Processing (MMSP)
脇元 修一 (2011)
IEEE International Conference on Robotics and Automation (ICRA) におけるフルードパワー技術の研究動向TRANSACTIONS OF THE JAPAN FLUID POWER SYSTEM SOCIETY, 42
L. Kneip, P. Furgale (2014)
OpenGV: A unified and generalized approach to real-time calibrated geometric vision2014 IEEE International Conference on Robotics and Automation (ICRA)
Simon Lynen, Markus Achtelik, S. Weiss, M. Chli, R. Siegwart (2013)
A robust and modular multi-sensor fusion approach applied to MAV navigation2013 IEEE/RSJ International Conference on Intelligent Robots and Systems
Joel Hesch, Dimitrios Kottas, Sean Bowman, S. Roumeliotis (2012)
Towards Consistent Vision-Aided Inertial Navigation
Dennis Strelow, Sanjiv Singh (2004)
Motion Estimation from Image and Inertial MeasurementsThe International Journal of Robotics Research, 23
Esha Nerurkar, Kejian Wu, S. Roumeliotis (2014)
C-KLAM: Constrained keyframe-based localization and mapping2014 IEEE International Conference on Robotics and Automation (ICRA)
K. Konolige, M. Agrawal, J. Solà (2007)
Large-Scale Visual Odometry for Rough Terrain
Mingyang Li, Anastasios Mourikis (2012)
Vision-aided inertial navigation for resource-constrained systems2012 IEEE/RSJ International Conference on Intelligent Robots and Systems
Tue-Cuong Dong-Si, Anastasios Mourikis (2011)
Motion tracking with fixed-lag smoothing: Algorithm and consistency analysis2011 IEEE International Conference on Robotics and Automation
P. Furgale, J. Rehder, R. Siegwart (2013)
Unified temporal and spatial calibration for multi-sensor systems2013 IEEE/RSJ International Conference on Intelligent Robots and Systems
Andreas Geiger, Philip Lenz, R. Urtasun (2012)
Are we ready for autonomous driving? The KITTI vision benchmark suite2012 IEEE Conference on Computer Vision and Pattern Recognition
Stefan Leutenegger, P. Furgale, V. Rabaud, M. Chli, K. Konolige, R. Siegwart (2013)
Keyframe-Based Visual-Inertial SLAM using Nonlinear Optimization, 09
S. Weiss, Markus Achtelik, Simon Lynen, M. Chli, R. Siegwart (2012)
Real-time onboard visual-inertial state estimation and self-calibration of MAVs in unknown environments2012 IEEE International Conference on Robotics and Automation
Sang-Hack Jung, C. Taylor (2001)
Camera trajectory estimation using inertial sensor measurements and structure from motion resultsProceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. CVPR 2001, 2
D. Bayard, P. Brugarolas (2005)
An estimation algorithm for vision-based exploration of small bodies in spaceProceedings of the 2005, American Control Conference, 2005.
Combining visual and inertial measurements has become popular in mobile robotics, since the two sensing modalities offer complementary characteristics that make them the ideal choice for accurate visual–inertial odometry or simultaneous localization and mapping (SLAM). While historically the problem has been addressed with filtering, advancements in visual estimation suggest that nonlinear optimization offers superior accuracy, while still tractable in complexity thanks to the sparsity of the underlying problem. Taking inspiration from these findings, we formulate a rigorously probabilistic cost function that combines reprojection errors of landmarks and inertial terms. The problem is kept tractable and thus ensuring real-time operation by limiting the optimization to a bounded window of keyframes through marginalization. Keyframes may be spaced in time by arbitrary intervals, while still related by linearized inertial terms. We present evaluation results on complementary datasets recorded with our custom-built stereo visual–inertial hardware that accurately synchronizes accelerometer and gyroscope measurements with imagery. A comparison of both a stereo and monocular version of our algorithm with and without online extrinsics estimation is shown with respect to ground truth. Furthermore, we compare the performance to an implementation of a state-of-the-art stochastic cloning sliding-window filter. This competitive reference implementation performs tightly coupled filtering-based visual–inertial odometry. While our approach declaredly demands more computation, we show its superior performance in terms of accuracy.
The International Journal of Robotics Research – SAGE
Published: Mar 1, 2015
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.