I. ALGORITHM DESCRIPTION In this report, we present our IMU-RGBD camera navigation algorithm employing the observability constrained (OC)-EKF, which seeks to maintain the original system’s observability properties in the linearized implementation (EKF). In particular, we first describe the implementation of OC-EKF for processing point feature measurements. Then, we prove that once the OC-EKF is employed for point feature measurements, the observability constraint is automatically satisfied for the plane feature measurements. A system’s observability Gramian [1], M, is defined as M
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 2005.Includes...
The motivation of this research is to address the benefits of tightly integrating optical and inerti...
Abstract — We analyze the observability of 3-D pose from the fusion of visual and inertial sensors. ...
Abstract — In this paper, we present a linear-complexity 3D inertial navigation algorithm using both...
In this paper, we address the problem of extrinsically calibrating an inertial measurement unit (IMU...
Abstract — In this paper, we address the problem of ex-trinsically calibrating an inertial measureme...
The idea is to implement a vision-aided inertial navigation system (INS) for estimating inertial mea...
With the release of RGBD-cameras (cameras that provide both RGB as well as depth information) resear...
In this paper, we present a system for estimating the trajectory of a moving RGB-D camera with appli...
Presented at the AIAA Guidance Navigation and Control Conference, Kissimmee, Florida, January 2015.T...
When appropriate infrastructure is not available, localization of pedestrians becomes a difficult ta...
A rising popularity of RGBD sensors caused an increase of research in recording and reconstruction o...
Camera based navigation is getting more and more popular and is the often the cornerstone in Augment...
In this technical report, we study estimator inconsistency in Vision-aided Inertial Navigation Syste...
In this paper, we address the problem of ego-motion estimation by fusing visual and inertial informa...
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 2005.Includes...
The motivation of this research is to address the benefits of tightly integrating optical and inerti...
Abstract — We analyze the observability of 3-D pose from the fusion of visual and inertial sensors. ...
Abstract — In this paper, we present a linear-complexity 3D inertial navigation algorithm using both...
In this paper, we address the problem of extrinsically calibrating an inertial measurement unit (IMU...
Abstract — In this paper, we address the problem of ex-trinsically calibrating an inertial measureme...
The idea is to implement a vision-aided inertial navigation system (INS) for estimating inertial mea...
With the release of RGBD-cameras (cameras that provide both RGB as well as depth information) resear...
In this paper, we present a system for estimating the trajectory of a moving RGB-D camera with appli...
Presented at the AIAA Guidance Navigation and Control Conference, Kissimmee, Florida, January 2015.T...
When appropriate infrastructure is not available, localization of pedestrians becomes a difficult ta...
A rising popularity of RGBD sensors caused an increase of research in recording and reconstruction o...
Camera based navigation is getting more and more popular and is the often the cornerstone in Augment...
In this technical report, we study estimator inconsistency in Vision-aided Inertial Navigation Syste...
In this paper, we address the problem of ego-motion estimation by fusing visual and inertial informa...
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 2005.Includes...
The motivation of this research is to address the benefits of tightly integrating optical and inerti...
Abstract — We analyze the observability of 3-D pose from the fusion of visual and inertial sensors. ...