Calibration of The Spatial Pose Between Inertial and Visual Sensors With An Inclinometer (original) (raw)

Modeling and Calibration of Inertial and Vision Sensors

The International Journal of Robotics Research, 2010

This paper is concerned with the problem of estimating the relative translation and orientation of an inertial measurement unit and a camera, which are rigidly connected. The key is to realize that this problem is in fact an instance of a standard problem within the area of system identification, referred to as a gray-box problem. We propose a new algorithm for estimating the relative translation and orientation, which does not require any additional hardware, except a piece of paper with a checkerboard pattern on it. The method is based on a physical model which can also be used in solving for example sensor fusion problems. The experimental results show that the method works well in practice, both for perspective and spherical cameras.

Relative Pose Calibration Between Visual and Inertial Sensors

International Journal of Robotic Research, 2007

This paper proposes an approach to calibrate off-the-shelf cameras and inertial sensors to have a useful integrated system to be used in static and dynamic situations. The rotation between the camera and the inertial sensor can be estimated, when calibrating the camera, by having both sensors observe the vertical direction, using a vertical chessboard target and gravity. The translation between the two can be estimated using a simple passive turntable and static images, provided that the system can be adjusted to turn about the inertial sensor null point in several poses. Simulation and real data results are presented to show the validity and simple requirements of the proposed method.

Joint calibration of an inertial measurement unit and coordinate transformation parameters using a monocular camera

2010 International Conference on Indoor Positioning and Indoor Navigation, 2010

An estimation procedure for calibration of a lowcost inertial measurement unit (IMU), using a rigidly mounted monocular camera, is presented. The parameters of a sensor model that captures misalignments, scale and offset errors are estimated jointly with the IMU-camera coordinate transformation parameters using a recursive Sigma-Point Kalman Filter. The method requires only a simple visual calibration pattern. A simulation study indicates the filter's ability to reach subcentimeter and subdegree accuracy.

Camera-Inertial Sensor Modeling and Alignment for Visual Navigation

2003

This article presents a technique for modeling and calibrating a camera with integrated low-cost inertial sensors, three gyros and three accelerometers for full 3D sensing. Inertial sensors attached to a camera can provide valuable data about camera pose and movement. In biological vision systems, inertial cues provided by the vestibular system, are fused with vision at an early processing stage. Vision systems in autonomous vehicles can also benefit by taking inertial cues into account. Camera calibration has been extensively studied, and standard techniques established. Inertial navigation systems, relying on high-end sensors, also have established techniques. Nevertheless, in order to use off-the-shelf inertial sensors attached to a camera, appropriate modeling and calibration techniques are required. For inertial sensor alignment, a pendulum instrumented with an encoded shaft is used to estimate the bias and scale factor of inertial measurements. For camera calibration, a standard and reliable camera calibration technique is used, based on images of a planar grid. Having both the camera and the inertial sensors calibrated and observing the vertical direction at different poses, the rigid rotation between the two frames of reference is estimated, using a mathematical model based on unit quaternions. The technique for this alignment and consequent results with simulated and real data are presented at the end of this article.

A Review Of Visual Inertial Odometry For Object Tracking And Measurement

International Journal of Scientific & Technology Research, 2020

This paper aims to explore the use of Visual Inertial Odometry (VIO) for tracking and measurement. The evolution of VIO is first discussed, followed by the overview of monocular Visual Odometry (VO) and the Inertial Measurement Unit (IMU). Next, the related measurement approaches and the use of VIO for measurement have also been discussed. Visual Inertial Odometry is the combination of IMU in the VO system in which the visual information and inertial measurements are combined to achieve an accurate measurement. The algorithm of VO system contains four components, which are camera calibration algorithm, the feature tracker algorithm (usually the KLT algorithm), the rigid motion estimation algorithm, and the algorithm that matches a description of the features points (typically RANSAC algorithm). The IMU is the combination of accelerometer, gyroscopes and magnetometer that measures the linear and angular motion. To fuse the visual and inertial measurements data, there are two differen...

Camera-inertial sensor modelling and alignment for visual navigation

2003

Inertial sensors attached to a camera can provide valuable data about camera pose and movement. In biological vision systems, inertial cues provided by the vestibular system, are fused with vision at an early processing stage. Vision systems in autonomous vehicles can also benefit by taking inertial cues into account. In order to use off-the-shelf inertial sensors attached to a camera, appropriate modelling and calibration techniques are required. Camera calibration has been extensively studied, and standard techniques established. Inertial navigation systems, relying on highend sensors, also have established techniques. This paper presents a technique for modelling and calibrating the camera integrated with low-cost inertial sensors, three gyros and three accelerometers for full 3D sensing. Using a pendulum with an encoded shaft, inertial sensor alignment, bias and scale factor can be estimated. Having both the camera and the inertial sensors observing the vertical direction at different poses, the rigid rotation between the two frames of reference can be estimated. Preliminary simulation and real data results are presented.

An enhanced multi-position calibration method for consumer-grade inertial measurement units applied and tested

Measurement Science and Technology, 2010

An accurate inertial measurement unit (IMU) is a necessity when considering an inertial navigation system capable of giving reliable position and velocity estimates even for a short period of time. However, even a set of ideal gyroscopes and accelerometers does not imply an ideal IMU if its exact mechanical characteristics (i.e. alignment and position information of each sensor) are not known. In this paper, the standard multi-position calibration method for consumer-grade IMUs using a rate table is enhanced to exploit also the centripetal accelerations caused by the rotation of the table. Thus, the total number of measurements rises, making the method less sensitive to errors and allowing use of more accurate error models. As a result, the accuracy is significantly enhanced, while the required numerical methods are simple and efficient. The proposed method is tested with several IMUs and compared to existing calibration methods.

Evaluation and calibration of an inertial measurement unit

8TH BSME INTERNATIONAL CONFERENCE ON THERMAL ENGINEERING

The Micro-electro-mechanical systems (MEMS) have been adopted into various applications such as navigation or positioning, consumer electronics, medical electronics. It provides low fabrication cost, lightweight and low-power chip-based accelerometers and gyroscopes. The signals for inertial measurement unit (IMU) is greatly depends on the sensor's proper installment. Due to this fact, these low-cost sensors are in great need of a calibration procedure. This paper presents evaluation and calibration of an inertial measurement unit, Ultra-Miniature 6 (UM-6). An experimental setup has been developed for this purpose. The UM6 sensor was placed on the setup and the sensor frame rotating around the standard coordinate system. An electrical circuit was designed to connect the sensor with personal computer. An UART serial protocol was used for communication. Data acquisition system based on Visual C++ was used for archiving of the measurement data, whereas further data processing was performed in Origin Pro 8. This raw experimental data was placed on a mathematical equation for developing calibration data and compare it with sensor provided calibration data for evaluation. The algorithm is shown to give good results using data from the IMU units. The experimental maximum error was 10%, 16.67% and 18.18% for gyro calibration and 9.34%, 7.92% and 13.07% for accelerometer calibration consecutively x, y and z coordinates. Hence the experimental results demonstrate the effectiveness of the proposed method of calibration.

Calibration of inertial and vision systems as a prelude to multi-sensor fusion

Transportation Research Part C: Emerging Technologies, 2008

Effective integration of vision and inertial sensing can be used to overcome the classical issue of inertial divergence in inertial systems over time. However, for such integration, two types of calibration need to be performed, namely, (a) calibration to estimate the intrinsic properties of the vision cameras, such as the focal length and lens distortion parameters, and (b) determination of the transformation between the camera and the inertial system. While there are a variety of approaches to calibrate a camera optically, only a few accomplishments have been reported on calibration of cameras with inertial systems. Even the latter ones have been developed under indoor conditions. Calibration of a two sensor system under indoor conditions does not provide an appropriate and practical transformation for use in outdoor maneuvers due to invariable differences between outdoor and indoor conditions. Also, use of custom calibration objects in outdoor operational conditions is not feasible due to larger field of view that requires relatively large calibration object sizes. Hence calibration becomes a critical issue particularly if the integrated system is used in Intelligent Transportation Systems applications. In such cases it is essential that the two sensor system be calibrated in an outdoor setting in the final configuration of the vehicle. This paper introduces a calibration technique to determine the optimized transformation between the inertial and vision sensor systems in an outdoor setting. The derived intrinsic properties and the transformation between individual sensors are verified during two separate test runs on a section of an actual roadway. Finally, on a third roadway section the inertial and converted vision data are successfully compared with data from a manual survey performed.

Fast Relative Pose Calibration for Visual and Inertial Sensors

Springer Tracts in Advanced Robotics, 2009

Accurate vision-aided inertial navigation depends on proper calibration of the relative pose of the camera and the inertial measurement unit (IMU). Calibration errors introduce bias in the overall motion estimate, degrading navigation performance -sometimes dramatically. However, existing camera-IMU calibration techniques are difficult, time-consuming and often require additional complex apparatus. In this paper, we formulate the camera-IMU relative pose calibration problem in a filtering framework, and propose a calibration algorithm which requires only a planar camera calibration target. The algorithm uses an unscented Kalman filter to estimate the pose of the IMU in a global reference frame and the 6-DoF transform between the camera and the IMU. Results from simulations and experiments with a low-cost solid-state IMU demonstrate the accuracy of the approach.