Inertial and 3d-odometry fusion in rough terrain-towards real 3d navigation (original) (raw)
Related papers
Improved position estimation for mobile robots on rough terrain using attitude information
2001
Most mobile robots use a combination of absolute and relative sensing techniques for position estimation. Relative positioning techniques are generally known as dead-reckoning. Many systems use odometry as their only dead-reckoning means. However, in recent years fiber optic gyroscopes have become more affordable and are being used on many platforms to supplement odometry, especially in indoor applications. Still, if the terrain is not level (i.e., rugged or rolling terrain), the tilt of the vehicle introduces errors into the conversion of gyro readings to vehicle heading. In order to overcome this problem vehicle tilt must be measured and factored into the heading computation. This technical report introduces a new fuzzy logic expert rule-based method for fusing data from multiple low-to medium-cost gyroscopes and accelerometers in order to estimate accurately the attitude (i.e., heading and tilt) of a mobile robot. The attitude information is then further fused with wheel encoder data to estimate the three-dimensional position of the mobile robot. Experimental results of mobile robot runs over rugged terrain are presented, showing the effectiveness of our fuzzy logic rule-based sensor fusion method.
3D-odometry for rough terrain - towards real 3D navigation
2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422), 2003
Up to recently autonomous mobile robots were mostly designed to run within an indoor, yet partly structured and flat, environment. In rough terrain many problems arise and position tracking becomes more difficult. The robot has to deal with wheel slippage and large orientation changes. In this paper we will first present the recent developments on the off-road rover Shrimp. Then a new method, called 3D-Odometry, which extends the standard 2D odometry to the 3D space will be developed. Since it accounts for transitions, the 3D-Odometry provides better position estimates. It will certainly help to go towards real 3D navigation for outdoor robots.
Fusion of inertial and kinematic navigation systems for autonomous vehicles
Proceedings of VNIS '93 - Vehicle Navigation and Information Systems Conference, 1993
It is expected that an Inertial Navigation System (INS), when used on a land vehicle, would provide a superior position estimate to that which would be determined using odometry. There are, however, a number of benefits to using odometry as opposed to an INS technique, including high physical robustness and reliability of the encoders, low weight, power consumption and cost, and a zero time-dependent drift. For these points alone, it would seem logical to include some form of odometry on any autonomous surface vehicle. If this last point is true, and an INS technique is to be implemented, then it follows that an attempt should be made to improve the overall system performance by combining INS and odometry signals into an optimal, or at least more robust, estimation system.
Multi-information fusion based localization algorithm for Mars rover
Advances in aircraft and spacecraft science, 2014
High-precision autonomous localization technique is essential for future Mars rovers. This paper addresses an innovative integrated localization algorithm using a multiple information fusion approach. Firstly, the output of IMU is employed to construct the two-dimensional (2-D) dynamics equation of Mars rover. Secondly, radio beacon measurement and terrain image matching are considered as external measurements and included into the navigation filter to correct the inertial basis and drift. Then, extended Kalman filtering (EKF) algorithm is designed to estimate the position state of Mars rovers and suppress the measurement noise. Finally, the localization algorithm proposed in this paper is validated by computer simulation with different parameter sets.
3D Position Tracking in Challenging Terrain
The International Journal of Robotics Research, 2007
The intent of this paper is to show how the accuracy of 3D position tracking can be improved by considering rover locomotion in rough terrain as a holistic problem. An appropriate locomotion concept endowed with a controller minimizing slip improves the climbing performance, the accuracy of odometry and the signal/noise ratio of the onboard sensors. Sensor fusion involving an inertial measurement unit, 3D-Odometry, and visual motion estimation is presented. The experimental results show clearly how each sensor contributes to increase the accuracy of the 3D pose estimation in rough terrain.
Sensors, 2017
Using a single sensor to determine the pose estimation of a device cannot give accurate results. This paper presents a fusion of an inertial sensor of six degrees of freedom (6-DoF) which comprises the 3-axis of an accelerometer and the 3-axis of a gyroscope, and a vision to determine a low-cost and accurate position for an autonomous mobile robot. For vision, a monocular vision-based object detection algorithm speeded-up robust feature (SURF) and random sample consensus (RANSAC) algorithms were integrated and used to recognize a sample object in several images taken. As against the conventional method that depend on point-tracking, RANSAC uses an iterative method to estimate the parameters of a mathematical model from a set of captured data which contains outliers. With SURF and RANSAC, improved accuracy is certain; this is because of their ability to find interest points (features) under different viewing conditions using a Hessain matrix. This approach is proposed because of its simple implementation, low cost, and improved accuracy. With an extended Kalman filter (EKF), data from inertial sensors and a camera were fused to estimate the position and orientation of the mobile robot. All these sensors were mounted on the mobile robot to obtain an accurate localization. An indoor experiment was carried out to validate and evaluate the performance. Experimental results show that the proposed method is fast in computation, reliable and robust, and can be considered for practical applications. The performance of the experiments was verified by the ground truth data and root mean square errors (RMSEs).
IFAC Proceedings Volumes, 2012
In this paper we describe a method for combining data from multiple onboard sensors to determine a mobile robot pose. An error model for a gyroscope, a dual-axial accelerometer and wheel encoders are derived for estimating the mobile robot's pose. A tri-axial magnetometer measures the magnetic field strength which is used as a criterion for acceptance of electronic compass readings to correct the azimuth of the mobile robot's orientation. The errors in each sensor are estimated mutually rather than independently considering each sensor error model. The final estimated pose is used to navigate, guide and control a mobile robot in an indoor environment.
Tracking a Mobile Robot Position Using Vision and Inertial Sensor
IFIP Advances in Information and Communication Technology, 2014
Wheeled mobile robots are still the first choice when it comes to industrial or domotic applications. The robot's navigation system aims to reliably determine the robot's position, velocity and orientation and provide it to control and trajectory guidance modules. The most frequently used sensors are inertial measurement units (IMU) combined with an absolute position sensing mechanism. The dead reckoning approach using IMU suffers from integration drift due to noise and bias. To overcome this limitation we propose the use of the inertial system in combination with mechanical odometers and a vision based system. These two sensor complement each other as the vision sensor is accurate at low-velocities but requires long computation time, while the inertial sensor is able to track fast movements but suffers from drift. The information from the sensors is integrated through a multi-rate fusion scheme. Each of the sensor systems is assumed to have it's own independent sampling rate, which may be time-varying. Data fusion is performed by a multi-rate Kalman filter. The paper describes the inertial and vision navigation systems, and the data fusion algorithm. Simulation and experimental results are presented.
Experimental results from FLEXnav: An expert rule-based dead-reckoning system for Mars rovers
IEEE Aerospace Conference Proceedings, 2004
Research at the University of Michigan's Mobile Robotics Lab aims at the development of an accurate proprioceptive (i.e., without external references) position estimation (PPE) system for Mars Rovers. Much like other PPE systems, ours uses an inertial measurement unit (IMU) comprising three-axis fiber-optic gyroscopes and a two-axis accelerometer, as well as odometry based on wheel encoders. Our PPE system, however, is unique in that it does not use the conventional Kalman Filter approach for fusing data from the different sensor modalities. Rather, our system combines data based on expert rules that implement our indepth understanding of each sensor modality's behavior under different driving and environmental conditions. Since our system also uses Fuzzy Logic operations in conjunction with the Expert Rules for finer gradation, we call it Fuzzy Logic Expert navigation (FLEXnav) PPE system. The paper presents detailed experimental results obtained with our FLEXnav system integrated with our Mars Rover clone Rocky-8B and operating in a Mars-like environment. The paper also introduces new methods for wheel slippage detection and correction, along with preliminary experimental results.