Hierarchical sensor fusion for building a probabilistic local map using active sensor modules | Robotica | Cambridge Core (original) (raw)

Summary

An algorithm for three-level hierarchical sensor fusions has beenproposed and applied to environment map building with enhanced accuracy and efficiency. The algorithm was realized through the two new types of sensor modules, which are composed of a halogen lamp-based active vision sensor and a semicircular ultrasonic (US) and infrared (IR) sensor system. In the first-level fusion, the US and IR sensor information is utilized in terms of the geometric characteristics of the sensor location. In the second-level fusion, the outputs from the US and IR sensors are combined with the sheet of halogen light through a proposed rule base. In the third-level fusion, local maps from the first- and second-level fusion are updated in a probabilistic way for a very accurate environment local map. A practical implementation has been carried out to demonstrate the efficiency and accuracy of the proposed hierarchical sensor fusion algorithm in environment map building.

References

1.Gonzalez, J., Ollero, A. and Reina, A., “Map Building for a Mobile Robot Equipped With a 2D Laser Rangefinder,” Proceedings of the IEEE International Conference on Robotics and Automation (1994) pp. 1904–1909.Google Scholar

2.Brown, C. D., Marvel, R. W., Arce, G. R., Ih, C. S. and Fertell, D. A., “Morphological 3-D Segmentation Using Laser Structured Light,” Proceedings of the IEEE International Symposium on Circuits and Systems (1988) pp. 2803–2806.Google Scholar

3.Arsenio, A. and Ribeiro, M. I., “Active Range Sensing for Mobile Robot Localization,” Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (1998) pp. 1066–1071.Google Scholar

4.Mertz, C., Kozar, J., Miller, J. R. and Thorpe, C., “Eye-Safe Laser Line Striper for Outside Use,” Proceedings of the IEEE International Symposium on Intelligent Vehicle (2002) pp. 507–512.Google Scholar

5.Clerentin, A., Pegard, C. and Drocourt, C., “Environment Exploration Using an Active Vision Sensor,” Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (1999) pp. 1525–1530.Google Scholar

6.Luo, R. C., Yih, C.-C. and Su, K. L., “Multisensor fusion and integration: approaches, applications, and future research directions,” IEEE Sens. J. 2, 107–119 (2002).CrossRefGoogle Scholar

7.Durrant-Whyte, H., Integration, Coordination and Control Multi-Sensor Robot Systems (Kluwer Academic Norwell, MA, 1988).Google Scholar

8.Elfes, A., “Using occupancy grids for mobile robot perception and navigation,” IEEE Comput. 22, 46–57 (1989).CrossRefGoogle Scholar

9.Moravec, H. P., “Sensor fusion in certainty grids for mobile robots,” AI Mag. 9 (2), 61–74 (1988).Google Scholar

10.HoseinNezhad, R., Moshiri, B. and Asharif, M. R., “Sensor Fusion for Ultrasonic and Laser Arrays in Mobile Robotics: a Comparative Study of Fuzzy, Dempster and Bayesian Approaches,” Proceedings of the 1st IEEE International Conference on Sensors (2002) vol. 2, pp. 1682–1689.CrossRefGoogle Scholar

11.Pagac, D., Nebot, E. M. and Durrant-Whyte, H., “An evidential approach to map-building for autonomous vehicles,” IEEE Trans. Robot. Autom. 14, 623–629 (1998).CrossRefGoogle Scholar

12.Murphy, R. R., “Dempster-Shafer theory for sensor fusion in autonomous mobile robots,” IEEE Trans. Robot. Autom. 14, 197–206 (1998).CrossRefGoogle Scholar

13.Negishi, Y., Miura, J. and Shirai, Y., “Vision-Based Mobile Robot Speed Control Using a Probabilistic Occupancy Map,” Proceedings of the IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (2003) pp. 64–69.Google Scholar

14.Kim, G. W., Kwak, N. and Lee, B. H., “Low Cost Active Range Sensing Using Halogen Sheet-of-Light for Occupancy Grid Map Building,” Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (2005) pp. 2288–2293.Google Scholar

15.Kim, G. W., Kwak, N. and Lee, B. H., “Active Range Sensing and Grid Map Building Using Halogen Sheet-of-Light,” Proceedings of the 2nd International Conference on Ubiquitous Robots and Ambient Intelligence (2005) pp. 223–227.Google Scholar

16.Fisher, R. B., Ashbrook, A. P., Robertson, C. and Werghi, N., “A Low-Cost Range Finder Using a Visually Located, Structured Light Source,” Proceedings of the International Conference on 3-D Digital Imaging and Modeling (1999) pp. 24–33.Google Scholar

17.Park, J., DeSouza, G. N. and Kak, A. C., “Dual-Beam Structured-Light Scanning for 3-D Object Modeling,” Proceedings of the International Conference on 3-D Digital Imaging and Modeling (2001) pp. 65–72.Google Scholar

18.Zhang, G. and Ma, L., “Modeling and calibration of grid structured light based 3-D vision inspection,” J. Manuf. Sci. Eng. 122, 34–738 (2000).CrossRefGoogle Scholar

19.Zhang, Z., “Flexible Camera Calibration by Viewing a Plane From Unknown Orientations,” Proceedings of the IEEE International Conference on Computer Vision (1999) pp. 666–673.Google Scholar

20.Stepan, P., Kulich, M. and Preucil, L., “Robust data fusion with occupancy grid,” IEEE Trans. Syst. Man Cybern. 35, 106–115 (2005).CrossRefGoogle Scholar

21.Moshiri, B., Asharif, M. R. and HoseinNezhad, R., “Pseudo information measure: a new concept for extension of Bayesian fusion in robotic map building,” Inf. Fusion, 3 (1), 51–68 (2002).CrossRefGoogle Scholar

22.Oriolo, G., Ulivi, G. and Vendittelli, M., “Fuzzy maps: a new tool for mobile robot perception and planning,” J. Robot. Syst. 14 (3), 179–197 (1997).3.0.CO;2-O>CrossRefGoogle Scholar