M.A. Lanthier, D. Nussbaum, and A. Sheng (Canada)
Stereo Camera, Sonar, Infrared, Data Fusion, Mapping
Vision-based sensors such as stereo cameras, are often used on mobile robots for mapping and navigation purposes. Cameras provide a rich set of data making them useful for object recognition, localization and detecting environmen tal structure. When obtaining range measurements, how ever, stereo camera vision systems do not perform well un der some environmental conditions such as regions which are uniform in appearance (e.g., plain walls), large metallic or glass surfaces (e.g., windows) and poor lighting condi tions. This paper describes how range data obtained from a stereo camera vision system can be improved upon through use of additional sonar and infrared proximity sensors. We provide experimental results showing that data fusion from three types of sensor range data does indeed result in a more accurate occupancy grid mapping.
Important Links:
Go Back