打开APP
userphoto
未登录

开通VIP,畅享免费电子书等14项超值服

开通VIP
Integrated Mobile Robot Navigation System for Truly Autonomous Mobile Robots
Introduction ::
As humans, if a mobile robot has to get familiar with a completely unknown environment, it should be able to build the map of the environment by exploring it.
Like us, Robots can map only a part of the environment which is visible from current position.
It should be able to navigate to the unknown region of the environment by predicting them
and planning a safe path to reach there,
while also maintaining the track of its position.
These operations require the presence of SLAM (simultaneous localization and mapping), planning and exploration modules.
Robots equipped with sonar only sensors are cheap and small in size.
But the readings from sonar are highly unpredictable beyond some minimum angle of incidence of the sonar beam onto the obstacle.
We have developed an integrated mobile robot navigation system for a mobile robot equipped with sonar only sensors that has all the above modules working in tandem and seamlessly.
Challenges with Sonar Sensors ::
Data from sonar sensors are highly unreliable as compared to laser range finder. But sonar is still an attractive alternative due to its much smaller size and much lower cost as compared to laser range finder.
Inconsistency in Readings: As shown in figure 1, the world point Pw has been hit by two sonar beams from positions Pt1 and Pt2 of the robot at successive time instants. One may expect to obtain readings r1 & r2 as shown by solid lines. But practically one gets instead some unpredictable readings like r1extd & r2extd.
Hence Practically it is very rare to get the consistent pair of features or readings from two different positions by sonar.
Figure 1: From two position for sonar sensor instead of getting readings shown by solid line, we will get some unpredictable readings shown by extended dotted line, for a same world point Pw.
Figure 2: Concept of Region of Constant Depth (RCD): For all the three sonar beams shown as different colors, the perpendicular reading shown as middle solid arrowed line will be returned to the sonar sensor.
Region of Constant Depth (RCD): Figure 2 shows that how the sonar beam fired at three different angles will give the same reading, which is actually perpendicular distance from the wall, hence forming a region of constant depth (RCD).
But due to similar phenomenon, RCDs will be observed in the case of corners, edges, round curves etc. So, RCD does not always indicates presence of a wall segment.
In fact sonar readings are susceptible to high degree of uncertainty especially due to angular and radial errors along with specular reflection problems. Hence we need an intelligent approach to extract reliable information from sonar as much as possible at the same time maximizing the data usability because collecting data is also a time taking process.
Approach and Results ::
Map Building with Sonar ::
Our developed system can build safe and accurate map of the environment by using sonar only.
The Occupancy Grid (OG) maps provide a dense representation of the environment. However independence assumptions between cells during updates as well as not considering sonar models lead to inconsistent maps.
Feature based maps provide more consistent representation by implicitly considering correlation between cells. But they are sparse due to sparseness of features in a typical environment.
We have developed a method for integrating feature based representations within the standard Bayesian framework of OG and provides a dense, more accurate and safe representation than standard OG methods, for the robots equipped with sonar sensors only.
Left column in figure 3 shows different environments for test run. Middle column shows map built by using Occupancy Grid approach which uses raw sonar data. Right column shows map built using our approach. The enhancement in accuracy of the map is obvious in the right maps.
Figures 3 : left column : Real environments for test run; Middle Column : Map built using conventional Occupancy Grid approach; Right column : Map built using our approach.
SLAM with Sonar ::
SLAM stands for Simultaneous Localization And Mapping.
During the process of moving and scanning the environment, robot looses track of its 3D position (x,y,θ) and this pose error accumulate with motion despite highly robust hardware and low level controls.
For building the accurate map of the environment, robot need to know its exact position (x,y,θ).
But for knowing its exact position robot need to localize itself in the map, which requires that the correct map of the environment is known to robot.
Hence Simultaneous Localization And Mapping is challenging and it is known as Chicken and Egg problem in mobile robotics.
As mentioned earlier, by using sonar, it is very rare to get the consistent pair of features or readings from two different positions. Hence the problem of SLAM becomes most challenging when sonar sensors are used.
We have developed a feature chain based approach for SLAM using sonar data only. For localizing, instead of relying completely on matching of feature to feature or point to point, our approach creates various features chains, by finding possible associations between features. For achieving robustness a link graph based approach has been introduced to remove false associations.
In Figure 4, left column shows different test environments. Middle column shows the map built by continuously localizing the robot by our approach and right column shows the maps without localizing the robot. Right maps are coming as curve because robot is losing the track of its position during map building. Performance gain of our system is obvious from these figures.
Figures 4 : left column : Real environments for test run; Middle Column : Map built using our approach of SLAM; Right column : Map built without correcting the localization error.
As a proof of concept we tested our SLAM approach in two environments which require the robot to complete the loop. As shown in figure 5 the robot is able to build maps involving loops, demonstrating its efficacy. Last row of figure 5 shows a bigger environment which requires the robot to come out of a room and enter into another room passing through a corridor. The present approach enables the robot to built the map autonomously without any external path guiding.
Figures 5 :  left column : Real environments for test run which includes loops, rooms and corridor ; Right Column : Map built by robot autonomously without any external aid for path.
Exploration and Planning Modules ::
Our system uses frontier based exploration to take decision where to move next to explore more unknown regions.
First it finds a set of boundaries between known and unknown regions called as frontier and takes a decision to move one of the frontier based on various criteria.
To reach to a frontier it plans a shortest path inside the safe and known region, based on cost grid.
Figure 6(a) shows the partial map built after one360degree scan of the environment. Figure 6(b) shows different frontiers detected by robot. Dotted blue ellipse in figure 6(b) shows each of such frontier region and the green arrow points to the location decided by the robot to visit to explore that frontier region. Robot has decided to move the bottom right frontier shown as light-green boundary. And the dark-green curve inside the gray region of figure 6(c) shows the planned shortest path to reach that frontier. The gray shaded region shows the safe region for robot to move.
(a)
(b)
(c)
Figure 6: (a) Partial map of the environment; (b) different frontier regions and corresponding frontier locations extracted by robot as a possible location for further exploration; (c) the planned path to reach the winner frontier location shown as green line segments inside the safe gray region.
Features of Our Work ::
The developed system works with sonar sensor which is cheap and small but highly unreliable.
It is a fully autonomous mobile robot navigation system, where no external guidance of path or exploration is provided.
Our system builds a safe map with sonar only, which is more near to ground truth environment. And hence reduces run time obstacle avoidance behavior along with blind and random decision making behaviors which otherwise will be an overhead due to inaccuracy of the map.
It also reduces the danger of collision by minimizing the hazardous zone in the map, which is basically the region robot is thinking as empty but actually having some obstacles.
It is an online SLAM process which does not delay the localization hence enabling the system to perform well with localization hungry and localization critical applications.
Conventional SLAM approaches requires point features but our approach is efficiently working with line segment features as well. Because in a real environment using sonar, from a single360degree scan of the environment, the features which can be extracted with certainty is line segments not the point features.
Conventional SLAM approaches requires matching of readings or features from different scans but our approach works in situations, where exact matching of features is very difficult.
Unlike many of the SLAM approaches our approach restricts the forward propagation of any residual error of any localization step, hence confining the effect of that error in that very local region where it has occurred.
By maintaining various feature chains robot can use the information of earlier features to localize from the current position itself, even if those features have not been detected in the present scan. Hence our approach significantly reduces the overhead of revisiting a feature for the sake of localization only.
Related Publications ::
[1]   Amit Kumar Pandey, K. Madhava Krishna, Mainak Nath, “Feature Based Occupancy Grid Maps for Sonar Based Safe Mapping”, International Joint Conference on Artificial Intelligence (IJCAI), 2007.
[2]   Amit Kumar Pandey, K. Madhava Krishna, Henry Hexmoor, “Feature Chain based Occupancy Grid SLAM for Robots Equipped with Sonar Sensors”, IEEE-International Conference on Integration of Knowledge Intensive Multi-Agent Systems (KIMAS), 2007.
[3]   Amit Kumar Pandey, K. Madhava Krishna, “Link Graph and Feature Chain based Robust Online SLAM for Fully Autonomous Mobile Robot Navigation System using Sonar Sensors”, International Conference on Advanced Robotics (ICAR), 2007.
[4]   Amit Kumar Pandey, K. Madhava Krishna, “Link Graph and Feature Chain based Robust Online SLAM for Fully Autonomous Mobile Robot Navigation System using Sonar Sensors”, Springer-Verlag Lecture Notes in Control and Information Sciences (LNCIS), 2007.
This is reduced list of publications related to my previous research work on Autonomous Mobile Robot Navigation System, at Robotics Research Lab at IIIT-Hyderabad.
Visithere for the updated list of publications.
Associated People ::
Amit Kumar Pandey (akpandey@research.iiit.ac.in ) Dr. K. Madhava Krishna (mkrishna@iiit.ac.in )
Homepage:http://homepages.laas.fr/akpandey Homepage:http://faculty.iiit.ac.in/~mkrishna
本站仅提供存储服务,所有内容均由用户发布,如发现有害或侵权内容,请点击举报
打开APP,阅读全文并永久保存 查看更多类似文章
猜你喜欢
类似文章
【热】打开小程序,算一算2024你的财运
Jacobs Robotics
视觉SLAM漫谈 (三): 研究点介绍
干货推荐 | 83 项开源视觉 SLAM 方案,够你用了吗?
RGB-D Mapping: Using Depth Cameras for Dense 3D Modeling of Indoor Environments
Mike Boyle's Joint by Joint Approach
初中英语书面表达精品范文(一)
更多类似文章 >>
生活服务
热点新闻
分享 收藏 导长图 关注 下载文章
绑定账号成功
后续可登录账号畅享VIP特权!
如果VIP功能使用有故障,
可点击这里联系客服!

联系客服