Back to results list
Show full item record
Please use this identifier to cite or link to this item:
|Title:||Investigation of simultaneous localization and mapping (SLAM) in dynamic settings||Authors:||Zhang, Xinzheng||Degree:||Ph.D.||Issue Date:||2009||Abstract:||A crucial characteristic of an indoor autonomous mobile robot is its ability to determine its whereabouts and make sense of its environments. Simultaneous localization and mapping (SLAM) is also regarded as an essential behaviour for realization of other advanced tasks such as exploration and autonomous navigation. SLAM in static environments whereby the mobile robot is the sole moving object has been exhaustively studied in the last three decades. Real world, however, is generally dynamic, and the states of the objects are versatile over time. In this context, it is imperative to study SLAM in such environments. This thesis reports research carried out on a robust mapping methodology via robust statistics theory and develops a new set of SLAM strategies based on sensor fusion viewpoint by distributed fusion technology, Bayesian inference, and information theory. Additionally, the data association problem in SLAM is also considered. At first, occupancy grid-based and segment-based maps are studied along with three perception systems involving only ultrasonic sonar, only laser rangefinder and sonar plus monocular camera and are validated in both static and dynamic environments. The main purpose is to decide an effective map model and sensor configurations to be applied in subsequent studies. These studies led to adoption of the segment-based map integrated with laser rangefinder and its fusion with vision and ultrasonic sensors. This arrangement has been treated as the experimental framework for the rest of studies in the thesis. Estimation algorithms based on the ordinary least square fail to extract the features in the presence of moving objects. In this thesis, a robust regression model is utilized via a robust estimate called MM-estimate that fits the blurred data well, and provides a reasonable segment prediction. This robust regression model is embedded into the Extended Kalman Filter (EKF) SLAM to remove the dynamic features which correspond to the moving objects and sensor noise, and enhances the performance of SLAM procedure. In the EKF-SLAM, the data association problem is revisited and an optimal graph theory based approach is proposed. It is mathematically proved that optimally solving the minimum weighted bipartite graph matching problem is equivalent to optimally resolve the data association problem. For some special cases, however, the robust regression model does not function properly. When the dynamic objects move slowly or momentarily pause for a while, they are erroneously regarded as line segments. Hence, an indirect sensor fusion strategy is presented, which consists of two aspects. The first is a feature fusion based on the Bayesian inference which synthesizes line segments generated by a robust regression model from laser rangefmder with static line features extracted from a monocular camera. This policy eliminates any pseudo segments that will appear from momentary pausing of dynamic objects in laser data. The second is a modified multi-sensor point estimation fusion that amalgamates two individual EKF-SLAM algorithms: monocular and laser SLAM. It is mathematically proved that the covariance of the state variables in fused SLAM is reduced compared with those of individual SLAM, and the accuracy of localization is improved. Particularly, for monocular SLAM the thesis suggests another data association technique based on the homography transformation that relaxes the pleonastic computation. The indirect feature fusion procedure only makes the hypothesis test on the removal of pseudo segments. Considering this deficiency, a modified feature fusion process named direct fusion management is proposed, which immediately combines the homogeneous parameters of segments extracted from sonar, laser rangefmder and camera by means of information theory. Also the associative features from different sensors are determined by information entropy. The fusion algorithm is a simple and general framework, which borrows the idea of information entropy weight in decision analysis discipline. Furthermore, the entropy weight is introduced into the parameter covariance fusion processing. That is allocating the weight to the related covariance matrix and applying the covariance intersection algorithm for these weighted covariance matrices to derive an amalgamated reduced covariance matrix. The fused features contribute to the EKF-SLAM and decrease the error of robot position compared with the results without fusion. In essence, the thesis presents novel methodologies for SLAM in dynamic settings. The above solutions are validated by extensive simulation and experimental studies.||Subjects:||Hong Kong Polytechnic University -- Dissertations.
Robots -- Motion -- Mathematical models.
Robot vision -- Mathematical models.
|Pages:||1 v. (various pagings) : ill. ; 30 cm.|
|Appears in Collections:||Thesis|
View full-text via https://theses.lib.polyu.edu.hk/handle/200/5172
Citations as of May 22, 2022
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.