Back to results list
Please use this identifier to cite or link to this item:
|Title:||Fuzzy modeling and behaviour learning of autonomous mobile robots||Authors:||Chow, Ka-ming Jimmy||Keywords:||Mobile robots
Hong Kong Polytechnic University -- Dissertations
|Issue Date:||2001||Publisher:||The Hong Kong Polytechnic University||Abstract:||In the past two decades, there has been a surge in research on Autonomous Mobile Robots (AMR) in particular within the soft-computing research community which has found AMRs as a perfect platform through which its ideas can be realized. Many researchers have advocated that mobile robots should be designed with characteristics to enable them to "understand" the outside word. Nevertheless, some researchers have cautioned that such "understanding" is very crude and should be only loosely interpreted as designing robots that combine several primitive behaviours by themselves, i.e., navigate along an unstructured environment, avoid obstacles and be able to generate dynamic maps for changing environments. In other words, robots should be designed to emulate "some" human functions. Fuzzy systems (FS) provide a mechanism for "machines" to copy human behaviour. In particular, fuzzy modeling has been widely used in the identification of unknown or partially known problems such as those encountered in AMR research. In this thesis, the challenging problems of modeling, behaviour learning and map building of AMRs are addressed and some solutions are provided. The approach adopted in this work is through fusion of fuzzy reasoning and genetic algorithms. This union has been augmented by model-based methodologies such as least-squares and gradient descant algorithms.
The studies undertaken and reported in this thesis are arranged into three phases. In the first phase, a new class of Fuzzy Model labeled Virtual Higher-Resolution (VHR - FM) is suggested and its properties are explored. The characteristics as well as comparison with standard fuzzy model have also been studied. This fuzzy model forms the backbone of several fuzzy learning algorithms.
In phase II, the notion of VHR - FM is used to formulate identification algorithms based on Gradient Descent and Recursive Least-Squares update laws. The VHR-FM is next combined with genetic algorithms to provide an on-line fuzzy identification algorithm. In order to alleviate the computational burden which is a by product on on-line identification, i.e., large window memory size, a novel Fuzzy Data Window Memory (FDWM) algorithm is also proposed. The FDWM reduces the size of the window memory significantly without deterioration in performance. Finally, the VHR-Table Look-up (TL) scheme is proposed which generates the fuzzy rules based on validating the data separately in the corresponding VHR fuzzy sets. The performances of these algorithms are assessed through extensive simulation and experimental studies.
Phase III of the study addresses the map-building problem. A Fuzzy-Tuned Grid-Based Map (FTGBM) is proposed. In this algorithm, the probability distribution function associated with the measurements from sonar sensor data is tuned on-line via the information obtained in the occupancy grid map. The FTGBM is also combined with the behaviour learning algorithms of phase II. The sonar data are also validated through the information available in the FTGBM. Finally, the validated sonar data are used to select an appropriate model automatically for further learning operation. The experimental platform for this phase has been an in-house designed and built mobile robot - The Explorer.
|Description:||1 v. (various pagings) : ill. ; 30 cm.
PolyU Library Call No.: [THS] LG51 .H577P EE 2001 Chow
|URI:||http://hdl.handle.net/10397/977||Rights:||All rights reserved.|
|Appears in Collections:||Thesis|
Show full item record
Files in This Item:
|b15995562_link.htm||For PolyU Users||179 B||HTML||View/Open|
|b15995562_ir.pdf||For All Users (Non-printable)||5.57 MB||Adobe PDF||View/Open|
Citations as of Mar 11, 2018
Citations as of Mar 11, 2018
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.