Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/86980
DC FieldValueLanguage
dc.contributorDepartment of Land Surveying and Geo-Informatics-
dc.creatorDarwish, Walid Abdallah Aboumandour-
dc.identifier.urihttps://theses.lib.polyu.edu.hk/handle/200/9662-
dc.language.isoEnglish-
dc.titlePrecise reconstruction of indoor environments using RGB-depth sensors-
dc.typeThesis-
dcterms.abstractCommercial RGB-D cameras (e.g., Kinect) have been widely used in the gaming industry as non-touch remote controllers. RGB-D cameras are designed for maximum three-meter range applications where geometric fidelity is not of utmost importance. Recently, Structure Sensor was released in the commercial market as the first mobile RGB-D camera. As this promising camera has great potential to be used in indoor navigation and 3D modelling, precise calibration of their depth information, working range, and geometric sensor parameters should be thoroughly obtained. In this study, we propose a novel calibration method for Structured Light (SL) RGB-D cameras. The calibration method uses a novel distortion model for the captured depth images. The depth distortion model consumes the distortion effects of both IR sensors. The method calibrates the geometric parameters of each RGB-D camera lens. Moreover, the method extends to modelling the systematic depth bias resulting from imaging conditions and IR sensors' baseline. The method can thoroughly calibrate the SL RGB-D cameras' full range independently of the IR sensors' baseline. The calibration procedure was normalized and designed to be automatic. The proposed calibration method can calibrate the full range of the sensor and achieve a relative error of 0.8%, while ordinary calibration methods can only calibrate up to 34% of the sensor's range and achieves a relative error of 4.0%. Due to indoor scalability, many RGB-D frames were collected and registered together to form a complete colored 3D model. The Simultaneous Localization And Mapping (SLAM) technique is used to track the RGB-D camera. The scene structure, the depth range, and feature types are the dominant elements affecting registration accuracy and thus SLAM performance. Those elements can easily force SLAM into a severe drift or terminate the tracking status (lost tracking). Current SLAM systems use visual matched point features to compute the camera pose; therefore, those systems suffer from lost tracking problems and inevitable drift. To minimize the probability of lost tracking and drift, strong features (lines, planes) were added to the SLAM tracking core. In this context, a new procedure to detect, extract, describe, and match those 3D features was proposed. Line features were extracted using RGB and depth images while plane features were extracted using the depth image. The procedure uses a novel descriptor which adopted both visual and depth information to describe the 3D features for further matching. A new RGB-D SLAM system is proposed to utilize the valuable 3D matched features. The Fully Constrained RGB-D SLAM (FC RGB-D SLAM) system minimizes the combined geometric distance of 2D and 3D matched features to estimate the camera pose, then to enhance 3D model quality, the system applies a global refinement stage to refine the estimated camera poses based on indoor geometric constraints. Also, the system adopts the graph-based optimization technique to correct the closure error whenever a loop closure is detected. The results show that compared to visual RGB-D SLAM systems, FC RGB-D SLAM can achieve significant improvements in 3D model accuracy with and without loop closure constraints.-
dcterms.accessRightsopen access-
dcterms.educationLevelPh.D.-
dcterms.extentxix, 119 pages : color illustrations-
dcterms.issued2018-
dcterms.LCSHHong Kong Polytechnic University -- Dissertations-
dcterms.LCSHComputer vision-
dcterms.LCSHDepth perception-
Appears in Collections:Thesis
Show simple item record

Page views

42
Last Week
0
Last month
Citations as of Apr 14, 2024

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.