Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/79583
Title: Precise reconstruction of indoor environments using RGB-depth sensors
Authors: Darwish, Walid Abdallah Aboumandour
Advisors: Chen, Wu (LSGI)
Wu, Bo (LSGI)
Keywords: Computer vision
Depth perception
Issue Date: 2018
Publisher: The Hong Kong Polytechnic University
Abstract: Commercial RGB-D cameras (e.g., Kinect) have been widely used in the gaming industry as non-touch remote controllers. RGB-D cameras are designed for maximum three-meter range applications where geometric fidelity is not of utmost importance. Recently, Structure Sensor was released in the commercial market as the first mobile RGB-D camera. As this promising camera has great potential to be used in indoor navigation and 3D modelling, precise calibration of their depth information, working range, and geometric sensor parameters should be thoroughly obtained. In this study, we propose a novel calibration method for Structured Light (SL) RGB-D cameras. The calibration method uses a novel distortion model for the captured depth images. The depth distortion model consumes the distortion effects of both IR sensors. The method calibrates the geometric parameters of each RGB-D camera lens. Moreover, the method extends to modelling the systematic depth bias resulting from imaging conditions and IR sensors' baseline. The method can thoroughly calibrate the SL RGB-D cameras' full range independently of the IR sensors' baseline. The calibration procedure was normalized and designed to be automatic. The proposed calibration method can calibrate the full range of the sensor and achieve a relative error of 0.8%, while ordinary calibration methods can only calibrate up to 34% of the sensor's range and achieves a relative error of 4.0%. Due to indoor scalability, many RGB-D frames were collected and registered together to form a complete colored 3D model. The Simultaneous Localization And Mapping (SLAM) technique is used to track the RGB-D camera. The scene structure, the depth range, and feature types are the dominant elements affecting registration accuracy and thus SLAM performance. Those elements can easily force SLAM into a severe drift or terminate the tracking status (lost tracking). Current SLAM systems use visual matched point features to compute the camera pose; therefore, those systems suffer from lost tracking problems and inevitable drift. To minimize the probability of lost tracking and drift, strong features (lines, planes) were added to the SLAM tracking core. In this context, a new procedure to detect, extract, describe, and match those 3D features was proposed. Line features were extracted using RGB and depth images while plane features were extracted using the depth image. The procedure uses a novel descriptor which adopted both visual and depth information to describe the 3D features for further matching. A new RGB-D SLAM system is proposed to utilize the valuable 3D matched features. The Fully Constrained RGB-D SLAM (FC RGB-D SLAM) system minimizes the combined geometric distance of 2D and 3D matched features to estimate the camera pose, then to enhance 3D model quality, the system applies a global refinement stage to refine the estimated camera poses based on indoor geometric constraints. Also, the system adopts the graph-based optimization technique to correct the closure error whenever a loop closure is detected. The results show that compared to visual RGB-D SLAM systems, FC RGB-D SLAM can achieve significant improvements in 3D model accuracy with and without loop closure constraints.
Description: xix, 119 pages : color illustrations
PolyU Library Call No.: [THS] LG51 .H577P LSGI 2018 Darwish
URI: http://hdl.handle.net/10397/79583
Rights: All rights reserved.
Appears in Collections:Thesis

Files in This Item:
File Description SizeFormat 
991022165759203411_link.htmFor PolyU Users167 BHTMLView/Open
991022165759203411_pira.pdfFor All Users (Non-printable)3.13 MBAdobe PDFView/Open
Show full item record
PIRA download icon_1.1View/Download Contents

Page view(s)

15
Citations as of Feb 18, 2019

Download(s)

6
Citations as of Feb 18, 2019

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.