Please use this identifier to cite or link to this item:
Title: Suture thread detection and 3D model reconstruction for automated surgical knot tying with a vision-based robotic system
Authors: Lu, Bo
Degree: Ph.D.
Issue Date: 2019
Abstract: With the advancement of modern technology, robots have made significant progress over last two decades. Owing to the advantages of high precision, dexterity, and sophistication, more and more traditional industries start to deploy robots to liberate labors, obtaining higher accuracy to complete tasks in extreme conditions. Nowadays, surgical robots can be found in clinics, and robot-assisted surgery (RAS) has become the cutting-edge technique in minimally invasive surgery (MIS) to provide precise and flexible maneuverability for surgical tools. To further reduce the on-site burden of the surgeons in a long surgery, one promising solution is to develop surgical robots and related algorithms to execute low-level operations automatically. Surgical knot tying is a fundamental yet important step. This thesis investigates the automation of the suture thread grasping and its looping procedures. With a pierced surgical thread, its three-dimensional (3D) position estimation is essential towards the automated suture grasping. Nevertheless, accurate suture detection remains a challenging task because of the flexible property and indistinctive feature points of sutures. To resolve this problem, a novel model-free method is proposed to enable the 3D reconstruction of a suture thread through a pre-calibrated stereo-camera system. To interact with the robotic system, surgeons only need to indicate the tip and grasping points in the left camera. Our approach can further refine the manual clicks and locate the accurate positions of these two pints. Moreover, an iterative computation method is employed segment points on the suture thread so that stereo pairs of key points identified sequentially in both cameras. Consequently, 3D cooridnates reconstruction of the suture thread can be proceeded. Experiments were conducted using different backgrounds to examine the accuracy and robustness of the algorithm in detecting the image coordinates of the suture thread. Afterward, suture threads were formed in various 3D shapes, and their computer outcomes were obtained for comparison. To further enhance the 3D computation of the suture thread, a deep-learning driven surgical suture thread segmentation method was proposed. This model was well trained using our own data set, and it can be applied to the suture's tip localization, eliminating the manual click operation. Afterward, a Hessian matrix-based filter was built to remove environmental noises and highlight the information of the curved suture thread. Owing to the ambiguity of Hessian eigenvectors' directions in detecting curvilinear objects, we first apply an accurate multistencils fast marching method (MFMM) to the surgical suture thread detection, and the image was transformed to an arrival time map of propagation. Integrating these results, a vision-based stereo pairing algorithm which calculates the stereo key point pairs from the suture thread's tip to its end was thereby proposed for 3D coordinates computation. With the 3D information, the suture thread can be picked up automatically with robots. Experiments associated with the precision of the deep learning model, the robustness of the image filter, and the overall accuracy of 3D coordinate computation of the suture thread were conducted using different backgrounds with various noises.
To further conduct the automated suture thread looping manipulation, a dynamic approach is presented to automate knot tying with an in-house robot vision system. We proposed a new robotic knot tying technique, which aims at reducing the workspace required to construct a surgical knot through simultaneous manipulation and coordination between two grippers. The position offset along the center-of-axis direction was introduced between two grippers, which facilitates the efficient formation of the suture loop. Through proper path planning, potential issues such as the suture slippage and collisions between instruments can be eliminated. Moreover, visual images were employed to monitor the motions of two grippers in real-time. Derivations on the transformation relationship between the image and the robot coordinates were provided, and the positions of two grippers were evaluated using transformation matrices obtained experimentally. In addition, a Linear Quadratic (LQ) control scheme was applied to optimize the tracking performance of the two grippers. During the dynamic looping, visual occlusions between two grippers may occasionally happen. Hence, an position estimation approach was also incorporated and merged into the control method to deal with this condition, which can further enhance the robustness of the entire procedure. Experiments associated with the accuracy of the visual system in evaluating the gripper position was conducted. The proposed model to predict the lost tracking position was also examined, and different parameters in the control scheme were examined by introducing external impulse disturbances during the knot tying process. Finally, the overall suture looping operation was successfully performed in all six trials, which consequently proved a reliable and efficient automated approach for the surgical suture looping. The entire method requires a manipulator with only 3 degrees of freedom (DOF) with a stereo-camera, and this set of approach can be impelemnted to surgical robots with a high-quality stereo endoscopy to carry out in-vivo manipulations. In this thesis, solutions to resolve key challenges in surgical knot tying were comprehensively validated. The success of this study not only gives a significant insight in achieving automation of surgical knot tying, but also provides a basis for automating various surgical tasks in the near future.
Subjects: Hong Kong Polytechnic University -- Dissertations
Surgical robots
Robot vision
Pages: xxii, 205 pages : color illustrations
Appears in Collections:Thesis

Show full item record

Page views

Last Week
Last month
Citations as of Jun 4, 2023

Google ScholarTM


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.