Font Size: a A A

Research On Augmented Reality Navigation Technology For Minimally Invasive Surgery Under Stereoscopic Endoscope

Posted on:2022-08-21Degree:MasterType:Thesis
Country:ChinaCandidate:W P CaiFull Text:PDF
GTID:2504306539961229Subject:Electronics and Communications Engineering
Abstract/Summary:PDF Full Text Request
In recent years,the medical field has continuously introduced high-tech achievements in computers,such as the use of robotics,computer graphics and image technology to help medical treatments,which has formed a trend of discipline integration,especially in surgery,computer-assisted minimally invasive surgery technology has gradually become Field hotspots and research directions.One of the more popular technologies is augmented reality technology,which is characterized by the fusion of virtual and real,which can enhance people’s experience and perception of real scenes.At present,in visually guided minimally invasive surgery,it is difficult for surgeons to locate surgical instruments and specific lesions under endoscopy,limited field of view,and lack of depth information.This article focuses on the automatic detection and positioning technology of minimally invasive surgical instruments and based on ultrasound images.The fusion of augmented reality technology for minimally invasive surgery and applied to minimally invasive surgery navigation can not only accurately obtain the position information of surgical instruments in real time,but also can use prior information such as ultrasound images that are immediately available on the minimally invasive surgery screen.Superimposition and fusion are used for the doctor’s navigation during the operation,which can visually display the patient’s lesions during the operation,accurately locate the surgical instruments,and ensure the smooth progress of the operation.First,this paper studies a surgical instrument positioning algorithm of a minimally invasive surgery augmented reality navigation system to realize automatic detection and positioning of surgical instruments,and obtain three-dimensional spatial positioning information.This algorithm first generates a surgical instrument segmentation model based on U-Net network training to realize the segmentation of surgical instruments in a complex minimally invasive surgery environment;then uses the ORB algorithm to detect the head features of the surgical tool in the binocular image under the stereo endoscopic field of view,Combined with the optimized matching algorithm based on NN-GMS proposed in this paper to eliminate the mismatch,and then calculate the three-dimensional space coordinates of the matching feature points according to the binocular stereoscopic endoscopic imaging model to generate a three-dimensional feature point cloud;finally,to minimize The registration error function is the target,and the ST-ICP algorithm is used to register the three-dimensional feature point cloud with the surgical instrument point cloud model in the stereo space,so as to obtain the positioning information of the surgical instrument under the binocular stereoscopic endoscope.Second,the traditional augmented reality navigation technology for minimally invasive surgery uses medical prior information(such as CT,MRI)to rebuild the threedimensional space,and re-add the assembled frame to the video of the binocular stereoscopic endoscopy.Techniques to locate the lesion mainly rely on preoperative data,which cannot predict the movement and deformation of the soft tissue during the operation.Therefore,this paper studies the key technologies of real-time detection and fusion of ultrasound probes based on ultrasound images under stereo endoscopy,aiming to achieve an augmented reality navigation effect that is more in line with the real environment.Firstly,based on the minimally invasive surgical instrument automatic detection and positioning algorithm proposed in this paper,the three-dimensional space coordinate points of the head of the binocular stereoscopic endoscopic video ultrasound probe are determined;then the three-dimensional space coordinate points of the head part of the ultrasound probe are used to calculate the sweep of the ultrasound probe Plane;Finally,the ultrasound image scanned by the ultrasound probe is instantly fused and superimposed on this scanning plane,so as to achieve the effect of augmented reality.This technology realizes the instant superposition and fusion of the ultrasound image in the video of the binocular stereoscopic endoscope,and ensures that the position of the ultrasound image in the stereo space is consistent with the position of the ultrasound probe scanning in the real environment.
Keywords/Search Tags:Minimally invasive surgery, Augmented reality navigation, Detection and positioning, Ultrasonic image fusion, Stereo vision
PDF Full Text Request
Related items