Surgery simulation has attracted great attention over the past twenty years. Existing simulators have taken active roles in a variety of applications, including surgery education, surgery planning, surgery performance evaluation, etc. Successful simulator products can allow simulation users to perform surgery procedures in a virtual training environment, thereby mimicking the effect of real surgery training. This dissertation covers three aspects of research related to surgery simulation: 1) physical-based deformation; 2) haptic feedback; and 3) the visual rendering and display. A physical-based deformation algorithm is accelerated by using the power of a graphics processing unit. By rearranging the memory access pattern on the graphics card side, the time consumption in each deformation frame is reduced between 50% and 65%. The dissertation also presents the derivation and implementation of a volume-based haptic feedback algorithm. Unlike previous haptic feedback algorithms that are only relying on the penetration depth, this algorithm uses the help of graphics cards to sample the penetration volume along three axes to form the feedback force. The proposed algorithm is further optimized by using a volume reconstruction method, which greatly increases the haptic feedback frame rate. For the aspect of visualization, since existing rendering modules only illustrate the object geometry by showing surface illumination, it can be difficult to tell the surface geometry variation under certain light settings. To address this issue, the dissertation describes an innovative non-photorealistic rendering scheme. It can vividly illustrate very subtle curvature variations on the surface through simulated textures called pencil-strokes. The interaction forces, as well as elastic forces, are also displayed using color coding. |