Font Size: a A A

Sensor fusion and knowledge integration for medical image recognition

Posted on:1992-11-01Degree:Ph.DType:Dissertation
University:Northwestern UniversityCandidate:Chen, Shiuh-YungFull Text:PDF
GTID:1478390014499737Subject:Computer Science
Abstract/Summary:
In this dissertation, we describe a medical image understanding system whose reasoning module employs the profound features of Dempster-Shafer theory. Given a set of three correlated images acquired from x-ray CT, {dollar}Tsb1{dollar}-, and {dollar}Tsb2{dollar}-weighted modalities at the same slicing level and angle of a human brain, the proposed system is capable of mimicking the reasoning process of a human expert in recognizing the image set based on (1) the knowledge about sensor characteristics, (2) the knowledge about anatomical structures, and (3) the knowledge about image processing and analysis tools. To implement such complicated processes, the blackboard architecture composed of three major components, knowledge sources, blackboard data structure, and control is adopted. The proposed system consists of three phases. In phase one, entities in the form of regions and curves with associated features are extracted from the images. The second and the third phases aim at recognizing the physically meaningful entities in the image set. In phase two, the system tries to identify the major anatomies and locate the slice in the model that is most similar to the image set under study. In phase three, the selected model slice is used to refine the formation of the identified anatomical structures and extract gray and white matters.
Keywords/Search Tags:Image, System
Related items