Font Size: a A A

Parallel Salient Region Detection Method On GPU

Posted on:2015-02-14Degree:MasterType:Thesis
Country:ChinaCandidate:J P DengFull Text:PDF
GTID:2298330467984612Subject:Computer application technology
Abstract/Summary:PDF Full Text Request
Salient region of an image is its most appealing part that can draw people’s attention. Reliable estimation of visual saliency is very significant in many computer vision tasks, including face recognition, image compression, adaptive segmentation, object tracking and image retrieval. Current salient region detection algorithms are calculating the salient value according to definition, and using a fixed threshold or adaptive threshold for image salient region segmentation. In practice, these algorithms are usually quite time-consuming, which cannot meet the requirement of real-time image processing. At meantime, most of them do not mark features of salient region, when the input image contains much noise, their detection accuracy decrease. To solve these problems, we propose a GPU based and anti-noise salient region detection and marking method, it can detect the salient region in a very short period of time and mark features of the region with the minimum bounding rectangle.Our algorithm consists of four basic steps. First, our method shrinks image to1/N fold with local histogram based image shrinking algorithm and smooth small image with general median filter. Based on the filtered image, we use bilinear interpolation algorithm to magnify it N times to the original image size and process it with bilateral filter. Through image difference new image with original image, we get the salient image. We use local clustering algorithm to acquire good saliency quality and region growing algorithm to label the salient region. At last, we get the features by calculating the minimum bounding rectangle with convex hull.We propose a parallel salient region detection algorithm based on state-of-art method. It’s easy to implement and it can meet real time processing requirements. In performance comparison of a300,000pixels test image with various algorithms, our approach can finish salient region detection and marking for in only30ms. Our approach acquired good precision and recall in open test dataset. It has been used in auto quality check of transistors in stream line. Meanwhile, our method exploits texture memory and quick interpolation based on hardware of GPU to accelerate. Through reasonable threads mapping, memory access and efficient data structure. It achieves a lot efficiency improvement. These methods contribute to other parallel algorithm research.
Keywords/Search Tags:Salient Region, Image Reduction, GPU Parallel Computing, Local Clustering
PDF Full Text Request
Related items