Font Size: a A A

Compression aided feature based steganalysis of perturbed quantization steganography in JPEG images

Posted on:2008-04-27Degree:M.SType:Thesis
University:University of DelawareCandidate:Thorpe, ChristopherFull Text:PDF
GTID:2448390005456630Subject:Engineering
Abstract/Summary:
Steganography is the process of embedding data within a larger cover object. This larger cover object is usually a media file such as an image, audio or video file. The goal of the process is to embed the data in such a way that it is impossible to differentiate an ordinary object from a cover object. With the proliferation of media on the Internet, steganography has become relatively easy for two reasons. First, passing media between Internet users directly or posting the media on a website is a common practice and arouses no suspicion. Second, broadband has made using larger cover media possible thereby making changes to media via embedding harder to detect.; Steganalysis is the process of trying to determine if an object contains embedded data. There are two main methods used to detect steganography. The first and most commonly used method is to exploit the statistics of the image. The second method is to recompress the media and compare the results to known performance ranges for the compression algorithm.; This thesis uses both compression and traditional statistical methods to analyze existing steganographic techniques. There will be a focus on newer steganographic techniques, which operate in the transformation domain but attention will also be given to older spatial domain steganography techniques. Most new steganographic techniques embed messages into the cover media by degrading the quality of the cover media in a controlled way. The resulting degraded cover media is perceptually identical to the original media but a potential adversary is deprived of having access to an image of the same quality as the original cover image. The result is a more secure steganographic system that introduces less distortion than earlier methods.; A blind steganalysis algorithm that analyzes both the statistical properties of the embedding domain and the compressibility of the image can be compressed is proposed. A non-linear support vector machine will perform the classification. The input feature vector is composed primarily of statistical measures with a small but very influential compression component. A portion of the statistical features are created by finding the first, second, third and fourth order statistical moments of the transformation domain histograms. The remainder of the statistical portion is composed of correlation measures between carefully selected transformation domain coefficients. Higher order moments have higher miss rates than lower order moments because they more influenced by noise; however, research has shown that if appropriately treated, including higher order moments increases classifier accuracy. Using additional higher order moments as a basis for classification has the effect of increasing the size of the feature vector making accurate training difficult. To address this, classification is performed using each statistical moment order separately and a logical combination of the individual classifications used to produce a single classification for the presented image.; Previous work in steganalysis has shown that statistical analysis using support vector machines will provide good classification accuracy. The classification algorithm proposed in this thesis improves on these results by including more features than previous methods and weighting their influence appropriately. In addition, the inclusion of a compression feature vector improves hit rates by two to three percent and reduces false alarm rates similarly.
Keywords/Search Tags:Compression, Steganography, Feature, Media, Image, Cover, Steganalysis, Higher order moments
Related items