Font Size: a A A

Cell Recognition Based On Statistical Uncorrelated Linear Discriminant Analysis And Multifeature Fusion

Posted on:2007-09-12Degree:MasterType:Thesis
Country:ChinaCandidate:F H QuFull Text:PDF
GTID:2178360182996626Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
Medical microscopic image processing is the result of mutual promotion ofmany theories and technological,such as computer,image processing,patternrecognition and so on. The automatic analysis of microscopic image is one ofthe most important research subjects in medical image processing field. It hasnot only earned diagnosing time but also improved the diagnosing precision fordoctor.Feature level fusion has played a key role in the process of informationfusion. It has many advantages, on the one hand, it has already keptmulti-feature discriminant information. On the other hand , it reducesinformation redundancy to a great extent and implement informationcompression considerably. This paper adopts the projection of spectrumfeature,singular value feature and gray scale feature of image on the optimaldiscriminant vectors as the basis of recognition. It implement feature fusion andgain good recognition performance. The paper introduce the theory of lineardiscriminant and prove it by uniform mathematics. Based on the characteristicof matrix,we propose a kind of " increase method" to solve the sigular value ofadjacency matrix.Linear discriminant analyse and an optimal of statisticaluncorrelated discriminant vectorIt is the most important to select classifier in pattern recognition. lineardiscriminant analysis use linear function as discriminant function to patternclassify. Fisher linear discriminant analysis is a classical one.Fisher Criterion Function :( )TbF TJ ww S w= w Sω where w is projection direction,S b is between-class scatter matrix of samples,Sω is total within-class scatter matrix. Fisher criterion function shows the ratiobetween between-class distance and within-class distance of samples. To Fishercriterion function J F( w) ,the bigger the value is,the better the classficationeffects are.Let R ( w) = J F ( w) = wwT T SSω bwwThe generalized eigenvalue of matrix S b relative to matrix Sω is arrangedaccording to the size as follows:λ1 ≤ λ2 ≤ ≤ λnSuppose that the corresponding norm orthogonal eigenvector systemsto Sω are:p1 , p2 , , pnForR ( p n ) = mw≠a0x( R ( w)),let w* = pn,then w* is the optimal transformvector which we look for. This paper adopt { }pn , p n ?1 , ,pm here m ≥ 1 asthe most optimal set of discriminant vector. It is easy to proved that theprojection coefficients of this set of discriminant vector is linear uncorrected.Feature fusionIn the cell recognition processing,many features are combined to a newfeature for recognition. Through doing projection to the feature on an optimal ofstatistical uncorrelated discriminant vector,the projection coefficient vector hasoptimal classify effect by our rule. We adopt projection coefficient vectors asnew feature to recognize. We can do it as follows:Ψ 1,Ψ 2,Ψ 3 is the characteristic vectors extracted from sub-image of cellmicroscopic image. For the different vector has the different meaning,and thereis great difference on their own ranges of change and the range of valueselection ,we must do normalization to the characteristic vectors.Let 1112Ψ ′=ΨΨ, 2 222Ψ ′=ΨΨ, 3 332Ψ ′=ΨΨ,where Ψ 1, Ψ 2, Ψ 3 is themeans vector in the samples.And( )1( )iiinxx? ?Ψ ′= ?? ???? ??i = 1,2,3Then sub-image combinatorial feature vector is:(1) (1) (2) (2) (3) (3)X = ( x1 , , xn , x1 , , xn , x1 , , xn)TThe optimal set of discriminant vector Y = Φ TX is the obtained projectioncoefficient vector,Φ = (? 1 , ? 2 , ?3) is the optimal set of discriminant vector. Yis the new characteristic vector by fusion.The iteration method to solve the singular valueWe need compute gray matrix's singular value of sub-image. In order toimprove the speed of computing,based on the characteristic there are samesub-blocks on the two adjacency matrix,this paper present a method that usethe last matrix's iterative process to solve its adjacency matrix's sigular value.Suppose that A = [ ai ,j ]n× nis n -order symmetric matrix, T pq is rotationmatrix on the plane ( p , q ).1111pqc spTs cq? ???? ?? ?? ?? ?= ??? ????? ?? ?? ??? ??(1)where c = cosθ,s = sinθ,put transform T pq to A ,we obtain B = T pq ATpTq.It is easy to see:bk l = ak l, k ≠ p , q , l ≠p ,q (2)2 22 2 22 222( ) ( ), ,, ,pp pp pq qqqq pp pq qqpq pq qq pppk kp kqqk kp kqb c a csa s ab s a csa c ab c s a cs a ab ca sa k p qb sa ca k p q? = + +???? == ? ? + +??? = + ≠??= ? + ≠(3)if select1cos [ 1(1 2 2 )]2c = θ= 2+ y x + ys = sin θ= x (2c x 2 + y2) (4)andx = 2 a pq sign ( a pp ? aq q ), y = a pp ?aqq (5)Then we obtain b pq = bqp= 0. Similarity transformation to rotation matrix,we can annihilate nondiagonal elements a pq and aq p to 0,which is on thetwo symmetry position of matrix A .In Jacobi method we select position by a certain regular,then annihilatenondiagonal elements continuously,the matrix trends to diagonal matrix. It isorthogonal similarity transformation,so the elements on the diagonal is theeigenvalue's approximation of the original matrix.From (3),(4),(5),we find if p ,q ≥ k(k ∈ Z+ and k ≥ n),then thevalue of bi j(i ≥ k , j ≥ k) only relates to ai j(i ≥ k , j ≥ k). when p ,q ≤ k,thesituation is similar to this,so we can utilize the foregoing iterative process.In this paper,based on the matrix's characteristic that will be computed,we compute the next adjacency matrix's singular value by utilize the foregoingiterative process. Proved after the experiment,this article studies has gained ahigher operation speed. But this kind of algorithm has very great limitation,theimprovement degree of its speed depends on the distribution characteristic oforiginal matrix in a great extent,and it should to be improved further more.Cell recognition and classificationDuring the course of training,we extract red blood cell,white blood cell,spherical crystalloid and other similar impurity samples to train,gain an optimalof statistical uncorrelated discriminant vector Φ = (? 1 , ? 2 , ?3) and orthogonalvector system U = {u 1 , um}of K-L transform. We gain every samplecharacteristic vector's projection coefficient on Φ = (? 1 , ? 2 , ?3) too ,projection coefficient vector is regarded as contrast goal of recognition stage.During the course of recognition,first,we find the possible position of thecell centre by using gray and texture characteristic of the image to berecognized,then mark it and keep in the array at the same time. second,calculate the characteristic vector of sub-image of 16 × 16 with the centre ofthe marked,we get the projection coefficient vector through the obtainedcharacteristic vector as the new characteristic vector. Finally,we use thedesigned classifier to classify.We test 100 cell micro image by matlab7.0. the result shows:The recognition rate of red blood cell > 91%.The recognition rate of white blood cell > 87%.
Keywords/Search Tags:Uncorrelated
PDF Full Text Request
Related items