With the rapid development of technology,biometrics has become a reliable method of identifying a human.Gait recognition is an emerging behavioral biometric which identifies an individual through exploiting a subject’s distinctive way of walking without interfering with the subject’s activity.Compared with physiological biometrics,gait recognition has its advantages such as nonaggression,noncontact,easy collection,and hard to hide and camouflage,which makes it have contributed to many potential applications in the field of visual surveillance,access control,biometric authentication,criminology and medical diagnosis.More and more researchers are absorbed to pay attention to the gait recognition.The gait recognition includes three parts which are moving target detection,feature extraction and classification.In this paper,a novel gait recognition method which can select the optimal characteristics adaptively is proposed in order to solve the problem that the different part of human body has different effect in identifying an individual.The main work of this thesis includes:The first stage is gait image preprocessing.A complete binary gait image sequence can be obtained by moving target detection,morphology processing and connectivity analysis.Then we employ the edge detection,gait cycle detection and standardization to dispose the binary image for the following work.The second stage is gait feature extraction and processing.The width of human body,the velocity of silhouette’s outer contour and the vertical coordinate of centroid are chosen as the candidate feature sets which can be calculated from the processed gait images,and then,these features are analyzed by the wavelet transform to overcome the error of preprocessing.The third stage is classification.Every feature is considered as the gait feature for recognition using DAG SVMs,the features which have better performance are chosen as the gait recognition characteristics.Then,gait recognition was accomplished using decision fusion based on weighted voting principle. |