Font Size: a A A

Theory And Algorithms For Distance Metric Learning: Kernel Regression, Large Margin Nearest Neighbor And Fisher Linear Discriminant

Posted on:2011-10-30Degree:MasterType:Thesis
Country:ChinaCandidate:Q N ChenFull Text:PDF
GTID:2178360302964335Subject:Computer software and theory
Abstract/Summary:PDF Full Text Request
Distance metric learning has exhibited its great power to improve performance in regression, clustering and classification tasks. In this paper, we learn the theory and algorithm of the distance metric learning on kernel regression, large margin nearest neighbor classification and the Fisher linear discriminant classification.Tradition kernel regression is based on Euclidean distance, but Euclidean distance treats all features equally and can not reveal the internal structure. Different from the Euclidean distance, Mahalanobis distance is independent of the scale of the features and can reflect the internal structure. We combine Mahalanobis distance with traditional kernel regression and apply them to the short-term traffic flow forecasting. Experiments on real data of urban vehicular traffic flows are performed. Comparisons with traditional kernel regression show that our method is more effective for short-term traffic forecasting.The recently proposed large margin nearest neighbor classification improves the performance of K-nearest neighbor classification by a learned global distance metric. However, it does not consider the locality of data distributions. We propose a novel local distance metric learning method called hierarchical distance metric learning to combine with large margin nearest neighbor classification. Experiments are performed on many artificial and real-world data sets. Comparisons with the traditional K-nearest neighbor classification and the state-of-the-art large margin nearest neighbor classification show the effectiveness of the proposed hierarchical distance metric learning for large margin nearest neighbor classification.The goal of distance metric learning is to draw near the samples in the same class and separate the samples in different classes. Fisher discriminant analysis's main idea is consistent to the goal of distance metric learning. In real-world domains, many objects have many natural features and these features can be divided into several groups. Traditional Fisher discriminant analysis addresses problems with a single view. We propose multi-view Fisher discriminant analysis which combines traditional Fisher discriminant analysis with multi-view learning. The traditional Fisher discriminant, measures the between-class divergence by the distance between sample mean in every class and total mean, it does not consider the locality of data distributions. We further combine hierarchical metric learning with multi-view Fisher discriminant analysis. Experiments on many artificial and real-world data sets show the effectiveness of the proposed method.
Keywords/Search Tags:Distance metric learning, kernel regression, Fisher discriminant analysis, multi-view learning
PDF Full Text Request
Related items