用户名: 密码: 验证码:
Global and local metric learning via eigenvectors
详细信息    查看全文
文摘
Distance metric plays a significant role in machine learning methods(classification, clustering, etc.), especially in k-nearest neighbor classification(kNN), where the Euclidean distances are computed to decide the labels of unknown points. But Euclidean distance ignores the statistical structure which may help to measure the similarity of different inputs better. In this paper, we construct an unified framework, including two eigenvalue related methods, to learn data-dependent metric. Both methods aim to maximize the difference of intra-class distance and inter-class distance, but the optimization is considered in global view and local view respectively. Different from previous work in metric learning, our methods straight seek for equilibrium between inter-class distance and intra-class distance, and the linear transformation decomposed from the metric is to be optimized directly instead of the metric. Then we can effectively adjust the data distribution in transformed space and construct favorable regions for kNN classification. The problems can be solved simply by eigenvalue-decomposition, much faster than semi-definite programming. After selecting the top eigenvalues, the original data can be projected into low dimensional space, and then insignificant information will be mitigated or eliminated to make the classification more efficiently. This makes it possible that our novel methods make metric learning and dimension reduction simultaneously. The numerical experiments from different points of view verify that our methods can improve the accuracy of kNN classification and make dimension reduction with competitive performance.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700