# Metric Learning

Differential Training, Similarity Learning

Discusssion

The goal of metric learning is to ensure that, after training, the distance between vectors of the same class remains small, while distance between different classes are large.

References

http://www.cs.toronto.edu/~rsalakhu/papers/oneshot1.pdf Siamese Neural Networks for One-shot Image Recognition

http://yann.lecun.com/exdb/publis/pdf/chopra-05.pdf Learning a Similarity Metric Discriminatively, with Application to Face Verification

The learning process minimizes a discriminative loss function that drives the similarity metric to be small for pairs of faces from the same person, and large for pairs from different persons.

https://arxiv.org/abs/1412.6622 Deep metric learning using Triplet network

http://web.cse.ohio-state.edu/~kulis/pubs/ftml_metric_learning.pdf Metric Learning: A Survey

http://arxiv.org/pdf/1509.05360v1.pdf Geometry-aware Deep Transform

Deep networks are often optimized for a classification objective, where class-labeled samples are input as training ; or a metric learning objective, where training data are input as positive and negative pairs.

In this section, we first propose a novel deep learning objective that unifies the classification and metric learning criteria. We then introduce a geometry-aware deep transform, and optimize it through standard back-propagation.

We denote formulation as Geometry aware Deep Transform (GDT). The GDT objective is a weighted combination of the two formulations. We can understand it as regularizing the metric learning formulation using the classification one.