Main Menu (Mobile)- Block
- Overview
-
Support Teams
- Overview
- Anatomy and Histology
- Cryo-Electron Microscopy
- Electron Microscopy
- Flow Cytometry
- Gene Targeting and Transgenics
- Immortalized Cell Line Culture
- Integrative Imaging
- Invertebrate Shared Resource
- Janelia Experimental Technology
- Mass Spectrometry
- Media Prep
- Molecular Genomics
- Primary & iPS Cell Culture
- Project Pipeline Support
- Project Technical Resources
- Quantitative Genomics
- Scientific Computing Software
- Scientific Computing Systems
- Viral Tools
- Vivarium
- Open Science
- You + Janelia
- About Us
Main Menu - Block
- Overview
- Anatomy and Histology
- Cryo-Electron Microscopy
- Electron Microscopy
- Flow Cytometry
- Gene Targeting and Transgenics
- Immortalized Cell Line Culture
- Integrative Imaging
- Invertebrate Shared Resource
- Janelia Experimental Technology
- Mass Spectrometry
- Media Prep
- Molecular Genomics
- Primary & iPS Cell Culture
- Project Pipeline Support
- Project Technical Resources
- Quantitative Genomics
- Scientific Computing Software
- Scientific Computing Systems
- Viral Tools
- Vivarium
Abstract
Metric learning seeks a transformation of the feature space that enhances prediction quality for a given task. In this work we provide PAC-style sample complexity rates for supervised metric learning. We give matching lower- and upper-bounds showing that sample complexity scales with the representation dimension when no assumptions are made about the underlying data distribution. In addition, by leveraging the structure of the data distribution, we provide rates fine-tuned to a specific notion of the intrinsic complexity of a given dataset, allowing us to relax the dependence on representation dimension. We show both theoretically and empirically that augmenting the metric learning optimization criterion with a simple norm-based regularization is important and can help adapt to a dataset’s intrinsic complexity yielding better generalization, thus partly explaining the empirical success of similar regularizations reported in previous works.