Diese Präsentation wurde erfolgreich gemeldet.
Wir verwenden Ihre LinkedIn Profilangaben und Informationen zu Ihren Aktivitäten, um Anzeigen zu personalisieren und Ihnen relevantere Inhalte anzuzeigen. Sie können Ihre Anzeigeneinstellungen jederzeit ändern.
ECCV2018, sota on CARS196, 9 pages
The number of embeddings: L
The dimension of each embedding: D
D equals the number of meta-classes
Proxy NCA loss for trai...
The dimension of each embedding: D The number of embeddings: L
The dimension of each embedding: D The number of embeddings: L
D L
Higher than our SSR with
48 models ensemble
Discussion1:
Do we really
need attribute
to enhance feature
learning?
Samples within a meta class can
be viewed as sharing...
Discussion2:
In hidden layers, we may expect some clusters within the dataset.
A cluster may be viewed as a meta class.
em...
Deep randomized embedding
Nächste SlideShare
Wird geladen in …5
×

Deep randomized embedding

35 Aufrufe

Veröffentlicht am

yifan sun

Veröffentlicht in: Technologie
  • Als Erste(r) kommentieren

  • Gehören Sie zu den Ersten, denen das gefällt!

Deep randomized embedding

  1. 1. ECCV2018, sota on CARS196, 9 pages
  2. 2. The number of embeddings: L The dimension of each embedding: D D equals the number of meta-classes Proxy NCA loss for training each embedding
  3. 3. The dimension of each embedding: D The number of embeddings: L
  4. 4. The dimension of each embedding: D The number of embeddings: L D L Higher than our SSR with 48 models ensemble
  5. 5. Discussion1: Do we really need attribute to enhance feature learning? Samples within a meta class can be viewed as sharing a latent attribute. So meta classes corresponds to randomized attributes
  6. 6. Discussion2: In hidden layers, we may expect some clusters within the dataset. A cluster may be viewed as a meta class. employing meta class = enforcing diversity of clustering? Discussion3: Encoding the original one-hot label into a sequential label. Using L-2 loss (or KLDiv loss, etc.) for learning the embedding brings about a similar improvement?

×