Local and global structure in neural network representations

15. prosince 2023

Řečníci

O prezentaci

Empirically and sometimes even theoretically, neural network training objectives lead similar data points to cluster near each other in learned representations. However, the global structure of these representations, i.e., the locations of clusters of similar points, is typically less constrained. In this talk, I'll first present results from our recent work demonstrating that this global structure is important for downstream tasks that require learning from few examples, and it can be substantially improved with six orders of magnitude fewer data points than were used for pretraining. I'll then discuss implications of this local/global structure dichotomy for measurement of similarity between neural network representations.

Organizátor

Baví vás formát? Nechte SlidesLive zachytit svou akci!

Profesionální natáčení a streamování po celém světě.

Sdílení

Doporučená videa

Prezentace na podobné téma, kategorii nebo přednášejícího

Zajímají Vás podobná videa? Sledujte NeurIPS 2023