Local and global structure in neural network representations

Dec 15, 2023

Speakers

About

Empirically and sometimes even theoretically, neural network training objectives lead similar data points to cluster near each other in learned representations. However, the global structure of these representations, i.e., the locations of clusters of similar points, is typically less constrained. In this talk, I'll first present results from our recent work demonstrating that this global structure is important for downstream tasks that require learning from few examples, and it can be substantially improved with six orders of magnitude fewer data points than were used for pretraining. I'll then discuss implications of this local/global structure dichotomy for measurement of similarity between neural network representations.

Organizer

Like the format? Trust SlidesLive to capture your next event!

Professional recording and live streaming, delivered globally.

Sharing

Recommended Videos

Presentations on similar topic, category or speaker

Interested in talks like this? Follow NeurIPS 2023