Can the Brain Mimic Weight-Sharing by using Local Objective Functions?

Jul 28, 2023

Speakers

About

Many different forms of weight-sharing have proved to be important in artificial neural networks. Backpropagation shares weights between the forward and backward passes. Convolutional nets share weights between spatial filters at different locations. Recurrent nets share weights over time. Self-supervised contrastive learning shares weights between different image patches. Transformers share weights between word fragments at different positions within a document. Most significantly, multiple copies of a model running on different hardware share weights to allow very large datasets to be processed in parallel. I will discuss attempts to achieve the effect of weight-sharing in biologically plausible models. These attempts typically try to achieve weight-sharing by re-using the same weights at different times and I will conclude that the main feature that makes digital intelligence superior to biological intelligence is the ability to have many copies of exactly the same model running on different hardware -- something that biology cannot do.

Organizer

Like the format? Trust SlidesLive to capture your next event!

Professional recording and live streaming, delivered globally.

Sharing

Recommended Videos

Presentations on similar topic, category or speaker

Interested in talks like this? Follow ICML 2023