Player FM 앱으로 오프라인으로 전환하세요!
BI 120 James Fitzgerald, Andrew Saxe, Weinan Sun: Optimizing Memories
Manage episode 307675538 series 2422585
Support the show to get full episodes, full archive, and join the Discord community.
James, Andrew, and Weinan discuss their recent theory about how the brain might use complementary learning systems to optimize our memories. The idea is that our hippocampus creates our episodic memories for individual events, full of particular details. And through a complementary process, slowly consolidates those memories within our neocortex through mechanisms like hippocampal replay. The new idea in their work suggests a way for the consolidated cortical memory to become optimized for generalization, something humans are known to be capable of but deep learning has yet to build. We discuss what their theory predicts about how the "correct" process depends on how much noise and variability there is in the learning environment, how their model solves this, and how it relates to our brain and behavior.
- James' Janelia page.
- Weinan's Janelia page.
- Andrew's website.
- Twitter:
- Paper we discuss:
- Andrew's previous episode: BI 052 Andrew Saxe: Deep Learning Theory
0:00 - Intro 3:57 - Guest Intros 15:04 - Organizing memories for generalization 26:48 - Teacher, student, and notebook models 30:51 - Shallow linear networks 33:17 - How to optimize generalization 47:05 - Replay as a generalization regulator 54:57 - Whole greater than sum of its parts 1:05:37 - Unpredictability 1:10:41 - Heuristics 1:13:52 - Theoretical neuroscience for AI 1:29:42 - Current personal thinking
235 에피소드
Manage episode 307675538 series 2422585
Support the show to get full episodes, full archive, and join the Discord community.
James, Andrew, and Weinan discuss their recent theory about how the brain might use complementary learning systems to optimize our memories. The idea is that our hippocampus creates our episodic memories for individual events, full of particular details. And through a complementary process, slowly consolidates those memories within our neocortex through mechanisms like hippocampal replay. The new idea in their work suggests a way for the consolidated cortical memory to become optimized for generalization, something humans are known to be capable of but deep learning has yet to build. We discuss what their theory predicts about how the "correct" process depends on how much noise and variability there is in the learning environment, how their model solves this, and how it relates to our brain and behavior.
- James' Janelia page.
- Weinan's Janelia page.
- Andrew's website.
- Twitter:
- Paper we discuss:
- Andrew's previous episode: BI 052 Andrew Saxe: Deep Learning Theory
0:00 - Intro 3:57 - Guest Intros 15:04 - Organizing memories for generalization 26:48 - Teacher, student, and notebook models 30:51 - Shallow linear networks 33:17 - How to optimize generalization 47:05 - Replay as a generalization regulator 54:57 - Whole greater than sum of its parts 1:05:37 - Unpredictability 1:10:41 - Heuristics 1:13:52 - Theoretical neuroscience for AI 1:29:42 - Current personal thinking
235 에피소드
모든 에피소드
×플레이어 FM에 오신것을 환영합니다!
플레이어 FM은 웹에서 고품질 팟캐스트를 검색하여 지금 바로 즐길 수 있도록 합니다. 최고의 팟캐스트 앱이며 Android, iPhone 및 웹에서도 작동합니다. 장치 간 구독 동기화를 위해 가입하세요.