Artwork

Turpentine, Erik Torenberg, and Nathan Labenz에서 제공하는 콘텐츠입니다. 에피소드, 그래픽, 팟캐스트 설명을 포함한 모든 팟캐스트 콘텐츠는 Turpentine, Erik Torenberg, and Nathan Labenz 또는 해당 팟캐스트 플랫폼 파트너가 직접 업로드하고 제공합니다. 누군가가 귀하의 허락 없이 귀하의 저작물을 사용하고 있다고 생각되는 경우 여기에 설명된 절차를 따르실 수 있습니다 https://ko.player.fm/legal.
Player FM -팟 캐스트 앱
Player FM 앱으로 오프라인으로 전환하세요!

Popular Mechanistic Interpretability: Goodfire Lights the Way to AI Safety

1:55:33
 
공유
 

Manage episode 434715096 series 3452589
Turpentine, Erik Torenberg, and Nathan Labenz에서 제공하는 콘텐츠입니다. 에피소드, 그래픽, 팟캐스트 설명을 포함한 모든 팟캐스트 콘텐츠는 Turpentine, Erik Torenberg, and Nathan Labenz 또는 해당 팟캐스트 플랫폼 파트너가 직접 업로드하고 제공합니다. 누군가가 귀하의 허락 없이 귀하의 저작물을 사용하고 있다고 생각되는 경우 여기에 설명된 절차를 따르실 수 있습니다 https://ko.player.fm/legal.

Nathan explores the cutting-edge field of mechanistic interpretability with Dan Balsam and Tom McGrath, co-founders of Goodfire. In this episode of The Cognitive Revolution, we delve into the science of understanding AI models' inner workings, recent breakthroughs, and the potential impact on AI safety and control. Join us for an insightful discussion on sparse autoencoders, polysemanticity, and the future of interpretable AI.

Papers

Apply to join over 400 founders and execs in the Turpentine Network: https://hmplogxqz0y.typeform.com/to/JCkphVqj

SPONSORS:

Oracle Cloud Infrastructure (OCI) is a single platform for your infrastructure, database, application development, and AI needs. OCI has four to eight times the bandwidth of other clouds; offers one consistent price, and nobody does data better than Oracle. If you want to do more and spend less, take a free test drive of OCI at https://oracle.com/cognitive

The Brave search API can be used to assemble a data set to train your AI models and help with retrieval augmentation at the time of inference. All while remaining affordable with developer first pricing, integrating the Brave search API into your workflow translates to more ethical data sourcing and more human representative data sets. Try the Brave search API for free for up to 2000 queries per month at https://bit.ly/BraveTCR

Omneky is an omnichannel creative generation platform that lets you launch hundreds of thousands of ad iterations that actually work customized across all platforms, with a click of a button. Omneky combines generative AI and real-time advertising data. Mention "Cog Rev" for 10% off https://www.omneky.com/

Head to Squad to access global engineering without the headache and at a fraction of the cost: head to https://choosesquad.com/ and mention “Turpentine” to skip the waitlist.

CHAPTERS:

(00:00:00) About the Show

(00:00:22) About the Episode

(00:03:52) Introduction and Background

(00:08:43) State of Interpretability Research

(00:12:06) Key Insights in Interpretability

(00:16:53) Polysemanticity and Model Compression (Part 1)

(00:17:00) Sponsors: Oracle | Brave

(00:19:04) Polysemanticity and Model Compression (Part 2)

(00:22:50) Sparse Autoencoders Explained

(00:27:19) Challenges in Interpretability Research (Part 1)

(00:30:54) Sponsors: Omneky | Squad

(00:32:41) Challenges in Interpretability Research (Part 2)

(00:33:51) Goodfire's Vision and Mission

(00:37:08) Interpretability and Scientific Models

(00:43:48) Architecture and Interpretability Techniques

(00:50:08) Quantization and Model Representation

(00:54:07) Future of Interpretability Research

(01:01:38) Skepticism and Challenges in Interpretability

(01:07:51) Alternative Architectures and Universality

(01:13:39) Goodfire's Business Model and Funding

(01:18:47) Building the Team and Future Plans

(01:31:03) Hiring and Getting Involved in Interpretability

(01:51:28) Closing Remarks

(01:51:38) Outro

  continue reading

197 에피소드

Artwork
icon공유
 
Manage episode 434715096 series 3452589
Turpentine, Erik Torenberg, and Nathan Labenz에서 제공하는 콘텐츠입니다. 에피소드, 그래픽, 팟캐스트 설명을 포함한 모든 팟캐스트 콘텐츠는 Turpentine, Erik Torenberg, and Nathan Labenz 또는 해당 팟캐스트 플랫폼 파트너가 직접 업로드하고 제공합니다. 누군가가 귀하의 허락 없이 귀하의 저작물을 사용하고 있다고 생각되는 경우 여기에 설명된 절차를 따르실 수 있습니다 https://ko.player.fm/legal.

Nathan explores the cutting-edge field of mechanistic interpretability with Dan Balsam and Tom McGrath, co-founders of Goodfire. In this episode of The Cognitive Revolution, we delve into the science of understanding AI models' inner workings, recent breakthroughs, and the potential impact on AI safety and control. Join us for an insightful discussion on sparse autoencoders, polysemanticity, and the future of interpretable AI.

Papers

Apply to join over 400 founders and execs in the Turpentine Network: https://hmplogxqz0y.typeform.com/to/JCkphVqj

SPONSORS:

Oracle Cloud Infrastructure (OCI) is a single platform for your infrastructure, database, application development, and AI needs. OCI has four to eight times the bandwidth of other clouds; offers one consistent price, and nobody does data better than Oracle. If you want to do more and spend less, take a free test drive of OCI at https://oracle.com/cognitive

The Brave search API can be used to assemble a data set to train your AI models and help with retrieval augmentation at the time of inference. All while remaining affordable with developer first pricing, integrating the Brave search API into your workflow translates to more ethical data sourcing and more human representative data sets. Try the Brave search API for free for up to 2000 queries per month at https://bit.ly/BraveTCR

Omneky is an omnichannel creative generation platform that lets you launch hundreds of thousands of ad iterations that actually work customized across all platforms, with a click of a button. Omneky combines generative AI and real-time advertising data. Mention "Cog Rev" for 10% off https://www.omneky.com/

Head to Squad to access global engineering without the headache and at a fraction of the cost: head to https://choosesquad.com/ and mention “Turpentine” to skip the waitlist.

CHAPTERS:

(00:00:00) About the Show

(00:00:22) About the Episode

(00:03:52) Introduction and Background

(00:08:43) State of Interpretability Research

(00:12:06) Key Insights in Interpretability

(00:16:53) Polysemanticity and Model Compression (Part 1)

(00:17:00) Sponsors: Oracle | Brave

(00:19:04) Polysemanticity and Model Compression (Part 2)

(00:22:50) Sparse Autoencoders Explained

(00:27:19) Challenges in Interpretability Research (Part 1)

(00:30:54) Sponsors: Omneky | Squad

(00:32:41) Challenges in Interpretability Research (Part 2)

(00:33:51) Goodfire's Vision and Mission

(00:37:08) Interpretability and Scientific Models

(00:43:48) Architecture and Interpretability Techniques

(00:50:08) Quantization and Model Representation

(00:54:07) Future of Interpretability Research

(01:01:38) Skepticism and Challenges in Interpretability

(01:07:51) Alternative Architectures and Universality

(01:13:39) Goodfire's Business Model and Funding

(01:18:47) Building the Team and Future Plans

(01:31:03) Hiring and Getting Involved in Interpretability

(01:51:28) Closing Remarks

(01:51:38) Outro

  continue reading

197 에피소드

همه قسمت ها

×
 
Loading …

플레이어 FM에 오신것을 환영합니다!

플레이어 FM은 웹에서 고품질 팟캐스트를 검색하여 지금 바로 즐길 수 있도록 합니다. 최고의 팟캐스트 앱이며 Android, iPhone 및 웹에서도 작동합니다. 장치 간 구독 동기화를 위해 가입하세요.

 

빠른 참조 가이드