Artwork

Soroush Pour에서 제공하는 콘텐츠입니다. 에피소드, 그래픽, 팟캐스트 설명을 포함한 모든 팟캐스트 콘텐츠는 Soroush Pour 또는 해당 팟캐스트 플랫폼 파트너가 직접 업로드하고 제공합니다. 누군가가 귀하의 허락 없이 귀하의 저작물을 사용하고 있다고 생각되는 경우 여기에 설명된 절차를 따르실 수 있습니다 https://ko.player.fm/legal.
Player FM -팟 캐스트 앱
Player FM 앱으로 오프라인으로 전환하세요!

Ep 12 - Education & advocacy for AI safety w/ Rob Miles (YouTube host)

1:21:26
 
공유
 

Manage episode 405391218 series 3428190
Soroush Pour에서 제공하는 콘텐츠입니다. 에피소드, 그래픽, 팟캐스트 설명을 포함한 모든 팟캐스트 콘텐츠는 Soroush Pour 또는 해당 팟캐스트 플랫폼 파트너가 직접 업로드하고 제공합니다. 누군가가 귀하의 허락 없이 귀하의 저작물을 사용하고 있다고 생각되는 경우 여기에 설명된 절차를 따르실 수 있습니다 https://ko.player.fm/legal.

We speak with Rob Miles. Rob is the host of the “Robert Miles AI Safety” channel on YouTube, the single most popular AI alignment video series out there — he has 145,000 subscribers and his top video has ~600,000 views. He goes much deeper than many educational resources out there on alignment, going into important technical topics like the orthogonality thesis, inner misalignment, and instrumental convergence.
Through his work, Robert has educated thousands on AI safety, including many now working on advocacy, policy, and technical research. His work has been invaluable for teaching and inspiring the next generation of AI safety experts and deepening public support for the cause.
Prior to his AIS education work, Robert studied Computer Science at the University of Nottingham.
We talk to Rob about:
* What got him into AI safety
* How he started making educational videos for AI safety
* What he's working on now
* His top advice for people who also want to do education & advocacy work, really in any field, but especially for AI safety
* How he thinks AI safety is currently going as a field of work
* What he wishes more people were working on within AI safety
Hosted by Soroush Pour. Follow me for more AGI content:
Twitter: https://twitter.com/soroushjp
LinkedIn: https://www.linkedin.com/in/soroushjp/
== Show links ==
-- About Rob --
* Rob Miles AI Safety channel - https://www.youtube.com/@RobertMilesAI
* Twitter - https://twitter.com/robertskmiles
-- Further resources --
* Channel where Rob first started making videos: https://www.youtube.com/@Computerphile
* Podcast ep w/ Eliezer Yudkowsky, who first convinced Rob to take AI safety seriously through reading Yudkowsky's writings: https://lexfridman.com/eliezer-yudkowsky/
Recording date: Nov 21, 2023

  continue reading

15 에피소드

Artwork
icon공유
 
Manage episode 405391218 series 3428190
Soroush Pour에서 제공하는 콘텐츠입니다. 에피소드, 그래픽, 팟캐스트 설명을 포함한 모든 팟캐스트 콘텐츠는 Soroush Pour 또는 해당 팟캐스트 플랫폼 파트너가 직접 업로드하고 제공합니다. 누군가가 귀하의 허락 없이 귀하의 저작물을 사용하고 있다고 생각되는 경우 여기에 설명된 절차를 따르실 수 있습니다 https://ko.player.fm/legal.

We speak with Rob Miles. Rob is the host of the “Robert Miles AI Safety” channel on YouTube, the single most popular AI alignment video series out there — he has 145,000 subscribers and his top video has ~600,000 views. He goes much deeper than many educational resources out there on alignment, going into important technical topics like the orthogonality thesis, inner misalignment, and instrumental convergence.
Through his work, Robert has educated thousands on AI safety, including many now working on advocacy, policy, and technical research. His work has been invaluable for teaching and inspiring the next generation of AI safety experts and deepening public support for the cause.
Prior to his AIS education work, Robert studied Computer Science at the University of Nottingham.
We talk to Rob about:
* What got him into AI safety
* How he started making educational videos for AI safety
* What he's working on now
* His top advice for people who also want to do education & advocacy work, really in any field, but especially for AI safety
* How he thinks AI safety is currently going as a field of work
* What he wishes more people were working on within AI safety
Hosted by Soroush Pour. Follow me for more AGI content:
Twitter: https://twitter.com/soroushjp
LinkedIn: https://www.linkedin.com/in/soroushjp/
== Show links ==
-- About Rob --
* Rob Miles AI Safety channel - https://www.youtube.com/@RobertMilesAI
* Twitter - https://twitter.com/robertskmiles
-- Further resources --
* Channel where Rob first started making videos: https://www.youtube.com/@Computerphile
* Podcast ep w/ Eliezer Yudkowsky, who first convinced Rob to take AI safety seriously through reading Yudkowsky's writings: https://lexfridman.com/eliezer-yudkowsky/
Recording date: Nov 21, 2023

  continue reading

15 에피소드

모든 에피소드

×
 
Loading …

플레이어 FM에 오신것을 환영합니다!

플레이어 FM은 웹에서 고품질 팟캐스트를 검색하여 지금 바로 즐길 수 있도록 합니다. 최고의 팟캐스트 앱이며 Android, iPhone 및 웹에서도 작동합니다. 장치 간 구독 동기화를 위해 가입하세요.

 

빠른 참조 가이드