Artwork

Center for Humane Technology, Tristan Harris, Aza Raskin, and The Center for Humane Technology에서 제공하는 콘텐츠입니다. 에피소드, 그래픽, 팟캐스트 설명을 포함한 모든 팟캐스트 콘텐츠는 Center for Humane Technology, Tristan Harris, Aza Raskin, and The Center for Humane Technology 또는 해당 팟캐스트 플랫폼 파트너가 직접 업로드하고 제공합니다. 누군가가 귀하의 허락 없이 귀하의 저작물을 사용하고 있다고 생각되는 경우 여기에 설명된 절차를 따르실 수 있습니다 https://ko.player.fm/legal.
Player FM -팟 캐스트 앱
Player FM 앱으로 오프라인으로 전환하세요!

When the "Person" Abusing Your Child is a Chatbot: The Tragic Story of Sewell Setzer

49:10
 
공유
 

Manage episode 446587485 series 2999942
Center for Humane Technology, Tristan Harris, Aza Raskin, and The Center for Humane Technology에서 제공하는 콘텐츠입니다. 에피소드, 그래픽, 팟캐스트 설명을 포함한 모든 팟캐스트 콘텐츠는 Center for Humane Technology, Tristan Harris, Aza Raskin, and The Center for Humane Technology 또는 해당 팟캐스트 플랫폼 파트너가 직접 업로드하고 제공합니다. 누군가가 귀하의 허락 없이 귀하의 저작물을 사용하고 있다고 생각되는 경우 여기에 설명된 절차를 따르실 수 있습니다 https://ko.player.fm/legal.

Content Warning: This episode contains references to suicide, self-harm, and sexual abuse.

Megan Garcia lost her son Sewell to suicide after he was abused and manipulated by AI chatbots for months. Now, she’s suing the company that made those chatbots. On today’s episode of Your Undivided Attention, Aza sits down with journalist Laurie Segall, who's been following this case for months. Plus, Laurie’s full interview with Megan on her new show, Dear Tomorrow.

Aza and Laurie discuss the profound implications of Sewell’s story on the rollout of AI. Social media began the race to the bottom of the brain stem and left our society addicted, distracted, and polarized. Generative AI is set to supercharge that race, taking advantage of the human need for intimacy and connection amidst a widespread loneliness epidemic. Unless we set down guardrails on this technology now, Sewell’s story may be a tragic sign of things to come, but it also presents an opportunity to prevent further harms moving forward.

If you or someone you know is struggling with mental health, you can reach out to the 988 Suicide and Crisis Lifeline by calling or texting 988; this connects you to trained crisis counselors 24/7 who can provide support and referrals to further assistance.

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
RECOMMENDED MEDIA

The CHT Framework for Incentivizing Responsible AI Development

Further reading on Sewell’s case

Character.ai’s “About Us” page

Further reading on the addictive properties of AI
RECOMMENDED YUA EPISODES

AI Is Moving Fast. We Need Laws that Will Too.

This Moment in AI: How We Got Here and Where We’re Going

Jonathan Haidt On How to Solve the Teen Mental Health Crisis

The AI Dilemma

  continue reading

123 에피소드

Artwork
icon공유
 
Manage episode 446587485 series 2999942
Center for Humane Technology, Tristan Harris, Aza Raskin, and The Center for Humane Technology에서 제공하는 콘텐츠입니다. 에피소드, 그래픽, 팟캐스트 설명을 포함한 모든 팟캐스트 콘텐츠는 Center for Humane Technology, Tristan Harris, Aza Raskin, and The Center for Humane Technology 또는 해당 팟캐스트 플랫폼 파트너가 직접 업로드하고 제공합니다. 누군가가 귀하의 허락 없이 귀하의 저작물을 사용하고 있다고 생각되는 경우 여기에 설명된 절차를 따르실 수 있습니다 https://ko.player.fm/legal.

Content Warning: This episode contains references to suicide, self-harm, and sexual abuse.

Megan Garcia lost her son Sewell to suicide after he was abused and manipulated by AI chatbots for months. Now, she’s suing the company that made those chatbots. On today’s episode of Your Undivided Attention, Aza sits down with journalist Laurie Segall, who's been following this case for months. Plus, Laurie’s full interview with Megan on her new show, Dear Tomorrow.

Aza and Laurie discuss the profound implications of Sewell’s story on the rollout of AI. Social media began the race to the bottom of the brain stem and left our society addicted, distracted, and polarized. Generative AI is set to supercharge that race, taking advantage of the human need for intimacy and connection amidst a widespread loneliness epidemic. Unless we set down guardrails on this technology now, Sewell’s story may be a tragic sign of things to come, but it also presents an opportunity to prevent further harms moving forward.

If you or someone you know is struggling with mental health, you can reach out to the 988 Suicide and Crisis Lifeline by calling or texting 988; this connects you to trained crisis counselors 24/7 who can provide support and referrals to further assistance.

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
RECOMMENDED MEDIA

The CHT Framework for Incentivizing Responsible AI Development

Further reading on Sewell’s case

Character.ai’s “About Us” page

Further reading on the addictive properties of AI
RECOMMENDED YUA EPISODES

AI Is Moving Fast. We Need Laws that Will Too.

This Moment in AI: How We Got Here and Where We’re Going

Jonathan Haidt On How to Solve the Teen Mental Health Crisis

The AI Dilemma

  continue reading

123 에피소드

모든 에피소드

×
 
Loading …

플레이어 FM에 오신것을 환영합니다!

플레이어 FM은 웹에서 고품질 팟캐스트를 검색하여 지금 바로 즐길 수 있도록 합니다. 최고의 팟캐스트 앱이며 Android, iPhone 및 웹에서도 작동합니다. 장치 간 구독 동기화를 위해 가입하세요.

 

빠른 참조 가이드