Artwork

Plutopia News Network에서 제공하는 콘텐츠입니다. 에피소드, 그래픽, 팟캐스트 설명을 포함한 모든 팟캐스트 콘텐츠는 Plutopia News Network 또는 해당 팟캐스트 플랫폼 파트너가 직접 업로드하고 제공합니다. 누군가가 귀하의 허락 없이 귀하의 저작물을 사용하고 있다고 생각되는 경우 여기에 설명된 절차를 따르실 수 있습니다 https://ko.player.fm/legal.
Player FM -팟 캐스트 앱
Player FM 앱으로 오프라인으로 전환하세요!

Sophie Nightingale: Our Minds on Digital Technology

1:02:07
 
공유
 

Manage episode 517601026 series 2292604
Plutopia News Network에서 제공하는 콘텐츠입니다. 에피소드, 그래픽, 팟캐스트 설명을 포함한 모든 팟캐스트 콘텐츠는 Plutopia News Network 또는 해당 팟캐스트 플랫폼 파트너가 직접 업로드하고 제공합니다. 누군가가 귀하의 허락 없이 귀하의 저작물을 사용하고 있다고 생각되는 경우 여기에 설명된 절차를 따르실 수 있습니다 https://ko.player.fm/legal.

The Plutopia podcast hosts Dr. Sophie Nightingale, a psychologist at Lancaster University, to discuss how digital technology — especially social media, generative AI, and the constant flow of online information — shapes human memory, judgment, and vulnerability to deception. She explains that people struggle to evaluate critically the sheer volume of information they encounter, so they’re more likely to accept content that aligns with their preexisting beliefs, and this helps misinformation spread. Nightingale traces her research from early work on how taking photos can impair memory to current studies showing that most people can spot fake or AI-generated images only slightly better than chance, and even training improves performance only modestly. She and the hosts dig into the limits of AI “guardrails,” the uneven global landscape of AI regulation, the rise of misogynistic online spaces, and the troubling growth of AI-enabled nonconsensual intimate imagery, arguing that legal reform, platform accountability, and public education are all needed to reduce harm.

One of the things that tends to make people quite susceptible is just information overload, purely that we live in an age where we are accessing so much information all the time we can’t possibly interpret, or critically think about, everything. So we might well just accept things that we wouldn’t otherwise. There’s quite a lot of evidence showing that’s especially the case, if that information coincides with your pre-existing beliefs. So for example, if I happen to be a huge fan of Donald Trump, let’s say, and I saw some misinformation around Donald Trump that was positive about him, then I would probably be more likely to believe that than somebody who was not a fan of Donald Trump already, if you see what I mean. So those biases definitely exist. There’s a lot of evidence showing that. And then I think, you know, it kind of comes back as well to — if you want to believe something, you will.

The post Sophie Nightingale: Our Minds on Digital Technology first appeared on Plutopia News Network.

  continue reading

274 에피소드

Artwork
icon공유
 
Manage episode 517601026 series 2292604
Plutopia News Network에서 제공하는 콘텐츠입니다. 에피소드, 그래픽, 팟캐스트 설명을 포함한 모든 팟캐스트 콘텐츠는 Plutopia News Network 또는 해당 팟캐스트 플랫폼 파트너가 직접 업로드하고 제공합니다. 누군가가 귀하의 허락 없이 귀하의 저작물을 사용하고 있다고 생각되는 경우 여기에 설명된 절차를 따르실 수 있습니다 https://ko.player.fm/legal.

The Plutopia podcast hosts Dr. Sophie Nightingale, a psychologist at Lancaster University, to discuss how digital technology — especially social media, generative AI, and the constant flow of online information — shapes human memory, judgment, and vulnerability to deception. She explains that people struggle to evaluate critically the sheer volume of information they encounter, so they’re more likely to accept content that aligns with their preexisting beliefs, and this helps misinformation spread. Nightingale traces her research from early work on how taking photos can impair memory to current studies showing that most people can spot fake or AI-generated images only slightly better than chance, and even training improves performance only modestly. She and the hosts dig into the limits of AI “guardrails,” the uneven global landscape of AI regulation, the rise of misogynistic online spaces, and the troubling growth of AI-enabled nonconsensual intimate imagery, arguing that legal reform, platform accountability, and public education are all needed to reduce harm.

One of the things that tends to make people quite susceptible is just information overload, purely that we live in an age where we are accessing so much information all the time we can’t possibly interpret, or critically think about, everything. So we might well just accept things that we wouldn’t otherwise. There’s quite a lot of evidence showing that’s especially the case, if that information coincides with your pre-existing beliefs. So for example, if I happen to be a huge fan of Donald Trump, let’s say, and I saw some misinformation around Donald Trump that was positive about him, then I would probably be more likely to believe that than somebody who was not a fan of Donald Trump already, if you see what I mean. So those biases definitely exist. There’s a lot of evidence showing that. And then I think, you know, it kind of comes back as well to — if you want to believe something, you will.

The post Sophie Nightingale: Our Minds on Digital Technology first appeared on Plutopia News Network.

  continue reading

274 에피소드

모든 에피소드

×
 
Loading …

플레이어 FM에 오신것을 환영합니다!

플레이어 FM은 웹에서 고품질 팟캐스트를 검색하여 지금 바로 즐길 수 있도록 합니다. 최고의 팟캐스트 앱이며 Android, iPhone 및 웹에서도 작동합니다. 장치 간 구독 동기화를 위해 가입하세요.

 

빠른 참조 가이드

탐색하는 동안 이 프로그램을 들어보세요.
재생