Player FM 앱으로 오프라인으로 전환하세요!
An Investigation into Self-Generated Child Sexual Abuse Material Networks on Social Media
Manage episode 365871238 series 3397905
Stanford’s Evelyn Douek and Alex Stamos are joined by Stanford Internet Observatory (SIO) Research Manager Renée DiResta and Chief Technologist David Thiel to discuss a new report on a months-long investigation into the distribution of illicit sexual content by minors online.
Large Networks of Minors Appear to be Selling Illicit Sexual Content Online
The Stanford Internet Observatory (SIO) published a report last week with findings from a months-long investigation into the distribution of illicit sexual content by minors online. The SIO research team identified a large network of accounts claiming to be minors, likely teenagers, who are producing, marketing and selling their own explicit content on social media.
A tip from The Wall Street Journal informed the investigation with a list of common terms and hashtags indicating the sale of “self-generated child sexual abuse material” (SG-CSAM). SIO identified a network of more than 500 accounts advertising SG-CSAM with tens of thousands of likely buyers.
With only public data, this research uncovered and helped resolve basic safety failings with Instagram’s reporting system for accounts with expected child exploitation, and Twitter’s system for automatically detecting and removing known CSAM.
Most of the work to address CSAM has focused on adult offenders who create the majority of content. These findings highlight the need for new countermeasures developed by industry, law enforcement and policymakers to address sextortion and the sale of illicit content that minors create themselves.
Front-Page Wall Street Journal Coverage
- A Wall Street Journal article first covered Twitter’s lapse in safety measures to prevent known CSAM from appearing on the site and the importance of researcher access to study public social media data to identify and help address issues. - Alexa Corse/ The Wall Street Journal
- Instagram was the focus of a larger Wall Street Journal investigation, based in part on SIO’s research findings. The app is currently the most significant platform for these CSAM networks, connecting young sellers with buyers with recommendation features, searching for hashtags, and direct messaging. - Jeff Horwitz, Katherine Blunt/ The Wall Street Journal
Bipartisan Concern and Calls for Social Media Regulation
The investigation sparked outrage across the aisle in the U.S. and grabbed the attention of the European Commission as the European Union prepares to enforce the Digital Services Act for the largest online platforms later this summer.
- Thierry Breton, the top EU official for trade and industry regulation, announced that he will meet with Meta CEO Mark Zuckerberg later this month at the company’s Menlo Park headquarters to discuss the report and demand the company takes action.
In Congress, House Energy and Commerce Democrats and GOP Senators were most outspoken about taking action to address the concerning findings.
- Senate Judiciary Ranking Member Lindsey Graham (R-SC) suggested a hearing on the findings during a Senate Judiciary markup session.
- Sen. Tom Cotton (R-AK) @SenTomCotton: “Social media isn’t safe for kids. At a minimum, we should require age verification and parental consent.”
- Sen. Rick Scott (@SenRickScott): “Every parent should read this story. Social media is NOT SAFE for our kids. What is described here is disgusting and needs to be shut down now!”
- House Energy and Commerce Committee Democrats released statements that they were “appalled” and “disgusted” by the role Instagram plays in connecting minors with buyers for abuse content. - Office of Congressman Frank Pallone, Office of Congresswoman Jan Schakowsky
- Rep. Ken Buck (@RepKenBuck): “How do we expect Big Tech companies like @Meta to regulate themselves when they allow vast networks of pedophiles to operate freely? #pedogram
- Rep. Anna Paulina Luna (@RepLuna): “Instead of meddling in elections, it would be cool if Mark Zuckerburg spent a few Zuckerbucks on cleaning up the Pedogram network.”
Join the conversation and connect with Evelyn and Alex on Twitter at @evelyndouek and @alexstamos.
Moderated Content is produced in partnership by Stanford Law School and the Cyber Policy Center. Special thanks to John Perrino for research and editorial assistance.
Like what you heard? Don’t forget to subscribe and share the podcast with friends!
86 에피소드
Manage episode 365871238 series 3397905
Stanford’s Evelyn Douek and Alex Stamos are joined by Stanford Internet Observatory (SIO) Research Manager Renée DiResta and Chief Technologist David Thiel to discuss a new report on a months-long investigation into the distribution of illicit sexual content by minors online.
Large Networks of Minors Appear to be Selling Illicit Sexual Content Online
The Stanford Internet Observatory (SIO) published a report last week with findings from a months-long investigation into the distribution of illicit sexual content by minors online. The SIO research team identified a large network of accounts claiming to be minors, likely teenagers, who are producing, marketing and selling their own explicit content on social media.
A tip from The Wall Street Journal informed the investigation with a list of common terms and hashtags indicating the sale of “self-generated child sexual abuse material” (SG-CSAM). SIO identified a network of more than 500 accounts advertising SG-CSAM with tens of thousands of likely buyers.
With only public data, this research uncovered and helped resolve basic safety failings with Instagram’s reporting system for accounts with expected child exploitation, and Twitter’s system for automatically detecting and removing known CSAM.
Most of the work to address CSAM has focused on adult offenders who create the majority of content. These findings highlight the need for new countermeasures developed by industry, law enforcement and policymakers to address sextortion and the sale of illicit content that minors create themselves.
Front-Page Wall Street Journal Coverage
- A Wall Street Journal article first covered Twitter’s lapse in safety measures to prevent known CSAM from appearing on the site and the importance of researcher access to study public social media data to identify and help address issues. - Alexa Corse/ The Wall Street Journal
- Instagram was the focus of a larger Wall Street Journal investigation, based in part on SIO’s research findings. The app is currently the most significant platform for these CSAM networks, connecting young sellers with buyers with recommendation features, searching for hashtags, and direct messaging. - Jeff Horwitz, Katherine Blunt/ The Wall Street Journal
Bipartisan Concern and Calls for Social Media Regulation
The investigation sparked outrage across the aisle in the U.S. and grabbed the attention of the European Commission as the European Union prepares to enforce the Digital Services Act for the largest online platforms later this summer.
- Thierry Breton, the top EU official for trade and industry regulation, announced that he will meet with Meta CEO Mark Zuckerberg later this month at the company’s Menlo Park headquarters to discuss the report and demand the company takes action.
In Congress, House Energy and Commerce Democrats and GOP Senators were most outspoken about taking action to address the concerning findings.
- Senate Judiciary Ranking Member Lindsey Graham (R-SC) suggested a hearing on the findings during a Senate Judiciary markup session.
- Sen. Tom Cotton (R-AK) @SenTomCotton: “Social media isn’t safe for kids. At a minimum, we should require age verification and parental consent.”
- Sen. Rick Scott (@SenRickScott): “Every parent should read this story. Social media is NOT SAFE for our kids. What is described here is disgusting and needs to be shut down now!”
- House Energy and Commerce Committee Democrats released statements that they were “appalled” and “disgusted” by the role Instagram plays in connecting minors with buyers for abuse content. - Office of Congressman Frank Pallone, Office of Congresswoman Jan Schakowsky
- Rep. Ken Buck (@RepKenBuck): “How do we expect Big Tech companies like @Meta to regulate themselves when they allow vast networks of pedophiles to operate freely? #pedogram
- Rep. Anna Paulina Luna (@RepLuna): “Instead of meddling in elections, it would be cool if Mark Zuckerburg spent a few Zuckerbucks on cleaning up the Pedogram network.”
Join the conversation and connect with Evelyn and Alex on Twitter at @evelyndouek and @alexstamos.
Moderated Content is produced in partnership by Stanford Law School and the Cyber Policy Center. Special thanks to John Perrino for research and editorial assistance.
Like what you heard? Don’t forget to subscribe and share the podcast with friends!
86 에피소드
Tüm bölümler
×플레이어 FM에 오신것을 환영합니다!
플레이어 FM은 웹에서 고품질 팟캐스트를 검색하여 지금 바로 즐길 수 있도록 합니다. 최고의 팟캐스트 앱이며 Android, iPhone 및 웹에서도 작동합니다. 장치 간 구독 동기화를 위해 가입하세요.