Artwork

Stanford Women in Data Science (WiDS) initiative, Professor Margot Gerritsen, and Chisoo Lyons에서 제공하는 콘텐츠입니다. 에피소드, 그래픽, 팟캐스트 설명을 포함한 모든 팟캐스트 콘텐츠는 Stanford Women in Data Science (WiDS) initiative, Professor Margot Gerritsen, and Chisoo Lyons 또는 해당 팟캐스트 플랫폼 파트너가 직접 업로드하고 제공합니다. 누군가가 귀하의 허락 없이 귀하의 저작물을 사용하고 있다고 생각되는 경우 여기에 설명된 절차를 따르실 수 있습니다 https://ko.player.fm/legal.
Player FM -팟 캐스트 앱
Player FM 앱으로 오프라인으로 전환하세요!

Natalie Evans Harris | Creating A Shared Code Of Ethics To Guide Ethical and Responsible Use of Data

30:31
 
공유
 

Manage episode 264308732 series 2706384
Stanford Women in Data Science (WiDS) initiative, Professor Margot Gerritsen, and Chisoo Lyons에서 제공하는 콘텐츠입니다. 에피소드, 그래픽, 팟캐스트 설명을 포함한 모든 팟캐스트 콘텐츠는 Stanford Women in Data Science (WiDS) initiative, Professor Margot Gerritsen, and Chisoo Lyons 또는 해당 팟캐스트 플랫폼 파트너가 직접 업로드하고 제공합니다. 누군가가 귀하의 허락 없이 귀하의 저작물을 사용하고 있다고 생각되는 경우 여기에 설명된 절차를 따르실 수 있습니다 https://ko.player.fm/legal.

During her career at the National Security Agency, Capitol Hill and the White House, Natalie Evans Harris saw that while we collected troves of data, we didn't have strong frameworks and governance in place to protect people in a data driven world. “Data has been used to intrude in our lives. Things are happening based upon data that nobody communicated to the public was actually happening,” she explained during a conversation with Stanford’s Margot Gerritsen, Stanford professor and host of the Women in Data Science podcast.

Data ethics and responsible use of data are essentially about building trust. There's this gap in understanding what sharing data means. Two things have to happen if we're going to build a relationship where people allow their data to be used by a company. Individuals have to trust that what the company is doing with that data is something they're okay with. And the company has to be able to prove that they're being responsible with the use of the data. A company could have the best products out there, but if people don't trust you or understand what you're doing with the data, then they're not going to trust you to use the data. And then innovation stops.

She believes the biggest problem is we do not have a shared vision of what ethical practices mean. We don’t want to put broad impact laws in place to govern responsible use of data when we're still trying to define that vision. To change business practices, we have to change company expectations so that they're not only incentivized to be ethical and responsible in their business models, but they're also penalized when they violate.

Harris has been advocating for a data science “code of ethics” to create a shared vision to guide our behaviors, and then start to develop best practices around. Companies are now taking this code of ethics and personalizing it to their businesses around principles like informed consent, transparency, fairness and diversity. Companies then publicize the practices that they're putting in place to align with those principles. That's how you start to create that shared vision.

She sees that there's this transformation happening with the relationship between technology and people. For so long, technology has been this very passive thing in our lives, and now with AI and machine learning and all of these uses of data and technology, there's this tension around what technology can do and what humans should do. Until people know and understand what is happening with their data, and until companies can thoughtfully express what they're doing with the data in a very transparent fashion, we will continue to have this tension. She is hoping that this code of ethics can start to ease that tension.

RELATED LINKS
Connect with Natalie Evans Harris on Twitter (@QuietStormnat) and LinkedIn
Find out more about Natalie on her personal website
Connect with Margot Gerritsen on Twitter (@margootjeg) and LinkedIn
Find out more about Margot on her Stanford Profile
Find out more about Margot on her personal website
Read more about BrightHive and Beeck Center

Listen and Subscribe to the WiDS Podcast on Apple Podcasts, Google Podcasts, Spotify, Stitcher

  continue reading

53 에피소드

Artwork
icon공유
 
Manage episode 264308732 series 2706384
Stanford Women in Data Science (WiDS) initiative, Professor Margot Gerritsen, and Chisoo Lyons에서 제공하는 콘텐츠입니다. 에피소드, 그래픽, 팟캐스트 설명을 포함한 모든 팟캐스트 콘텐츠는 Stanford Women in Data Science (WiDS) initiative, Professor Margot Gerritsen, and Chisoo Lyons 또는 해당 팟캐스트 플랫폼 파트너가 직접 업로드하고 제공합니다. 누군가가 귀하의 허락 없이 귀하의 저작물을 사용하고 있다고 생각되는 경우 여기에 설명된 절차를 따르실 수 있습니다 https://ko.player.fm/legal.

During her career at the National Security Agency, Capitol Hill and the White House, Natalie Evans Harris saw that while we collected troves of data, we didn't have strong frameworks and governance in place to protect people in a data driven world. “Data has been used to intrude in our lives. Things are happening based upon data that nobody communicated to the public was actually happening,” she explained during a conversation with Stanford’s Margot Gerritsen, Stanford professor and host of the Women in Data Science podcast.

Data ethics and responsible use of data are essentially about building trust. There's this gap in understanding what sharing data means. Two things have to happen if we're going to build a relationship where people allow their data to be used by a company. Individuals have to trust that what the company is doing with that data is something they're okay with. And the company has to be able to prove that they're being responsible with the use of the data. A company could have the best products out there, but if people don't trust you or understand what you're doing with the data, then they're not going to trust you to use the data. And then innovation stops.

She believes the biggest problem is we do not have a shared vision of what ethical practices mean. We don’t want to put broad impact laws in place to govern responsible use of data when we're still trying to define that vision. To change business practices, we have to change company expectations so that they're not only incentivized to be ethical and responsible in their business models, but they're also penalized when they violate.

Harris has been advocating for a data science “code of ethics” to create a shared vision to guide our behaviors, and then start to develop best practices around. Companies are now taking this code of ethics and personalizing it to their businesses around principles like informed consent, transparency, fairness and diversity. Companies then publicize the practices that they're putting in place to align with those principles. That's how you start to create that shared vision.

She sees that there's this transformation happening with the relationship between technology and people. For so long, technology has been this very passive thing in our lives, and now with AI and machine learning and all of these uses of data and technology, there's this tension around what technology can do and what humans should do. Until people know and understand what is happening with their data, and until companies can thoughtfully express what they're doing with the data in a very transparent fashion, we will continue to have this tension. She is hoping that this code of ethics can start to ease that tension.

RELATED LINKS
Connect with Natalie Evans Harris on Twitter (@QuietStormnat) and LinkedIn
Find out more about Natalie on her personal website
Connect with Margot Gerritsen on Twitter (@margootjeg) and LinkedIn
Find out more about Margot on her Stanford Profile
Find out more about Margot on her personal website
Read more about BrightHive and Beeck Center

Listen and Subscribe to the WiDS Podcast on Apple Podcasts, Google Podcasts, Spotify, Stitcher

  continue reading

53 에피소드

Semua episode

×
 
Loading …

플레이어 FM에 오신것을 환영합니다!

플레이어 FM은 웹에서 고품질 팟캐스트를 검색하여 지금 바로 즐길 수 있도록 합니다. 최고의 팟캐스트 앱이며 Android, iPhone 및 웹에서도 작동합니다. 장치 간 구독 동기화를 위해 가입하세요.

 

빠른 참조 가이드