Artwork

SWI swissinfo.ch에서 제공하는 콘텐츠입니다. 에피소드, 그래픽, 팟캐스트 설명을 포함한 모든 팟캐스트 콘텐츠는 SWI swissinfo.ch 또는 해당 팟캐스트 플랫폼 파트너가 직접 업로드하고 제공합니다. 누군가가 귀하의 허락 없이 귀하의 저작물을 사용하고 있다고 생각되는 경우 여기에 설명된 절차를 따르실 수 있습니다 https://ko.player.fm/legal.
Player FM -팟 캐스트 앱
Player FM 앱으로 오프라인으로 전환하세요!

New wars, new weapons and the Geneva Conventions

24:54
 
공유
 

Manage episode 415425329 series 2789582
SWI swissinfo.ch에서 제공하는 콘텐츠입니다. 에피소드, 그래픽, 팟캐스트 설명을 포함한 모든 팟캐스트 콘텐츠는 SWI swissinfo.ch 또는 해당 팟캐스트 플랫폼 파트너가 직접 업로드하고 제공합니다. 누군가가 귀하의 허락 없이 귀하의 저작물을 사용하고 있다고 생각되는 경우 여기에 설명된 절차를 따르실 수 있습니다 https://ko.player.fm/legal.

Send us a Text Message.

In the wars in Ukraine and in the Middle East, new, autonomous weapons are being used. Our Inside Geneva podcast asks whether we’re losing the race to control them – and the artificial intelligence systems that run them.

“Autonomous weapons systems raise significant moral, ethical, and legal problems challenging human control over the use of force and handing over life-and-death decision-making to machines,” says Sai Bourothu, specialist in automated decision research with the Campaign to Stop Killer Robots.

How can we be sure an autonomous weapon will do what we humans originally intended? Who’s in control?

Jean-Marc Rickli from the Geneva Centre for Security Policy adds: “AI and machine learning basically lead to a situation where the machine is able to learn. And so now, if you talk to specialists, to scientists, they will tell you that it's a black box, we don't understand, it's very difficult to backtrack.”

Our listeners asked if an autonomous weapon could show empathy? Could it differentiate between a fighter and a child? Last year, an experiment asked patients to rate chatbot doctors versus human doctors.

“Medical chatbots ranked much better in the quality. But they also asked them to rank empathy. And on the empathy dimension they also ranked better. If that is the case, then you opened up a Pandora’s box that will be completely transformative for disinformation,” explains Rickli.

Are we going to lose our humanity because we think machines are not only more reliable, but also kinder?

“I think it's going to be an incredibly immense task to code something such as empathy. I think almost as close to the question of whether machines can love,” says Bourothu.

Join host Imogen Foulkes on the Inside Geneva podcast to learn more about this topic.

Please listen and subscribe to our science podcast -- the Swiss Connection.

Get in touch!

Thank you for listening! If you like what we do, please leave a review or subscribe to our newsletter.
For more stories on the international Geneva please visit www.swissinfo.ch/
Host: Imogen Foulkes
Production assitant: Claire-Marie Germain
Distribution: Sara Pasino
Marketing: Xin Zhang

  continue reading

챕터

1. The Ethics of Autonomous Weapons (00:00:07)

2. The Rise of Empathetic Machines (00:15:49)

120 에피소드

Artwork
icon공유
 
Manage episode 415425329 series 2789582
SWI swissinfo.ch에서 제공하는 콘텐츠입니다. 에피소드, 그래픽, 팟캐스트 설명을 포함한 모든 팟캐스트 콘텐츠는 SWI swissinfo.ch 또는 해당 팟캐스트 플랫폼 파트너가 직접 업로드하고 제공합니다. 누군가가 귀하의 허락 없이 귀하의 저작물을 사용하고 있다고 생각되는 경우 여기에 설명된 절차를 따르실 수 있습니다 https://ko.player.fm/legal.

Send us a Text Message.

In the wars in Ukraine and in the Middle East, new, autonomous weapons are being used. Our Inside Geneva podcast asks whether we’re losing the race to control them – and the artificial intelligence systems that run them.

“Autonomous weapons systems raise significant moral, ethical, and legal problems challenging human control over the use of force and handing over life-and-death decision-making to machines,” says Sai Bourothu, specialist in automated decision research with the Campaign to Stop Killer Robots.

How can we be sure an autonomous weapon will do what we humans originally intended? Who’s in control?

Jean-Marc Rickli from the Geneva Centre for Security Policy adds: “AI and machine learning basically lead to a situation where the machine is able to learn. And so now, if you talk to specialists, to scientists, they will tell you that it's a black box, we don't understand, it's very difficult to backtrack.”

Our listeners asked if an autonomous weapon could show empathy? Could it differentiate between a fighter and a child? Last year, an experiment asked patients to rate chatbot doctors versus human doctors.

“Medical chatbots ranked much better in the quality. But they also asked them to rank empathy. And on the empathy dimension they also ranked better. If that is the case, then you opened up a Pandora’s box that will be completely transformative for disinformation,” explains Rickli.

Are we going to lose our humanity because we think machines are not only more reliable, but also kinder?

“I think it's going to be an incredibly immense task to code something such as empathy. I think almost as close to the question of whether machines can love,” says Bourothu.

Join host Imogen Foulkes on the Inside Geneva podcast to learn more about this topic.

Please listen and subscribe to our science podcast -- the Swiss Connection.

Get in touch!

Thank you for listening! If you like what we do, please leave a review or subscribe to our newsletter.
For more stories on the international Geneva please visit www.swissinfo.ch/
Host: Imogen Foulkes
Production assitant: Claire-Marie Germain
Distribution: Sara Pasino
Marketing: Xin Zhang

  continue reading

챕터

1. The Ethics of Autonomous Weapons (00:00:07)

2. The Rise of Empathetic Machines (00:15:49)

120 에피소드

모든 에피소드

×
 
Loading …

플레이어 FM에 오신것을 환영합니다!

플레이어 FM은 웹에서 고품질 팟캐스트를 검색하여 지금 바로 즐길 수 있도록 합니다. 최고의 팟캐스트 앱이며 Android, iPhone 및 웹에서도 작동합니다. 장치 간 구독 동기화를 위해 가입하세요.

 

빠른 참조 가이드