Killer Robots in War and Civil Society - Noel Sharkey #TOA15

00:16:05
https://www.youtube.com/watch?v=mL_3CxmmfHY

Summary

TLDRThe speaker highlights the concerns surrounding the development and use of autonomous weapon systems (AWS), also known as killer robots, which operate independently without direct human control. He explains the technology behind these systems, potential international humanitarian law violations, and the risk of an arms race. The 'Campaign to Stop Killer Robots' is introduced as an advocacy effort seeking to achieve a legally binding prohibition on AWS. The discussion urges societal awareness of these developments and the moral implications of allowing machines to have lethal power, emphasizing the need for political and public action against AWS proliferation.

Takeaways

  • 🤖 Autonomous weapons function without human control.
  • ⚖️ AWS cannot comply with humanitarian laws.
  • 🚨 An arms race in autonomous systems is likely.
  • 🌍 The Campaign to Stop Killer Robots aims for regulation.
  • ✈️ Examples include the X-47B and Israeli guardian systems.
  • ⚠️ AWS may be misused in civilian contexts.
  • 📉 The need for human oversight in military decisions is critical.
  • 🤔 Morality of machines making kill decisions is questioned.

Timeline

  • 00:00:00 - 00:05:00

    In the history of warfare, human control has been pivotal, but modern weapons increasingly include computer chips that mediate between the weapon and the operator. This video discusses autonomous weapon systems, often called killer robots, their implications, and ongoing efforts at the UN to prohibit them. These systems can operate without human intervention to identify and engage targets, raising serious concerns about compliance with international humanitarian law, such as the Geneva Conventions, which require discrimination between combatants and civilians.

  • 00:05:00 - 00:10:00

    The presenter highlights various examples of autonomous weapons, including drones and combat aircraft, emphasizing the potential arms race and international destabilization they could foster. The argument is made that the militaries may falsely believe these weapons will confer superiority, risking proliferation as more countries develop similar technologies. The complications of decision-making speed in combat scenarios are discussed, raising concerns about the unpredictability and automation of warfare that can lead to irrevocable consequences during conflicts.

  • 00:10:00 - 00:16:05

    Finally, the speaker reflects on the successful international campaign to regulate these weapons, advocating for a legally binding prohibition against machines making lethal decisions. Concerns are also raised about the civilian use of autonomous technologies for policing and conflict suppression. The overall message stresses the importance of public awareness and discourse on the ethical implications of allowing machines to take human lives, urging collective action against the unconsidered advances in autonomous weaponry.

Mind Map

Video Q&A

  • What are autonomous weapon systems?

    Autonomous weapon systems (AWS), also known as killer robots, are weapons that can operate and make decisions without human intervention.

  • Why are AWS considered dangerous?

    AWS cannot comply with international humanitarian laws, making it difficult to ensure they discriminate between combatants and civilians.

  • What efforts are being made to regulate AWS?

    The Campaign to Stop Killer Robots aims to establish a legally binding prohibition on AWS at the United Nations.

  • How quickly is autonomous weapon technology developing?

    There is rapid development in AWS technology, raising concerns of an arms race among high-tech nations.

  • What is the principle of proportionality in warfare?

    The principle of proportionality allows certain civilian harm if it is proportional to the military advantage gained, but AWS struggle to assess military advantage accurately.

  • What are some examples of AWS in development?

    Examples include the US X-47B drone, Israeli guardian systems, and autonomous combat aircraft from various countries.

  • What concerns arise regarding civilian use of AWS?

    AWS may be used for policing, border control, and suppressing protests, which poses risks to civilian safety.

View more video summaries

Get instant access to free YouTube video summaries powered by AI!
Subtitles
en
Auto Scroll:
  • 00:00:08
    up until now you've known in the history
  • 00:00:11
    of weapons there's always been a human
  • 00:00:13
    in direct control so they would point
  • 00:00:15
    pull a trigger fire a bow or whatever
  • 00:00:18
    but in most modern weapons there's high
  • 00:00:21
    tech not the obviously not not some
  • 00:00:24
    rifles but but there in most weapons now
  • 00:00:27
    there's a computer chip between the
  • 00:00:30
    weapon and the person so the person
  • 00:00:33
    pulls the trigger and there's a computer
  • 00:00:35
    chip that does other stuff like
  • 00:00:36
    targeting so what I want to talk to you
  • 00:00:38
    about today is killer robots or more
  • 00:00:42
    academically autonomous weapon systems
  • 00:00:44
    or lethal autonomous weapon systems as
  • 00:00:47
    they call them in the UM now what I'm
  • 00:00:49
    going to do is want to tell you about
  • 00:00:50
    these I'm going to tell you what's wrong
  • 00:00:52
    with them I'm going to show you some of
  • 00:00:54
    them and then I'm going to tell you but
  • 00:00:55
    some work we're doing at the United
  • 00:00:57
    Nations to have them prohibited and then
  • 00:00:59
    I'll go on to tell you why they're a
  • 00:01:01
    danger to you in your everyday lives so
  • 00:01:05
    this is an autonomous weapon system and
  • 00:01:07
    what I mean here is that you've got a
  • 00:01:09
    Nordic guy here and as soon as the lever
  • 00:01:12
    is pulled as soon as this weapon is
  • 00:01:14
    activated as soon as this robot is
  • 00:01:16
    activated it could be in the air under
  • 00:01:18
    the sea on the sea or on land as soon as
  • 00:01:21
    it's activated oops
  • 00:01:26
    this isn't control completely a computer
  • 00:01:29
    is in control now quick primer and
  • 00:01:31
    robotics that will take me about two
  • 00:01:33
    minutes and is everything you need to
  • 00:01:34
    know about robotics for this to get to
  • 00:01:38
    use this weapon you have to have sensors
  • 00:01:40
    that input into your computer so the
  • 00:01:42
    computer is detecting the world and what
  • 00:01:45
    it's got to do it's got you've got
  • 00:01:46
    lasers you've got sonar you've got audio
  • 00:01:48
    you've got cameras you've got all sorts
  • 00:01:50
    of feeds okay that inputs into the
  • 00:01:53
    computer that information is then
  • 00:01:55
    processed and that's a very difficult
  • 00:01:57
    part it's still very difficult for
  • 00:01:59
    robots to be able to process we can't do
  • 00:02:01
    visual recognition too well facial
  • 00:02:04
    recognition yes but not objects so then
  • 00:02:08
    once that's been processed the
  • 00:02:10
    information is sent to a motor simple as
  • 00:02:13
    that and then the robot will move around
  • 00:02:15
    according to the information so it's
  • 00:02:18
    processed and a control signal is sent
  • 00:02:20
    to a motor and that's all fine
  • 00:02:22
    not against that because I've been
  • 00:02:23
    working in robotics for about 40 years
  • 00:02:25
    and I like I like autonomous robots but
  • 00:02:28
    it's this bit I don't like which is it's
  • 00:02:30
    processed and automatically fires a
  • 00:02:33
    weapon so these weapons will go out find
  • 00:02:36
    their own targets without any human
  • 00:02:38
    involvement and kill them with light
  • 00:02:40
    again without human involvement so we'll
  • 00:02:42
    seek out someone or secret a group of
  • 00:02:44
    people or a target and destroy it
  • 00:02:46
    without any human involvement now good
  • 00:02:48
    to show you one of these I can actually
  • 00:02:51
    build one of these next week and bring
  • 00:02:53
    it in here and kill all of you without
  • 00:02:55
    damaging the furniture much and how you
  • 00:02:58
    do that is you can use simple household
  • 00:03:00
    alarms like burglar alarms the detect
  • 00:03:03
    heat moving heat so this little robot
  • 00:03:06
    I've got on here has got two heat
  • 00:03:08
    sensors that's a robot that's the wheels
  • 00:03:10
    it's a silly one the program says if he
  • 00:03:13
    just detected turned the robot until the
  • 00:03:16
    both sensors are detecting it and then
  • 00:03:18
    fire the weapon here we go okay simple
  • 00:03:27
    as that and what's wrong with that well
  • 00:03:29
    a lot of things are wrong with it the
  • 00:03:31
    main thing that's wrong with it is this
  • 00:03:33
    type of weapon cannot comply with
  • 00:03:35
    international humanitarian law the laws
  • 00:03:37
    of war you might know them as the Geneva
  • 00:03:39
    Conventions they require that you've got
  • 00:03:42
    to any weapon with any weapon you've got
  • 00:03:43
    to be able to discriminate between a
  • 00:03:45
    combatant and a civilian between a
  • 00:03:47
    target that's military and a civilian
  • 00:03:50
    target you've got to be able to do that
  • 00:03:52
    with any weapon system okay
  • 00:03:54
    these can't do that at all yet I mean
  • 00:03:56
    and maybe in some distant future but
  • 00:03:58
    it's not just a visual classification
  • 00:04:00
    you think of these things working in the
  • 00:04:02
    fog of war and trying to discriminate
  • 00:04:04
    between you've got to also be able to
  • 00:04:06
    tell the difference between a combatant
  • 00:04:07
    and a surrendering soldier surrendering
  • 00:04:10
    soldier doesn't have to put their hands
  • 00:04:11
    up they can just be shot and go oh
  • 00:04:14
    whatever and you've got to be able to
  • 00:04:16
    not shoot mentally ill soldiers as well
  • 00:04:19
    so that that's that that's one of the
  • 00:04:22
    problems the other problem is under the
  • 00:04:24
    laws of war and I don't like this much
  • 00:04:26
    but it's it's it's legal to kill
  • 00:04:28
    civilians under certain circumstances
  • 00:04:31
    and damage civilian property providing
  • 00:04:33
    it's proportional to direct
  • 00:04:35
    advantage it's called the principle of
  • 00:04:37
    proportionality now the problem with
  • 00:04:40
    that is that nobody knows what military
  • 00:04:43
    advantage is it's something an
  • 00:04:44
    experienced commander in the groin
  • 00:04:46
    thinks about and knows about and can
  • 00:04:48
    talk about but not everybody knows this
  • 00:04:50
    a machine certainly not an algorithmic
  • 00:04:53
    thing and not yet anyway maybe in
  • 00:04:55
    hundred years or so I don't know but I
  • 00:04:57
    don't think so
  • 00:04:58
    so those are two of the problems the
  • 00:05:00
    other one is you must always check the
  • 00:05:02
    legitimacy of the target my point is
  • 00:05:04
    that there's no reliable guaranteed way
  • 00:05:07
    that these machines can do that ok
  • 00:05:10
    that's just some of the problems they
  • 00:05:11
    let me show you some of the weapons this
  • 00:05:16
    is the x-47b from the United States of
  • 00:05:19
    this ever clicker ever works thank you
  • 00:05:23
    meet the x-47b a brilliant technological
  • 00:05:27
    dream machine that is the future of US
  • 00:05:30
    Navy unmanned aviation the x-47b has
  • 00:05:35
    been designed for you subordinates class
  • 00:05:37
    aircraft carriers
  • 00:05:38
    it's tailless batwing shape will make it
  • 00:05:41
    the stealthiest unmanned system ever to
  • 00:05:43
    take to the skies so you can see it's a
  • 00:05:48
    quite advanced stage night and it's
  • 00:05:51
    designed to be used in the Pacific where
  • 00:05:53
    the Chinese and I have aircraft carrier
  • 00:05:55
    busting missiles so you can set the
  • 00:05:57
    aircraft carriers back they can go ten
  • 00:05:58
    times further than the normal F F 25
  • 00:06:01
    fighter jet still in prototype the UK
  • 00:06:04
    have one as well this is not working and
  • 00:06:11
    that's called the Tyrannis and then
  • 00:06:12
    there's a bunch of others I'm going to
  • 00:06:14
    show you this is just a sample there's
  • 00:06:15
    lots of them so that's the Israeli
  • 00:06:20
    guardian' that's an autonomous hunting
  • 00:06:23
    submarine hunting submarine from DARPA
  • 00:06:25
    the the the wing of the research wing of
  • 00:06:28
    the Pentagon that's also from the DARPA
  • 00:06:30
    which is a the crusher and this is the
  • 00:06:33
    Chinese air-to-air combat aircraft fully
  • 00:06:36
    autonomous ok nope
  • 00:06:39
    after that I have to check where I'm
  • 00:06:41
    going here sorry
  • 00:06:43
    oh yes but it's not just the problems I
  • 00:06:47
    told you what what people are talking
  • 00:06:49
    about particularly in the United States
  • 00:06:51
    and that if you read a lot of the plans
  • 00:06:52
    and the think-tank information in United
  • 00:06:54
    States what they're talking about
  • 00:06:56
    is they've lost a missile domination in
  • 00:06:59
    the world other countries have caught up
  • 00:07:00
    and going past them there's a lot of
  • 00:07:02
    worry about China and the Pacific and so
  • 00:07:05
    the idea really is to think we'll get
  • 00:07:07
    domination with these autonomous weapons
  • 00:07:10
    and that is a extremely blinkered
  • 00:07:12
    approach and that's why we need to reach
  • 00:07:14
    the international community people are
  • 00:07:16
    thinking this will give us domination
  • 00:07:18
    and when what they're not thinking about
  • 00:07:20
    is that domination is very temporary
  • 00:07:22
    because everybody will have them
  • 00:07:24
    everybody will get them maybe not
  • 00:07:26
    everybody but certainly every high-tech
  • 00:07:28
    nation will get them there be a new arms
  • 00:07:30
    race and there will be proliferation all
  • 00:07:34
    over the place we've already seen this
  • 00:07:35
    with drones drone strikes happen not
  • 00:07:38
    very long ago 2001 and now we have 87
  • 00:07:43
    countries with them and 30 countries
  • 00:07:45
    have armed drones so that proliferated
  • 00:07:47
    really quickly and this is the next step
  • 00:07:49
    where there's no operator so it's all
  • 00:07:51
    bells and whistles the other thing is
  • 00:07:53
    that people keep saying when I talk to
  • 00:07:55
    military advisers at the UN what the big
  • 00:07:59
    thing is they say the piece of battle is
  • 00:08:01
    getting so fast that humans can't aren't
  • 00:08:04
    quick enough to make decisions so you
  • 00:08:06
    have things like they are in doermann
  • 00:08:07
    Israel which shoots down missiles okay
  • 00:08:09
    that's fine but they say that the piece
  • 00:08:11
    about was increasing all the time the US
  • 00:08:14
    have developed an unmanned aircraft
  • 00:08:16
    called the Falcon that travels at 22,000
  • 00:08:19
    kilometers an hour and the idea is to
  • 00:08:22
    get anywhere on the planet within the
  • 00:08:23
    window of one hour so you've got this
  • 00:08:25
    speed-up that humans can't keep up with
  • 00:08:27
    my answer to that is slow down there's
  • 00:08:30
    no rush to kill more people really and
  • 00:08:32
    we I don't know what to do but the pace
  • 00:08:34
    of battle is not much we can do but if
  • 00:08:36
    you combine that if this computer
  • 00:08:37
    scientists here you'll know what I'm
  • 00:08:39
    talking about
  • 00:08:39
    if you have algorithms on these on these
  • 00:08:42
    things and they're talking about swarms
  • 00:08:44
    of them that's that's can you press that
  • 00:08:47
    so the big thing is to swarm that's what
  • 00:08:50
    they're trying to do so those X 47 bees
  • 00:08:52
    would work as a swarm you imagine one
  • 00:08:54
    swarm approaching
  • 00:08:55
    nice warm nobody knows what the other
  • 00:08:57
    swarms algorithms are you'd have to be
  • 00:09:00
    insane to tell them what your algorithm
  • 00:09:02
    is otherwise they would be able to
  • 00:09:04
    defeat it so you've got unknown
  • 00:09:05
    algorithms against each other and what's
  • 00:09:07
    gonna happen then I don't know uh nor
  • 00:09:10
    does anyone else it's totally
  • 00:09:11
    unpredictable and not good for civilian
  • 00:09:13
    populations that's for sure this also of
  • 00:09:16
    course the problem of hacking as the US
  • 00:09:19
    have talked about quite a lot spoofing
  • 00:09:21
    and trickery by the enemy so all these
  • 00:09:24
    things when you're using completely
  • 00:09:25
    automated weapons and what are my
  • 00:09:28
    biggest concern is once this starts you
  • 00:09:30
    imagine there's an accidental conflict
  • 00:09:33
    where one set of weapons approaches
  • 00:09:35
    another country they accidentally get
  • 00:09:37
    into a conflict now you can't stop it
  • 00:09:39
    because the first person to stop will
  • 00:09:41
    lose and that's one of the big problems
  • 00:09:43
    so we're looking here at the first step
  • 00:09:45
    towards the full automation of warfare
  • 00:09:47
    essentially and that's what I'm scared
  • 00:09:49
    about I'm not talking about a
  • 00:09:50
    technological singularity I'm talking
  • 00:09:53
    about dumb automation of warfare not
  • 00:09:55
    great so what can we do about this what
  • 00:09:58
    can we do to stop it well in 2009 after
  • 00:10:00
    I'd worked on it for about three years I
  • 00:10:03
    co-founded the International Committee
  • 00:10:05
    for robot arms control and we wrote lots
  • 00:10:07
    of articles for newspapers
  • 00:10:09
    we went and gave lots of talks to
  • 00:10:11
    militaries I talked about militaries
  • 00:10:13
    from about 30 countries then but we
  • 00:10:15
    couldn't get nations to speak to each
  • 00:10:17
    other
  • 00:10:18
    at all and that went on for three years
  • 00:10:20
    really trying hard but talking to
  • 00:10:23
    individual nations but not getting
  • 00:10:24
    nations to speak to each other then in
  • 00:10:27
    2012 November 2012 in New York I went
  • 00:10:31
    and talked to a group of activists who
  • 00:10:33
    managed to get landmine spam cluster
  • 00:10:35
    munitions banned and blinding lasers
  • 00:10:37
    banned and you can see us here that's me
  • 00:10:40
    there and based on my other side is Judy
  • 00:10:42
    Williams Nobel Peace Prize winner and
  • 00:10:44
    then that size is another Nobel Peace
  • 00:10:46
    Prize winner for Pugwash
  • 00:10:48
    and so we got a bunch of Nobel Peace
  • 00:10:51
    Prize winners together and a large
  • 00:10:52
    number of NGOs including Human Rights
  • 00:10:54
    Watch and Amnesty International and we
  • 00:10:57
    launched a campaign called the campaign
  • 00:10:59
    to stop killer robots in 2013 from the
  • 00:11:02
    UK Parliament that wasn't my idea for a
  • 00:11:05
    title because I'm an academic but it
  • 00:11:07
    turns out it's a good title it's catch
  • 00:11:09
    so it's you know that sort of thing it
  • 00:11:11
    works all right I would call the Matan
  • 00:11:13
    immerse weapon systems but that's not so
  • 00:11:15
    good the campaign to stop autonomous
  • 00:11:17
    weapons systems isn't as good
  • 00:11:19
    I admit so can I have the next slide so
  • 00:11:21
    what what are we trying to do in this
  • 00:11:22
    campaign well after six months which is
  • 00:11:25
    rapid we managed to talk the
  • 00:11:27
    ambassador's particularly the US
  • 00:11:29
    delegation who were actually very
  • 00:11:30
    helpful and the French delegation and
  • 00:11:33
    the German delegation all of whom helped
  • 00:11:35
    us and we went that's the wrong slide
  • 00:11:38
    yeah thank you again okay that's good so
  • 00:11:44
    we went to we went to these ambassadors
  • 00:11:46
    I mean there's a committee called the
  • 00:11:48
    CCW at the UN and that's the committee
  • 00:11:51
    that bans weapons and there are a
  • 00:11:53
    hundred and twenty nations there and we
  • 00:11:55
    had to convince all 120 that this was a
  • 00:11:58
    problem worth looking at and that was
  • 00:12:00
    November 2013 and we managed to do that
  • 00:12:03
    any one of them could have vetoed it but
  • 00:12:06
    they didn't so in 2014 they the UN
  • 00:12:09
    convened a large expert meeting for four
  • 00:12:12
    days and we talked at that and then they
  • 00:12:15
    convened another one this year for five
  • 00:12:17
    days and now 80 countries have spoken at
  • 00:12:19
    the UN about this issue and it's now the
  • 00:12:22
    hottest topic in UN disarmament circles
  • 00:12:24
    so we were very successful there but we
  • 00:12:26
    still don't have there's five protocols
  • 00:12:29
    and we're after protocol number six for
  • 00:12:31
    a complete legally binding prohibition
  • 00:12:33
    on these on the kill decision being
  • 00:12:36
    taken by weapons that's what we're after
  • 00:12:38
    and some people the big question is what
  • 00:12:41
    sort of control do we want to cede to
  • 00:12:43
    machines really morally do we really
  • 00:12:45
    want machines to kill people on their
  • 00:12:47
    own that's a big decision we have to
  • 00:12:49
    make as a planet really I think and
  • 00:12:51
    maybe some people like that idea maybe
  • 00:12:54
    some people don't I certainly don't and
  • 00:12:56
    and I knew that Germany doesn't because
  • 00:12:58
    the first principle of the german
  • 00:13:00
    constitution is that is dignity human
  • 00:13:04
    dignity and Germany decided the Supreme
  • 00:13:07
    Court decided that even if there was a
  • 00:13:09
    plan loaded with terrorists and
  • 00:13:11
    passengers about to fly into the center
  • 00:13:14
    of Berlin and kill a lot of people they
  • 00:13:16
    would not shoot down that plant over a
  • 00:13:18
    deserted area because of the dignity of
  • 00:13:21
    the passengers
  • 00:13:22
    whether you think that's right or not I
  • 00:13:24
    don't know just telling you the facts
  • 00:13:25
    okay so lastly and I am just coming to
  • 00:13:28
    the end the problem is that at the CCW
  • 00:13:31
    we can get this changed for the laws of
  • 00:13:33
    war but they could still be used in the
  • 00:13:36
    civilian world and we're beginning to
  • 00:13:37
    see it happening already
  • 00:13:39
    and that's one of my worries for border
  • 00:13:41
    control for policing for the suppression
  • 00:13:44
    of strikes the suppression of
  • 00:13:46
    populations okay can I have the next one
  • 00:13:48
    so this thing here for this is one of
  • 00:13:50
    the first ones I've seen now this was
  • 00:13:52
    last year a company called
  • 00:13:54
    Desert Storm or desert wolf I'm not sure
  • 00:13:56
    desert wolf develop this and sold them
  • 00:13:59
    to 25 mining companies the quadcopter
  • 00:14:02
    will they're more than quadcopter
  • 00:14:04
    they're drones and they deliver pepper
  • 00:14:06
    spray and fire plastic balls and they're
  • 00:14:09
    specially designed to break up strikes
  • 00:14:11
    from miners okay
  • 00:14:13
    they also said they've sold them to law
  • 00:14:15
    enforcement agencies from a number of
  • 00:14:17
    countries but they wouldn't say who the
  • 00:14:19
    countries where now there's a fully
  • 00:14:21
    autonomous one in Texas and only in
  • 00:14:24
    Texas would you find this yet can have
  • 00:14:25
    that can I have that okay and you can
  • 00:14:30
    see here it says we deliver eighty
  • 00:14:32
    thousand volts of Awesomeness and you're
  • 00:14:34
    going to see a bit two second video clip
  • 00:14:36
    here of the office inter it's called
  • 00:14:38
    cupid because it fires a dart but it
  • 00:14:41
    happens to be a taser dart so can we
  • 00:14:44
    just
  • 00:14:49
    he's just been shot by Cupid by the dark
  • 00:14:52
    now the idea of this device is it hovers
  • 00:14:55
    above your property if someone intrudes
  • 00:14:57
    on your property it says leave this
  • 00:14:59
    property and if you don't go
  • 00:15:01
    it fires a taser dart at you and keeps
  • 00:15:03
    you under electric shock until the
  • 00:15:05
    authorities arrive or somebody arrives
  • 00:15:07
    to arrest you okay so so this is coming
  • 00:15:10
    to the civil world and I'm telling you
  • 00:15:11
    this because you need to be watching out
  • 00:15:13
    for it
  • 00:15:13
    the reports tend to be in technology
  • 00:15:16
    pages and other places okay so finally
  • 00:15:19
    so what I'm saying here is we mustn't
  • 00:15:21
    sleep walk into this world we mustn't
  • 00:15:24
    just walk into it without thinking we
  • 00:15:26
    really need to think about what we're
  • 00:15:28
    doing here we really need to be getting
  • 00:15:30
    together about it we're working very
  • 00:15:32
    hard political campaigning but other
  • 00:15:34
    people need to take action and
  • 00:15:36
    action you can take is just to tell your
  • 00:15:37
    friends about it and discuss it and
  • 00:15:40
    thank you very much for listening
  • 00:16:02
    you
Tags
  • Autonomous Weapons
  • Killer Robots
  • International Humanitarian Law
  • Military Technology
  • Campaign to Stop Killer Robots
  • Arms Race
  • Civilian Safety
  • Robotics
  • Artificial Intelligence
  • Discrimination in Targeting