Notice: Undefined variable: echo in /var/www/wp-content/themes/TDL_Theme/single.php on line 4

Bull Session

Policing with Robots

July 14, 2016          

Episode Summary

On this episode of The Digital Life we discuss the consequences of the Dallas police using a robot kill a gunman, who had shot and killed five officers, wounding many others.

For some observers, that the robot delivered the explosive that ultimately killed the sniper has been cause for alarm; this is the first time that police have used a robot like this in a deliberately lethal manner. However, unlike the famous dystopian sci-fi movies of our popular culture, such as the Terminator, this robot was not autonomous — It was remote controlled. In fact, the robot model is currently used by police and the military to dispose of bombs. It clearly wasn’t designed to be a weapons system, and is not part of a greater strategy for police use, at least for now.

Unsurprisingly this incident contributes to the “killer robot” debate, held at the UN and elsewhere, where policy makers struggle to determine the ethics of battlefield robots. Does this event in Dallas become a precedent, prototyping future use? Robots are particularly good at repetitive, dirty, dangerous jobs. It remains to be seen if a police robots—coupled with ad hoc, tactical, creative problem solving in emergencies—become further involved in such lethal scenarios.

 
Resources:
Police used a robot to kill — The key questions
Scientists Debate Killer Robots at U.N. Conference

Jon:
Welcome to Episode 164 of the Digital Life. A show about our insights into the future of design and technology. I am your host Jon Follett and with me is founder and co-host Dirk Knemeyer.

Dirk:
Hello Jon.

Jon:
Hello Dirk. For our podcast this week, we are gonna talk a little bit about the police in Dallas using a robot to kill the sniper in the wake of some fatal shootings of five police officers tragically last week. As you pointed out off the air, the Twitters and the social media are all kerfuffle about this use of the robot for a couple of different reasons. Of course let’s set up the scenario that this robot, which was a bomb disposal robot, was actually used to deliver an explosive.

The police used it to deliver the explosive and kill the sniper. This bomb disposal robot is used by both police and military, and this is really the first time that domestic police have used a robot in this deliberately lethal way. In that sense, it does play a bit into the debate about so-called “killer robots” right? Which is being discussed right now, so do the autonomous, lethal robot war scenarios that are being discussed and debated in the UN and elsewhere because of course we need rules for battlefield robots, but unlike those debates just to be very clear, this robot was not autonomous.

It was a remote-controlled bomb disposal unit, and it wasn’t designed to have a weapons system as part of it. This bomb was attached to it. Then this is also importantly this is not part of a larger tactical use of robots in this way. This is not like a new approach to policing. This was sort of in the moment, ad hoc problem solving by people who were in a combat, by police who were in a combat situation. In this particular case, the robot was basically used to protect human life insofar as there wasn’t a policeman or a police sniper trying to get rid of this bad actor.

There are lots of intertwining pieces that we can discuss. First I just wanted to get an idea of your general sense of this debate and of robots being used in this way.

Dirk:
Sure. There’s lots of interesting things to talk about here, but let me profess with something else. This obviously isn’t a political show, so I don’t want to overly politicize this episode, but I do want to just a few things because the context of this happening is meaningful to us here in the United States. First I want to say that I’m saddened for the continued senseless death of people of color by their interaction with the law enforcement system, and I’m similarly saddened by the recent spate of basically assassinations of law enforcement individuals by disaffected people.

Second I’d to like to say I’m deeply disturbed, and really in alignment and unison with the idea that people of color today cannot in this country live and work and function in the same way that I can as a white person. I’m very aware of that difference, how that difference is manifesting in death in some of these situations that we have here, and without getting into the specific political movements, I’m very much aligned with the people who are outraged by this and aren’t willing to tolerate it anymore. The third thing I want to say is and what I’m going to say here in no way is excusing or condoning the deaths that have happened, but I think something that’s not talked about as much but is an important nuance to the conversation.

I think that’s really fucking hard to be a police officer, to have a job where any time that a call could lead to your death or your being harmed in some really significant way. That doesn’t excuse the things that have happened, which in many cases are clearly criminal, but I think that this big social thing we are grappling with, like people just pound their fist on racism as this oblique thing. I think it is much more complicated in the context of policing, in the context of authority, in the context of peacekeeping. Then how the society manifests those things. Then imposes upon the people who are less privileged let’s say.

Again this isn’t a political show, and I’m sorry for straying off this way a little bit but …

Jon:
Yeah that’s a pretty thoughtful, hopefully our listeners think that’s a pretty thoughtful balanced take because I do certainly. Yeah let’s turn our focus then to the use of robotics here, which is kind of our focus for today. What was your impression on that Dirk?

Dirk:
Let me start with the snarky and then we can get into some of the real topics. Snarky was like, what is everybody is so upset about? We have been using drones which are robots, to kill people all around the world.

Jon:
That’s right.

Dirk:
For years and not only just to kill criminals. The use of our drones results in the death of civilians. These are civilians who are in other sovereign nations, and we just kind of shrug our shoulders, as we obliterate them while we are using these robotic weapons. I’m sort of bemused by all of the conversation about the robot, first robot killing, I mean come on.

Jon:
The domestic, so you are talking about international … Whether it is legal or not is a question, but …

Dirk:
It is not. It is not legal.

Jon:
It is international battlefield use of drones versus domestic use of whether it be a robot drone whatever you call it. The domestic use I think was what got people’s attention.

Dirk:
We can get to that, but I want to be clear. You are putting it in a bucket where you called it battlefield. You used the word twice. It is not battlefields right? The way that these terror organizations work is that they are woven into the fabric of their society and their neighborhood and their environment. We are not going to giant military base with a drone. That happens as well, but we are going into a neighborhood by a school and by a church. We have these so-called smart strikes. They are a little bit stupid right?

Civilians die. Children die. Doctors die. People who are horrified knowing that this terrorist is in their neighborhood but can’t do anything about it, die. Yes, the distinction of US soil is important, and I think it is where the conversation is going to pivot to, but I want us to be really clear and by us, I’m not meaning you and me, but sort of all of us in the United States/First World privileged technology culture to be clear that this isn’t such a big deal in the context of humanity, people. It is only a big deal because now it is actually close to home.

We are like oh no, it is not some faceless person with a hijab on who is getting killed. I don’t care about that. Oh now it is somebody in Dallas Texas. I was in Dallas Texas. There’s people like me in Dallas Texas. Now I’m horrified. Now I’m upset. Now it matters to me, so we are gonna talk about that context, but I want everyone to be really, really sanguine and not ignorant to our I don’t know … Yeah anyway.

Jon:
I mean the counterpoints that I think of while you are mentioning that, it is very true those are places where people live. They are cities. They are by schools and churches etc., but the modern battlefield and non-state actors have certainly opened up those areas to make them let’s not say that they are … There’s some tactical choices being made there both by the bad actors, terrorists to be in those spaces and create that sort of tension. The US military in pursuing them, they go into those areas because that’s where those bad actors are.

Dirk:
Let’s not experiment those. Would it be fair to say that there are members of the US military who have done things on the battlefield in Afghanistan or Iraq or one of these nations that people could legitimately consider them war crimes? Is that a fair thing to say or not?

Jon:
Yeah sure.

Dirk:
Okay given that if those people had moved back into society, weren’t still part of the military industrial complex, and people in let’s call it Afghanistan. Let’s say this hypothetical US military person was committing war crimes in Afghanistan. If the people of Afghanistan had access to drone technology, and if they flew their little drone to let’s say Boston, and they did a smart strike and killed this person and this person’s family and some people in the surrounding neighborhood, how would the United States government react to that? How would you and I feel about that?

Jon:
Yeah I mean there’s an awful lot of grey area I think that’s created by this non-state actor phenomenon.

Dirk:
We have a double standard here right? We are okay. We are the big bad mighty United States, and we are okay drone striking those bad evil terrorists in places that we can’t see and killing civilians that we don’t identify with and also can’t see, but trust me if some other country was dropping drone strikes on us, we’d be getting out the heave equipment. We would be inflicting massive pain and destruction on those nations. They are doing it to us because they can’t.

Jon:
Well I think they are doing it to us. That’s the asymmetric warfare right. I would say terrorist strikes in the United States are exactly what you are talking about, but before we get too up far …

Dirk:
Yeah let’s get back to robots.

Jon:
Let’s get back to robots.

Dirk:
Yeah, yeah.

Jon:
We were talking about the use of robots on American soil and that being a significant factor. I think a point worth raising is that since this robot was not designed for this task, it really actually sort of the creativity that was displayed by the police in using the robot in this way is very much in line with what robots are meant to do, which is do sort of very dangerous things. Do jobs that are perhaps dangerous or dirty or would really put a human being in peril. Whether that job is we often think of it on the factory floor, handling something that a human couldn’t handle, but if you recall about a year ago DARPA, the Defense Advanced Research Projects had a contest or a competition where robots where essentially given a set of very difficult rescue tasks.

This competition got a lot of international news media. It was actually inspired by the events at the Fukushima plant meltdown where human beings were unable to get inside the plant to stop the plant from melting down, and robots would have been really handy to have to do things like shut off valves, things like that. These robots that were created for this competition were able to do many different kinds of things and were meant to be used in ad hoc emergency situations to, in this case, do something like rescue people or shut down a plant that’s melting down or what have you, but they are not specific purpose robots insofar as they are meant to be used in dangerous situations but they are meant to do a lot of things.

One thing that I thought was very interesting, we talk about the delivery of the bomb and the woefulness of this particular action that the robot took but in a lot of ways …

Dirk:
That the robot took. It is very interesting the language here.

Jon:
Yeah right or that the remote user of the robot took, but what I’m getting at is that in dangerous scenarios, this is very much what robots are meant to do and because it wasn’t an autonomous thing, I guess I sort of fall in the same line as you do. I don’t see as much of the excitement about it because it was actually a human being using the robots to protect other human beings.

Dirk:
Yeah.

Jon:
You seem to have some response, but I kept plowing through, so I want to give you a moment to …

Dirk:
No, no it is okay. I was really struck I mean aside from the fact that the drones happened far away and this bombing happened in Dallas, another facet of it other than time and place I’m struck by is the anthropomorphic nature. The robot bomb delivery device is a little bit anthropomorphic. It is not like a great human design, but it has in its basic structure some of the … It looks like arms. It looks like head, eyes. There is a nod to human structure with that or if the drones look like airplanes. They don’t look like humans.

I think we are gonna start getting into really tough and interesting questions about life frankly when we think about these things. As I was thinking about the outrage and it wasn’t an outrage, but certainly the questions, also confusion, curiosity that people had about the situation in Dallas. It clearly meant something to people. It was clearly seen as something different from a technology perspective. Not just from a social perspective. I think that it is going to take us back to questions of how do we define life. What is …

Not just what is life but how should we treat different things. If we take robots out of it for a second, there’s the continuum, and it is probably even broader than this but as a simple and crude continuum, you have on one hand people who are basically saying all animal life should be preserved. All animal life is valuable. We shouldn’t be killing animals, whether it be for eat- Certainly not for sport. Life, animal life is valuable. Then we have at the opposite extreme, people for whom only human life is valuable. There is even psychopaths and other individuals for whom human life is not valuable, but let’s take them off the table for now.

At the other extreme, you have people who are basically saying look human life is valuable, none of the rest of it matters. Then you have a lot of people who are in the middle, which is where I would put myself which is I actually think animal life is valuable, but I’m still having barbecue lunch with you today. I’m not going so far as translating the value I see and really what my inherent belief system would be to a change in my behavior. I’m still eating the beef, even though on a moral island I think it is not a good thing but I’m eating the beef. Okay, so we’ve got this sort of existing social continuum over how people value different types of life.

Now we are bringing robots into the equation, which is a very different vector because horses and cows, I guess we are not eating horses, but cows and pigs, animals that we eat these are the process of the same biological process that we ourselves have as humans. They conform to a certain sense of biologic life that we can understand and identify with and make sense to us. Now in the context of robots, this is life that we are creating or at a small level, this is already happening, but as the robots get smarter eventually they will be self replicating and be at sophisticated levels recreating themselves.

What does that look like? What I’m trying to tease out of all this is, what’s the value system? Where does this lie? Is it okay to you know we make this robot which is just a machine like any other machine. We are happy to bulldoze a factory with a lot of machines in it. For 99.99% of people have no value as life. It is just okay the machine was created. We kerchunked widgets for a while. Widgets aren’t relevant anymore. Bulldoze the machines and move on to the next one. Now we are moving into this more grey area with anthropomorphic machines and artificially intelligent machines, and the question is when does their destruction matter? This little robot, I’m even sort of imbibing emotion on to it.

This little robot was wheeled into this room to blow itself up along with this perpetrator. Is that okay? I think generally we collectively think that it is now, but is it really and should it be? I don’t know. I think it is a complicated question, but I think we need answers for it. I don’t think it is okay. So much in the world and politics is treated as a matter of opinion and that might’ve been fine in the old days when we were ignorant. We had to claim the gods or the supernatural are making things happen, but now that we understand the world and understand increasingly how we function, why we do the things that we do, I think that there are … I hate the word truce, but things that are more corrupt out there let’s say that clearly in a question of A and B one is corrupt and one is not.

I think we need to figure out where those lines lie around life because we have really compelling questions to answer about that in the context of robots and artificial intelligence. I think in the context of animals and other creatures, we are already acting in ways that probably are morally consistent with where we should be a species. Even more chillingly as we are expanding or at least exploring at this point some day possibly expanding into the stars, if we are encountering extra-terrestrial life. That is a whole different vector on the conversation. One that we need to be ready to have.

I think all of these things fit together. I typically I like being a problem solver. I’ll tend to solve things just whole cloth, but after this thing happened in Dallas and people were obviously so moved by it, I was just sort of shrugging I thought figuring out life, a definition to life is probably a important thing. I instead of solving the problem myself, I did some research as to what are the leading definitions of life. Where it took to me to is that the smartest people, the people who you and I would respect John, the leading scientists, the people at NASA, they don’t think definition of life matters. What they think is that there needs to be a life system that we understand and agree on.

There’s been some really good work done interestingly largely back in the ’70s that has sort of propagated forward. I don’t remember, there were two fellows who did a lot of that early work. I don’t remember the names off the top of my head, although it would be a good pointer for our listeners, but the idea in science is we’ve got to figure out what does the system look like and how do we define it. Once we understand that, then we can make some decisions around how we should be behaving whereas right now I am like what the hell are we doing?

Jon:
Yeah lot of interesting stuff there Dirk. I think that’s probably fodder for a bunch of future episodes if there ever was some. I want to leave us with one thought which is that whatever we think the cultural ingest or intake of this debate, it will probably happen faster than we think, and we are gonna be completely unprepared for it because if we think that the mobile phone or the internet has upended our lives in different ways and very quickly we can only imagine what robots, synthetic biology, and artificial intelligence are capable of doing.

I think it is several degrees higher than what we’ve seen so far. Listeners, remember that while you are listening to the show, you can follow along with the things that we are mentioning here in real time. Just head over to the digitalife.com, that’s just one L in the digital life and go to the page for this episode. We’ve included links to pretty much everything mentioned by everybody, so it is a rich information resource to take advantage of while you are listening or afterward if you are trying to remember something that you liked.

Jon:
Listeners remember that while you’re listening to the show, you can follow along with the things that we’re mentioning here in real time. Just head over to thedigitalife.com, that’s just one “L” in The Digital Life, and go to the page for this episode. We’ve included links to pretty much everything mentioned by everybody. It’s a rich information resource to take advantage while you’re listening, or afterward if you’re trying to remember something that you liked. You can find The Digital Life on iTunes, Soundcloud, Sticher Player FM, and Google Play. If you want to follow us outside of the show, you can follow me on Twitter, at @JonFollett. That’s J-O-N F-O-L-L-E-T-T. Of course, the whole show is brought to you by Involution Studios. Which you can check at goinvo.com. That’s G-O-I-N-V-O.com. Dirk.

Dirk:
You can follow me on Twitter at @DKnemeyer. That’s @D-K-N-E-M-E-Y-E-R. Thanks so much for listening.

Jon:
That’s it for episode 164, of The Digital Life. For Dirk Newmeyer. I’m Jon Follett, and we’ll see you next time.

No Comments

Leave a Comment

Your email address will not be published. Required fields are marked *