Welcome to Episode 265 of The Digital Life, a show about our insights into the future of design and technology. I’m your host, Jon Follett, and with me is founder and cohost, Dirk Knemeyer.
For our podcast this week, we’re going to take a look at facial recognition software and digital disguises. Facial recognition really is everywhere, whether we’re talking about these AI-driven systems that run face-scanning technologies for law enforcement and government or the same types of systems that are working in our social media to help determine who is in the photos that we upload on a day-to-day basis. You can learn an awful lot about people from facial recognition.
There’s a number of concerns which we’re going to get into, but just as a starting point, you can learn so much about me. You could see my ethnic background. You could see my age. You could see my current appearance, how my appearance maybe changes over time if you’re looking at a lot of different photos. So this type of ongoing surveillance tool can be used for the public good, or it can be used to harm people specifically identifying people at times when they may not wish to be identified and violating their privacy rights, all sorts of issues come up. Let’s start by talking a little bit about the personal privacy expectations and that loss of privacy in the digital age, and then let’s get into this research being done at the University of Toronto about basically disguising yourself digitally.
In terms of facial recognition software specifically, there’s three primary big buckets of use cases. The first one is personal online security and identity. One of the problems with the historical method of, or not the, with all of the historical methods of privacy and security online protection is lack of usability. That’s most captured by the password approach where especially today you need a safe password. Safe passwords are hard to remember. You’re supposed to have different passwords. Everywhere the rules for passwords are different in different places. It’s a total fiasco to try and manage your online security and identity. Facial recognition is/was a technology that promised usability in our personal security and identity. We use our face, we look, and we’re in, which, from a strict usability perspective, is exactly how it should be. Now the research that’s being done, some things that are out there, we’ll see why, sadly, this may not be the thing that we had hoped it would be.
The second use case is for protecting the public good. Organizations like the police using the technology in order to better track us, better identify us in pursuance of their job, and their job is a civic one. It’s one that is intended to be in our best interest as a society.
The third big use case is corporations using our identity to enrich themselves in various ways so using facial recognition technology to find out information about us, to identify us, to place us, and to leverage, however they’re using that, to monetize us. At a high level, those are the three contexts in which this facial recognition software pertains.
I think just shifting gears slightly now to talk about some of the concerns that each of those scenarios raise. One, of course, is this idea that in our day-to-day lives we have elements that are private and elements that are public. In addition to those specific elements, there are even moments within our public and private lives which may overlap. For instance, let’s say you’re protesting something. That is part of your public life. At the same time you’re a part of a collective that’s doing something either in support or against an issue, and that’s not necessarily … It could be something that you’re advertising and that you want everybody to know about, or it could be something that you want to do in an anonymous fashion. Maybe you feel like your company, your boss doesn’t need to know about your protest.
Public protesting almost by definition is not anonymous so that’s … I don’t know. That person is a little deluded in that case.
Well, let me give you an example. Let’s say that in every public protest you are now identified as having gone to that protest because they take a picture of the crowd. So now you can say in this crowd of-
Hundreds or thousands of people [crosstalk 00:06:11].
… thousands, yeah, you specifically were at that protest and you were at another protest. Now your behavior is something that people need to track. Now, I know with knowing people who have protested various issues, I’m familiar with their willingness to be public about their views. At the same time that does feel like it could be an intimation factor. Like if everything you do that may not be in line with how some powerful, call them, authorities think and that can be tracked and held as a chit against you, that feels like something that could have a chilling effect. Just as an idea that public and private, at least up until this point, there are certain overlapping areas where we don’t expect our activities to necessarily be tracked to be remembered to be emblazoned on us.
I mean maybe I’m the one who’s being ignorant or naïve here but I do. For me, this goes back to the popularity of cell phone video. Now, if someone is pulled over and something happens, somebody is having a video of it. If there’s something happening at a protest or an event, everybody’s phones are up and are recording it and have video of it. I may be only speaking for myself here, I already assume that I’m captured, that I’m documented. Me not being a person of interest, hopefully, it’s not being found and leveraged and used and what have you. But I take for granted it’s there. For me, facial recognition technology doesn’t introduce a new issue. It’s just an additional technology to exacerbate an existing issue for me. But that might be unrealistic. I don’t know.
It certainly something that bears consideration. I think, Dirk, you are maybe more attuned to the level of technology being inserted into our lives versus someone who maybe isn’t in the tech industry and just doesn’t have those same expectations. I know that you’re pretty well familiar with the way these things work, and so that sets a baseline for you.
We’re in the bubble, so maybe sometimes we’re not seeing it that the average person outside the bubble would see it.
Assuming that there are some things that we would like to keep private or maybe every photo of us we don’t necessarily want to be identified in, there’s some interesting research at the University of Toronto where researchers have actually used artificial intelligence to create an algorithm which enables a filter to alter a few pixels here and there in a photo which confuses the AI that would be processing that photo and identifying the facial recognition from it. You have your photo of, say, the two of us doing this podcast and the filter does its magic, and all of a sudden Facebook AI would have a lot of difficulty recognizing that there’s two people, Jon and Dirk, conversing at a podcast.
It’d say that there’s two giant melons suspended in air.
Yes, if we’re lucky. The results which are being presented at, I believe, at an IEEE conference is that they’ve reduced the ability for AI facial recognition to be successful down to .5%, so not 5% but .5%, which is pretty significant and pretty impressive. The thing that really interested me about this research was that they actually have these two competing AIs, one that was trying to identify the photo and one that was trying to actively deceive the other AI and force it to misidentify photos. That’s how they came upon their filtering technology.
This idea of the digital disguise I think this is the start of this. Now certainly there have been other experiments. I think there have been glasses and hats and masks all for the same purpose, which is confusing AI facial recognition. But this feels to me like an interesting step because it’s introducing software to combat other software, which shouldn’t be a shock to me because we have anti-malware and antivirus software to combat malware and viruses, of course, so why wouldn’t we have anti-facial recognition software. So for every software you now apparently have the anti-version. Dirk, how do you see this arms race, because that’s really what it is-
… this arms race proceeding?
Look, it’s measures and counter measures. It’s as old as any arms race. That one side develops something, it’s overcome by the other side which is then overcome by the other side which is then overcome by the other side. It’s an interesting story. It’s interesting in that they are producing the countermeasure for facial recognition almost preemptively before it’s a huge issue. Here in the United States for the most part, we don’t have people being thrown into dark rooms and paddy wagons for their participation in protesting, to go back to your earlier example. It’s almost being done ahead of the curve. It’s being done before we [inaudible 00:12:58] having to deal with some of these things. Beyond that, it’s what you would expect. It’s the here’s a technology; here’s the counter. Then there will be a counter to the counter and back and forth and back and forth.
Eventually, look, they’re going to be able to recognize us by our face unless we are wearing masks or something like that. If we start wearing masks, I’m sure that they’ll develop technology to use our bodies or some body part or whatever. At some point an authority will be smart enough not to tell us what they’ve done, not to give the secret away. We know that’s Jon because of his shoulders. Shoulders certainly won’t be it but whatever it might be. I don’t know. A long time ago, I threw up my hands and stopped worrying and accepted I’m being tracked, I’m being followed. All of the stuff that I do online is cataloged somewhere, If it came out, I’d look really silly and I’d have to own it, and I’m ready to own it. If I have to do that, I’m confident that the hidden, stupid stuff of the rest of you would be just as ridiculous as my own. If I have to eat it, I’ll eat it and you’ll eat it too, and hopefully we’ll like it together.
Yeah, that’s it. I don’t know what you call that philosophy, Dirk, the Zen of The Digital Life maybe. This story is interesting to me because it does mirror the encryption, voice encryption in particular, question especially around … you see mobile communications are very easily intercepted. There are a number of services and special phone companies and apps as well for disguising your signal, for making it difficult to listen in on your calls. This feels in some way like facial encryption, the idea that we are camouflaging ourselves in some way. I know there’s plenty of sci-fi imaginings of how that might be done further. We can dig into those another time. I’ve certainly watched way too much science fiction so that would be a lot of fun.
But this idea that our digital selves and our physical selves, this interrelationship and the sensitive issues that it raises around privacy, around our ability to function in society, our ability get jobs, our ability to communicate with each other, our digital and physical selves are both separate and tightly intertwined in a way that this evolution I’m just finding maybe a little scary because it feels like I don’t have any kind of control over how these relationships are established and formed. It feels like when my digital self is analyzed or spotted or facial recognition is used on me in Facebook, it feels a little funky. Maybe it’s because I’m not a digital native quite. I’m like pre-digital native, Gen X, so this connection and the ability to manipulate and to analyze my digital self still feels a little icky to me. When I see University of Toronto researchers doing that, I’m like, “Oh, that’s kind of cool. Maybe I’d like to try that out.”
It might save you, Jon. It might save you. The challenge is that the world moves faster today, particularly, than our contacts of ourself moves. What I mean by that is we are rooted in the time that we came of age, the way the world was, how we were, our perception of things. We, people in general, don’t change as quickly as the world changes around us, and so every day as we get a day older, we’re that much farther removed from the present. We are rooted in the past in ways that are out of step with the reality of today and what young people take as correct and of the moment in the way the world is. We have a warped version based on our past as well. So the challenge is to pull yourself forward, you, not you Jon, but maybe all of us and move at a faster pace so that we can naturally be in step with that and not need to feel that safety when it comes but just understanding where the world is and where it keeps going.
Yeah, that’s a great conclusion. Listeners, remember that while you’re listening to the show, you can follow along with the things that we’re mentioning here in real time. Just head over to the thedigitalife.com, that’s just one L in the digitalife, and go to the page for this episode. We include links to pretty much everything mentioned by everyone, so it’s rich-information resource to take advantage of while you’re listening or afterward if you’re trying to remember something that you liked. You can find The Digital Life on iTunes, SoundCloud, Stitcher, Player FM, and Google Play. If you’d like to follow us outside of the show, you can follow me on Twitter @jonfollett. That’s J-O-N-F-O-L-L-E-T-T. Of course the whole show is brought to you by GoInvo, a studio designing the future of health care and emerging technologies, which you can check out at goinvo.com. That’s G-O-I-N-V-O dot com. Dirk?
You can follow me on Twitter @dknemeyer. That’s @ D-K-N-E-M-E-Y-E-R. Thanks so much for listening.
That’s it for Episode 265 of The Digital Life. For Dirk Knemeyer, I’m Jon Follett. And we’ll see you next time.