Welcome to episode 155 of the Digital Life, a show about our adventures in the world of design and technology. I am your host Jon Follett, and with me is founder and co-host, Dirk Knemeyer.
Hey, Dirk. For our podcast today, we’re going to talk a little bit about the future of designing embeddables, which, of course are those things we might be putting into our bodies in future technologies certainly that are already here when you’re talking about things like replacing joints and things like that, but, in the future there’s going to be a lot more silicon going in there as well as we start putting computer chips and other things into our bodies that we’re not doing too much of that just yet.
You said we may be putting them. We will be putting them in. We can get into this more later, but the whole wearables thing is just a transitional phase, I mean, embeddables are going to be where it’s at. Wearables are going to be clumsy clunky junk.
Yeah, there’s an interesting trend with wearables. They’re getting obviously smaller and closer to your body. You’ve got wearables that are now sort of part of your clothing and of course-
The washing machines are going to love that.
It’s all this continuum approaching the point at which we’re going to start sticking-
Just embed it. Stick it in there, Jon, stick it in there.
In that spirit then, our friends at Google, also known as Alphabet have started laying the groundwork for their embeddables. In particular, they got a U.S. Patent for technology that would let them implant an electronic lens directly into your eye. This is following a trend that we’ve been seeing from Google which is basically talking about or rather talking about, basically focusing on the eye and enhancing it, right. A few years ago, we had the Google Glass, of course-
Which was wearable technology.
There’s your wearable version. Then, recently, I believe Google was partnering with Novartis to work on a contact lens, which in that case contains some very tiny chip-set for detecting diseases like diabetes. That has yet to come to market, but you can sort of follow their progress. A little farther down the road and what this patent speaks to is actually injecting this technology into the eye and making it part of you, which, for my money, makes you a cyborg of some kind, right. We’ve discussed this before.
You can imagine a host of reasons you might want to do this; if you’re older and your eyesight is failing, maybe this improves it; if you wear glasses or contact lenses, maybe you never have to wear glasses or contact lenses again. These are kind of the everyday problems that this might fix. Of course, you know that Google/Alphabet is not going to stop there. There’s the whole idea of captioning photos or captioning video, or Lord knows what else, maybe seeing very far away, now you’ve got super human abilities, right, hey, I can see dirt coming from a mile away, or having the ability to zoom in really close to things, right, so, hey, I don’t have a microscope anymore, because I can see I just shook hands with you and my hands are covered with bacteria, lovely.
People who have seen HD porn knows more resolution is not always a better thing.
I would guess not. I will leave that to the listeners who know what you’re talking about. The increasing in technology that’s sort of adhered to this is interesting, but releases all kinds of issues, right, so, the same issues that everybody had with Google Glass and the famous moniker “well, now you’re a glass-hole, because you’re basically recording our conversations, perhaps surreptitiously, you’re snapping photos of people when they’re unaware, you’re invading people’s privacy”-
What’s privacy, that’s not something I’m familiar with?
What is privacy, right, so, now you’ve got this embeddable that can do all of those things and the only way you know it is you don’t, maybe there’s a glowing light in the center of the guy’s forehead. Then, add to that, the fact that you’re going to have some kind of wireless transmission, so, from the perspective of the owner of this technology, the cyborg, all of a sudden, they’re a heck of a lot easier for law enforcement to track, because they have this presumably unique signature that they can’t get rid of now.
Additionally, employers could be tracking this making sure you’re focused on what you’re focused on and not watching anything you shouldn’t be, so, to make matters worse, maybe there’s hackers that are interested in interrupting your video feed or messing with you in some way. We saw how hackers could get into automated cars, right. I don’t know if hackers would want to get into my eyeballs, but who else. There’s all of these social acceptability questions and all of these other issues that come along with making technology part of us. Dirk, what’s your take on this? I know it’s a big area to weigh in on.
It is big. When you let your mind sort of go crazy and explore, it seems like dystopia all over the place, but, I don’t think the technologies will manifest that way. The technologies can’t manifest that way, and here’s why. To take your example of employers, employers being up in your shit, every damn thing you do at work, it’s not feasible, and the reason it’s not feasible is we as humans are not robots. We are going to rest, we are going to take moments where we are not linearly kerchunking away like John Henry on the railroad on the exact thing the employer wants us to do right in front of ourselves. If that level of monitoring existed, it would spoil the relationship between virtually every employee and every employer everywhere in the world, and that’s not going to happen, so, yes, there are a lot of interesting questions about where this could go, what could happen, how it impacts us, but a lot of them wouldn’t even be manifest because they would undermine the very fabric of reality.
We’re a long way away … I shouldn’t say that because neuroscience is moving very quickly, but we certainly don’t have a coherent sense yet of, and of course, it would have to be different for each person because we’re all wired so differently, but, we don’t yet have a coherent sense of the optimal way to work is in four hour shifts with two and a half hours being kerchunking, and a half an hour being daydreaming, and fifteen minutes being a power nap. At some point, that kind of stuff will be figured out, but I think we’re a long way away from that and it would only be in the context of that deeper understanding of how the human animal optimally functions that that sort of analysis of how people are spending their time at work, what they’re doing really has any value. Until that, it’s just voodoo.
I’m picking on that one example to sort of push back against the whole waterfall of interesting thought examples you had of these crazy ways it could go. A lot of them aren’t going to go that way because it would be completely undermining to the basic systems and functions we have in place. The ones that we should probably be more concerned about are the ones that would be more at the level of the government, Big Brother, tracking. Right now with our cell phones, we can be tracked in pretty granular ways, probably more so than we realize, and maybe it’s even happening in ways beyond what my naïve little brain would allow for.
I don’t know that embeddables change the game that much. I think where I’m interested with embeddables, at the end of the day, our eyes, our hands, our mouth, and other parts of our body are part of a UI. They’re part of a user interface between ourselves and the outside world and we’re going to get to a point where those user interfaces are less important, possibly to the point of obsolescence because everything can be straight into our brain, into our central nervous system, into the neurological and endocrinological and psychological aspects of who and what we are, so, we’ll have direct mind to mind communication, be able to picture each other in fullsome ways from across the country or from across the world, to download not even the literal sense of how we think of download per se, but to download huge chunks of data and thought.
That’s not coming, it’s not super close, but we’re on that path. That opens up a lot of real interesting questions because then the frontier becomes the brain, the frontier becomes the self. Right now, cyber terrorists or hackers are trying to crack our thumbprint. Right now, our thumbprint gets us into our phone. We’re also moving towards ocular technology, right. The technologies of high resolution, which you talked about before will make it trivial for someone to copy my eye-print. Somebody who is just way off, that I don’t even realize is there is getting a picture of my eye in a way that it could in a high resolution way reproduce it, and make sort of ocular authentication completely irrelevant. That’s all trivial and that’s all greatly becoming pretty much as fast as ocular recognition technology itself comes.
To me, where it gets more freaky and more interesting is when the brain becomes the final battlefield, is when we move beyond the eye, the thumb, the lettered passwords to where it’s the brain is the true essential self that is somehow unlocking systems communicating externally. Our self representation in the world is largely from our mind and spirit, whatever that is or isn’t, and that is going to be the frontier of hacking and that is going to be the frontier of terrorism. I think that’s where really interesting stuff starts to come, and now I’m going pretty far down the road.
Yeah, the brain as a hacking surface is really one I really want to contemplate. I have enough trouble with my brain as it is. I want to dial this back maybe to some of the considerations that we’ve discussed about the intersection of design and really science and health and all of these things are represented in that Google patent in some way, because we’ve got this … Coming together now, we’ve got all of these different disciplines that are only just starting to work together. When you consider what it will take for this potentially a commercialized product to happen, it’s got the digital component, which we’ve discussed at length, and then it also has a surgical component, it has health ramifications, it has all sorts of variety of sciences that up until now, I don’t think we’ve quite seen this cross pollination of disciplines.
I think from a design perspective this is absolutely fascinating to me where Google is taking some of this technology, because it says to me that the interesting design disciplines that will sprout up around products like the Google embeddable lens or whatever this thing is going to be called, that’s sort of the next frontier of design, at least from my perspective. I think because embeddables, your skin as the interface, your senses as the interface. I think that opens up a wide variety of areas to explore. We’re starting right now with the screen paradigm which is slowly giving way in some ways to audio and voice and things like that. As you pointed out-
Yeah, there’s this whole next frontier of the human body where design and science are going to interact in ways that we’re not quite sure what those are yet, but when you think about the evolution of interaction design, how it’s moved forward through digital space, I really can’t wait to see what these design disciplines are in the future, because frankly, I’d love to be a part of that. At the same time, it’s a little bit frightening, because of course, the level of knowledge required to sort of have table stakes here is that there’s a lot of learning to do.
There is a lot of learning to do, and we mysticize and privilege humanity, but we really need to step back and deconstruct it and think that we are just an IO device. Our bodies are our user interfaces, and the fact of when the wind blows, it blows my skin which makes me feel something, which makes things happen in my brain, those are all things that science can get to the point of first understanding directly from wind hitting all the way through the totality of things that you think can feel in a certain way, but the next step is to replicate those things, and whether it be wind on the skin, or the things you’re hearing in your ears or seeing with your eyes.
At the end of the day, that can be chunked down into IO stuff, into data in and data interpreted, and data making systems fire within our system, and science is well down the path of figuring those things out, and, once it’s done, the sky is the limit. Science, technology, it’s been the applied technology that has really driven the digital revolution. The next revolution to come is one that is going to be driven by the science.
I fully agree with that. I think it’s worth mentioning because this is a patent, this is definitely Google putting a stake in the ground, and they’ve thought about it, clearly and they’ve sort of laid some of the groundwork. This is by no means ready to ship, right, I’m sure there’s going to be a fair amount of time for them to develop this, so, who knows if it will even come to market. What’s fascinating about watching these patents get awarded is it gives you a sense of where the company is and where they’re considering the next frontier is, so, it’s almost as a predicted measure of where they’re headed, I think it’s a pretty good one.
Yeah, I mean, speaking of predictive, I’ve been saying for years, including on this show, consistently that Apple is not the interesting technology company to watch. I mean, I was years ahead of this, but the really interesting companies to watch are Google, particularly, and Amazon as well, and more and more I’m almost embarrassed to admit it, but more and more, I think that Facebook might be rising up into consideration in that same sort of consideration set. I always feel very vindicated as Google continues to do more and more innovative things moving the bar out as Apple releases watches and different sizes of their phone, because maybe Apple is … That’s a whole different show.
We’ll dig into that in a future show for sure. Listeners, remember that while you’re listening to the show, you can follow along with the things that we’re mentioning here in real time. Just head over to thedigitalife.com, That’s just one “L” on The Digital Life, and go to the page for this episode. We’ve included links to pretty much everything mentioned by everybody, so it’s a rich information resource to take advantage of while you’re listening, or afterward if you’re trying to remember something that you liked. If you want to follow us outside of this show, you can follow me on Twitter @JonFollett. That’s J-O-N-F-O-L-L-E-T-T, and of course, the whole show is brought to you by Involution Studios, which you can check out at goinvo.com. That’s G-O-I-N-V-O.com. Dirk?
You can follow me on Twitter @dknemeyer or email me, firstname.lastname@example.org.
That’s it for episode 155 of the Digital Life. For Dirk Knemeyer, I’m Jon Follett, and we’ll see you next time.