Internet of Things tags

Bull Session

Smartware: Design and Function

September 28, 2017          

Episode Summary

On the podcast this week, we conclude our multi-episode discussion about the evolution of software and the future of computing, looking at how a handful of advances will come together to transform software and hardware into something new, which we’re calling “Smartware”. Smartware are computing systems that require little active user input, integrate the digital and physical worlds, and are continually learning on their own.

This week we’ll look at five ways in which Smartware will manifest in the design and functionality of future computing: Machines will do more of the “mechanical” work, interfaces will become invisible, environments will become customized to the individual user, physical presence will be optional, and apps, while fewer in number, will create a greater, networked ecosystem.

Jon:
Welcome to episode 226 of The Digital Life, a show about our insights into the future of design and technology. I’m your host Jon Follett, and with me is founder and co-host, Dirk Knemeyer.

Dirk:
Greetings, listeners.

Jon:
For our podcast topic this week, we’ll continue with part three of our multi-episode discussion about the evolution of software and the future of computing, looking at how a handful of advances, like AI, the internet of things, neuroscience, and additive fabrication will come together to transform software and hardware into something new, which we’re calling “smartware.” Smartware are computing systems that require little active user input, integrate the digital and physical worlds, and are continually learning on their own. In addition to our podcast discussion, we’re happy to announce a six-part monthly series on smartware, in partnership with respected user experience publication UX Matters.

This week, we’re going to look at five ways in which we think smartware will manifest in the design and functionality of future computing. Let’s get started with that. The first way we think it’s going to manifest is in that machines will be doing more of the, quote, “mechanical” work for us. If we think about the evolution of software, this has been the direction that software has been going for some time. Dirk, you know in the design industry, we’ve had increasingly automated practices, so back when I started in graphic design, I had an internship where I had a very old school layout, and I used hot wax to do the layout. I had to run pieces of type columns through this hot wax machine, and I would put those down onto a larger layout, in a suitable design, and those would go and get photographed, right?

Dirk:
Wow. Hot wax and photographs. That sounds more like a swingers’ party than design.

Jon:
If only. Unfortunately, that was a summer internship that paid like- I don’t know- seven bucks an hour. But the point is that QuarkXPress, if you remember QuarkXPress-

Dirk:
They’re still in business. They’re still chunking.

Jon:
-was the miracle software, because it automated the hot wax and photography layout, right? No longer did you have to take this stack of layouts to the photography room and get the negatives made so it could go off to press. In this way, the actual mechanical work was being automated, so in this trajectory, we’re seeing artificial intelligence, machine learning, enable various kinds of software to take on more of that type of mechanical work, and in so doing, sort of eliminate the need for that and allow the human beings to sort of look at more strategic, more different kinds of areas of focus, and ultimately, I think that the jobs, whether you’re talking about design, or architecture, or science or whatever, those types of jobs are going to change as this mechanical work gets automated.

Dirk, what’s your take on that? How do you see this coming to be with smartware?

Dirk:
Yeah. It’s only going to increase, and with design in particular, this is one of those areas where machine learning is going to have a particularly big impact. I mean, you talked about sort of the transition from the real old school days of hot wax and photographs to QuarkXPress, which was just huge of course, but since then, and now sort of shifting over to the Adobe Suite, which has become the sort of defacto market leader at this point, there’s been ongoing incremental process toward continually removing the scutwork, towards continually removing the things that are tedious, that are procedural, that take time and are not design, in some big D sense of how we think about design as problem solving. They’re more about just getting things to work in the environment that we’re in.

Adobe software is incredibly powerful. I mean, not only does it allow you to not have to do a lot of physical things, but even within the software, a simple feature that’s been in for a long time, I’ll use as a quick example, is just setting up grids, and allowing things you put in there to snap to the grid, making it so you don’t have to get … At least at a prototype level, you don’t have to get it pixel perfect, and still have a layout that sort of follows rules of good layout.

One tedious thing that people still spend a lot of time doing is cutting out outlines. If you’re going to use Photoshop in particular for a lot of its more powerful things, you have to be able to, within the file, basically specify, “This part of the file is object A, basically, and this other part is object B, and this part is object C.” Then you can start to make really interesting things happen, but it is still, until recently … Maybe they’ve already solved this. I know that they’re on their way to solving it. If they haven’t, you had to hand-cut that out. You had to go, and if there’s a person that you want to stylize, or keep from being stylized as you stylize the rest of the thing, you have to really precisely zoom in and basically cut it, point by point, by point, by point, by point, and say, “This is another thing.” Maybe they’ve already achieved it, because I don’t use the software commonly the way that I used to, although I’ve been reading about how they’re evolving it. They’ve taught it to identify different things within an image, and from that identification, be able to cut it out on its own accord, as opposed to having someone go around and snip it and cut it out.

With the snapping to grid, and with this example, these are two easy examples, but just of the point that the things that are being cut out, hot wax and photographs, the trying to line things up precisely, the cutting out of part of an image in order to take that part or the rest of the image and process it in some special and powerful way, those things are all being wiped out, and what it’s leaving for the time being, at least, is the design, and the problem solving, and the sort of aesthetic aspects of being able to make decisions around, “I’m trying to communicate what here? What are the best ways to communicate it and solve those problems on a conceptual level?” And then have the implementation of the solution be far closer to pushbutton.

It certainly isn’t pushbutton yet. The designer needs to be someone who’s schooled in these tools, which remain professional grade tools, not consumer grade tools. However, those professionals are able to take more and more of their time and put it into thinking, as opposed to fabrication. Over time, I think we’ll get to the point where first, the tools, we’ll get to the point where they’re consumer grade, so the being a designer part of that is not the arcane knowledge of hard to use software. That, as sort of a prerequisite for being a designer, at some level will be erased, but then eventually, of course, it will get into the problem solving, but I think that’s a whole different conversation.

Jon:
Sure. The second element that we have identified as being part of this transformation of smartware is that interfaces are going to, in some ways, vanish, become invisible. That’s something that we’re beginning to see already, whether you’re talking about the voice user interface of something like Amazon Echo, or Google Home, or Siri, or if you’re talking about tracking eye movements or gestures, and already we’re seeing that Facebook and Elon Musk’s company there is looking at mind to machine interfaces, right? There are hosts of ways in which the interface can disappear, and sort of more deeply track what our motivations or our desires are via another type of input.

Also, as our environments become more sense-related, and of course there will be lots of other ways to pick this up, whether it’s embedded sensors, or cameras, or what have you. Dirk, when you think of the invisible interface, what do you see?

Dirk:
Yeah. I mean, you touched on some of the key points, but for example, you mentioned voice user interface, and voice and other methods of interface different from the traditional click, swipe, tap type that we’re accustomed to, are going to matter and be important. Right now, voice interface is largely rubbish. I mean, I know there’s people who use it. I know there’s people who find utility from it, in relatively narrow use cases, I would say, but it’s just totally not there. But within the next decade, voice user face will be robust, and accurate, and interesting, and not just sort of working in a few clunky ways, but in a more holistic way. Voice, along with other sort of user interface methods, will be replacing what we’re accustomed to, and they will be much more interfaceless, in the interface itself.

But then the other part is, with what I call the identity graph, is the machines are going to have a much more robust profile about us, and that will happen from a couple of different perspectives. One is from a machine learning perspective, where passively the machines are going to be collecting data and information, and putting together an identity graph, both based on the things they’re collecting from us, but then the things that they are putting those in juxtaposition to, in order to stereotype us, in order to more quickly identify how they should be interacting with us and what they should be providing.

The other part of that, of course, gets into the hard sciences, and we talked about the future and sort of unclear impacts of things like these incredible advances in neuroscience and genomics in particular, with other fields of science as well. That’s going to contribute to the identity graph too, because unlike the more traditional clumsy taking action on a person, we’re increasingly understanding how the chemicals in our body work, how our brain works, how to motivate us, how to get us, as unique individuals, into environments and ways of performing where we’re at our best, right? From a business perspective, from a worker perspective, and also then from a self perspective, from a humanistic perspective. What does life look like when we’re needing to recharge, when we’re needing to fulfill ourselves?

The machines are going to have coded into them an understanding of those things from a scientific perspective, and that will be woven into our unique identity graph as individuals, but also the larger sort of tableau of software that will increasingly become invisible because these things will just happen. There doesn’t need to be me interacting with a machine and saying, “Give me this. Give me that. Do this. Do that.” It’s going to be much more just in time, and the machine sort of understanding and sort of proactively creating the sort of living environment which encompasses computing. At this point, with mobile already, but even more so in the future, the living environments that I’m participating in.

Jon:
Right. Yeah. That’s really our sort of third element here, is that the environments that we work in, that we play in, that we eat and sleep in, will become customized to our individual user preferences, right? I mean, we can already sort of see around the edges how this might work in a working environment, or in a communication environment, and having those pieces of software that we need happen just in time, or sort of anticipate our needs based on previous history, as you outlined in your discussion on the identity graph.

I think that sort of melding of the behavioral data, which is sort of driven by the sensors in our environment, as well as biological data, which you also mentioned, whether that’s genomic data or data derived from other biomarkers, it certainly would make it possible to have this unique footprint, this unique identity that can be used to hopefully make our lives a little bit more smooth, and potentially almost seem magical. Whether that happens in the execution I think remains to be seen.

Dirk:
Yeah, and you mentioned that we treat those as sort of two separate things, because they will sort of develop separately, which is to say we’ll have invisible interfaces less related to identity graph, at the same time as identity graph becoming a thing and starting to really matter. I just get too excited, and what I’m really excited about is even farther down the road when all of that stuff is really coming together into really a wonderful environment.

Jon:
The next element that we want to talk about in terms of smartware is about sort of eliminating or greatly reducing the need for physical presence, to the point where it might even be optional for a variety of activities. Dirk, I know you’ve thought a bit about this, regarding how this might work for virtual reality, and also 3D printing. Do you want to tell us about that?

Dirk:
Yeah. We’ve already seen with Amazon, and just with the rise of the internet in its simplest form, a transition from physical shopping to virtual shopping, but with virtual reality, and additive fabrication, that can become a much more total thing. Right now, there’s some class of products that most of us want to go to a store and see, or touch, or open, or purchase, or give a test ride to, which of the course the last example would be like a car or a motorcycle, but also large appliances. Maybe a mattress. For each of us it’s different, but for all of us, there’s some group of things that we want to try before we buy, in some way, shape, or form.

With virtual reality- now, again, the technology isn’t there yet- but we’re starting down a path towards a virtual and augmented reality sort of tech stack that allows us to explore and experience those things to the degree which we wouldn’t need to go into the store anymore, where we’d be comfortable enough saying, “I don’t need to sit in that car, because of what I was able to experience and explore in this virtual environment.” Which it might not be something that is literally like sitting in the car, but you can imagine if it’s showing you … If you can see 12 front seat layouts all next to each other in one field of view, and have a real sense of how much space is there, and inclines, and declines, and all the rest, as a more intermediate step before something even more fulsome. Regardless, it’s giving you enough so that you’re saying, “You know what? I’m going to push the ‘buy’ button. I’m confident to throw my $30,000 for a car, or $2,000 for a washer and dryer, or whatever those numbers look like, to have it come in, and I’m going to accept it. There’s not going to be a buyer’s remorse thing going on here.”

The trick is … Or “trick” isn’t even the right word. The interesting thing about that at scale is, we no longer have a need for the retail organizations that we have today, so if we take a car dealer, car dealers generally take up a lot of space, because they need a lot of inventory. Why do they need a lot of inventory? Because people need to come in, they need to try a lot of things, look at a lot of things. Without that, the inventory is completely unnecessary. You have a person who, at home, is specifying and buying their car, and what matters to them is that it’s exactly the way that they want it. They don’t want just the one that happens to be on the lot. “Oh, we’ve got all the options, but it’s in brown.” You know? You don’t want the brown one because it happens to be there. You want the thing that you want.

We’re going to move to where the notion of the car dealership as we’re accustomed to it is going to completely disappear. I mean, there’s going to be some notion of … Certainly used cars have a little bit different use case. Certainly there’s some context, probably more, frankly, for more affluent buyers, where having some kind of a footprint makes sense, but the traditional notion of going in, the hard negotiation with the salesman who goes and talks to their boss and tries to sell you the undercoating, that’s just all going to completely go away, and be replaced by you’re buying something that is customized just to you, and delivered very quickly.

Where additive fabrication comes in, and the path to additive fabrication in cars is a lot farther out, but things like additive fabrication in appliances, going back to washers and dryers, it’s not years away, but it’s also not decades. It’s more in the intermediate term. You’ll have a quote-unquote “appliance source.” Not really even an appliance store. You’ll have just a place of fabrication so that as I’m choosing washers and dryers online, again, specifically to the specs that I want and have specified, it’s being produced locally same-day and is being shipped to me same-day. Instead of a giant appliance warehouse with all of these models, you need a bajillion models to show people one of each, and then for each model, at least if they’re popular models, you need a lot of quantity so that when you do your big weekend sale you can be selling one after another, after another, after another. That just all goes away, and in a world of additive fabrication, it’s just a place local that can make it, that can print it, and then get it to you in a quick contact.

That was maybe a long-winded sort of trying to give the full round trip of how I think this will work.

Jon:
No. That was good. Yeah. I think that’s a pretty exciting vision, and is probably what gets me excited as well about the potential for smartware. The last element we’ll talk about today is just the general idea that the software that we have that’s very widely splintered into many different types of apps and software to do different things that’s spread across our mobile devices, our phones, our tablets, and our laptops, and even our televisions, right? And seeping into our cars. This diverse ecosystem is going to have to come together in sort of a greater networked and more seamless way, in order for there to be experiences where you can have these customized environments or a customized identity graph that sort of seamlessly moves along with you no matter where you might go.

The current paradigm of having a wide amount of apps for just about any reason, that’s going to end up being something of the past, if only because we’re not going to have any reason to be managing and updating and sort of searching through app stores to find the specific element that we want. Rather, in order for these types of services to really take off, they’re going to need to operate at sort of slightly under the surface level. We need not be aware of every single service that we’re using, although we probably want to be able to audit that, to take a look at that if we wanted to, but in a regular day to day use, I doubt we’re going to be interested in knowing every single system element that we’re activating.

Dirk, what’s your take on this need for the seamless ecosystem to happen?

Dirk:
Yeah. I mean, to make the most of the identity graph, and to make the computing experience be one that is just integrated into our lives, doesn’t require holding a device, or a device on our wrist, or whatever the physical manifestation looks like for our interaction with the competing environment to be more integrated. It requires integration among apps, integration among software. Right now, there’s almost no integration at all. I mean, the big companies of course, you could go to Apple and have integration between a lot of things, but just off the top of my head, and I actually don’t use many apps, but Slack is not integrated. We’re talking on Skype, is not integrated. Games aren’t integrated. They’re not part of the Apple ecosystem, so it all becomes splintered into this world of all of these separate apps, and that’s not going to fly in the way it is so fragmented today, in a future where really the most is being made of smartware.

That will have a lot of impacts, of course. I mean, one is ever since the rise of mobile, there’s been an explosion in app creators. I think the number is over half a million different people, companies, whatever the sort of atomic unit for the entities that this thing I read was using. There’s over half a million creators. That’s a lot of app creators. Most apps are failures. Most apps are commercial failures, where they’re made, very few people get them, very few people buy into the commerce model in a way that it can be considered successful and fund the sort of financially responsible creation of further apps. That’s part of it.

Then on the whole other side of it, we see the advances in machine learning, and we’re already seeing … If you think about machine learning, if you’re a normal person on the street, if somebody’s talking about artificial intelligence, you’re thinking about IBM, I think, because the high profile stories are about Watson, right? Watson is the sort of iconic AI brain at this point in our culture. But more than just being a marketing or propaganda thing, IBM has built lines of business around Watson. One that we’re familiar with of course is IBM Watson Health, and that is a real business where Watson is being used by IBM employees in conjunction with major health care organizations to create AI-infused diagnosis, AI-infused prognosis, AI-infused recommendations for treatment, and other things.

What’s happening here is IBM, and IBM’s not the only company, right? But I’m going to stick with IBM as sort of an icon of big business, which once upon a time it very much was, even moreso than it is today. IBM is gaining all of this knowledge, all of this experience, all of this capability around machine learning. It is not going to be easy or possible for the half a million random app creators to make things with machine learning in a way that can compete with IBM, or Google, or a lot of other organizations. It’s not clear how this will shake out from a money perspective. Currently, IBM Watson is making it available for people to sort of pay to play, and to use Watson almost like buying bandwidth, or buying cloud storage.

You can use Watson in a similar business model now, but is it going to be a case where it’s these giant corporations, and then you’re using their AI machine learning, and you’re kind of building on top of it? And then you’ll fit into their ecosystem so that you’re not this random guy or gal who’s done this thing that is outside the loop, you’ll be in the loop because you’ve kind of bought in that way? Or will it go back to much older business models of IBM is going to totally wall the garden, and you’re going to have to get … I don’t think this will happen, but to give you sort of an extreme example of where it could go, if you’re kind of going with IBM, you’re going to have to play the games that IBM produces, right? Is it all IBM at that point? Is it sort of the Oracle model, but for everything, as opposed to just for sort of your ERP systems?

I don’t know. It’s a little unknown now, but it’s really interesting, but all of those things come together are just going to winnow down the number of independent app creators, and the number of different apps, and we’re going to have a contraction. We’re going to have a shrinking. The absolute numbers may continue to go up as there’s more applications, but in terms of the total creators, there’s going to be a contraction, and in terms of the total different apps, by different apps being this separate button that you would push on your phone just for that one specific niche thing, all of that is going to contract. All of that is going to go down, and we’re going to be in a much more focused, much more integrated ecosystem that will, as you can tell, based on the things that I’m talking about, have huge downstream implications on businesses, and creators, and consumers as well.

Jon:
Well, we hope you’ve enjoyed our three-part podcast series on smartware. If you have, please head on over to UX Matters, and check out the article series that we have going there as well. Listeners, remember that while you’re listening to the show, you can follow along with the things that we’re mentioning here in realtime. Just head over to TheDigitaLife.com- that’s just one L in TheDigitaLife- and go to the page for this episode. We’ve included links to pretty much everything mentioned by everybody, so it’s a rich information resource to take advantage of while you’re listening, or afterward if you’re trying to remember something that you liked.

You can find The Digital Life on iTunes, SoundCloud, Stitcher, PlayerFM, and Google Play, and if you want to follow us outside of the show, you can follow me on Twitter @JonFollett. That’s J-O-N F-O-L-L-E-T-T. Of course, the whole show is brought to you by Involution Studios, which you can check out at GoInvo.com. That’s G-O-I-N-V-O dot com. Dirk?

Dirk:
You can follow me on Twitter @DKnemeyer. That’s @ D-K-N-E-M-E-Y-E-R. Thanks so much for listening.

Jon:
That’s it for episode 226 of The Digital Life. For Dirk Knemeyer, I’m Jon Follett, and we’ll see you next time.

No Comments

Leave a Comment

Your email address will not be published. Required fields are marked *

Jon Follett
@jonfollett

Jon is Principal of Involution Studios and an internationally published author on the topics of user experience and information design. His most recent book, Designing for Emerging Technologies: UX for Genomics, Robotics and the Internet of Things, was published by O’Reilly Media.

Dirk Knemeyer
@dknemeyer

Dirk is a social futurist and a founder of Involution Studios. He envisions new systems for organizational, social, and personal change, helping leaders to make radical transformation. Dirk is a frequent speaker who has shared his ideas at TEDx, Transhumanism+ and SXSW along with keynotes in Europe and the US. He has been published in Business Week and participated on the 15 boards spanning industries like healthcare, publishing, and education.

Credits

Co-Host & Producer

Jonathan Follett @jonfollett

Co-Host & Founder

Dirk Knemeyer @dknemeyer

Minister of Agit-Prop

Juhan Sonin @jsonin

Audio Engineer

Dave Nelson

Technical Support

Eric Benoit @ebenoit

Brian Liston @lliissttoonn

Opening Theme

Aiva.ai @aivatechnology

Closing Theme

Ian Dorsch @iandorsch

Bull Session

Smartware: Transformative Technologies

September 21, 2017          

Episode Summary

On the podcast this week, we continue our multi-episode discussion about the evolution of software and the future of computing, looking at how a handful of advances will come together to transform software and hardware into something new, which we’re calling “Smartware”. Smartware are computing systems that require little active user input, integrate the digital and physical worlds, and are continually learning on their own. Join us as we discuss the major advances in science and technology that are driving Smartware — from artificial intelligence (AI), neuroscience, and genomics to the Internet of Things (IoT) and additive fabrication / 3D printing.

Resources:
Smartware: A Tribute to Dead Machines

Bull Session

Smartware: A Tribute to Dead Machines

September 14, 2017          

Episode Summary

On the podcast this week, we begin a multi-episode discussion about the evolution of software and the future of computing, looking at how a handful of advances — such as AI, the IoT, neuroscience, and additive fabrication — will come together to transform software and hardware into something new, which we’re calling “Smartware”. Smartware are computing systems that require little active user input, integrate the digital and physical worlds, and are continually learning on their own.

We’ll start our discussion with “a tribute to dead machines”. Technology and humanity are inseparable: It’s present in every facet of our civilization. We’ll take a look at the history of technology from the era of big machines to personal computing to mobile. And, we’ll discuss some early examples of Smartware including self-driving cars like Tesla’s automobiles and the AI-driven voice user interface of Amazon’s Echo.

Resources:
Tesla
Amazon Echo

Bull Session

The Trials and Tribulations of the Early Adopter

August 10, 2017          

Episode Summary

On The Digital Life this week, we discuss the difficulties that early adopters can encounter when using new consumer technology. In many instances, the first version of a tech product is no better than a beta release. Initial consumer iterations are often test cases for unproven inventions that can barely survive QA. Today, with so many tech products being released on a regular basis, the role of the early adopter is akin to that of an innovation guinea pig. So, why be an early adopter?

Resources:
The Trials and Tribulations of the Early Adopter

Bull Session

Amazon Eats Whole Foods

June 22, 2017          

Episode Summary

On The Digital Life this week, we explore Amazon’s recent purchase of high-end grocery chain Whole Foods and how this transaction will impact the future of retail. For its $14 billion investment, Amazon gets, among other things, a strong real estate portfolio in areas of the US with wealthy, desirable demographics; sophisticated food industry logistics and warehousing; a host of purchasing relationships and agreements; and some potentially rich customer data.

It’s a little ironic that an e-commerce giant such Amazon now has an unique opportunity to redefine brick-and-mortar retail as well. But, the company has been experimenting in this space for a few years. Its Amazon Go offering, for instance, is a IoT-enabled grocery store which enables customers to forgo the checkout line. People can walk in, tap their mobile phones on a turnstile, grab what they like from the shelves, and just walk out again — no waiting in line required. We can imagine that Amazon’s retail technology might soon make an impact on its newly purchased grocery stores. Join us as we discuss the evolution of the retail and the consequences of the Amazon acquisition of Whole Foods.

Resources

Why Amazon Bought Whole Foods