With the advent of new tech interfaces we are now seeing the way we interact with our computing experiences change and change in a multitude of ways.
I have previous written a few pieces around the concept of ubiquitous computing and how we will start to live in a world where we don't just interact with devices, software, platforms with touch and input but a world where we interact by using commands that are related to voice and motion. Yes start thinking 'Minority Report' and you will be on the right track.
So how do we as creative people, designers, UI & UX thinkers, strategy leaders etc consider the future. We need to start learning how to do a none linear none predictable workflow. We need to be able to think on the fly like we do as humans, consider all options and variants and then adjust on the fly.
Yes we will need to use a selection of different skill sets and tools, but where are these going to come from and who is going to use them? I am fascinated as it has far reaching implications than just the UI of the future. What is the job role? Where do i learn the skills required to design for this world? Is it even done by humans?
Will there actually come a day when our devices understand us better than we do? Will they be able to predict our future before we even consider it? All of these questions need to be answered one day but first we need to learn to use the tools to design the thinking behind these smarts.
If you've ever used an Amazon Echo, changed a channel by waving at a Microsoft Kinect, or setup a Nest thermostat, you've already used a device that could be considered part of Goodman's Zero UI thinking. It's all about getting away from the touchscreen, and interfacing with the devices around us in more natural ways: haptics, computer vision, voice control, and artificial intelligence. Zero UI is the design component of all these technologies, as they pertain to what we call the internet of things. Over time, these methods have become less complex: the punch card gave way to machine code, machine code to the command line, command line to the GUI. The next step is for machines to finally understand us on our own terms, in our own natural words, behaviors, and gestures. That's what Zero UI is all about.