Back to the start
When I was two, in 1973, Xerox PARC invented the graphical user interface (GUI). You know, the thing Steve Jobs saw, stole and resold in his first Apple computers.
It was designed to replace command line/punch card interaction with a computer. And this was done at a time where computers were largely used for mathematical computation and writing. If we could hop back to 1973 and show the Xerox dudes what we’re using computers for now, their minds would be blown.
From GUI to IUI
We’re now using computers in ways they never could have imagined. But, and it’s a big but, we’re still using the same GUI. A lot of folk reckon it’s time for an update.
This year, we’re due to see the appearance of the intelligent user interface (IUI) with perceptual computing – a platform which will bring the tech out of the computer and into our world. It uses a mix of AI, machine learning, sensors and robotics to let these technologies understand and get around the real world and act intelligently for us. Minority Report on steroids.
Why now? It’s down to the shift from algo-based learning to machine learning using layered neural networks to learn from examples. Because of this, machine learning is already considerably better than human learning when it’s given a specific frame of reference. Like when it’s taught the rules of chess, for example.
The new augmented reality (AR) stuff in high end phones from Apple, Samsung and Google isn’t just about letting my three year old daughter cackle with glee when she sees a Very Hungry Caterpillar on the carpet in her room. It’s opening the door for AR to do much, much more. AR is the user I/O layer to perceptual computing and it lets us point it directly to actual real world issues that need a solution. Consider Blippar. We can point our phone camera to an actual thing in our actual world and get information about it. This new UI is called intelligent UI (IUI).
AR changing the world
It’s not going to be long before technology will be able to understand the world around us faster than we can. It will anticipate our intent because of what it’s learned and take the appropriate actions. The more AR is getting used, the better IUI is going to get.
In 2018 there’s going to be c. one billion AR devices online, all coupled with handy personal AI assistants who are all more than happy to change based on the habits of the person using it.
This is going to be an age of very fast learning. Yes, there’s going to be mistakes along the way, but it’s not going to be long before IUI replaces the WIMP systems we’ve all been using since I was two. And I’m 46 now.