As technology becomes more and more advanced it is changing in sometimes unpredictable ways. However in some other ways this progression is predictable and certain patterns have very clearly emerged. One example of this is the way that technology and specifically the web is more and more seamlessly being integrated into our everyday lives. While once we used the internet by sitting down at a big machine that sat on the desktop, now we use it by pulling a small device out of our pocket. Even the names of these devices – the ‘Android' and the ‘iPhone' represent how the bridge between man and machine is getting ever more negligible and how much more of an integrated experience using the web now is. AR codes could even be seen as real-world hyperlinks which connect physical real-world objects with websites and digital information.
And true to form it seems that the future has in store more such changes that will make the web even more integrated into our every day lives. One obvious example is ‘Google Glass' which is effectively a pair of glasses that allow you to see the web at all times while you go about your usual business. Then there's the rumoured ‘iWatch' which will give you permanent access to your e-mails right on your wrist.
What is Sixth Sense?
Sixth sense is another interesting project that evolves this concept. It's an open source project from MIT Media Labs that essentially aims to integrate digital information with the real world in order to give us an effective ‘sixth sense' (an ‘information' sense if you will). It's been demonstrated at the TED Talks, and the code and instructions for how to build the basic hardware are available from the site.
The idea is essentially that you connect a computing device such as a small Windows based tablet (the Winpad for instance) to a projector and a camera that hang around your neck. The camera uses high tech Kinect-like video-analysis technology in order to analyse your gestures and movements, and the projector (with a small mirror attachment to adjust it) projects information based on those gestures. The idea is then that you would wear the camera and the projector around your neck as a pendant, and that these would then interact with you on a daily basis by picking up your movements and projecting onto surfaces.
For instance then, if you were to want to check the time you would simply lift up your wrist as though you were looking at a watch and the watch would then be ‘projected' onto your wrist. Likewise if you wanted to make a phone call you might raise your hand up and the numbers would then appear on your palm. You would then simply dial the numbers on your palm as though it was a touch pad and this would begin the call. Other examples demoed showed the ability to project a map onto a wall, or to bring a keyboard up on a table that you could use to type.
With projectors being integrated already into devices like the galaxy beam and with the open source nature of this project, chances are that it's only a matter of time until this kind of interaction catches on in a mainstream way.
So the question you need to ask yourself as a web designer is are you ready? And what value could you provide in a world of holographic web-widgets?
Mark Wellington works as a software executive for web crm software which is considered to be the best product and project based software company.