The technology world is obsessed with screens, from what you wear on your wrist to screens you can hang on the wall. But the bigger picture is tactile computing, where you can touch and lift objects thousands of miles away.
In a sense the tactile nature of technology is already coming into our lives via the Apple Watch – you can literally send a gesture by touching the screen.
But imagine lifting an object or shaking hands with another person thousands of miles away on the other side of the planet?
Well the boffins of MIT are hard at work bringing tactile computing to real life. MIT Media Lab’s Tangible Media Group two years ago created the inFORM, a scrying pool for imagining the interfaces of tomorrow.
Almost like a table of living clay, the inFORM has a surface that three-dimensionally changes shape, allowing users to interact with digital content but also hold hands with a person thousands of miles away.
Created by Daniel Leithinger and Sean Follmer and overseen by Professor Hiroshi Ishii, the technology behind the inFORM consists of a pinscreen.
Each pin is connected to a motor controlled by a nearby laptop that can move the pins to render digital content physical but can also register real-life objects using the sensors of a hacked Microsoft Kinect controller.