To try to get a glimpse of the everyday devices we could be using a decade from now, there are worse places to look than inside the Future Interfaces Group (FIG) lab at Carnegie Mellon University.

During a recent visit to Pittsburgh by Engadget, PhD student Gierad Laput put on a smartwatch and touched a Macbook Pro, then an electric drill, then a door knob. The moment his skin pressed against each, the name of the object popped up on an adjacent computer screen. Each item had emitted a unique electromagnetic signal which flowed through Laput’s body, to be picked up by the sensor on his watch.

The software essentially knew what Laput was doing in dumb meatspace, without a pricey sensor needing to be embedded (and its batteries recharged) on every object he made contact with.

[Read More]

LEAVE A REPLY

Please enter your comment!
Please enter your name here