MIT Media Lab - Scan & Visualize Environment Beyond Line-Of-Sight

This proof-of-concept uses scattered laser light to see around corners. It could lead to commodity devices for realtime environment mapping and object tracking with no line-of-sight barriers.

With sufficient fidelity, this could provide room-wide (or wider) gesture tracking, enabling ambient natural user interfaces for everyone within the coverage area. (Related, see and the comment thread for more about NUI.)

And from my exchange with +Fred Steube (see

I foresee "ambient cameras" having a huge role in next generation AR and VR. I haven't written much about these concepts yet, but ubiquitous 360-degree camera/sensor coverage will be required to enable the full potential of natural user interfaces, and more significantly, to create public shared spaces that are hybrids between the real and the virtual, with seamless crossover from one to the other.

If this is too abstract, imagine real places as Second Life locations, and realistic avatars indistinguishable from "real" people (e.g., physically present). The technology for the latter is a few years away, but initial steps with non-realistic avatars and glitchy attempts at real-time rendering -- we'll be seeing this in the earliest experiments with AR glasses combined with locations set up with the additional sensors, hardware, bandwidth and software required to enable the broader concept.

There are some phenomenal opportunities for PR stunts. As one trivial example, consider a network of DIY physical locations participating in a NYE party -- the biggest party in history, with millions of attendees -- that takes place entirely in a location that is physically vacant, but rigged for telepresence. Perhaps...the moon.

Original research: "Recovering three-dimensional shape around a corner using ultrafast time-of-flight imaging" (

#3D #imaging #gesture #nui #augmentedreality´╗┐
Shared publiclyView activity