PRISM, schmism. Big Brother’s gotta be salivating over new hardware like Leap Motion and the Kinect for Xbox One, which enable machines to track our finger positions and facial expressions–and that’s not all.
Bruce Sterling sees the new Kinect as the ultimate surveillance machine.
Make that the new, improved, and *much more invasive* Kinect, because the thing’s supposed to be always-on and permanently scanning your living room like the infrared Eye of Sauron.
The Kinect knows whether you’re smiling or frowning. Soon Santa and the NSA could be swapping notes on who’s been naughty or nice.
“Kinect has always been able to tell that you’re moving. But now it can tell if you’re moving your thumb, and which way your thumb is facing. It can tell which muscles you’re engaging at any given time, and how much — it knows the difference between a jab and an uppercut, and registers them differently. If you’re playing with a friend, it can tell when the two of you switch places, or even when the two of you switch controllers. Kinect knows if you’re smiling or frowning, or if you’re talking or not. It knows if you’re looking at the screen or not, and will only register your commands if you’re looking. It knows, by either remarkable science or sorcery, your heart rate just by looking at your face.
“We spent a few minutes in a crowded room using a prototype of the new Kinect, and we left reeling. There’s almost no latency, things are astonishingly accurate — the muscle sensors knew even the slightest shift in my posture, and try as I might I couldn’t make it think I was smiling when I wasn’t. We heard it discern commands from a noise-filled room, and track our movements in the dark when we couldn’t see them ourselves.
Motion detection just got a lot more sensitive. You can hide, but can you still your beating heart?
“Scientists at the Computer Science and Artificial Intelligence Laboratory (CSAIL) at MIT first presented “Eulerian video magnification” last year at SIGGRAPH, a computer graphics conference. Originally, the system — which lets you identify from afar if a person is breathing, how fast their heart is beating and where blood is traveling in their body — was designed to monitor the vital signs of neonatal infants without having to touch them.
“It measures the color intensity of pixels and then amplifies any changes in that intensity, registering the slight reddening of your face in conjunction with your pulse, for example…”
Plus you don’t even need special gear, as you can now use Wi-Fi signals to recognize your gestures throughout your entire home.
Of course, there are still plenty of creative things you can do with AR.
“SketchSynth lets anyone create their own control panels with just a marker and a piece of paper. Once drawn, the controller sends Open Sound Control (OSC) messages to anything that can receive them; in this case, a simple synthesizer running in Pure Data. It’s a fun toy that also demonstrates the possibilities of adding digital interaction to sketched or otherwise non-digital interfaces.
Turn your Kinect into a potter’s wheel:
We present a novel interaction system, “Shape-It-Up”, for creative expression of 3D shapes through the naturalistic integration of human hand gestures with a modeling scheme dubbed intelligent generalized cylinders (IGC). To achieve this naturalistic integration, we propose a novel paradigm of shape-gesture-context interplay (SGCI) wherein the interpretation of gestures in the spatial context of a 3D shape directly deduces the designer’s intent and the subsequent modeling operations.
And for that global perspective, spin the earth with your little finger.
To celebrate Earth Day, Google Earth today announced it now supports Leap Motion as part of its desktop application (which works on PC, Mac and Linux).
Finally, a history lesson from Wired–reminding us that these technologies have been a long time coming (and that Ivan Sutherland basically invented the “virtual”).
“To “scan” a 3D object, they needed two key things: An object and a scanner. The object was Sutherland’s wife Marsha’s 1967 VW Beetle, and the “scanner” was Sutherland’s students, armed with paint and yardsticks.
“They mapped out polygons right on the Beetle itself, and measured every line in what must have been that uniquely scientific mix of tedious and exciting. The resulting dataset was entered into Sutherland’s programs, and produced this first 3D wireframe model of a car. Actually, the first 3D model of any physical-world thing, ever.
Got an idea for using AR for good rather than evil? Get it funded from the new Intel Capital hundred million dollar Perceptual Computing fund.