Since the news broke that Apple had bought PrimeSense, the developer of the motion-tracking tech inside the original Kinect sensor for the Xbox 360, there have been a whole host of rumours - AKA speculation, AKA hogwash - about what Apple is going to do with it. Some people have got very excited around the idea that Apple is going to bring gestural controls to the Mac or still speculation-generating 'Apple TV set' or even the iPad or iPhone. I’m not convinced, but there are some more interesting things Apple could do with PrimeSense.
If the idea of controlling your iPhone by wiggling your fingers at it seems ludicrous, you’re correct. It offers no advantage over using the touchscreen - in fact, it’d probably be less usable than a touchscreen as it’s more prone to errors, as anyone who’s used a Kinect can attest - and it makes you look like an idiot while you’re doing it.
Add in the fact that PrimeSense's sensors aren't much smaller than a whole iPhone, and you have to be at least 30cm from even the 'close contact' model, and it's clear this is a non-starter. And even if PrimeSense did have some secret R&D project that made it stamp-sized and able to register your fingers 5-10cm away, it still wouldn't get round the fact that this is a pointless way to interact with a phone or tablet (except for some niche applications like virtual clay modelling or something).
PrimeSense's tech could be easily integrated into the iMac or MacBook Air/Pro to provide something similar to the Leap Motion. We've seen some really interesting apps that use this: from the virtual painting tool Corel Painter FreeStyle to 3D models of the human body – or even the whole planet through Google Earth - that you can move around in. But these are great for a small number of users who want this - and not really of interest to the wider populace to the level that it would help them choose to buy a Mac to get it.
A knowing gesture
Much of the speculation seems based around the idea that we'll be controlling our Macs - rather than specific targeted applications - using gestures, Minority Report-style. The problem with this is that few people are really interested in this. Where gestural control is most useful is where it’s skeuomorphic, replicating a real-world experience that opens it up to those who haven’t spent the last 15 years mastering how to use a joypad – whether dancing drunkenly to pop hits, shooting invading aliens or getting a real workout. Waving your arms to move files around or skip tracks in iTunes doesn’t have the same appeal.
So why would Apple pay a reported £210 million for the company? It can't just be on the off-chance the Kinect One and Leap Motion take off massively and it turns out we do want Minority Report experiences from our computers (can it?).
More likely, Apple wants to use the tech in more subtle ways. If they can shrink the sensor to fit inside the iPhone or iPad, the company could use it alongside the camera in some interesting way. Rather than focusing the sensor on you the user alongside the front-facing camera, it could build it into the back alongside the higher-resolution main camera.
Tracking motion could be used for anything from taking better photos of moving objects to smoother footage to enabling you to scan a scene in 3D (so you could, for example, only photograph objects that are within 10 feet of you, with a blank background). These might not sound as futuristically exciting as gestural control - but they’ll be a damn sight more useful on a regular basis.
Or maybe the company's thought of a use we haven’t. That would be very Apple.