iPhones of the future could have an interactive 3D user interface that utilises motion sensors and facial recognition enabling you to 'reach into' your device.
A patent application that has been assigned to Apple Inc. of Cupertino, CA, US, suggests that the company is working on a three dimensional user interface for handheld devices such as the iPhone and iPad.
"With various inertial clues from accelerometer, gyrometer, and other instruments that report their states in real time, it is possible to track the Frenet frame of the device in real time to provide a continuous 3D frame-of-reference," the patent's abstract reads.
"Once this continuous frame of reference is known, the position of a user's eyes may either be inferred or calculated directly by using a device's front-facing camera. With the position of the user's eyes and a continuous 3D frame-of-reference for the display, more realistic virtual 3D depictions of the objects on the device's display may be created and interacted with by the user."
Effectively this means that the device will be able to work out how to project a three-dimensional image by working out where the user's eyes are in relation to the device, so it can send one image to the left eye and a different image to the right eye, giving the impression of depth. The Nintendo 3DS uses a similar system.
However, the system proposed by Apple is slightly more complex than that used by Nintendo, as motion sensors such as the accelerometer and gyrometer could help to detect the movement of your eyes and hands and expand various parts of the screen such as icons when you look at them.
The application for the patent was made by Mark Zimmer, Geoff Stahl, David Hayward and Frank Doepke in April 2010 and was granted last week.
Though the most obvious application of the technology is likely to be in handheld devices the patent also mentions portable music devices, televisions, gaming devices, laptops and desktop computers.