It was Global Accessibility Awareness Day on Thursday 20 May and Apple celebrated by adding a number of new accessibility features to its devices. Among these are some updates for the Apple Watch, iPhone and iPad that feel like they are straight out of a sci-fi movie.
Probably the most spectacular and futuristic of these new features is Assistive Touch for Apple Watch. What it means is that the Apple Watch can be operated with just one hand: the watch detects the smallest movements of hand muscles and tendons via the accelerometer, gyroscope and pulse sensor and can thus be operated with small gestures on the watch-wearing hand.
Various combinations of finger movements such as pinching or clenching can allow the user to control the watch. The Apple Watch also recognises inclinations of the wrist and lets the user navigate a curser on the Watch's screen this way.
Apple is promising Assistive Touch on the Apple Watch "later this year," so we expect the option to come out with watchOS 8 in the autumn.
There are also some innovations on the iPhone: Voice Over will be able to recognise details in images and describe them. The iPhone will also recognise texts in the photos and read it out, The artificial intelligence behind this can even recognise and describe a person's position in relation to the environment.
As for the iPad, there will be eye-tracking support which will track where a person is looking on the screen and more the curser according to the person's gaze. If you stair at a spot on the screen that can equate to an action like a tap.
The engineers in Cupertino have also come up with something to combat noise pollution: the iPhone will play background noise to mask other ambient noises and thus help the user to concentrate, calm down or relax. If desired, the background noise can be integrated into the films or songs being played.
Those features are coming later this year but some accessibility features are available now in the UK, USA and France. These include appointments with Apple Support in sign language. SignTime, probably based on FaceTime, allows the customer to communicate directly in sign language at their Apple Care appointment. When visiting an Apple Store a user can connect directly to the sign language interpreter in the browser and thus communicate with employees. Other languages and regions are set to be added in the future.
Other functions from the area of accessibility that will appear later this year are the ability for sounds to replace buttons. So, for example, the user can make noises like clicking the tongue instead of pressing buttons.
Apple has certainly invested several developer hours and creative thoughts in its accessibility features, some of which, such as the one-hand operation of the Apple Watch, are certainly also used outside the disabled community.
This article originally appeared on Macwelt. Translation by Karen Haslam.