Apple's mobile products are opening up new opportunities for people who are customarily denied at least some of the chances many of us take for granted. These two stories show just some of the potential these devices have for improving people's lives. One story relates to studies taking place in New Zealand in which scientists are attempting to use Apple's solutions to help Autistic children speak; the second is a real-life account of how an iPhone running a color-recognition app enabled a partially-sighted person a way to augment their sight.
Victoria University: A report tells us researchers here are looking into how iPads and other devices can help autistic kids communicate.
Autism affects huge numbers of children -- just 0.75 percent of all children born are autistic. It affects their ability to communicate. Many never talk at all.
Researchers are using Apple's mobile products to run electronic speech generation apps; testing these beside sign language and use of images to comminicate.
The aim of the project is to create objective data concerning how children react to the three different communication methods.
Professor Sigafoos -- a globally-recognized expert in augmented communication and its uses to enhance the lives of people of disability -- says,
"Evidence has been accumulating since the 1970s that autistic children who fail to develop speech are more likely to experience things like aggression, extreme tantrums, and self harming behaviours. Frustration at being unable to communicate is regarded a prime cause."
Writing in a Victoria University press release, he continues, "identifying and using their preferred tools and techniques, we may be able to help autistic children become better all round communicators."
Seeing from behind the curtain
Austin Serphim is partially sighted -- he's almost blind, but this hasn't stopped him finding self-expression with his own blog. He recently wrote about his experience using an iPhone equipped with the Color Identifier app. This uses the iPhone's camera and then tells the holder what color it is looking at.
Serphim says use of the app along with his iPhone and its camera helped deepen his appreciation of the world around him. In his own words:
"I then roamed my yard, and saw a blue flower. I then found the brown shed, and returned to the gray house. My mind felt blown. I watched the sun set, listening to the colors change as the sky darkened. The next night, I had a conversation with Mom about how the sky looked bluer tonight. Since I can see some light and color, I think hearing the color names can help nudge my perception, and enhance my visual experience."
It wasn't so long ago visually-impaired folk were completely locked-out of the technology revolution: now, armed with a smartphone they have many more options than before.
In an interview, Paul Scroeder, vice president of programs and policy at the American Foundation for the Blind, called the iPhone "The undisputed phone of choice for the visually impaired."
Visually impaired users like it that iPhone icons are large and that screen-reading is built-in. They don't like iTunes -- Apple needs to make that much more accessible. (How about making it work like an iOS system, with large icons to direct you to different media libraries, and translucent floating apps-like icons floating above the main window. Perhaps with text-to-speech support?)
Interestingly, an NPR report observes that Apple's iPhone beats Android for accessibility, hinting at how small a commitment to usability Google's made at this point in the evolution of its Android OS.
Apple's been working at accessibility for years. Voice Over on introduction offered features Windows users needed to spend significant sums on. This commitment is laudable, but there's still a way to go.
But what Apple has organized for accesibility is pleasing to some users. In a more recent blog entry, Serphim notes,
"Most screen readers have a word exceptions dictionary. This lets the user modify the pronunciation of individual words. Screen readers also usually ship with a default set of pronunciations. For example, "Qty" becomes "Quantity."" The brilliant geniuses who program VoiceOver simply added the following definitions to the default word exceptions dictionary. The iPhone even has these built in, though the user cannot edit the dictionary."
He's impressed by this. It means he can now understand when someone sends him an emoticon within a message.
Think about the significance of that. Many of us are so used to the use of emoticons that we don't think they really matter, but imagine how much they do matter to a blind or partially-sighted user who now knows that someone just shared some humor, a little emotion, some sarcasm -- they become peers in the communication game.
"For years, the blind have gotten nothing from corporations. Accessibility means as much as its market share, in other words not much. Big companies usually do not have an incentive to care about a very small base of users, or so they think. In truth, the blind represent a tightly knit community who tend to follow products loyally and passionately. If something works, word travels quickly, and everyone adopts the thing in question. If something doesn't work, word travels even more quickly, dooming the product to failure. Apple's devices have withstood the test," writes Serphim.
That's all I wanted to share today. Just to note that the transformative effects of technology don't need to be confined to the new, that sometimes there's an awful lot to be said for incremental transformations which can illuminate a person's life.
There's lots more information on Apple's technologies for accessibility on these pages here.
These include (but are not confined to):
iPad, iPhone, iPod touch
- Screen readers, full-screen magnification, and more
Mac OS X Snow Leopard
- VoiceOver 3 now hosts gesture support.
- Mac OS X supportsbraille displays to Mac OS X.
Perhaps its things like this which inspired the National Federation of the Blind to award Apple in July 2010 for its extensive work in making its solutions accessible.