Apple bets on accessibility with new features that give you the floor | New technics

Apple Accessibility

This year, Apple is introducing a fifth accessibility pillar dedicated to speech (Picture: Apple)

Apple’s new suite of accessibility features announced at this year’s WWDC underscores its commitment to staying on track to be inclusive.

Metro.co.uk sat down with Sarah Herrlinger, Apple’s Senior Director of Accessibility and Policy, to talk about the tech giant’s approach to accessibility and the new features coming this fall.

Until now, Apple’s accessibility efforts have revolved around four main pillars: vision, hearing, motor skills and cognition. This year they are introducing a fifth pillar dedicated to speech.

One of the notable features launching this fall is “Personal Voice”, a voice banking tool developed in conjunction with leading organizations working with patients with ALS (amyotrophic lateral sclerosis).

ALS, also known as Lou Gehrig’s disease, is a progressive disease that causes the nerve cells controlling voluntary muscles to deteriorate. Loss of speech is a common symptom, affecting around one in three people diagnosed with ALS.

The new feature will allow users at risk of losing their ability to speak to create their personalized voice on their device. This powerful tool takes about 15 minutes to set up and allows users to type phrases, expressions and save frequently used phrases.

Live speaking apple

Live Speech on iPhone, iPad and Mac gives users the ability to type what they want to say and have it spoken out loud during phone and FaceTime calls, as well as in-person conversations (Photo: Apple)

The next version also includes “Live Speech” functionality, allowing people who have lost their ability to speak to communicate via text-to-speech functionality.

This integrates seamlessly with Personal Voice so users can speak in their preferred voice on iPhone, iPad, and Mac, typing what they want to say to be spoken aloud during phone calls, FaceTime and in-person conversations.

Users can also record commonly used phrases to quickly intervene during conversations.

When it comes to accessibility and privacy, Apple assures users that it doesn’t “choose one over the other.” The configuration of Personal Voice is done entirely on the device and nothing is saved in the cloud. Live Speech is exclusively available on devices with password lock, ensuring that the user remains in control of their voice.

Apple said it doesn’t factor adoption into its features like accessibility.

“We do it because we believe it’s the right thing to do and if we do it for the communities that need it, that’s powerful,” Herrlinger said.

Apple Personal Voice

Personal Voice allows users at risk of losing their ability to speak to create a voice that sounds like them (Picture: Apple)

“If it positively affects one person’s life, it will probably positively affect enough people’s lives to make it worthwhile.”

But the tech giant can rest assured that extending its accessibility features to its ecosystem of devices is an attractive adoption factor.

For example, the Apple Watch has undergone significant improvements to meet the needs of people with various disabilities. The addition of watch commands based on muscle and tendon movements makes it easier for people with reduced mobility to use the device.

Another powerful feature introduced by Apple is “Point and Talk”, which makes it easier for visually impaired users to read text in their environment – for example, text on a microwave.

Point And Speak is integrated into the Magnifier app on iPhone and iPad, works great with VoiceOver, and can be used with other Magnifier features such as person detection, door detection, and image descriptions to help users to navigate in their physical environment.

Apple

With all these new features, Apple clearly wants to be the benchmark for accessibility needs (Picture: Apple)

However, the detection mode features work with a LiDAR scanner built into Apple’s professional models on iPhones and iPads, which will cost extra at this time.

The company’s accessibility features will also extend to Apple’s hottest announcement of the year, the Vision Pro.

Working with voice-over, voice commands and Braille input, it is designed to be “fluid in its ability to switch between modalities”.

Apple Vision Pro should be great for accessibility because it introduces an all-new input system controlled by a person’s eyes, hands, and voice.

The company is also “very encouraging” for developers looking to build accessibility apps on the new visionOS.

Accessibility features go beyond their simple distribution, as evidenced by online communities and message boards filled with individuals and families seeking advice on how to use them. For them, the Apple Retail Store Accessibility Workshops should be a good place to start or the company’s Accessibility Helpline.

With all these new features, Apple clearly wants to be the benchmark for accessibility needs.

“I think in some ways, with the work we’ve done, we already are,” Herrlinger said. “If you look at public statistics, they tell us that 72% of the blind community using mobile devices uses an iOS device.”

She also highlighted the importance of involving people with disabilities in the development process: “Our commitment to accessibility is deeply rooted in the mantra of ‘nothing about us without us’.

“We employ people with disabilities who use the assistive technologies we create every day to ensure their voice is heard throughout the process.”

MORE: Apple’s $3,499 Vision Pro headset could ‘read your mind’

MORE: Apple unveils its first AR/VR headset, the Vision Pro

Leave a Comment