News

Big accessibility improvements coming with iOS 15

Table of contents:

Anonim

New accessibility features in iOS 15

Apple announced that there are new accessibility features coming later this year, with iOS 15, to its various products and services . These are functions that are designed for people with mobility, vision, hearing and cognitive disabilities.

Sarah Herrlinger, Apple's Senior Director of Global Accessibility Policy and Initiatives, had this to say about it, “At Apple, we've long felt that the best technology in the world should answer the needs of everyone, and our Our teams are working tirelessly to build accessibility into everything we do." "With these new features, we're pushing the boundaries of innovation with next-generation technologies that bring the fun and function of Apple technology to even more people—and not we can look forward to sharing them with our users.”

Here are the new accessibility features coming with iOS 15:

Apple announces many more in its statement, of which we leave you a link at the end of the article. We show you the most interesting:

AssistiveTouch for Apple Watch:

For users with limited mobility, AssistiveTouch will allow you to use the watch without having to touch the screen or controls. Built-in motion sensors, an optical heart rate sensor, and on-device machine learning will enable Apple Watch to detect subtle differences in muscle movement and tendon activity that will control an on-screen cursor through hand gestures like a pinch or squeeze.

Here we leave you a video where we talk about this wonderful function:

iPad Eye Tracking:

Later this year, iPadOS will support third-party eye-tracking devices to let people control iPad with their eyes.

Background sounds:

Sounds around us can be distracting and cause discomfort or discomfort. In a show of support for neurodiversity, Apple is adding new background sounds that will minimize distractions and help users focus, relax, or unwind. We can make a light, dark or balanced noise and the sound of the ocean, rain or a stream play in the background to camouflage the noises of the environment in which we find ourselves. Plus, all of this is mixed and integrated with other system sounds and prompts.

SignTime. Communicate with Apple using sign language:

SignTime will allow customers to communicate with AppleCare and retail customer care using sign language. It's a feature that launched on May 20 in the US, UK, and France, and will be coming to more countries in the future.

Voiceover Improvements:

Recent updates to VoiceOver allow users to explore more details about people, text, table data, and other objects within images. VoiceOver can describe the position of a person along with other objects in images, and with Markup , users can add image descriptions to personalize their photos.

MFi (Made for iPhone) Hearing Aid Improvements:

Apple is introducing new support for two-way hearing aids, enabling hands-free phone conversations and FaceTime. Next-generation models for MFi users will arrive later this year.

Headphone audiograms:

Headphones will receive audiogram support. This will allow users to personalize their audio by importing their latest hearing test results.

In addition, other news will arrive, such as the ones we will comment roughly below:

  • Sound Actions will replace physical buttons with mouth sounds such as clicks and "ee" for non-speaking users with limited mobility.
  • The screen and text size settings can be configured in each compatible app, to make it easier for users with color blindness or other visual difficulties to see the screen.
  • The new Memoji customization options will offer greater representation, allowing users to don oxygen tubes, cochlear implants, and protective helmets.

Many of these new features are scheduled for release later this year, suggesting they'll be included in iOS 15 or one of its updates.

Greetings.

Source: Apple.com