
Apple expands accessibility features including Magnifier, Live Captions and Sound Recognition
The announcement comes ahead of Global Accessibility Awareness Day on May 15, with the features slated for release later this year. The features are slated to be released later this year, ahead of Global Accessibility Awareness Day (May 15). Apple Intelligence is also expected to be updated, particularly as companies like Samsung and Google continue to add AI features to their smartphones. These AI-powered features also boosted accessibility across devices like the iPhone and Pixel. Apple CEO Tim Cook stated in a press release that accessibility was part of Apple’s DNA. “Making tech accessible to everyone is our priority, and we are proud of the innovations that we will be sharing this year. This includes tools that help people to access vital information, explore their world, and do the things they love. Here’s what will soon be available across these devices. Here’s what’ll soon become available across those devices.
Accessibility Nutrition Labels
Accessibility Nutrition Labels will show which App Store games and apps have the supported features you need.
Apple
In the App Store, a new section in the product pages of apps and games will highlight accessibility features, so you’ll can know right away whether the capabilities you need are included before downloading. Apple
In the App Store, a new section in product pages of apps and games will highlight accessibility features, so you’ll know right away whether or not your needs are included before downloading.
Accessibility Nutrition Labels will be available worldwide on the App Store. Developers will be able to access guidance on the criteria that apps must meet in order to display accessibility information on product pages.
Magnifier allows users with low vision or blindness to read text, zoom in and detect objects surrounding them. The feature will soon be available on the Mac. Magnifier for Mac is connected to the same camera as your iPhone so that you can zoom into what’s in front of you. For example, a whiteboard or screen. You can either use Continuity camera on your iPhone to connect it to your Mac or choose a USB connection. This feature allows you to read documents in Desk View. You can adjust the brightness, contrast, and color filters on your screen to make text and images easier to read.
Accessibility Reader
This new reading mode on iPhone, iPad, Mac and Vision Pro is geared toward making text easier to read for people with a range of disabilities, including those with dyslexia or low vision. You can customize the text to focus on what you are reading. Accessibility Reader allows you to adjust font, color and space. It supports Spoken content, which allows your device to read out loud what is on the screen.
Accessibility Reader can be used within any app, and is built into Magnifier on iOS, iPadOS and MacOS. You can launch the feature to interact with real-world text like in menus and books.
Braille Access
Braille Access lets you essentially turn the iPhone, iPad, Mac or Vision Pro into a braille notetaker. The user can use Braille Screen Input to launch an app, and then write down notes or do calculations in Nemeth Braille. You can also access Braille Ready Format documents within Braille Access. This allows you to view books and files created using a braille note-taking device.
Live Captions on Apple Watch
Live Listen and Live Captions will show real-time text on your Apple Watch, and allow you to remotely control a Live Listen session on your iPhone.
Apple
Live Listen is a feature that takes audio captured by an iPhone and beams it to your AirPods, Beats or compatible hearing aids, essentially turning your phone into a remote microphone. Apple Watch will soon have Live Captions, which display text in real time of what is being heard on an iPhone. So, users can listen to audio and also see Live Captions displayed on their Apple Watch. You can use your Apple Watch to start or stop Live Listen as well as jump back in time if you miss something. You won’t need to leave a class or meeting in order to control or grab your iPhone. Instead, you can use your Apple Watch to do it. AirPods Pro 2.
Vision Accessibility on the Apple Vision Pro
The Apple Vision Pro has added a few features to help people with low or no vision. Zoom, a new update for the Vision Pro main camera, will let you zoom in on objects in your environment. VoiceOver will read documents, identify objects, and describe the environment around you using Live Recognition. Other accessibility updates
Apple announced a few other updates to its accessibility features. One of them is Vehicle Motion Cues which reduces motion sickness caused by looking at screens. On iPhone, iPad and Mac you can customize the animated dots.
Name Recognition will alert you if your name is being called.
Apple
Similar to Eye Tracking, which lets you control your iPhone and iPad using just your eyes, Head Tracking will also let you navigate and control your device with head movements.
You can now customize Music Haptics on iPhone, which plays a series of taps, textures and vibrations along to audio in Apple Music. You can adjust the intensity of the haptics and choose whether to use them for the entire song or only the vocals.
Sound Recognition alerts deaf and hard-of-hearing people to sounds such as sirens, car horns or doorbells. Now, it also adds Name Recognition. This allows them to know when their names are being called. Live Captions now supports more languages around the world, including English, Mandarin Chinese, Cantonese, Spanish, Spanish-Latin America and Spanish, Japanese, German, German, German, Korean, French, Australia, UK and Singapore.