Ticker

6/recent/ticker-posts

Ad Code

Responsive Advertisement

Apple Unveils Accessibility Features Allowing Users to Control iPad and iPhone with Eye Tracking and Feel Music with Haptics

Apple is set to revolutionize accessibility with a suite of new features coming later this year. These enhancements include Eye Tracking for iPad and iPhone, allowing users with physical disabilities to navigate their devices with just their eyes. Music Haptics will enable deaf or hard-of-hearing users to experience music through refined vibrations from the iPhone’s Taptic Engine. Vocal Shortcuts introduce custom sounds for task automation, while Listen for Atypical Speech enhances speech recognition for users with speech-related conditions. Vehicle Motion Cues aim to reduce motion sickness by using animated dots to minimize sensory conflicts for passengers in moving vehicles. CarPlay will see updates such as Voice Control, Color Filters, and Sound Recognition to improve accessibility. Additionally, visionOS will introduce systemwide Live Captions, support for hearing devices, and features for low vision users, furthering Apple’s commitment to inclusive design.

Designer: Apple

Eye Tracking uses the front-facing camera and on-device machine learning to enable users to control their devices with their eyes. It’s designed for people with physical disabilities, offering an easy and intuitive way to navigate without extra hardware. After a quick setup and calibration using the front-facing camera, users can move through apps by looking at elements. This interaction can trigger actions like pressing buttons, swiping, and performing other gestures. All data is processed and stored on the device, ensuring user privacy. This feature adapts to individual patterns through machine learning, providing a powerful tool for users with physical disabilities to navigate their devices easily.

Music Haptics provides a new way for deaf or hard-of-hearing users to experience music through vibrations. Using the Taptic Engine, the iPhone creates tactile feedback that syncs with the music’s rhythm, melody, and intensity. This tactile feedback allows users to feel the music, making it more accessible and enjoyable. Music Haptics works across millions of songs in the Apple Music catalog and is available as an API for developers to integrate into their apps, further expanding its accessibility. The feature is designed to be easy to use, with simple settings to turn it on and off.

Vehicle Motion Cues help reduce motion sickness by displaying animated dots at the screen’s edges, aligning visual input with the vehicle’s motion. This feature addresses the sensory conflict that often causes motion sickness, where what a person sees doesn’t match what they feel. Using built-in sensors, this feature detects when a user is in a moving vehicle and activates automatically or can be toggled in the Control Center. By providing a visual representation of vehicle motion, Vehicle Motion Cues make it easier for users to read, watch, or interact with content on their devices without experiencing discomfort.

CarPlay will also see significant improvements, enhancing accessibility for users with various needs. Voice Control will allow users to navigate and control CarPlay apps using their voice, providing a hands-free experience. Color Filters will make the interface more accessible for colorblind users by adjusting the display to distinguish between different colors. Sound Recognition will notify users of important sounds like car horns and sirens, ensuring that drivers and passengers who are deaf or hard of hearing remain aware of their surroundings.

visionOS will introduce systemwide Live Captions, further supporting users who are deaf or hard of hearing by providing real-time captions for spoken dialogue in live conversations and audio from apps. The update will also expand support for Made for iPhone hearing devices and cochlear hearing processors, ensuring seamless integration with Apple Vision Pro. For users with low vision, new features such as Reduce Transparency, Smart Invert, and Dim Flashing Lights will make the interface more comfortable and easier to navigate.

These advancements highlight Apple’s dedication to inclusive design, pushing technology’s boundaries to create the best experience for all users. To celebrate Global Accessibility Awareness Day, Apple will host curated collections and sessions at select Apple Store locations, allowing users to explore and learn about these new accessibility features. By constantly innovating and improving accessibility, Apple ensures its devices remain accessible to everyone, empowering all users to enjoy and benefit from the latest technological advancements.

The post Apple Unveils Accessibility Features Allowing Users to Control iPad and iPhone with Eye Tracking and Feel Music with Haptics first appeared on Yanko Design.

Enregistrer un commentaire

0 Commentaires