Apple Introduces Accessibility Features to Control Devices With Eyes and Music Haptics



While Google I/O was underway, Apple managed to steal the show by announcing several accessibility features that will allow people with disabilities to use the device more comfortably. The feature that caught our attention the most was the control of the device using eye movements.



Through the Eye Tracking feature, iPhone and iPad can be controlled with eye movements tracked by the device's front camera. According to Apple, artificial intelligence is used to follow the user's eyes with a calibration process that can be done in just a few seconds. The eye control system is already supported on Vision Pro but can now finally also be used on iOS phones and iPadOS tablets but not on macOS computers.



Next is Music Haptics which allows the hearing impaired to enjoy music. Some may say this is a strange feature but the fact is that deaf people also want to enjoy music in their own way. The Taptic Engine on the iPhone will vibrate to the tune of music played through Apple Music. At this time this feature will only be provided on the iPhone.



While the Vocal Shortcuts feature can teach the device to understand the user's speech that has changed due to stroke, ALS disease and more. The device is taught to listen to how certain sentences are spoken by the owner to quickly launch the application. This feature is similar to Project Relate (originally Project Euphonia) introduced by Google in 2019.


Personal Voice also now supports Chinese Mandarin for the first time. With this feature, people who will lose the ability to speak can record their voice which can then be used in place of a voice in the future. Previously only English was supported.


In addition, several other accessibility features have also been added to Apple Carplay and Vision Pro.

Previous Post Next Post

Contact Form