Apple announces new features on IOS 18: Eye Tracking, Music Haptics, and Vocal Shortcuts

Apple announces new features on IOS 18: Eye Tracking, Music Haptics, and Vocal Shortcuts

Apple announces new features on IOS 18, such as Eye Tracking, allowing those with physical limitations to operate their iPad or iPhone using only their eyes. Furthermore, Music Haptics will provide a fresh method for individuals with hearing impairments to enjoy music on their iPhone through the Taptic Engine; Vocal Shortcuts enable users to complete tasks by creating personalized sounds; Vehicle Motion Cues can diminish motion sickness while using an iPhone or iPad in a car; and additional accessibility features will be introduced in visionOS. These characteristics bring together Apple’s hardware and software capabilities, utilizing Apple silicon, artificial intelligence, and machine learning to uphold Apple’s longstanding dedication to creating products accessible to all.

Are you wondering how to use eye tracking, Music Haptics, and Vocal shortcuts on iOS 18? Don’t worry, we will explain you in detail.

Apple announces new features on IOS 18

Some of Apple announces new features on IOS 18 are Given below:

Eye Tracking Comes to iPad and iPhone:

Powered by artificial intelligence, Eye Tracking gives users a built-in option for navigating iPad and iPhone with just their eyes. Designed for users with physical disabilities, Eye Tracking uses the front-facing camera to set up and calibrate in seconds, and with on-device machine learning, all data used to set up and control this feature is kept securely on the device, and isn’t shared with Apple.

Eye Tracking works across iPadOS and iOS apps and doesn’t require additional hardware or accessories. With Eye Tracking, users can navigate through the elements of an app and use Dwell Control to activate each element, accessing additional functions such as physical buttons, swipes, and other gestures solely with their eyes.

Here are some steps: How To Use Eye Tracking on IOS 18.

  • Initially, ensure that your iPhone or iPad should have iOS version 18.
  • Next, access the Settings app on your iPhone.
  • Proceed to tap on Accessibility.
  • Subsequently, activate the switch to enable eye tracking.
  • Scroll further down and select “eye tracking”.
  • To activate this feature, track the dot on the screen with your eyes for a brief period.
  • Upon successful calibration, the eye-tracking feature will be operational.

The latest update offers a wide range of customization choices, such as the smoothing slider which enables you to adjust the pointer speed. Additionally, you can activate the ‘Snap to Item’ feature for automatic cursor movement towards nearby items and utilize the ‘Dwell Control’ toggle to execute selected actions by gazing at an item on the screen.

Music Haptics enhances the accessibility of songs:

One of Apple announces new features on IOS 18 is Music Haptics, which allows users who are deaf or have hearing impairments to feel the music on their iPhones in a new way. By enabling this feature, the Taptic Engine on the iPhone provides taps, textures, and subtle vibrations synchronized with the music. Music Haptics can be used with countless songs in Apple Music, and will also be released as an API for developers to integrate into their apps for improved music accessibility.

New Features for a Wide Range of Speech

One of Apple announces new features on IOS 18 is Vocal Shortcuts, which enables iPhone and iPad users to create custom phrases that Siri can understand to start shortcuts and complete tasks easily. Another latest addition allows users to enhance speech recognition for a wider range of speech: Listen for Unusual Speech. Listen for Atypical Speech utilizes on-device machine learning to recognize user speech patterns. Developed based on capabilities added in iOS 17 for individuals who cannot speak or may soon lose their ability to communicate, these features target users dealing with acquired or progressive disorders impacting speech, like cerebral palsy, ALS, or stroke. These characteristics provide an enhanced level of personalization and authority.

Apple has incorporated a lot of updates beyond its flagship features, like:

Voiceover

Apple is launching some improvements for users who like voiceover screen narration. Mac users are now able to utilize more lively voices, a personalized voice rotor, and modify voiceover keyboard shortcuts.

Vehicle Motion clues

This new function assists in fighting motion sickness by making small changes to the display according to the vehicle’s motion. This could revolutionize the experience for individuals who suffer from motion sickness while on car rides or boat trips

CarPlay Gets Voice Control, More Accessibility Updates

CarPlay will have additional accessibility options such as Voice Control, Color Filters, and Sound Recognition. By utilizing Voice Control, individuals can maneuver CarPlay and manage applications solely through vocal commands. By utilizing Sound Recognition, individuals who are deaf or have difficulty hearing can enable notifications to be alerted of car horns and sirens while driving or riding in a vehicle. Color Filters in CarPlay make the interface more visually user-friendly for colorblind users, offering additional visual accessibility options like Bold Text.

Don’t miss outFollow us on Facebook for all our fresh updates.

Do you want to know about 5 amazing Nepali mobile Apps? Keep Reading.

Conclusion

Apple’s announcement of accessibility is a significant milestone in the history of inclusive technology design. Apple is not only breaking new ground with eye tracking and music haptics, but it is also creating new opportunities for users with diverse needs to interact with the world. Combined with carplay and visionOS, we could see a future where technology empowers everyone, resulting in a more inclusive and connected society.

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *