Apple is adding eye-tracking functionality to recent iPhone and iPad devices this year as one of several accessibility updates aimed at users with physical disabilities.
The US tech company said its eye-tracking is "powered by artificial intelligence" and the device's camera, without the need for extra hardware or accessories.
To use their device hands-free, people can simply look at their screen to move through apps and menus, and linger on items to select them instead of physically pushing them.
Pausing your gaze on an item to select it is a feature Apple calls Dwell Control, which it already offers in Mac devices.
Eye-tracking is also already used in Apple's Vision Pro mixed-reality headset, which was released in the US earlier this year.
No New Zealand release date for the Vision Pro has yet been announced, but a report this week claimed it would be released in the coming months in Australia, Japan, Singapore, China, South Korea, France and Germany.
Other accessibility features announced by Apple on Thursday (NZ time) include Music Haptics, designed to allow users who are deaf or hard of hearing to physically feel music; Vocal Shortcuts, which allows users to perform tasks by making a custom sound; and Vehicle Motion Cues, designed to help reduce motion sickness when using iPhone or iPad in a moving vehicle.
CarPlay is also undergoing accessibility updates, including Sound Recognition - which will alert deaf or hard of hearing people to car horns and sirens.
Apple did not announce when these features will be released. The company is holding its annual WWDC software event in early June, where more will be announced about the highly anticipated iOS 18.