Uzone.id – Apple has introduced a new feature for iOS 18 and iPadOS 18 users called Eye Tracking. As the name suggests, this feature allows users to control their iPhones or iPads simply by using their eye movements.
While this feature might seem familiar to Android users, as it has been available on some Android smartphones like Oppo and Realme Air Gesture for quite some time, Apple’s implementation is not merely a gimmick. Apple states that Eye Tracking is designed to assist users with physical disabilities, enabling them to control their iPads or iPhones with their eyes.
“We believe deeply in the transformative power of innovation to enrich lives,” said Tim Cook, Apple’s CEO. “That’s why for nearly 40 years, Apple has championed inclusive design by embedding accessibility at the core of our hardware and software. We’re continuously pushing the boundaries of technology, and these new features reflect our long-standing commitment to delivering the best possible experience to all of our users.”
Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives, added, “Each year, we break new ground when it comes to accessibility. These new features will make an impact in the lives of a wide range of users, providing new ways to communicate, control their devices, and move through the world.”
So, Eye Tracking in iOS 18 and iPadOS 18 is powered by artificial intelligence (AI) to provide users with a built-in option to navigate their iPhones and iPads using only their eye movements. The feature utilizes the selfie camera to detect eye movements, and calibrates in seconds, and its capabilities are continually improved through built-in machine learning.
Apple also ensures that all user settings are securely stored on the device and not shared with Apple or any third party. This is why users must recalibrate their gaze in Eye Tracking each time the feature is turned off and on.
With Eye Tracking, users can seemingly press buttons or activate certain features via Dwell Control. This feature allows the system to automatically run or activate a button when the user’s gaze is directed at a particular app or part of the screen for a certain amount of time.
Interestingly, this feature works across all apps running on iPadOS and iOS without requiring any additional hardware or accessories. With Eye Tracking, users can navigate through app elements and use Dwell Control to activate each element, access additional functions like physical buttons, swipes, and other gestures, simply with their eyes.
However, while this feature comes with both iOS 18 and iPadOS 18, not all devices support Eye Tracking. Here is a list of devices that support Eye Tracking:
- iPhone SE 3
- iPhone 12
- iPhone 12 Mini
- iPhone 12 Pro
- iPhone 12 Pro Max
- iPhone 13
- iPhone 13 Mini
- iPhone 13 Pro
- iPhone 13 Pro Max
- iPhone 14
- iPhone 14 Plus
- iPhone 14 Pro
- iPhone 14 Pro Max
- iPhone 15
- iPhone 15 Plus
- iPhone 15 Pro
- iPhone 15 Pro Max
- iPhone 16
- iPhone 16 Plus
- iPhone 16 Pro
- iPhone 16 Pro Max
- iPad Mini (generasi ke-6)
- iPad (generasi ke-10)
- iPad Air (generasi 4 ke atas)
- iPad Air M2
- iPad Pro 11 inci (generasi 3 ke atas)
- iPad Pro 11 M4
- iPad Pro 12,9 inci (generasi 3 ke atas)
How to activate Eye Tracking on iPhone:
- Go to Settings, then Accessibility.
- Select Eye Tracking and toggle it on.
- Follow the instructions for the eye calibration process by following the colored dots that appear on the screen.
- Once calibration is complete, you can use Eye Tracking to control your iPhone.
Hope it is useful!
When you look at an item on the screen, an outline will appear around it. To press a button or confirm a selection, stare at the item for a few moments until a pointer or gray dot appears.