When big tech companies announce their next big thing, they’re usually going all in on selling us how it will instantly change our lives, routinely overpromising and underdelivering. At MWC 2024, Honor decided to mix things up with a delightfully dumb but fun demo showcasing its new eye-tracking technology on itsMagic 6 Proflagship. It posed the most hilarious scenario: what if you could control a car by looking at boxes on your phone?
For the demo, Honor brought me and a few other journalists out to a hidden backyard in the center of Barcelona, far away from the busy showfloors of the Mobile World Congress. Hidden behind a big garage door, a huge hall and a car wrapped into mule-style camouflaging foiling were revealed to our eyes, with a license plate on the vehicle prominently spelling HONOR in big letters. Naturally, we were all immediately drawn to this single biggest object in the room. Did Honor plan to release a car,like its competitor Xiaomi? The Honor representatives accompanying us were quick to reel in our questions and speculations. “The car doesn’t matter, ignore the car,” they told us.
![]()
The car doesn’t matter, ignore the car!
What Honor actually wanted to show us here is how its eye-tracking technology works on its new Magic 6 Pro, which is currently exclusive to the software of the Chinese version. In the production software, eye tracking can be used to interact with pop-up notifications and its Magic Capsule, allowing you to expand them to reveal the full app just by looking at them for a longer moment. That’s impressive in and of itself, but let’s be real: it’s not something that would make the headlines or that feels particularly groundbreaking.
Honor has bigger plans with eye tracking, though, and the demo we were invited to was supposed to show us that the company just how big the company dreams. After a quick safety briefing (along the lines of “The car is programmed to stop after a few meters, but please still don’t walk in front of it while it’s being remote-controlled”), each of us could calibrate our eyes on the provided Magic 6 Pro to get started. A Honor representative told me that right now, the algorithm is trained mostly for handling Asian eyes, but as Honor is looking to make the feature available in more markets, it will use a more diverse data set. It seemed to still work well for all kinds of eyes, though.
![]()
With the safety briefing out of the way and my eyes set up, Honor booted up its special remote control app. It’s as simple as it gets, really. There are only four targets for you to look at, allowing you to either turn on or off the engine and move the car forwards or backwards.
It’s really hard to focus on a single target for more than a second
![]()
From the moment I tried the demo, I immediately felt what a lot of peopletesting the Apple Vision Proand its eye-tracking based system navigation described: it’s really hard to focus on a single target for more than a few seconds. And that’s exactly what you need to do for the button to trigger, which is indicated by a helpful loading animation. Your eyes are constantly making micro-movements, and when you’re not used to using them to control things, you’re more easily distracted than you would think.
I still got the hang of it quickly enough after first inadvertently jumping betweentouchwatch targets. I managed to turn on the engine with flying colors, and then moved the car back and forth a few times. It’s quite an eerie experience, moving this heavy, gasoline-powered machine by sheer force of vision… oh, right, the car doesn’t matter, ignore the car.
It goes without saying that we’ll likely never see an application like this in action in a real-world scenario. But it still serves as a conversation starter for Honor’s plans with this technology. Honor wants to open up eye-tracking to third-party developers, allowing them to integrate it into their apps to enable new features. Honor envisions a future where this type of interaction becomes more common.
The company first introduced this form of eye-tracking during chipmaker Qualcomm’s Snapdragon Summit in 2023, with the underlying AI algorithms powered locally by theSnapdragon 8 Gen 3. It uses the phone’s front-mounted ToF depth sensor to accurately understand where users are looking exactly.
The end of the calibration process
Of course, only time will tell if developers will jump on board. Honor is just one of many Android makers, so developers have to think hard about whether they should introduce features that only a portion of the Android user base can use. That’s particularly true for the global market where the company is growing, but still only represents a small portion of the cake, only available in some non-US markets.
Google also tried similar touchless interaction models with thePixel 4and its Soli radar that could react to air gestures in front of the device. In Google’s case, the company quickly gave up on the technology after just one phone generation with little to no uptake from third-party developers. If even the company behind Android itself can’t succeed with something like it and convince other manufacturers to jump on board, the question arises of how well a lone phone maker will fare.
In the meantime, I’m definitely not mad that I got to control a car with my eyes.