Researchers Developed Eye-Tracking Glasses That Steer Drones With Just a Look

The University of Pennsylvania, NYU, and the U.S. Army Research Laboratory have been working on this tech.

byMarco Margaritoff|
Researchers Developed Eye-Tracking Glasses That Steer Drones With Just a Look
Share

0

Researchers from the University of Pennsylvania, New York University, and the U.S. Army Research Laboratory have developed a gaze-driven method of controlling unmanned aerial vehicles.

According to IEEE Spectrum, it's actually quite simple, albeit fairly sophisticated in terms of the level of programming that went into this technology. A pair of l​lightweight glasses​​​ightweight glasses with an embedded computing unit processes the movement of your eyeballs and makes the correlated drone navigate to the spot you’re looking at. In simpler terms: the UAV will fly wherever you look.

Let’s have a closer look at this joint Army and university project, shall we?

Video thumbnail

When it comes to the hands-free piloting of unmanned aerial vehicles, your options are essentially limited to the standard “follow me” modes that track behind you, or imperfect body and facial expression-controlled projects and experiments in development by various institutions across the world. In the past few years, we’ve covered everything from mind-controlled drones and voice-commanded UAVs to eye and hand-gesture alternatives. 

This particular endeavor differentiates itself by being fully self-contained. Instead of relying on external sensors that usually provide a system such as this with the location and orientation of the subject (you, the drone operator) and the object (the drone), the glasses are all you need to pilot a UAV with your eyeballs. This system doesn’t even require GPS data to function

The Tobii Pro Glasses 2 in use here include an inertial measurement unit (IMU) and a high-definition camera that detects the positioning of your eyeballs. Since the glasses themselves don’t have enough processing power to handle all that data, the research project is currently relying on a portable Nvidia Jetson TX2 processor. With this system in place, the glasses detect the drone in front of you, track your eyeballs, combine the location and orientation data of those two elements, and calculate the distance between you and the UAV based on the drone’s apparent size. 

The most complex part here is translating the camera’s 2D image of the drone ahead of you and turning it into usable 3D data. In other words, measuring distance and space in order to effectively navigate the UAV around a room is pretty difficult to do if all you have is 2D data from a high-definition camera. Currently, the eye-tracking glasses are doing a good enough job of detecting a user’s pupil dilation, and thereby, how far ahead or up close the user is looking. 

“To compute the 3D navigation waypoint, we use the 2D gaze coordinate provided from the glasses to compute a pointing vector from the glasses, and then randomly select the waypoint depth within a predefined safety zone,” the research paper explains. “Ideally, the 3D navigation waypoint would come directly from the eye tracking glasses, but we found in our experiments that the depth component reported by the glasses was too noisy to use effectively. In the future, we hope to further investigate this issue in order to give the user more control over depth.”

Ultimately, tracking eyeballs and pupil dilation with a simple pair of smart glasses and effectively piloting a drone that way is extremely impressive. The fact that researchers have managed to collect enough functional spatial data with such a limited amount of components is even more remarkable. Hopefully, this kind of research and experimentation will soon find its way to stores, so we, too, can get our hands (and eyes) on this.

stripe