Window to the soul? Maybe, but the eyes are also a flashing neon sign for a new artificial intelligence-based system that can read them to predict what you’ll do next.
A University of Maryland researcher and two colleagues have used eye-tracking technology and a new deep-learning AI algorithm to predict study participants’ choices while they viewed a comparison website with rows and columns of products and their features.
The algorithm, known as RETINA (Raw Eye Tracking and Image Ncoder Architecture), could accurately zero in on selections before people had even made their decisions.
“This is something AI technology is very good at—using data to make predictions,” said Michel Wedel, a Distinguished University Professor and PepsiCo Chair in Consumer Science in the Robert H. Smith School of Business. He worked with Moshe Unger of Tel Aviv University and Alexander Tuzhilin of New York University to develop RETINA. Their research is published in the journal Data Mining and Knowledge Discovery.
Researchers using eye movement data typically synthesize it into aggregated chunks of information, which can miss some information and certain types of eye movements. With their advanced machine-learning method, Wedel and his colleagues could use the full scope of raw data from the eye-tracking rather than the snippets current methods record.
Unusually, the algorithm is able to incorporate raw eye movement data from each eye, Wedel said.
“It’s a lot of data—several hundreds of thousands of data points, with millions of parameters—and we use it for both eyes separately,” he said.
The algorithm could be applied in many settings by all types of companies. For example, a retailer like Walmart could use it to enhance the virtual shopping experiences they are developing in the metaverse, a shared, virtual online world. Many of the VR devices people will use to explore the metaverse will have built-in eye tracking to help better render the virtual environment. With this algorithm, Walmart could tailor the mix of products on display in their virtual store to what a person will likely choose, based on their initial eye movements.
“Even before people have made a choice, based on their eye movement, we can say it’s very likely that they’ll choose a certain product,” Wedel says. “With that knowledge, marketers could reinforce that choice or try to push another product instead.”
RETINA has applications outside of marketing as eye tracking becomes more ubiquitous in many other fields, including medicine, psychology and psychiatry, usability and design, arts, reading, finance, accounting—anything where people are making decisions based on some kind of visual assessment.
The biggest players in tech, including Meta and Google, having recently acquired eye-tracking companies and are considering a range of applications. With front-facing cameras, it is now possible to track people’s eye movements from any personal smartphone, tablet or computer. Such consumer device-based approaches can’t yet be as accurate yet, as the advanced eye-tracking hardware that researchers currently use, said Wedel, and there is still the big issue of privacy concerns—companies need to ask permission from users.
The researchers are already working to commercialize the algorithm and extend their research to optimize decision-making.
“We think eye tracking will become available at very large scales,” said Wedel. “The processing of the eye movement data typically has been very laborious. With this algorithm, we sidestep a lot of that, so there may be many applications that we haven’t even thought about.”
More information:
Moshe Unger et al, Predicting consumer choice from raw eye-movement data using the RETINA deep learning architecture, Data Mining and Knowledge Discovery (2023). DOI: 10.1007/s10618-023-00989-7
Provided by
University of Maryland
Citation:
Researchers develop algorithm that crunches eye-movement data of screen users (2024, February 1)