AI recreates videos people are watching by reading their minds
By Matthew North
Artificial intelligence is getting better at reading your mind. An AI could guess what videos people were watching purely from their brainwaves.
Grigory Rashkov at Russian research firm Neurobotics and his colleagues trained an AI using video clips of different objects and brainwave recordings of people watching them. The recordings were made using an electroencephalogram (EEG) cap and the video clips included nature scenes, people on jet skis and human expressions.
The AI then tried to categorise and recreate the video clips from EEG data alone. In 210 out of 234 attempts, the AI successfully categorised each video, by providing tags such as waterfalls, extreme sports or human faces.
Visually, the AI seemed to have most success at recreating the primary themes of the images, such as colours and large shapes. More nuanced details such as those found on human faces were harder to recreate, with most appearing distorted and beyond recognition.
Mind-reading AIs are still only looking at the surface of human thought, says Victor Sharmas at the University of Arizona. “What we are currently seeing is a caricature of human experience, but nothing remotely resembling an accurate recreation,” he says.