Scientists’ ‘Mind Reading Device’ Matches YouTube Videos to MRI Scans
By Anne Witter | September 23, 2011 4:20 PM EST
A scientific way to read minds is a work in progress, but technology has moved forward yet again, matching YouTube videos to blood flow analysis by MRI scanners, The Telegraph (UK) reports.
Researchers were able to recreate a moving picture similar to the real footage being watched by their study's participants.
This study says there is possibility that technology can soon translate people's subconscious thoughts in dreams and memories onto screen, even as it cannot read actual thoughts that soon. In fact, if the technology is further developed, it could even be used to explore the minds of stroke patients, experts said.
Researchers from the University of California, Berkeley, used MRI scanners to monitor the blood flow in people's brains as they watched films, such as Madagascar 2, Pink Panther 2 and Star Trek. Then scientists created a computer program which could accurately guess what the monitored person was looking at.
In another round of testing, participants watched Hollywood film trailers, and computer was able to produce an approximate version of what they were watching, by scanning a library of random YouTube videos, finding the most similar clips and blending them together with brain scan interpretations.
Prof. Jack Gallant, one of the study's authors, said: "We're trying to reconstruct the movie that was seen by searching through a large library of completely different, random movies... This is a major leap toward reconstructing internal imagery. We are opening a window into the movies in our minds."
The current resulting video is blurry due to just 18 million seconds of footage in the program's database, but the researchers said expanding the size of the program's video library could solve the issue.
The study, published in the Current Biology journal, is said to be the first experiment to successfully interpret brain signals as they react to pictures in motion.
To contact the editor, e-mail: