In the 1983 science-fiction movie classic Brainstorm, a team of scientists invents a helmet that allows sensations and emotions to be recorded from a person's brain and converted to tape so that others can experience them.
While this seemed quite unbelievable thirty years ago, it now appears that scientists at the University of California-Berkeley are bringing these futuristic ideas a little closer to reality!
As farfetched as it might sound, the university team in professor Jack Gallant's laboratory has developed a system that uses functional magnetic resonance imaging (fMRI) and computational algorithms to "decode" and then "reconstruct" visual experiences such as watching movies.
UC-Berkeley's Dr. Shinji Nishimoto and two other research team members served as guinea pigs to test out the system, which required them to remain still inside an MRI scanner for hours at a time.
While they were in the scanner, they watched two separate sets of movie trailers, while the fMRI system measured the blood flow in their occipitotemporal visual cortexes. On a computer, the images of the blood flow taken by the scanner were then divided into sections, after which they were fed into a computer that learned which visual patterns in the movie corresponded with particular brain activity.
Brain activity evoked by a second set of clips was then used to test a movie reconstruction algorithm developed by the researchers. This was done by feeding random YouTube videos into the computer program. The 100 clips that the program decided were closest to the clips that the subject had probably seen based on the brain activity were then merged to produce a continuous reconstruction of the original clips.
The researchers' ideas might one day lead to the development of a system that could produce moving images that represent dreams and memories, too. If they do achieve that goal, however, I can only hope that the images are just as blurry as the ones that they have produced already. Anything sharper might be a little embarrassing!