Big Brother is watching you. As cameras and information tracking become more ubiquitous in society, our definitions and expectations of privacy must necessarily adjust. I mean, you could just drop off the grid, but then you wouldn’t get Facebook ads for that callous removal device you so desperately need. How did they figure that out, anyway? It may seem eerie and borderline telepathic, but the intrusive suggestions would probably stop if you could keep from Googling “ugly foot skin” every day (it’s okay; no judgment here). But hell, then maybe they’ll just read your mind for real! The possible application of Functional Magnetic Resonance Imaging (fMRI) to consumer research has been investigated since at least 2007, and a flurry of recent studies on the emerging method have shown its powerful and perhaps frightening ability to peer into our minds and deduce our thoughts. Has technology breached the final Orwell firewall, or can we tune out the transcranial surveillance?
Let’s first figure out what fMRI is and what it actually does. Magnetic resonance imaging, a technique made possible by the positive charge of the protons in our bodies’ hydrogen atoms, is typically used to identify abnormal tissue (such as tumors) and holds great advantage over X-rays and CT scans in both resolution and safety. While traditional MRI can be focused on any particular part of the body, the “functional” version is trained solely on the brain, and is so named because it uses observations of blood flow therein to determine which parts are active at a given time, although that connection isn’t exactly straightforward. Still, there are some startling, tangible test successes that are hard to argue away.
In 2008, Jack Gallant and his team from the University of California at Berkeley mapped the brains of subjects while they viewed random images from a set database. Armed with this blueprint, computer programs were then tasked with matching the picture to the neuronal activity when the subjects were shown the photos again. The program didn’t always pick right, but it could get at least the basic structure down and choose similar photos.
A similar study by the same group in 2011, this time with video, took it a step further. After building a brain “dictionary” with the help of hours of clips, the computer model was told to reconstruct from scratch what the subjects saw as they were shown random, never-before-used youtube videos. Check out the results for yourself and see how well they did. If that’s not invasive enough for you, an experiment from late last year even applies the same kind of technique to reading people’s dreams.
If advertisers try to utilize fMRI, can the long arm of the law be far behind? In a paper published this month, Anthony Wagner and his team describe how they used digital cameras to take 45,000 pictures of a person’s life over a several week period. When shown their own photos and those of the other participants while under the watchful eye of the fMRI, researchers were able to distinguish whether the person remembered the image 91% of the time, effectively confirming what they had done in the past. Is there hope such a procedure could be used in criminal cases to place someone at the scene of a crime, or to show a suspect has knowledge only the particular perpetrator could have?
WHAT DOES THIS MEAN?
Before you break out the tinfoil hat and line your living room with lead, look at what all those “mind-reading” examples have in common. The people who had their thoughts pinched had to first submit to having their brains observed during a certain activity so that the computer programs would know what to look for in the future. You can’t zap the memories out of someone’s head without them first lying still in a big metal tube for long periods of time. That’s a little easier to avoid than Facebook.
But what if the court orders it! Present-day polygraphs are inadmissible evidence in the U.S., and the forecast for fMRI lie-detection doesn’t seem any better. Anthony Wagner’s further studies show that if a person tries to fool the system by intentionally thinking a familiar scene is foreign (and vice versa), an examiner is no more likely than chance to discern the truth. The Fifth Amendment really is inalienable!
While we continue to see that fMRI technology can be used to guess general conditions, as with a new study that allows observers to know when a person is in pain, your particular thoughts and memories remain off limits for the time being. Unless of course you just cant resist sticking your head in that tube.