Mind Reading

  by  |  February 2nd, 2008  |  Published in All, Brain & Psychology, Technology, Weird Science


 |  Stumble |  Share on Facebook |  Tweet This | 

As pollsters have so well demonstrated this presidential primary season, reading minds, whether of voters or the person next to you, is close to impossible. However as this ScienCentral News video explains, scientists are actually one step closer to reading our thoughts.
Interviewees: Marcel Just and Tom Mitchell, Carnegie Mellon University
Length: 1 min 35 sec
Produced by Joyce Gramza
Edited by Brad Kloza
Copyright © ScienCentral, Inc.

Scanning Thoughts

Researchers using MRI imaging have gone from studying peoples’ brains to identifying specific thoughts, allowing them to tell which of 10 similar objects a person is viewing or thinking about.

Functional MRI imaging has shown a lot about how our brains respond to pleasure and rewards, and has revealed brain processes and areas involved in deception.

And neuroscientists have been wiring the brain’s motor neurons to enable paralyzed patients to control prosthetics, computers and robots.

But the new research is aimed at the biology underlying thoughts– or, as scientists call them, “cognitive processes.”

Carnegie Mellon cognitive psychologist Marcel Just teamed up with machine learning expert Tom Mitchell to conduct the research. They scanned the brains of people who looked at sets images of similar objects– like 10 types of tools, or 10 types of homes.

The researchers excluded the vision area of the brain from the scans “because it’s almost too easy a target,” explains Just. “The visual cortex really contains a very faithful, accurate representation of a shape that your looking at– whatever is on your retina gets translated to your visual cortex at the back of your brain. And if you look for that pattern, that’s a lot easier, so we can be very accurate there.”

“But we wanted not the representation of the visual picture the person was looking at, we wanted the thought,” he says. “Not just the shape of a hammer, but what you use it for, and that’s not in the visual cortex. That’s why we wanted to see if we could find the patterns in the rest of the brain.”

In previous research, they’d been able to see differences in the scans when people were thinking about a hammer versus a house. But to distinguish scans of similar objects– like a hammer versus a screwdriver, they needed to write computer programming that could learn to recognize subtle patterns in three dimensions.

“We’re used to looking at two-dimensional photographs– as a person, it’s kind of difficult to even look at a three-dimensional picture, and that’s what our brains are, of course,” says Mitchell. “And so, when you look at these patterns as a person, first of all there about 15 or 20,000 pixels in each one of these images, so it’s a lot of data to look at as a person to try to find these rather subtle patterns that distinguish hammer from screwdriver. And that’s one reason why computers are so critical to this task.”

They used machine learning techniques similar to those used in other types of pattern recognition, including when you swipe your credit card to make a purchase, Mitchell says. “When you swipe your credit card through a transaction and they approve it, that approval is done by computers based on having analyzed many historcial transactions and determining what’s common to the fraudulent ones versus the normal ones and using that pattern. We do the same thing with brain images.”

They reported in the journal PLoS One that once the program learned these subtle differences from a group of people’s brains, it could accurately tell which of ten similar objects a new volunteer was thinking of.

“The biggest news here is in field of cognitive neuroscience,” Mitchell says. “Whereas people had previously known you could use MRI to distinguish the patterns of brain activity for broad categories of objects within a single person– so within a single person you could determine if somebody was looking at some kind of tool versus some kind of building, the news here is that we can go to a much finer grain distinction– we can distinguish hammer from screwdriver– and furthermore, we can do this across people.”

“Our brains have a similar enough organization that we can use the patterns of activation from my brain to try to decode yours,” says Mitchell.
“I think it’s the first time anybody’s really been able to read the content of the brain activity in a person’s brain, find out what the thought is that the activation signals,” adds Just. “The brain is doing something systematic– it’s coding thoughts. And this research for the first time breaks a little bit of that code. It can tell how the pattern corresponds to the thought of an object.”

Does that mean scientists may eventually be able to read our minds? In theory, eventually, yes, say the researchers. “I think our work presents a noninvasive way of reading peoples’ minds as they’re thinking of common objects,” Just says.

“We’re able to have the computer answer a 10-wise multiple choice test,” says Mitchell. “That’s still a far cry from being able to have the computer determine whatever you’re thinking.”

“I think we’re doing fabulously well for common objects,” agrees Just. “I really think there’s no common object we couldn’t eventually be able to find a representation of.

“We’re also looking forward to working on how ideas combine in the brain– so not just hammer, but hammering a nail, buying a hammer, and so on. So, a lot of the complexity of human thought comes about when you combine ideas. It’s not just one element, its putting elements together into prepositions, sentences, structures,” Just adds. “We build our ideas out of simple elements, and we haven’t yet studied the building.”

“But then, beyond that,” says Mitchell, “there are words like love and hate, democracy and totalitarianism; those probably have a very different representation in the brain than these concrete objects that we manipulate as people. We don’t know how well we’ll do at that. We intend to find out– we’re now in the process of setting up some new studies where we will be collecting data on those kinds of words. And then, beyond nouns, there are questions of, well, what about the more complex compositions that we think about as people? You know, ‘the monkey flew over the lettuce.’ Given a sentence like that, what happens in your brain? Well, we don’t know either, but we intend to find out.”

Mitchell says that while it’s too expensive right now to “read” peoples’ thoughts in real-time, speed is not an issue. “The speed of the processing for the computer is probably not going to prevent us from doing this in real-time in the long term. Once the patterns are learned, its actually a fairly efficient operation. The computer can classify an image, given that image, in less than a second.”

So, could these techniques lead to new ways to invade our privacy?

Just points out that while MRI imaging is noninvasive– no wires or implants required– “It’s, you know, I don’t know, like a three-ton machine,” he laughs. And we’re not going to be doing this in shopping malls tomorrow.”

Still, they say it’s not too soon to start talking about the ethics of these technologies.

In the meantime, they point out, their aims are beneficial– decoding how our brains represent thoughts could also help study mental disorders like autism and schizophrenia.

“A person with paranoid schizophrenia– why are they frightened?” asks Just. “What are they concerned about? We might be able to tell what it is that concerns them. When a person with autism doesn’t develop sort of conventional friendships or relationships, we can ask how they view the people they interact with, we can see what the representation is like. Is there some element missing? Is there some new element pressent that isn’t normally there? “Once you get to the point of analyzing how people think, what they’re thinking about, it gives us a tremendous advantage in many kinds of medical issues having to do with brain function.”

This research was published online in PLoS One, January 2008, and funded by the W.M. Keck Foundation and the National Science Foundation.


 |  Stumble |  Share on Facebook |  Tweet This | 


Responses

  1. Xylos says:

    November 24th, 2009 at 12:19 am (#)

    My brother and I can read each others thoughts. We are three years apart and have been separated for years at a time. It is pretty cool sometimes but other times it is annoying. If my brother gets a song stuck in his head or is listening to his mp3 player I can hear what song he is listening too even when I am 10 miles away.

    Some friends decided to test this out while on vacation. My brother was in Florida and I in Baltimore. He would start to listen to a song and I would immediately know what it was.

  2. credit card says:

    October 15th, 2014 at 11:25 am (#)

    At this time it sounds like Movable Type is the best blogging platform out there right now.
    (from what I’ve read) Is that what you’re using
    on your blog?

Leave a Response


Archives


Clicky Web Analytics