Imagine you are looking for your wallet on a cluttered desk. As you scan the area, you hold in your mind a mental picture of what your wallet looks like. MIT neuroscientists have now identified a brain region that stores this type of visual representation during a search. The researchers also found that this region sends signals to the parts of the brain that control eye movements, telling individuals where to look next.
This region, known as the ventral pre-arcuate (VPA), is critical for what the researchers call “feature attention,” which allows the brain to seek objects based on their specific properties. Most previous studies of how the brain pays attention have investigated a different type of attention known as spatial attention — that is, what happens when the brain focuses on a certain location.
“The way that people go about their lives most of the time, they don’t know where things are in advance. They’re paying attention to things based on their features,” says Robert Desimone, director of MIT’s McGovern Institute for Brain Research. “In the morning you’re trying to find your car keys so you can go to work. How do you do that? You don’t look at every pixel in your house. You have to use your knowledge of what your car keys look like.”
Desimone, also the Doris and Don Berkey Professor in MIT’s Department of Brain and Cognitive Sciences, is the senior author of a paper describing the findings in the Oct. 29 online edition of Neuron. The paper’s lead author is Narcisse Bichot, a research scientist at the McGovern Institute. Other authors are Matthew Heard, a former research technician, and Ellen DeGennaro, a graduate student in the Harvard-MIT Division of Health Sciences and Technology. Read more