Thriving on teamwork: New research shows how brain cells filter information in groups

When we perceive the world around us, certain objects appear to be more noticeable than others, depending on what we do. For example, when we view a forest-covered mountain from a distance, the forest looks like a large green carpet. But as we get closer, we start noticing the individual trees, and the forest fades to the background. What happens in the brain as our experience changes so drastically?

For decades, scientists studying the visual system thought that individual brain cells, called neurons, operate as filters. Some neurons would prefer coarse details of the visual scene and ignore fine details, while others would do the opposite. Every neuron was thought to do its own filtering.

A new study led by Salk Institute researchers challenges this view. The study revealed that the same neurons that prefer coarse details could change to prefer finer details under different conditions. The work, which appeared in the journal Neuron on December 31, 2018, could help to better understand neural mechanisms that shape our perceptions of the world.

“We were trying to look beneath the hood and figure out how these filters work,” says Professor Thomas Albright, director of Salk’s Center for Neurobiology of Vision and a senior author of the study.

“The selectivity of neurons was thought to be stable, but our work has shown that the filtering properties of neurons are much more flexible than was previously thought,” adds study first author Ambarish Pawar, a postdoctoral researcher at Salk.

The team focused on neurons in the visual cortex in an animal model. Animals were shown optical patterns in which the researchers varied the contrast between dark and light areas and measured neurons’ preferences to coarse and fine details. The goal was to see how neurons process these patterns, specifically in the brain’s middle temporal area within the visual cortex. Scientists expected to find that the neurons were strictly “tuned” to perceive either coarse or fine details, but not both. What they found instead that an individual neuron could filter both fine as well as coarse detail, depending on the contrast of the pattern.

By measuring the firing rates of multiple neurons activated by the optical stimuli, the researchers showed that such flexibility was more likely if entire networks of neurons acted as filters rather than individual neurons.

“Our results suggest that the previously common description of individual neurons as filters was incorrect,” says Sergei Gepshtein, a scientist with the Center for Neurobiology of Vision at Salk and co-author of the new study.

“The preference of neurons may shift due to a change in the balance of positive (excitatory) signals and negative (inhibitory) signals by which neurons communicate in the network,” adds Pawar.

The researchers showed that teaming up endows networks of neurons with a high amount of flexibility in their preferences could easily adapt and tune the brain to the changing conditions, just as you might tune a radio to get good reception as you drive.

“We’ve uncovered a new dimension of adaptability of cortical networks,” says Gepshtein. “Our results made it clear that to understand that adaptability we have to rethink what the computing units of the brain are. It is the team of connected neurons — the malleable neural network — that is more suited as such a unit rather than an individual neuron.”

“This unexpected finding could help us shed light on the neural mechanisms that underlie the brains’ enormous adaptability to a continuously changing environment,” says Pawar.

Albright adds that, “even though the study centered on the visual system, this same flexible quality of neural networks is likely to hold true for other parts of the brain.”

Now that they’ve seen the adaptable neuronal networks in action, the researchers next plan to study how changes in these networks affect behavior.

The work was funded by the National Institute of Health’s National Eye Institute (NEI; R01 EY018613), an NEI Core Grant for Vision Research (P30 EY019005), the GemCon Family Foundation and Conrad T. Prebys.

Source: Read Full Article