

It reveals the millisecond-by-millisecond changes that happen in the brain to enable vision. The technique provides a direct measurement of brain cell activity using an array of sensors around the head. To examine this process, Isabelle Rosenthal, Katherine Hermann, and Shridhar Singh, post-baccalaureate fellows in Conway’s lab and co-first authors on the study, used magnetoencephalography or “MEG,” a 50-year-old technology that noninvasively records the tiny magnetic fields that accompany brain activity.

The brain mixes and categorizes these signals to perceive color in a process that is not well understood. Three types of cone photoreceptors detect light over a range of wavelengths. The brain uses light signals detected by the retina’s cone photoreceptors as the building blocks for color perception. “The approach lets us get at fundamental questions of how we perceive, categorize, and understand color.” “This is one of the first studies to determine what color a person is seeing based on direct measurements of brain activity,” said Bevil Conway, Ph.D., chief of NEI’s Unit on Sensation, Cognition and Action, who led the study. NEI is part of the National Institutes of Health. The study may have implications for the development of machine-brain interfaces for visual prosthetics. The findings, published today in Current Biology, open a window into how color processing is organized in the brain, and how the brain recognizes and groups colors in the environment. Researchers at the National Eye Institute (NEI) have decoded brain maps of human color perception. Volunteers used a variety of names for the upper stimuli, such as “yellow” for the left and “brown” for the right, but consistently used “blue” for both the lower stimuli. Light luminance level versions are on the left dark versions on the right. Colored stimuli in yellow (top) and blue (bottom).
