ADVERTISEMENT

Study reveals how the brain can recognize images without color

The research suggests that early developmental limitations in color perception play a crucial role in this ability.

Scientist Pawan Sinha / Image- news.mit.edu

Indian-origin researcher Pawan Sinha and his team at the Massachusetts Institute of Technology (MIT) have revealed how the human brain can effectively recognize objects in black-and-white images, despite its sophisticated machinery for processing color. 

The research published in Science suggests that early developmental limitations in color perception play a crucial role in this ability. It combined experimental data and computational modeling to explore how the brain learns to identify objects. 

The researchers discovered that early in life, when newborns have limited color information, the brain is forced to rely on luminance, or the intensity of light emitted by objects, rather than color. This developmental stage, characterized by poor visual acuity and color vision due to underdeveloped retinal cone cells, aids the brain in learning to recognize objects based on their luminance.

The study involved presenting children with both color and black-and-white images. Children with normal vision showed no difficulty in recognizing objects in grayscale, whereas children who had cataract removal showed a significant drop in performance with black-and-white images. This suggests that early exposure to limited color information helps develop a resilience to changes in color, aiding in object recognition.

“These findings highlight the importance of early perceptual limitations in shaping the brain’s ability to recognize objects, transcending beyond just color vision to other sensory systems,” said Sinha, professor of Brain and Cognitive Sciences at MIT.

The findings also indicate that early sensory input limitations might benefit other aspects of vision and the auditory system. For example, Sinha’s lab has previously shown that early exposure to low-frequency sounds can enhance auditory tasks requiring long-term sound analysis, such as recognizing emotions.

The latest study's significance lies in advancing the understanding of visual perception and brain development. It was funded by the National Eye Institute of the National Institutes of Health (NIH) and the Intelligence Advanced Research Projects Activity.


 

Comments

ADVERTISEMENT

 

 

 

ADVERTISEMENT

 

 

E Paper

 

 

 

Video