Turning conventional neuroscience on its head, new research suggests the human visual system processes sound and helps us see.
Here's the basics of what was Neuroscience 101: The auditory system records sound, while the visual system focuses, well, on the visuals, and never do they meet. Instead, a "higher cognitive" producer, like the brain's superior colliculus, uses these separate inputs to create our cinematic experiences.
The textbook rewrite: The brain can, if it must, directly use sound to see and light to hear.
The study was published last week in the journal BMC Neuroscience.
Monkey hear, monkey see
Researchers trained monkeys to locate a light flashed on a screen. When the light was very bright, they easily found it; when it was dim, it took a long time. But if a dim light made a brief sound, the monkeys found it in no time — too quickly, in fact, than can be explained by the old theories.
Recordings from 49 neurons responsible for the earliest stages of visual processing, researchers found activation that mirrored the behavior. That is, when the sound was played, the neurons reacted as if there had been a stronger light, at a speed that can only be explained by a direct connection between the ear and eye brain regions, said researcher Ye Wang of the University of Texas in Houston.
The study presents the first evidence that a sensory cell can process an alternative sensation, said head researcher Pascal Barone of the Université Paul Sabatier in Toulouse, France, who discovered a contender for the anatomical connection in 2002.
The discovery likely explains the tremendously quick reactions of most animals, including humans, to stimuli that cue multiple senses, such as a rustling tiger or a honking bus.
Especially in the corners of the visual field, where eyesight is poor, the ears take up the slack and stimulate the visual system, Barone said.
An extra benefit, Wang explained, is the early visual system’s spatial precision, something higher brain regions fudge in favor of prioritizing our central gaze. By sending sound inputs directly to our image processor, the auditory system can avoid playing telephone with time-sensitive information.
The discovery is likely unrelated to the rare experience of synesthesia, a bizarre condition experienced by a few people who can feel, hear and taste colors. In synesthesia, for example, more complicated sensations combine at later stages of brain processing, so that just the mention of a color, a letter or a shape can automatically trigger the perception of a certain note.
What most excites Barone about the new findings is the potential for "cortical plasticity" in sensory areas.
For example, the blind, by definition, do not use the visual system to see. But they can, this research suggests, use it to hear. This may explain how blind people develop such advanced hearing skills and, similarly, why the deaf often possess superior sight, said Barone.
The primary visual system is also directly activated by touch – perhaps helping us slap that mosquito before it stings.
Live Science newsletter
Stay up to date on the latest science news by signing up for our Essentials newsletter.
Robin Nixon is a former staff writer for Live Science. Robin graduated from Columbia University with a BA in Neuroscience and Behavior and pursued a PhD in Neural Science from New York University before shifting gears to travel and write. She worked in Indonesia, Cambodia, Jordan, Iraq and Sudan, for companies doing development work before returning to the U.S. and taking journalism classes at Harvard. She worked as a health and science journalist covering breakthroughs in neuroscience, medicine, and psychology for the lay public, and is the author of "Allergy-Free Kids; The Science-based Approach To Preventing Food Allergies," (Harper Collins, 2017). She will attend the Yale Writer’s Workshop in summer 2023.