Skip to content

Because sound travels much more slowly than light, we can often see distant events before we hear them. That is why we can count the seconds between a lightning flash and its thunder to estimate their distance.

But new research from the University of Rochester reveals that our brains can also detect and process sound delays that are too short to be noticed consciously. And they found that we use even that unconscious information to fine tune what our eyes see when estimating distances to nearby events.

“Much of the world around us is audiovisual,” said Duje Tadin, associate professor of brain and cognitive sciences at the University of Rochester and senior author of the study. “Although humans are primarily visual creatures, our research shows that estimating relative distance is more precise when visual cues are supported with corresponding auditory signals. Our brains recognize those signals even when they are separated from visual cues by a time that is too brief to consciously notice.”

Tadin and his colleagues have discovered that humans can unconsciously notice and make use of sound delays as short as 40 milliseconds (ms).

“Our brains are very good at recognizing patterns that can help us,” said Phil Jaekl, who conducted the research while a postdoctoral researcher in Tadin’s lab. “Now we also know that humans can unconsciously recognize the link between sound delays and visual distance, and then combine that information in a useful way.”

For the study, published in PLOS ONE, the researchers used projected three-dimensional (3D) images to test the human brain’s ability to use sound delays to estimate the relative distance of objects.

In the first experiment, participants were asked to adjust the relative depth of two identical shapes until they appeared to be at the same distance when viewed through 3D glasses. Each shape was paired with an audible “click.” The click came either just before the shape appeared, or slightly after it appeared—by an equally brief time.

Participants consistently perceived the shape that was paired with the delayed click as being more distant. “This surprised us,” said Jaekl. “When the 3D shapes were the same distance, participants were consistently biased by the sound delay to judge the shape paired with the delayed click as being further away—even though it wasn’t.”

In the second experiment, participants were shown three-dimensional shapes that were quickly repositioned either toward or away from the participant. When the shape was paired with a sound delayed by 42 ms, participants were more likely to perceive it as more distant—even in cases when the object was actually shifted toward the participant. Most importantly, when an object that was shifted away was paired with the sound delay—a pairing consistent with the natural world—participants were able to judge relative distance with greater precision.

“It’s striking that this bias is unconscious—participants were unable to consciously detect when sound delays were present, yet it had great influence over their perception of distance,” said Jaekl, who is currently conducting research at the University of Rochester Medical Center.

The study was supported by the National Eye Institute, part of the National Institutes of Health.