Ear sounds reveal eye movements

summary: Researchers have discovered that the ear makes subtle sounds in response to eye movements, allowing them to determine where a person is looking.

The study shows that these ear sounds, potentially caused by muscle contractions or hair cell activation, can reveal eye positions.

This discovery challenges existing beliefs about the function of the ear, suggesting that ear sounds may help synchronize the perception of sight and sound. The team’s innovative approach could lead to new clinical hearing tests and a deeper understanding of sensory integration.

Key facts:

  1. The research revealed that subtle ear sounds correspond to eye movements, providing insight into where a person is looking.
  2. This phenomenon is likely the result of the brain coordinating eye movements with ear muscle contractions or hair cell activation.
  3. The findings open possibilities for new clinical tests and a better understanding of how the brain integrates visual and auditory information.

source: Duke University

Scientists can now determine where someone’s eyes are looking just by listening to their ears.

“You can actually estimate the movement of the eyes, and the position of the target that the eyes will look at, just from recordings made with a microphone in the ear canal,” said Jennifer Groh, Ph.D., senior author of the book. The new report, and a professor in the departments of psychology and neuroscience as well as neuroscience at Duke University.

This indicates the eye.
One group of projects focuses on how eye and ear movement sounds differ in people with hearing or vision loss. Credit: Neuroscience News

In 2018, Groh’s team discovered that the ears emit a subtle, imperceptible noise when the eyes move. In a new report appearing the week of November 20 in the magazine Proceedings of the National Academy of SciencesNow, the Duke team shows that these sounds can reveal where your eyes are looking.

It also works in reverse. Only by knowing where someone was looking, Groh and her team were able to predict the waveform of subtle ear sound.

Groh believes these sounds may occur when eye movements stimulate the brain to contract either the middle ear muscles, which normally help attenuate loud sounds, or the hair cells, which help amplify quiet sounds.

The exact purpose of this ear squeak is unclear, but his initial hunch is that it might help sharpen people’s perception.

“We think this is part of a system that allows the brain to tune in to where sights and sounds are, even though our eyes can move when our heads and ears don’t,” Groh said.

Understanding the relationship between subtle ear sounds and vision may lead to the development of new clinical tests of hearing.

“If each part of the ear contributes individual rules to the tympanic signal, it could be used as a kind of clinical tool to evaluate which part of the anatomy in the ear is faulty,” said Stephanie Lovich, one of the study’s lead authors. The paper and a graduate student in psychology and neuroscience at Duke University.

Just as the pupil of the eye contracts or dilates like the aperture of a camera to adjust the amount of light that enters, the ears also have their own way of regulating hearing. Scientists have long believed that these sound-regulating mechanisms only help amplify quiet sounds or dampen loud sounds.

But in 2018, Groh and her team discovered that the same sound regulation mechanisms are also activated by eye movements, suggesting that the brain tells the ears about eye movements.

In their latest study, the research team followed up on their initial discovery and investigated whether faint auditory signals contain detailed information about eye movements.

To decode people’s ear sounds, Groh’s team at Duke University and Professor Christopher Schirra, Ph.D. From the University of Southern California, he recruited 16 adults with poor vision and hearing to Groh’s lab in Durham for a fairly simple eye test.

Participants looked at a stationary green dot on a computer screen and then, without moving their heads, tracked the dot with their eyes as it disappeared and reappeared either up, down, left, right, or diagonally from the starting point. This gave Groh’s team a wide range of auditory signals generated as the eyes move horizontally, vertically or diagonally.

An eye tracker recorded where the participants’ pupils were going for comparison to ear sounds, which were captured using a pair of earbuds built into the microphone.

The research team analyzed ear sounds and found unique signatures for different directions of movement. This enabled them to decode ear sound and calculate where people were looking just by examining the sound wave.

“Since diagonal eye movement is just a horizontal component and a vertical component, my lab colleague and co-author David Murphy realized that you could take those two components and guess what it would be like if you put them together,” Lovich said.

“Then you can go in the opposite direction and look at the oscillation to predict that someone was looking at a 30-degree angle to the left.”

His puppy is now setting out to examine whether these ear sounds play a role in perception.

One group of projects focuses on how eye and ear movement sounds differ in people with hearing or vision loss.

Groh is also testing whether people without hearing or vision loss will generate ear signals that can predict how well they will do at an audio location task, such as locating an ambulance while driving, which relies on mapping auditory information onto a visual device. Scene.

“Some people have a signal that is really repeatable day after day, and you can measure it quickly,” Groh said. “You would expect these people to be really good at visual and auditory tasks compared to other people, where it’s more varied.”

Financing: Groh’s research was supported by a grant from the National Institutes of Health (NIDCD DC017532).

About visual and auditory neuroscience research news

author: Dan Vahaba
source: Duke University
communication: Dan Vahaba – Duke University
picture: Image credited to Neuroscience News

Original search: Open access.
Parametric information about eye movements is sent to the ears“By Jennifer Groh et al. With people


a summary

Parametric information about eye movements is sent to the ears

When the eyes move, the alignment between visual and auditory scenes changes. We are not aware of these transformations, which suggests that the brain must integrate precise information about eye movements into auditory and visual processing.

Here we show that small sounds generated by the brain within the ear contain precise information about contemporaneous eye movements in the spatial domain: the direction and amplitude of eye movements can be inferred from these small sounds.

The underlying mechanism(s) likely involve various motor structures of the ear and can facilitate the translation of incoming auditory signals into a frame of reference fixed in the direction of the eyes and thus the visual scene.

Leave a Reply

Your email address will not be published. Required fields are marked *

indian nude girl mms verpornos.org desi gay videos
indian sxe hd redwap.sex porn vdo
xvidios pornpakistani.com desi free sex
hades hentai clipxhentai.com street fighter hentai
pamasahe watch teleseryehot.com pba rappler
xx justpornvideo.mobi lakshmi rai hot
marathi open sexy video tubeporncity.info tube 8 hindi
hentai love dolls hentaiparadize.org microne magazine 10
xnxx pakistan momporntrends.com xxxvom
deci xxx bigtitsporntrends.com indiyan x video
haryana sex.com mom2fuck.mobi www.sexy.com
anjelina hot erohardcore.info hindilink4uto
tales of the kama sutra: the perfumed garden fuckhindi.com kinkbomb
拘束男をひたすらヌキまくる逆レ●プ痴女 強制射精ザーメン10連発スペシャル 伊藤舞雪 javmovies.mobi 美雪ありす
hardcore sex videos download redpornvideos.net nangi hindi video