Posted on 03/17/2025 5:43:20 AM PDT by Red Badger
A new study suggests that the brain helps regulate the ear’s sensitivity to sound by sending signals to the cochlea, potentially leading to treatments for conditions like tinnitus and hyperacusis. Using advanced imaging technology, researchers observed real-time cochlear activity in awake mice, revealing how the brain compensates for hearing loss. Researchers from the Keck School of Medicine of USC, in partnership with Baylor College of Medicine in Houston, Texas, utilized a cutting-edge imaging tool to study the mouse inner ear, leading to a discovery that may aid in the treatment of hearing disorders.
A recent study published in the Journal of Neuroscience suggests that the brain may help regulate the ear’s sensitivity to sound and compensate for hearing loss by sending signals to the cochlea, a structure in the inner ear. This discovery could pave the way for new treatments for challenging hearing disorders such as hyperacusis, where everyday sounds become uncomfortably loud, and tinnitus, a condition characterized by ringing, buzzing, or other phantom sounds in the absence of an external source.
The study was made possible by a groundbreaking imaging tool that enabled researchers to capture real-time images of the cochlea in awake animals for the first time.
The cochlea uses sensory hair cells to detect sound waves in the air, then converts them into electrical signals that the brain can process. Most cochlear nerves carry information from the cochlea to the brain, but about 5% send signals in the opposite direction: from the brain to the cochlea. The exact role of those fibers has been a mystery, because researchers have struggled to measure cochlear activity in humans or animals while they are awake.
To change that, researchers from the Keck School of Medicine of USC, in collaboration with Baylor College of Medicine in Houston, Texas, have developed a new way to look at activity in the inner ear by adapting an imaging technique called optical coherence tomography (OCT), which is widely used in ophthalmology offices to scan the retina for conditions like glaucoma and macular degeneration. OCT uses light waves to scan tissue and create a 3D image, similar to the way ultrasound creates images from sound waves. Using this approach, the researchers captured real-time images of the cochlea in action.
“OCT lets us look down the ear canal, through the eardrum and bone into the cochlea, and measure how it’s working—noninvasively and without pain,” said John Oghalai, MD, professor and chair of otolaryngology-head and neck surgery and the Leon J. Tiber and David S. Alpert Chair in Medicine at the Keck School of Medicine. “What’s exciting about this is it lets us study how the brain is controlling the cochlea in real time.”
Using this tool, Oghalai and his team, including co-leaders Patricia Quiñones, research associate in Oghalai’s Lab, Brian E. Applegate, professor of otolaryngology-head and neck surgery at the Keck School of Medicine, and Matthew J. McGinley, assistant professor at Baylor College of Medicine, found that in healthy mice, cochlear activity does not change over the short term. But in mice with genetic hearing loss, cochlear function increased, indicating that the brain was enhancing the cochlea’s sensitivity as a response to long-term hearing loss.
Measuring cochlear function
A leading theory about the nerves that send signals from the brain to the cochlea (known as “efferent” fibers) is that they control the cochlea’s response to sound on a short-term basis, similar to the way our pupils work. Bright light makes the pupils constrict, while stress causes them to dilate. Could the cochlea be acting in a similar way?
To explore whether the cochlea responds to short-term stimuli, the researchers measured cochlear activity in mice using OCT. At the same time, they tracked the shifting brain states of the mice by measuring changes in pupil size. As brain states changed, cochlear activity stayed the same, suggesting that the inner ear does not modulate hearing on a short-term basis.
Next, the researchers genetically altered mice to disable the nerves carrying information from the inner ear to the brain (“afferent” fibers), causing hearing loss. Using OCT, they found that the cochlea was working overtime to compensate.
“As humans age and our hair cells die off, we start to lose our hearing. These findings suggest that the brain can send signals to the remaining hair cells, essentially telling them to turn up the volume,” said Oghalai, who is also a professor of biomedical engineering at the USC Viterbi School of Engineering.
The next step is a clinical trial to test drugs that block efferent fibers, which could lower the volume for patients with hyperacusis and may also help address tinnitus.
Improving diagnosis
OCT also holds promise for improving the diagnosis and treatment of hearing disorders. Now that Oghalai’s team has adapted OCT for cochlear imaging in awake mice, they are testing a version of the tool for patients in a new NIH-funded study.
The technology could ultimately allow providers to diagnose hearing problems based on physiology, not just performance on a hearing exam, and to tailor treatments to individual needs.
“This is the first step toward a tool that lets us look into a patient’s ear, find out what the problem is and treat it,” Oghalai said.
Reference:
“The medial olivocochlear efferent pathway potentiates cochlear amplification in response to hearing loss”
by Patricia M. Quiñones, Michelle Pei, Hemant Srivastava, Ariadna Cobo-Cuan, Marcela A. Morán, Bong Jik Kim, Clayton B. Walker, Michael J. Serafino, Frank Macias-Escriva, Juemei Wang, James B. Dewey, Brian E. Applegate, Matthew J. McGinley and John S. Oghalai, 20 February 2025, Journal of Neuroscience.
DOI: 10.1523/JNEUROSCI.2103-24.2025
This work was supported by the National Institute on Deafness and Other Communication Disorders [R01 DC014450, R01 DC013774, R01 DC017741, R25 DC019700, R21 DC019209, R01 DC017797]; the National Institute of Biomedical Imaging and Bioengineering [R01 EB027113]; and the Keck School of Medicine Dean’s Research Scholar Program.
Disclosure: John Oghalai and Brian Applegate are founders of AO technologies, with the goal of translating inner ear imaging technologies for clinical purposes.
TINNITUS RING LIST!.....................
Eh?
It would be great not to hear this crap.
I have an Osia Baha magnetic hearing aid on the back right side of my head. My inner ear has been rotor rootered over the years but the nerve is in good shape. Helps a lot and it’s blue tooth enabled so I have a microphone tied to our tv & I can stream movies on my phone without anyone else hearing it.
WHAT???
“It would be great not to hear this crap.”
Are you refering to the MSM ?
“...It would be great not to hear this crap....”
Fully agree.
I’m 74 and have had tinnitus in both ears since I was a teenager.
I cannot even begin to imagine what it’d be like with it gone.
My theory is that our brains inject biasing frequencies into hearing nerves to make electrons start moving .....alternating currents. Tape recorders work this way. Our brains place arbitrary hearing perception thresholds just above the biasing volume. When our attempts to hear sounds are forced down into the biasing current then the biasing frequency fills our hearing. Hearing aids help because they coerce our brains to raise our perception threshold to a level well above the biasing current.
There is a great need in DC for NEW BRAINS!
The only sounds I hear clearly come from the ringing in my ears. It won’t go away, so I try to ignore it.
Just sent the link to a relative who has been suffering from it since the Covid vax...has not found any help whatsoever.
FTA:
“Oghalai’s team has adapted OCT for cochlear imaging in awake mice...”
They have “woke” mice?
Yes, they are the leftovers from the Trans-Mice experiments....................
Same here, but it gets to very loud levels in the evening.
One of my ears has little sound for many decades. I’ve been asking doctors to fix it since they day my ex left.
“Are you refering to the MSM ?”
No tinnitus. On the other hand...........
” inject biasing frequencies into hearing nerves to make electrons start moving .....alternating currents. Tape recorders work this way. “
A man with knowledge you are. Still got a Revox A77 here, got rid of huge Ampex 440C units 20 years ago. You increase the bias until you see an perceivable hi-freq peak and drop starting ? How did you do it ?
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.