,

Tinnitus help in Devon and Somerset

Tinnitus, Devon, Somerset, earwax removal, earwax, Keynsham, Devon, Kingsbridge, Honiton

Tinnitus & Earwax

Devon and Somerset earwax removal services

Researchers using functional MRI (fMRI) have found that neurofeedback training has the potential to reduce the severity of tinnitus or even eliminate it, according to a study presented at the annual meeting of the Radiological Society of North America (RSNA), the international society of radiologists, medical physicists, and other medical professionals announced on its website.

sherwood_fig_1

The standard approach to fMRI neurofeedback.

 Tinnitus is the perception of noise, often ringing, in the ear. The condition is very common, affecting approximately one in five people. As sufferers start to focus on it more, they become more frustrated and anxious, which in turn makes the noise seem worse. The primary auditory cortex, the part of the brain where auditory input is processed, has been implicated in tinnitus-related distress.

For the study, researchers looked at a novel potential way to treat tinnitus by having people use neurofeedback training to turn their focus away from the sounds in their ears. Neurofeedback is a way of training the brain by allowing an individual to view some type of external indicator of brain activity and attempt to exert control over it.

“The idea is that in people with tinnitus there is an over-attention drawn to the auditory cortex, making it more active than in a healthy person,” said Matthew S. Sherwood, PhD, research engineer and adjunct faculty in the Department of Biomedical, Industrial, and Human Factors Engineering at Wright State University in Fairborn, Ohio. “Our hope is that tinnitus sufferers could use neurofeedback to divert attention away from their tinnitus and possibly make it go away.”

Matthew S. Sherwood, PhD

Matthew S. Sherwood, PhD

To determine the potential efficacy of this approach, the researchers had 18 healthy volunteers with normal hearing undergo five fMRI-neurofeedback training sessions. Study participants were given earplugs through which white noise could be introduced for periods of time. The earplugs also served to block out the scanner noise.

sherwood_fig_2

Overview of the experimental design. Each participant completed 5 sessions.

To obtain fMRI results, the researchers used single-shot echo planar imaging, an MRI technique that is sensitive to blood oxygen levels, providing an indirect measure of brain activity.

“We started with alternating periods of sound and no sound in order to create a map of the brain and find areas that produced the highest activity during the sound phase,”  Sherwood said. “Then we selected the voxels that were heavily activated when sound was being played.”

The volunteers then participated in the fMRI-neurofeedback training phase while inside the MRI scanner. They received white noise through their earplugs and were able to view the activity in their primary auditory cortex as a bar on a screen. Each fMRI-neurofeedback training run contained eight blocks separated into a 30-second “relax” period followed by a 30-second “lower” period. Participants were instructed to watch the bar during the relax period and actively attempt to lower it by decreasing primary auditory cortex activity during the lower phase.

Neurofeedback training paradigm.

Neurofeedback training paradigm.

The researchers gave the participants techniques to help them do this, such as trying to divert attention from sound to other sensations like touch and sight.

“Many focused on breathing because it gave them a feeling of control,” Sherwood said. “By diverting their attention away from sound, the participants’ auditory cortex activity went down, and the signal we were measuring also went down.”

A control group of nine individuals were provided sham neurofeedback—they performed the same tasks as the other group, but the feedback came not from them but from a random participant. By performing the exact same procedures with both groups using either real or sham neurofeedback, the researchers were able to distinguish the effect of real neurofeedback on control of the primary auditory cortex.

Control over the primary auditory cortex (A1 control) separated by group and session. The experimental group was found to have significantly higher control, averaged across training, than the control group.

Control over the primary auditory cortex (A1 control) separated by group and session. The experimental group was found to have significantly higher control, averaged across training, than the control group.

Whole brain effects of neurofeedback training.

Whole brain effects of neurofeedback training.

Effect of emotion on attention. Emotional distractors resulted in a significantly larger change in response latency in the experimental group when compared to the control group. However, the impact of emotion on attention was not found to change significantly between the groups across training.

Effect of emotion on attention. Emotional distractors resulted in a significantly larger change in response latency in the experimental group when compared to the control group. However, the impact of emotion on attention was not found to change significantly between the groups across training.

Activation of the primary auditory cortex in response to binaural stimulation. Activation significantly decreased from session 1 to session 5.

Activation of the primary auditory cortex in response to binaural stimulation. Activation significantly decreased from session 1 to session 5.

Improvements in control over the primary auditory cortex were found to be significantly related to decreases in the effect of emotion on attention.

Improvements in control over the primary auditory cortex were found to be significantly related to decreases in the effect of emotion on attention.

The study reportedly represents the first time fMRI-neurofeedback training has been applied to demonstrate that there is a significant relationship between control of the primary auditory cortex and attentional processes. This is important to therapeutic development, Sherwood said, as the neural mechanisms of tinnitus are unknown but likely related to attention.

The results represent a promising avenue of research that could lead to improvements in other areas of health like pain management, according to Sherwood.

“Ultimately, we’d like take what we learned from MRI and develop a neurofeedback program that doesn’t require MRI to use, such as an app or home-based therapy that could apply to tinnitus and other conditions,” he said.

Co-authors are Emily E. Diller, MS; Subhashini Ganapathy, PhD; Jeremy Nelson, PhD; and Jason G. Parker, PhD. This material is based on research sponsored by the US Air Force under agreement number FA8650-16-2-6702. The views expressed are those of the authors and do not reflect the official views or policy of the Department of Defense and its Components. The US Government is authorized to reproduce and distribute reprints for Governmental purposes notwithstanding any copyright notation thereon. The voluntary, fully informed consent of the subjects used in this research was obtained as required by 32 CFR 219 and DODI 3216.02_AFI 40-402.

Source: RSNA

Images: RSNA

,

Earwax removal in Somerset & Devon and hearing health

Earwax removal, Devon, Somerset, Honiton, Kingsbridge, Keynsham

Researchers Find Increased Risk of Hearing Loss Among Smokers

By, Somerset and Devon earwax removal centres

shutterstock_154685816

New research published in Nicotine & Tobacco Researchhas shown evidence that smoking is associated with hearing loss, according to a news release from the journal’s publisher, Oxford Press.

The study—which included 50,000 participants over an 8-year period—looked at data from annual health checkups, which included factors such as smoking status, number of cigarettes smoked per day, and the duration of smoking cessation on hearing loss, according to the release. Researchers calculated a 1.2 to 1.6 increased risk of hearing loss among smokers as compared to those who had never smoked before.

The risk of hearing loss decreased five years after smoking cessation.

For additional information, please click here to view the release on Science Daily’s website.

Original Paper: Hu H, Sasaki N, Ogasawara T, et al. Smoking, smoking cessation, and the risk of hearing loss: Japan epidemiology collaboration on occupational health study. Nicotine & Tobacco Research. March 14, 2018.

Source: Science Daily, Nicotine & Tobacco Research, Oxford Press

Sonic to Stream Sounds in Stereo from Numerous Devices

Earwax, removal, ear, blocked, Devon, Somerset, Honiton, Keynsham, Kingsbridge,

Sonic Enchant Line Adds SoundClip-A to Stream Sounds in Stereo from Numerous Devices

Published on 

Sonic SoundClip-A.

Sonic SoundClip-A.

Sonic, Somerset, NJ, has expanded the wireless connectivity options of its Enchant hearing aids with the introduction of SoundClip-A. The easy-to-use, lightweight SoundClip-A allows users to stream stereo sound hands-free to both hearing aids from all Bluetooth® 2.1 smartphones and devices. Sonic Enchant already offers a range of premium features including natural sound, the ease and convenience of rechargeable batteries, and direct wireless streaming from an iPhone®.

Now, the small, ergonomically designed clip-on device delivers added benefit as a wireless remote/partner microphone for easier listening when the speaker is at a distance or in noisy environments where listening is difficult. SoundClip-A also enables remote volume control, program changes and call pick-up with just the press of a button.

Joseph Lugera

Joseph Lugera

“SoundClip-A’s wireless transmission of stereo sound from all Bluetooth 2.1 smartphones and devices adds the ‘wow’ of even more wireless convenience to the many ways Enchant makes everyday sounds better,” said Sonic President & COO Joseph A. Lugara in a press statement. “With Enchant, wireless connectivity is simple and stress-free thanks to Enchant’s Dual-Radio System that delivers fast ear-to-ear connection and employs 2.4 GHz technology.”

Simply Streaming. SoundClip-A allows patients to use Enchant hearing aids as a headset for mobile calls. Users stream stereo quality sound to both ears through their Enchant hearing aids from any Bluetooth 2.1 compatible device—including mobile phones, tablets, MP3 players, and more. The built-in microphones pick up the wearer’s voice and sound from the call which is streamed wirelessly to both ears for convenient, hands-free conversations.

When SoundClip-A is used as a wireless remote/partner microphone, the speaker simply clips on the lightweight device or keeps it nearby. The speaker’s voice can be heard more easily through the user’s Enchant hearing aids at a distance of up to 65 feet, according to the company. SoundClip-A also helps users enjoy video calls, webinars, and other audio sources for easy wireless listening in both ears.

For more information on SoundClip-A and the entire Enchant family, including Enchant100, Enchant80 and Enchant60 and popular styles including the miniRITE with ZPower, miniRITE T (with telecoil) and BTE 105, visit www.sonici.com.

, ,

Visual Cues May Help Amplify Sound, University College London Researchers Find

Ear wax removal in the south west UK.

Have earwax issues? Visual Cues May Help Amplify Sound, University College London Researchers Find

Published on 

Looking at someone’s lips is good for listening in noisy environments because it helps our brains amplify the sounds we’re hearing in time with what we’re seeing, finds a new University College London (UCL)-led study, the school announced on its website.

The researchers say their findings, published in Neuron, could be relevant to people with hearing aids or cochlear implants, as they tend to struggle hearing conversations in noisy places like a pub or restaurant.

The researchers found that visual information is integrated with auditory information at an earlier, more basic level than previously believed, independent of any conscious or attention-driven processes. When information from the eyes and ears is temporally coherent, the auditory cortex —the part of the brain responsible for interpreting what we hear—boosts the relevant sounds that tie in with what we’re looking at.

“While the auditory cortex is focused on processing sounds, roughly a quarter of its neurons respond to light—we helped discover that a decade ago, and we’ve been trying to figure out why that’s the case ever since,” said the study’s lead author, Dr Jennifer Bizley, UCL Ear Institute.

In a 2015 study, she and her team found that people can pick apart two different sounds more easily if the one they’re trying to focus on happens in time with a visual cue. For this latest study, the researchers presented the same auditory and visual stimuli to ferrets while recording their neural activity. When one of the auditory streams changed in amplitude in conjunction with changes in luminance of the visual stimulus, more of the neurons in the auditory cortex reacted to that sound.

“Looking at someone when they’re speaking doesn’t just help us hear because of our ability to recognize lip movements—we’ve shown it’s beneficial at a lower level than that, as the timing of the movements aligned with the timing of the sounds tells our auditory neurons which sounds to represent more strongly. If you’re trying to pick someone’s voice out of background noise, that could be really helpful,” said Bizley.

The researchers say their findings could help develop training strategies for people with hearing loss, as they have had early success in helping people tap into their brain’s ability to link up sound and sight. The findings could also help hearing aid and cochlear implant manufacturers develop smarter ways to amplify sound by linking it to the person’s gaze direction.

The paper adds to evidence that people who are having trouble hearing should get their eyes tested as well.

The study was led by Bizley and PhD student Huriye Atilgan, UCL Ear Institute, alongside researchers from UCL, the University of Rochester, and the University of Washington, and was funded by Wellcome, the Royal Society; the Biotechnology and Biological Sciences Research Council (BBSRC); Action on Hearing Loss; the National Institutes of Health (NIH), and the Hearing Health Foundation.

Original Paper: Atilgan H, Town SM, Wood KC, et al. Integration of visual information in auditory cortex promotes auditory scene analysis through multisensory binding. Neuron. 2018;97(3)[February]:640–655.e4. doi.org/10.1016/j.neuron.2017.12.03

Source: University College London, Neuron

Hearing Protection for people in Somerset & Devon

Hearing-Protection-Somerstet-Devon-earwax-removal

GN Store Nord Develops Device to Protect Soldiers’ Hearing

Published on 

soldier aims gun

GN Store Nord has announced a “first of its kind, fully fledged hearing protection solution, enabling defense and security forces to hear more, do more, and be more.” With this advanced tactical hearing-protection solution, GN reports that it is leveraging unique leading competencies within intelligent audio solutions in both hearing aids and headsets to create an unparalleled noise management solution. The product will be manufactured at its Bloomington, Minn, facility where ReSound is also located.

The global market for military communication systems is estimated to be about $630 million, and features competitors such as Peltor (3M), INVISIO, Silynx, Racal Acoustics, and MSA Sordin, according to long-time hearing industry analyst Niels Granholm-Leth of Carnegie Investment Bank in Copenhagen.  GN has embarked on several projects in its GN Stratcom organization, which is currently part of GN Hearing, although the company could eventually establish it as a stand-alone division alongside its Hearing (ReSound, Beltone, and Interton) and Headset divisions (Jabra).

The new patented hearing protection solution is designed specifically for defense and security forces. GN says the solution offers the user a communication headset which is designed to be comfortable, highly durable, and protects the user against high volume noise. At the same time, by leveraging GN’s expertise within situational awareness, the solution allows its user to clearly identify important sound in 360°.

Anders Hedegaard

Anders Hedegaard

“The GN Group encompasses consumer, professional, and medical grade hearing technology under the same roof,” says CEO of GN Hearing, Anders Hedegaard. “This unique platform makes it possible to expand GN’s business into adjacent opportunities within the sound space. With our user-centric approach we aim to be the leader in intelligent audio solutions to transform lives through the power of sound.”

GN will be starting to build a small, swift group related to this new business opportunity. This year, GN will participate in military tenders in the United States and with other NATO-countries. The new product line will, under the name GN FalCom, include:

  • Comfort. Designed for optimal physical comfort allowing for multiple hours of use in extreme combat situations;
  • Clarity. Enables users to localize sounds all around them without the need to remove the earpiece. To maintain high quality communications at all times, GN FalCom will integrate seamlessly with military radio technology, and
  • Protection. Allows users to stay connected while benefitting from noise protection. For example, users will experience the highest level of safety without blocking out wanted sounds.

The  hearing protection solution builds on GN’s expertise in sound processing from both GN Hearing and GN Audio—and across R&D teams in the United States and Denmark. It is a successful result of corporate level investments made through GN’s Strategy Committee guided initiatives to explore opportunities outside of, but related to, GN’s existing business areas. According to the company, the hearing protection solution will be manufactured at GN’s existing production facilities in Bloomington, Minn, and will not impact GN’s financial guidance for 2018.

Duke Researchers Find Eye Movement Triggers Eardrum Movement

Earwax-Microsuction
http://www.dreamstime.com/stock-image-eye-ear-abstract-drawings-image89804381

Simply moving the eyes triggers the eardrums to move too, says a new study by Duke University neuroscientists, according to an article in Duke Todaya news hub for Duke University.

The researchers found that keeping the head still but shifting the eyes to one side or the other sparks vibrations in the eardrums, even in the absence of any sounds.

Surprisingly, these eardrum vibrations start slightly before the eyes move, indicating that motion in the ears and the eyes are controlled by the same motor commands deep within the brain.

“It’s like the brain is saying, ‘I’m going to move the eyes, I better tell the eardrums, too,’” said Jennifer Groh, a professor in the departments of neurobiology and psychology and neuroscience at Duke.

The findings, which were replicated in both humans and rhesus monkeys, provide new insight into how the brain coordinates what we see and what we hear. It may also lead to new understanding of hearing disorders, such as difficulty following a conversation in a crowded room.

The paper appeared January 23 in Proceedings of the National Academy of Sciences.

It’s no secret that the eyes and ears work together to make sense of the sights and sounds around us. Most people find it easier to understand somebody if they are looking at them and watching their lips move. And in a famous illusion called the McGurk Effect, videos of lip cues dubbed with mismatched audio cause people to hear the wrong sound.

But researchers are still puzzling over where and how the brain combines these two very different types of sensory information.

“Our brains would like to match up what we see and what we hear according to where these stimuli are coming from, but the visual system and the auditory system figure out where stimuli are located in two completely different ways,” Groh said. “The eyes are giving you a camera-like snapshot of the visual scene, whereas for sounds, you have to calculate where they are coming from based on differences in timing and loudness across the two ears.”

Because the eyes are usually darting about within the head, the visual and auditory worlds are constantly in flux with respect to one another, Groh added.

In an experiment designed by Kurtis Gruters, a formal doctoral student in Groh’s lab and co-first author on the paper, 16 participants were asked to sit in a dark room and follow shifting LED lights with their eyes. Each participant also wore small microphones in their ear canals that were sensitive enough to pick up the slight vibrations created when the eardrum sways back and forth.

Though eardrums vibrate primarily in response to outside sounds, the brain can also control their movements using small bones in the middle ear and hair cells in the cochlea. These mechanisms help modulate the volume of sounds that ultimately reach the inner ear and brain, and produce small sounds known as otoacoustic emissions.

Gruters found that when the eyes moved, both eardrums moved in sync with one another, one side bulging inward at the same time the other side bulged outward. They continued to vibrate back and forth together until shortly after the eyes stopped moving. Eye movements in opposite directions produced opposite patterns of vibrations.

Larger eye movements also triggered bigger vibrations than smaller eye movements, the team found.

“The fact that these eardrum movements are encoding spatial information about eye movements means that they may be useful for helping our brains merge visual and auditory space,” said David Murphy, a doctoral student in Groh’s lab and co-first author on the paper. “It could also signify a marker of a healthy interaction between the auditory and visual systems.”

The team, which included Christopher Shera at the University of Southern California and David W. Smith of the University of Florida, is still investigating how these eardrum vibrations impact what we hear, and what role they may play in hearing disorders. In future experiments, they will look at whether up and down eye movements also cause unique signatures in eardrum vibrations.

“The eardrum movements literally contain information about what the eyes are doing,” Groh said. “This demonstrates that these two sensory pathways are coupled, and they are coupled at the earliest points.”

Cole Jenson, an undergraduate neuroscience major at Duke, also coauthored the new study.

Original Paper: Gruters KG, Murphy DLK, Jenson CD, Smith DW, Shera CA, Groh JM. The eardrums move when the eyes move: A multisensory effect on the mechanics of hearing. Proceedings of the National Academy of Sciences, Early Edition. 2018. doi: 10.1073/pnas.1717948115

Source: Duke Today, Proceedings of the National Academy of Sciences, Early Edition

Starkey Hearing Technologies Launches IQ Product Lines

earwax-removal-south-west-england

Starkey Hearing Technologies Launches IQ Product Lines

Published on 

iQ Product Family

Starkey Hearing Technologies—an Eden Prairie, Minn-based hearing technology company—is launching a new line of hearing aids based on the company’s research in virtual reality, advanced neuroscience, and audiology and signal processing, Starkey announced. Designed to create an immersive hearing experience for patients, the iQ product lines include: Muse iQ, a complete line of 900sync™ technology, wireless hearing aids; SoundLens Synergy iQ, a new invisible-in-the-canal hearing aid; Halo iQ, smartphone-compatible hearing aids; and the brand-new TruLink Remote.

Muse iQ_Details(1)

“As part of our commitment to lead the world in hearing innovation, we are excited to share the results of our collaboration with the world’s top researchers in today’s most advanced technologies,” Starkey Hearing Technologies President Brandon Sawalich said. “By working closely with leading researchers in the fields of neuroscience, virtual reality, and audiology and signal processing to integrate advancements into our award-winning products, we can now provide patients with new levels of presence, clarity, personalization, and other benefits previously unattainable with traditional hearing devices.”

iQ Family_IIC_RIC_312T_mini_BTE_312_micro_RIC_312T_CIC

New Features with Acuity OS 2, Inspired by Virtual Reality Research

Built with Starkey Hearing Technologies’ proven Synergy platform and Acuity™ OS 2 operating system, the iQ technologies include a suite of new features that deliver the presence, clarity, and personalization patients have previously missed during the moments that matter most, the company said. Four of the most notable new features include:

  • Acuity Immersion – Designed to leverage microphone placement to aid with high-frequency information for improved sound quality and sense of special awareness, this technology has the potential to help patients relearn key acoustic brain cues to support clear speech, a sense of presence, and spatial attention for connection to their environment. Acuity Immersion takes the key natural cues needed for spatial awareness and shifts them to provide the wearer with both clear speech and a sense of presence and connection to their environment. By giving the wearer’s brain access to these cues, iQ hearing aids can help wearers’ brains relearn these key cues and thereby reassert spatial perception.
  • Acuity Immersion Directionality – Designed to restore front-to-back cues for a more natural, safer listening experience.
  • Speech Indicators for memory – Provide descriptive names for memory environments rather than numeric indicators.
  • Smart VC – Allows for an increase in gain in all channels not already at maximum, to give wearers a desired increase in loudness when needed.

TruLink Details(1)

“The iQ line represents a brand-new dimension in hearing technology research and innovation,” Starkey Hearing Technologies Chief Technology Officer and Executive Vice President of Engineering Achin Bhowmik said. “We anticipate that our new products will have a dramatic impact on our patients’ lives—and change the game in the global hearing aid industry.”

iQ_Synergy Platorm Family

 

Muse iQ, Muse iQ CROS and SoundLens Synergy iQ

Designed to provide high-quality, natural sound in even the most challenging environments, Muse iQ and SoundLens Synergy iQ hearing aids offer audibility and streaming for individuals with single-sided hearing loss, said Starkey. All Muse iQ and SoundLens Synergy iQ devices work with SurfLink wireless accessories to provide ear-to-ear streaming of calls, music and media, remote hearing aid control, and a personalized hearing experience.

SoundLens iQ Synergy_Details(1)

Muse iQ hearing aids are available in both custom and standard styles, and the Muse iQ micro RIC 312t is also available in a rechargeable option. Finally, Muse iQ CROS and BiCROS systems offer audibility and streaming for individuals with single-sided hearing loss.

SoundLens Synergy iQ hearing aids offer wearers an invisible, custom fit hearing solution featuring Starkey Hearing Technologies’ advanced technology and sound quality.

Halo iQ and the NEW TruLink Remote 

Powered by Starkey Hearing Technologies’ TruLink 2.4 GHz wireless hearing technology, Halo iQ smartphone-compatible hearing aids enable connectivity with iPhone, iPad®, iPod touch®, Apple Watch®, and select Android™ devices.

Halo iQ_Details

A brand-new wireless accessory, the TruLink Remote is compatible with Apple® or Android and works without a smartphone.

Source: Starkey Hearing Technologies

Images: Starkey Hearing Technologies