A Hearing Aid That Reads Brainwaves Isn't So Far Off
Professor Lucas Parra (GC/CCNY; Psychology/Biomedical Engineering) has an idea for hearing aid fittings, the process that tunes the aid to the patient’s specific hearing: Why not let the hearing aid adjust itself? In a recent study, Parra and Ph.D. student Ivan Iotzov (Psychology) demonstrated how electroencephalography, or EEG, can map brain responses to determine how well a person understands human speech. Parra envisions that someday hearing aids could do these measurements themselves and adjust accordingly.
Parra discussed his research and findings with The Graduate Center.
Graduate Center: What problem is this research trying to solve?
Parra: Hearing aids these days have a big problem. They cost a lot of money, but people wear them a few times and then put them away. Part of the problem is that they have to be set to the specific hearing of the client, which is often done at a doctor’s office. But then they go home or out onto the street and the sound quality is totally different, and they aren’t satisfied. The basic premise of this work is that we want to tune the hearing aid automatically. The idea is to have the hearing aid record brain responses, then tune itself.
GC: What did you find in this study?
Parra: This study shows that we can do this in principle. We can look at people’s brainwaves and look at the sound they are hearing, and correlate the two. When they correlate very well, that means the people are understanding the speech. When they don’t correlate, the people aren’t understanding the speech. The correlation is a good metric for speech comprehension.
GC: How does this project fit in with other research in your lab?
Parra: We’re interested in how the brain responds to natural stimuli, such as video, music, and educational content, or in this case, human speech. Our work stands in contrast to much neuroscience research, which looks at how the brain responds to stimuli like beeps and flashes. These experiments try to reduce stimuli to their most elemental form to get a fundamental understanding of how each beep or flash affects the brain.
In a video, on the other hand, there is a lot more going on. It's hard to know whether a brain response is an effect of the action, or the camera angle, or the story. I personally believe that while it is more uncontrolled, it's also more realistic.
GC: What prompted this study, and what are the next steps?
Parra: We’ve been studying how people react to natural stimuli for a long time. I had this gut feeling that we could do the same thing with speech instead of video, and that there could be an application in hearing aids. This was a “proof-of-concept” study. The next step is to show that we can do it with a real hearing aid, and the third step will be to tweak the hearing aid based on this correlation. Right now we are conducting a study with hearing aids, adjusting the parameters to see if we can track what conditions are easier or harder for participants to understand.
Submitted on: APR 2, 2019
Category: Faculty | General GC News | Psychology | Student News