Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Gated auditory speech perception in elderly hearing aid users and elderly normal-hearing individuals: effects of hearing impairment and cognitive capacity
Linköping University, Department of Behavioural Sciences and Learning, Disability Research. Linköping University, Faculty of Arts and Sciences. Linköping University, The Swedish Institute for Disability Research. (Linnaeus Centre HEAD)
Linköping University, Department of Behavioural Sciences and Learning, Psychology. Linköping University, Faculty of Arts and Sciences.
Linköping University, Department of Clinical and Experimental Medicine, Division of Neuroscience. Linköping University, Faculty of Health Sciences. Linköping University, The Swedish Institute for Disability Research. Östergötlands Läns Landsting, Anaesthetics, Operations and Specialty Surgery Center, Department of Otorhinolaryngology in Linköping. (Linnaeus Centre HEAD)
Linköping University, Department of Behavioural Sciences and Learning, Disability Research. Linköping University, Faculty of Arts and Sciences. Linköping University, The Swedish Institute for Disability Research. (Linnaeus Centre HEAD)
2014 (English)In: Trends in Hearing, ISSN 2331-2165, Vol. 18Article in journal (Refereed) Published
Abstract [en]

This study compared elderly hearing aid (EHA) users and elderly normal-hearing (ENH) individuals on identification of auditory speech stimuli (consonants, words, and final word in sentences) that were different when considering their linguistic properties. We measured the accuracy with which the target speech stimuli were identified, as well as the isolation points (IPs: the shortest duration, from onset, required to correctly identify the speech target). The relationships between working memory capacity, the IPs, and speech accuracy were also measured. Twenty-four EHA users (with mild to moderate hearing impairment) and 24 ENH individuals participated in the present study. Despite the use of their regular hearing aids, the EHA users had delayed IPs and were less accurate in identifying consonants and words compared with the ENH individuals. The EHA users also had delayed IPs for final word identification in sentences with lower predictability; however, no significant between-group difference in accuracy was observed. Finally, there were no significant between-group differences in terms of IPs or accuracy for final word identification in highly predictable sentences. Our results also showed that, among EHA users, greater working memory capacity was associated with earlier IPs and improved accuracy in consonant and word identification. Together, our findings demonstrate that the gated speech perception ability of EHA users was not at the level of ENH individuals, in terms of IPs and accuracy. In addition, gated speech perception was more cognitively demanding for EHA users than for ENH individuals in the absence of semantic context.

Place, publisher, year, edition, pages
Sage Publications, 2014. Vol. 18
Keyword [en]
hearing aid users, gating paradigm, speech perception, cognition
National Category
Otorhinolaryngology
Identifiers
URN: urn:nbn:se:liu:diva-109067DOI: 10.1177/2331216514545406ISI: 000343753700007PubMedID: 25085610OAI: oai:DiVA.org:liu-109067DiVA: diva2:736085
Funder
Swedish Research Council, 349-2007-8654
Available from: 2014-08-04 Created: 2014-08-04 Last updated: 2015-08-12Bibliographically approved
In thesis
1. Time is of the essence in speech perception!: Get it fast, or think about it
Open this publication in new window or tab >>Time is of the essence in speech perception!: Get it fast, or think about it
2014 (English)Doctoral thesis, comprehensive summary (Other academic)
Alternative title[sv]
Lyssna nu! : Hör rätt direkt, eller klura på det!
Abstract [en]

The present thesis examined the extent to which background noise influences the isolation point (IP, the shortest time from the onset of speech stimulus required for correct identification of that speech stimulus) and accuracy in identification of different types of speech stimuli (consonants, words, and final words in high-predictable [HP] and low-predictable [LP] sentences). These speech stimuli were presented in different modalities of presentation (auditory, visual, and audiovisual) to young normal-hearing listeners (Papers 1, 2, and 5). In addition, the present thesis studied under what conditions cognitive resources were explicitly demanded in identification of different types of speech stimuli (Papers 1 and 2). Further, elderly hearing-aid (EHA) users and elderly normal-hearing (ENH) listeners were compared with regard to the IPs, accuracy, and under what conditions explicit cognitive resources were demanded in identification of auditory speech stimuli in silence (Paper 3). The results showed that background noise resulted in later IPs and reduced the accuracy for the identification of different types of speech stimuli in both modalities of speech presentation. Explicit cognitive resources were demanded in identification of speech stimuli in the auditory-only modality, under the noisy condition, and in the absence of a prior semantic context. In addition, audiovisual presentation of speech stimuli resulted in earlier IPs and more accurate identification of speech stimuli than auditory presentation. Furthermore, a pre-exposure to audiovisual speech stimuli resulted in better auditory speech-in-noise identification than an exposure to auditory-only speech stimuli (Papers 2 and 4). When comparing EHA users and ENH individuals, the EHA users showed inferior performance in the identification of consonants, words, and final words in LP sentences (in terms of IP). In terms of accuracy, the EHA users demonstrated inferior performance only in the identification of consonants and words. Only the identification of consonants and words demanded explicit cognitive resources in the EHA users. Theoretical predictions and clinical implications were discussed.

Abstract [sv]

I denna avhandling undersöktes hur mycket bakgrundsbuller inverkar på isolationspunkten (IP, den tidigaste tidpunkt när ett talat stimulus kan identifieras korrekt) och exakthet i identifikation av olika typer av talade stimuli (konsonanter, ord, och ord i final position i högt predicerbara [HP] respektive lågt predicerbara [LP] meningar). Dessa talade stimuli presenterades i olika modaliteteter (auditivt, visuellt, och audiovisuellt) för unga normalhörande deltagare (Artikel 1, 2 och 5). Dessutom jämfördes under vilka betingelser explicita kognitiva resurser krävdes för identifikation av olika typer av talade stimuli (Artikel 1 och 2). Vidare jämfördes äldre hörapparatsanvändare (EHA) och äldre normalhörande (ENH) personer med avseende på IP, exakthet i identifikation, och under vilka betingelser explicita kognitiva resurser krävdes för auditiv identifikation i tystnad (d.v.s. utan bakgrundsbuller) (Artikel 3). Resultaten visade att bakgrundsbuller gav senare IP och sänkte exaktheten för identifikation av olika typer av talade stimuli och i båda modaliteterna för presentation. Explicita kognitiva resurser krävdes vid identifikation av talade stimuli vid rent auditiv presentation med bakgrundsbuller, och när ingen semantisk förhandsinformation presenterades. Dessutom resulterade audiovisuell presentation i tidigare IP och mer exakt identifikation av talade stimuli, jämfört med rent auditiv presentation. Ett ytterligare resultat var att förexponering av audiovisuella talade stimuli resulterade i bättre identifikation av tal i bakgrundsbrus, jämfört med förexponering av enbart auditiva talade stimuli (Artikel 2 och 4). Vid jämförelse av EHA-användare och ENH-personer, hade EHA-användare senare IP i identifikation av konsonanter, ord, och ord i final position i LP-meningar. Dessutom hade EHA-användare mindre exakt identifikation av konsonanter och ord. Endast identifikation av konsonanter och ord krävde explicita kognitiva resurser hos EHA-användare. Teoretiska prediktioner och kliniska implikationer diskuterades.

Place, publisher, year, edition, pages
Linköping: Linköping University Electronic Press, 2014. 56 p.
Series
Linköping Studies in Arts and Science, ISSN 0282-9800 ; 635Studies from the Swedish Institute for Disability Research, ISSN 1650-1128 ; 68
Keyword
Noise, auditory speech perception, audiovisual speech perception, hearing aids, Buller, hörsel, auditiv talperception, audiovisuell talperception, hörhjälpmedel
National Category
Other Medical Sciences Public Health, Global Health, Social Medicine and Epidemiology
Identifiers
urn:nbn:se:liu:diva-111723 (URN)10.3384/diss.diva-111723 (DOI)978-91-7519-188-1 (ISBN)
Public defence
2014-11-28, I:101, Hus I, Campus Valla, Linköpings universitet, Linköping, 14:00 (Swedish)
Supervisors
Available from: 2014-10-29 Created: 2014-10-29 Last updated: 2014-10-31Bibliographically approved

Open Access in DiVA

fulltext(214 kB)197 downloads
File information
File name FULLTEXT01.pdfFile size 214 kBChecksum SHA-512
2153e54f1343f3b65113a8aae9fc18f14d5c25b557a8e2df6c9ef13c3354e0b64cfa15c39ad016abc907fd314790f8b4662167b5815645405cbfe3a23224f615
Type fulltextMimetype application/pdf

Other links

Publisher's full textPubMed

Search in DiVA

By author/editor
Moradi, ShahramLidestam, BjörnHällgren, MathiasRönnberg, Jerker
By organisation
Disability ResearchFaculty of Arts and SciencesThe Swedish Institute for Disability ResearchPsychologyDivision of NeuroscienceFaculty of Health SciencesDepartment of Otorhinolaryngology in Linköping
Otorhinolaryngology

Search outside of DiVA

GoogleGoogle Scholar
Total: 197 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
pubmed
urn-nbn

Altmetric score

doi
pubmed
urn-nbn
Total: 327 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf