• 제목/요약/키워드: Auditory-visual mismatch

검색결과 3건 처리시간 0.015초

Effect of Speech Degradation and Listening Effort in Reverberating and Noisy Environments Given N400 Responses

  • Kyong, Jeong-Sug;Kwak, Chanbeom;Han, Woojae;Suh, Myung-Whan;Kim, Jinsook
    • 대한청각학회지
    • /
    • 제24권3호
    • /
    • pp.119-126
    • /
    • 2020
  • Background and Objectives: In distracting listening conditions, individuals need to pay extra attention to selectively listen to the target sounds. To investigate the amount of listening effort required in reverberating and noisy backgrounds, a semantic mismatch was examined. Subjects and Methods: Electroencephalography was performed in 18 voluntary healthy participants using a 64-channel system to obtain N400 latencies. They were asked to listen to sounds and see letters in 2 reverberated×2 noisy paradigms (i.e., Q-0 ms, Q-2000 ms, 3 dB-0 ms, and 3 dB-2000 ms). With auditory-visual pairings, the participants were required to answer whether the auditory primes and letter targets did or did not match. Results: Q-0 ms revealed the shortest N400 latency, whereas the latency was significantly increased at 3 dB-2000 ms. Further, Q-2000 ms showed approximately a 47 ms delayed latency compared to 3 dB-0 ms. Interestingly, the presence of reverberation significantly increased N400 latencies. Under the distracting conditions, both noise and reverberation involved stronger frontal activation. Conclusions: The current distracting listening conditions could interrupt the semantic mismatch processing in the brain. The presence of reverberation, specifically a 2000 ms delay, necessitates additional mental effort, as evidenced in the delayed N400 latency and the involvement of the frontal sources in this study.

Effect of Speech Degradation and Listening Effort in Reverberating and Noisy Environments Given N400 Responses

  • Kyong, Jeong-Sug;Kwak, Chanbeom;Han, Woojae;Suh, Myung-Whan;Kim, Jinsook
    • Journal of Audiology & Otology
    • /
    • 제24권3호
    • /
    • pp.119-126
    • /
    • 2020
  • Background and Objectives: In distracting listening conditions, individuals need to pay extra attention to selectively listen to the target sounds. To investigate the amount of listening effort required in reverberating and noisy backgrounds, a semantic mismatch was examined. Subjects and Methods: Electroencephalography was performed in 18 voluntary healthy participants using a 64-channel system to obtain N400 latencies. They were asked to listen to sounds and see letters in 2 reverberated×2 noisy paradigms (i.e., Q-0 ms, Q-2000 ms, 3 dB-0 ms, and 3 dB-2000 ms). With auditory-visual pairings, the participants were required to answer whether the auditory primes and letter targets did or did not match. Results: Q-0 ms revealed the shortest N400 latency, whereas the latency was significantly increased at 3 dB-2000 ms. Further, Q-2000 ms showed approximately a 47 ms delayed latency compared to 3 dB-0 ms. Interestingly, the presence of reverberation significantly increased N400 latencies. Under the distracting conditions, both noise and reverberation involved stronger frontal activation. Conclusions: The current distracting listening conditions could interrupt the semantic mismatch processing in the brain. The presence of reverberation, specifically a 2000 ms delay, necessitates additional mental effort, as evidenced in the delayed N400 latency and the involvement of the frontal sources in this study.

시각적 MMN(vMMN)의 분석을 통한 한국어 글말의 무의식적인 인지과정 연구 (Automatic cognitive processing of korean written language as indexed by visual MMN(vMMN))

  • 이성은
    • 한국정보과학회 언어공학연구회:학술대회논문집(한글 및 한국어 정보처리)
    • /
    • 한국정보과학회언어공학연구회 2009년도 제21회 한글 및 한국어 정보처리 학술대회
    • /
    • pp.67-72
    • /
    • 2009
  • ERP의 일종인 MMN(Mismatch Negativity)은 언어의 청각 인지정보 처리과정(central auditory processing)을 규명하는 데 유용한 수단으로 이용되어 왔다. 그런데, 최근의 연구들은 이러한 MMN이 청각 자극뿐만 아니라 시각 자극에 의해서도 검출될 수 있음을 밝혀냈다. 본 연구는 이러한 시각적 MMN을 이용하여 뇌에서 이루어지는 한국어 화자의 무의식적인 한국어 문자 정보처리과정을 규명하려고 시도하였다. 본 연구에서는 한국어의 글말 최소쌍 '므'/'모'와 '므'/'무', 이에 대응되는 비언어자극 '+ㅡ'/'+ㅗ'와 '+ㅡ'/'+ㅜ'(+표시의 아래에 모음을 붙여서 만든 인공문자, 그림1 참고)를 수동적(passive) Oddball paradigm으로 제시하고 언어 자극에 대한 EEG를 비언어자극과 비교 하에 측정, 분석하였다. 본 연구의 결과, 언어자극과 비언어자극 모두에서 시각적 MMN이 검출되었다. 하지만, 언어자극의 시각적 MMN이 비언어자극의 시각적 MMN보다 높게 나타남을 확인하였다. 이는 한국어 모국어화자들이 무의식적인 인지과정에서 언어자극이 갖는 물리적인 시각 정보뿐만 아니라 한국어 문자의 언어적 정보도 함께 처리하고 있음을 보여주는 것이다. 본 연구의 결과들은 한국어 글말의 무의식적인 인지처리과정을 밝혀주는 한편, 한국어 문자가 인지과학에서 갖는 중요한 지위를 보여줄 수 있을 것으로 기대된다.

  • PDF