DOI QR코드

DOI QR Code

Consistency between Individuals of Affective Responses for Multiple Modalities based on Behavioral and Physiological Data

행동 및 생리측정기반 개인 간 다중 감각정서 반응일치성

  • Received : 2022.05.12
  • Accepted : 2022.09.20
  • Published : 2023.03.31

Abstract

In this study, we assessed how participants represent various sensory stimuli experiences through behavioral ratings and physiological measurements. Utilizing intersubject correlation (ISC) analysis, we evaluated whether individuals' affective responses of dominance, arousal, and valence differed when stimuli of three modality conditions (auditory, visual, and haptic) were presented. ISC analyses were used to measure the similarities between one participant's responses and those of the others. To calculate the intersubject correlation, we divided the entire dataset into one subject and all other subject datasets and then correlated the two for all possible stimulus pair combinations. The results revealed that for dominance, ISCs of the visual modality condition were greater than the auditory modality condition, whereas, for arousal, the auditory condition was greater than the visual modality. Last, negative valence conditions had the greater consistency of the participants' reactions than positive conditions in each of the sensory modalities. When comparing modalities, greater ISCs were observed in haptic modality conditions than in visual and auditory modality conditions, regardless of the affective categories. We discussed three core affective representations of multiple modalities and proposed ISC analysis as a tool for examining differences in individuals' affective representations.

본 연구는 참가자 간 상관(Intersubject correlation: ISC)기법을 통해 정서 유발 자극에 대한 한 참가자의 반응과 그 참가자를 제외한 나머지 참가자들의 반응 간 일치성이 각 정서표상 범주(지배가, 각성가, 정서가)와 다양한 감각양상(청각, 시각, 촉각)에서 어떠한 차이가 있는지 밝히고자 하였다. 참가자 간 상관을 계산하기 위해 사용된 데이터는 참가자들의 청각, 시각, 촉각 자극에 대한 생리 측정치와 정서 평정치로 구성되었으며, 한 참가자의 데이터 세트와 나머지 참가자들의 데이터 세트의 평균으로 구분한 뒤 가능한 모든 자극 쌍에 대해 상관을 구하는 방식으로 참가자 간 상관을 계산하였다. 연구 결과, 지배가를 기준으로 재정렬한 데이터 세트에 대한 참가자들의 반응 일치성은 청각 감각양상 조건보다 시각 감각양상 조건에서 높은 ISC 값을 얻었다. 다음으로 각성가로 재정렬한 데이터 세트의 경우 시각 감각양상과 청각 감각양상에서 차이가 있음은 같았지만, 지배가 기준으로 재정렬한 데이터 세트와 결과가 상반되었다. 마지막으로, 정서가를 기준으로 재정렬된 데이터 세트는 모든 감각양상에서 부정적인 데이터 세트들이 긍정적인 데이터 세트보다 참가자들의 반응 일치성이 더 높았다. 모든 데이터 세트에서 정서표상 범주의 높고 낮음과 상관없이 촉각 감각양상에서 높은 ISC 값을 얻었다. 본 연구의 결과는 참가자 간 상관의 다양한 감각양상과 정서표상에 대한 반응의 일치성이 의미하는 바에 대한 해석을 제시하며, ISC 분석 방법이 참가자 반응의 차이에 대한 패턴을 측정하는 유용한 도구가 될 가능성을 제시하였다.

Keywords

Acknowledgement

이 논문은 한국연구재단 4단계 BK21사업(전북대학교 심리학과)의 지원을 받아 연구되었음(No.4199990714213).

References

  1. Arnold, M. B. (1960). Emotion and personality.
  2. Bach, D. R., Friston, K. J., & Dolan, R. J. (2010). Analytic measures for quantification of arousal from spontaneous skin conductance fluctuations. International Journal of Psychophysiology, 76(1), 52-55. https://doi.org/10.1016/j.ijpsycho.2010.01.011
  3. Barrett, L. F., & Bliss-Moreau, E. (2009). Chapter 4 affect as a psychological primitive. Advances in Experimental Social Psychology, 41, 167-218. https://doi.org/10.1016/S0065-2601(08)00404-8
  4. Baucom, L. B., Wedell, D. H., Wang, J., Blitzer, D. N., & Shinkareva, S. V. (2012). Decoding the neural representation of affective states. NeuroImage, 59(1), 718-727. https://doi.org/10.1016/j.neuroimage.2011.07.037
  5. Ben-Yakov, A., Honey, C. J., Lerner, Y., & Hasson, U. (2012). Loss of reliable temporal structure in event-related averaging of naturalistic stimuli. NeuroImage, 63(1), 501-506. https://doi.org/10.1016/j.neuroimage.2012.07.008
  6. Bracken, B. K., Alexander, V., Zak, P. J., Romero, V., & Barraza, J. A. (2014, June). Physiological synchronization is associated with narrative emotionality and subsequent behavioral response. In International conference on augmented cognition (pp. 3-13). Springer, Cham.
  7. Bradley, M. M., & Lang, P. J. (1994). Measuring emotion: The self-assessment manikin and the semantic differential. Journal of Behavior Therapy and Experimental Psychiatry, 25(1), 49-59. https://doi.org/10.1016/0005-7916(94)90063-9
  8. Bradley, M. M., & Lang, P. J. (1999). Fearfulness and affective evaluations of pictures. Motivation and Emotion, 23(1), 1-13. https://doi.org/10.1023/A:1021375216854
  9. Chen, P. H. A., Jolly, E., Cheong, J. H., & Chang, L. J. (2020). Intersubject representational similarity analysis reveals individual variations in affective experience when watching erotic movies. NeuroImage, 216, 116851.
  10. Croy, I., Laqua, K., Suss, F., Joraschky, P., Ziemssen, T., & Hummel, T. (2013). The sensory channel of presentation alters subjective ratings and autonomic responses toward disgusting stimuli-Blood pressure, heart rate and skin conductance in response to visual, auditory, haptic and olfactory presented disgusting stimuli. Frontiers in Human Neuroscience, 7, 510.
  11. Fruhholz, S., van der Zwaag, W., Saenz, M., Belin, P., Schobert, A.-K., Vuilleumier, P., & Grandjean, D. (2016). Neural decoding of discriminative auditory object features depends on their socio-affective valence. Social Cognitive and Affective Neuroscience, 11(10), 1638-1649. https://doi.org/10.1093/scan/nsw066
  12. Gatti, E., Calzolari, E., Maggioni, E., & Obrist, M. (2018). Emotional ratings and skin conductance response to visual, auditory and haptic stimuli. Scientific Data, 5(1), 1-12. https://doi.org/10.1038/s41597-018-0002-5
  13. Gavazzeni, J., Wiens, S., & Fischer, H. (2008). Age effects to negative arousal differ for self-report and electrodermal activity. Psychophysiology, 45(1), 148-151. https://doi.org/10.1111/j.1469-8986.2007.00596.x
  14. Gomez, P., & Danuser, B. (2004). Affective and physiological responses to environmental noises and music. International Journal of Psychophysiology, 53(2), 91-103. https://doi.org/10.1016/j.ijpsycho.2004.02.002
  15. Habes, I., Krall, S. C., Johnston, S. J., Yuen, K. S. L., Healy, D., Goebel, R., Sorger, B., & Linden, D. E. J. (2013). Pattern classification of valence in depression. NeuroImage: Clinical, 2(1), 675-683. https://doi.org/10.1016/j.nicl.2013.05.001
  16. Hale, J., Ward, J. A., Buccheri, F., Oliver, D., & Hamilton, A. F. D. C. (2020). Are you on my wavelength? interpersonal coordination in dyadic conversations. Journal of Nonverbal Behavior, 44(1), 63-83. https://doi.org/10.1007/s10919-019-00320-3
  17. Han, J., Lang, A., & Amon, M. J. (2022). Can media synchronize our physiological responses? Skin conductance synchrony as a function of message valence, arousal, and emotional change rate. Communication Monographs, 89(1), 47-69. https://doi.org/10.1080/03637751.2021.1942105
  18. Hasson, U., Nir, Y., Levy, I., Fuhrmann, G., & Malach, R. (2004). Intersubject synchronization of cortical activity during natural vision. Science, 303(5664), 1634-1640. https://doi.org/10.1126/science.1089506
  19. Hove, M. J., & Risen, J. L. (2009). It's all in the timing: Interpersonal synchrony increases affiliation. Social Cognition, 27(6), 949-960. https://doi.org/10.1521/soco.2009.27.6.949
  20. Kim, I., Jang, J., Kim, H., & Kim, J. (in press). Measuring consistency of affective responses to ASMR stimuli across individuals using intersubject correlation. Korean Journal of Cognitive and Biological Psychology.
  21. Kim, J., & Wedell, D. H. (2016). Comparison of physiological responses to affect eliciting pictures and music. International Journal of Psychophysiology, 101, 9-17. https://doi.org/10.1016/j.ijpsycho.2015.12.011
  22. Kim, J., Shinkareva, S. V., & Wedell, D. H. (2017). Representations of modality-general valence for videos and music derived from fMRI data. NeuroImage, 148, 42-54. https://doi.org/10.1016/j.neuroimage.2017.01.002
  23. Kim, J., Weber, C. E., Gao, C., Schulteis, S., Wedell, D. H., & Shinkareva, S. V. (2020). A study in affect: Predicting valence from fMRI data. Neuropsychologia, 143, 107473.
  24. Kim, J. (2021). Representation of facial expressions of different ages: A multidimensional scaling study. Science of Emotion and Sensibility, 24(3), 71-80. https://doi.org/10.14695/KJSOS.2021.24.3.71
  25. Lahnakoski, J. M., Glerean, E., Jaaskelainen, I. P., Hyona, J., Hari, R., Sams, M., & Nummenmaa, L. (2014). Synchronous brain activity across individuals underlies shared psychological perspectives. NeuroImage, 100, 316-324. https://doi.org/10.1016/j.neuroimage.2014.06.022
  26. Li, X., Zhu, Y., Vuoriainen, E., Ye, C., & Astikainen, P. (2021). Decreased intersubject synchrony in dynamic valence ratings of sad movie contents in dysphoric individuals. Scientific Reports, 11(1), 1-13. https://doi.org/10.1038/s41598-020-79139-8
  27. Masson, H. L., & Isik, L. (2021). Functional selectivity for social interaction perception in the human superior temporal sulcus during natural viewing. NeuroImage, 245, 118741.
  28. Miles, L. K., Nind, L. K., & Macrae, C. N. (2009). The rhythm of rapport: Interpersonal synchrony and social perception. Journal of Experimental Social Psychology, 45(3), 585-589. https://doi.org/10.1016/j.jesp.2009.02.002
  29. Mourao-Miranda, J., Hardoon, D. R., Hahn, T., Marquand, A. F., Williams, S. C. R., Shawe-Taylor, J., & Brammer, M. (2011). Patient classification as an outlier detection problem: An application of the one-class support vector machine. NeuroImage, 58(3), 793-804. https://doi.org/10.1016/j.neuroimage.2011.06.042
  30. Mourao-Miranda, J., Almeida, J. R., Hassel, S., Oliveira, L. de, Versace, A., Marquand, A. F., Sato, J. R., Brammer, M., & Phillips, M. L. (2012). Pattern recognition analyses of brain activation elicited by happy and neutral faces in unipolar and bipolar depression. Bipolar Disorders, 14(4), 451-460. https://doi.org/10.1111/j.1399-5618.2012.01019.x
  31. Najafi, M., Kinnison, J., & Pessoa, L. (2017). Dynamics of intersubject brain networks during anxious anticipation. Frontiers in Human Neuroscience, 11, 552.
  32. Nastase, S. A., Gazzola, V., Hasson, U., & Keysers, C. (2019). Measuring shared responses across subjects using intersubject correlation. Social Cognitive and Affective Neuroscience, 14(6), 669-687. https://doi.org/10.1093/scan/nsz037
  33. Nummenmaa, L., Glerean, E., Viinikainen, M., Jaaskelainen, I. P., Hari, R., & Sams, M. (2012). Emotions promote social interaction by synchronizing brain activity across individuals. Proceedings of the National Academy of Sciences of the United States of America, 109(24), 9599-9604. https://doi.org/10.1073/pnas.1206095109
  34. Nummenmaa, L., Lahnakoski, J. M., & Glerean, E. (2018). Sharing the social world via intersubject neural synchronisation. Current Opinion in Psychology, 24, 7-14. https://doi.org/10.1016/j.copsyc.2018.02.021
  35. Obrist, M., Subramanian, S., Gatti, E., Long, B., & Carter, T. (2015, April). Emotions mediated through mid-air haptics. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (pp. 2053-2062).
  36. Paquette, S., Takerkart, S., Saget, S., Peretz, I., & Belin, P. (2018). Cross-classification of musical and vocal emotions in the auditory cortex. Annals of the New York Academy of Sciences, 1423(1), 329-337. https://doi.org/10.1111/nyas.13666
  37. Peelen, M. V., Atkinson, A. P., & Vuilleumier, P. (2010). Supramodal representations of perceived emotions in the human brain. Journal of Neuroscience, 30(30), 10127-10134. https://doi.org/10.1523/JNEUROSCI.2161-10.2010
  38. Russell, J. A. (1980). A circumplex model of affect. Journal of Personality and Social Psychology, 39(6), 1161-1178. https://doi.org/10.1037/h0077714
  39. Sachs, M. E., Habibi, A., Damasio, A., & Kaplan, J. T. (2018). Decoding the neural signatures of emotions expressed through sound. NeuroImage, 174, 1-10. https://doi.org/10.1016/j.neuroimage.2018.02.058
  40. Shinkareva, S. V., Wang, J., & Wedell, D. H. (2013). Examining similarity structure: Multidimensional scaling and related approaches in neuroimaging. Computational and Mathematical Methods in Medicine, 2013.
  41. Sigrist, R., Rauter, G., Riener, R., & Wolf, P. (2012). Augmented visual, auditory, haptic, and multimodal feedback in motor learning: A review. Psychonomic Bulletin & Review, 20(1), 21-53. https://doi.org/10.3758/s13423-012-0333-8
  42. Simony, E., Honey, C. J., Chen, J., Lositsky, O., Yeshurun, Y., Wiesel, A., & Hasson, U. (2016). Dynamic reconfiguration of the default mode network during narrative comprehension. Nature Communications, 7(1), 1-13. https://doi.org/10.1038/ncomms12141