Acknowledgement
본 연구는 한국전자통신연구원 연구운영비지원사업(기본사업)의 일환으로 수행되었음[24ZC1200, 사용자 중심(Egocentric) 원격 교감 상호작용 핵심원천기술 개발].
References
- MIT Technology Review, "Meta is desperately trying to make the metaverse happen," 2022. 10. 11.
- A. Winkler, J. Won, and Y. Ye, "QuestSim: Human motion tracking from sparse sensors with simulated avatars," in Proc. SA, (Daegu Rep. of Korea), Nov. 2022, pp. 1-8.
- S. Lee, "QuestEnvSim: Environment-aware simulated motion tracking from sparse sensors," in Proc. SIGGRAPH, (Los Angeles, CA, USA), Jul. 2023, pp. 1-9.
- https://www.noitom.com/
- https://www.movella.com/products/motion-capture
- A. Toshev et al., "Deeppose: Human pose estimation via deep neural networks," in Proc. CVPR, (Columbus, OH, USA), June 2014.
- G. Pavlakos et al., "Coarse-to-fine volumetric prediction for single-image 3d human pose," arXiv preprint, CoRR, 2017, arXiv: 1611.07828.
- S. Wei et al., "Convolutional pose machines," in Proc. CVPR, (Las Vegas, NV, USA), June 2016.
- K. He et al., "Mask r-cnn," in Proc. ICCV, (Venice, Italy), Oct. 2017.
- K. Sun et al., "Deep high-resolution representation learning for human pose estimation," in Proc. CVPR, (Long Beach, CA, USA), June 2019.
- Z. Cao et al., "Realtime multi-person 2d pose estimation using part affinity fields," in Proc. CVPR, (Honolulu, Hi, USA), Jul. 2017.
- A. Newell, Z. Huang, and J. Deng, "Associative embedding: End-to-end learning for joint detection and grouping," arXiv preprint, CoRR, 2017, arXiv: 1611.05424.
- V. Belagiannis et al., "3D pictorial structures revisited: Multiple human pose estimation," IEEE Trans. Pattern Anal. Mach. Intell., vol. 38, no. 10, 2016, pp. 1929-1942. https://doi.org/10.1109/TPAMI.2015.2509986
- V. Belagianniset et al., "Multiple human pose estimation with temporally consistent 3D pictorial structures," in Computer Vision-ECCV 2014 Workshops, vol. 8925, Springer, 2014, pp. 742-754.
- T. Cootes et al., "Active shape models-their training and application," Comput. Vis. Image Underst., vol. 61, no. 1, 1995, pp. 38-59. https://doi.org/10.1006/cviu.1995.1004
- S.X. Ju et al., "Cardboard people: A parameterized model of articulated image motion," in Proc. Int. Conf. Automatic Face Gesture Recognition, (Killington, VT, USA), Oct. 1996, pp. 38-44.
- M. Loper et al., "SMPL: A skinned multi-person linear model," ACM Trans. Graph., vol. 34, no. 6, 2015, pp. 1-16. https://doi.org/10.1145/2816795.2818013
- D. Anguelov et al., "SCAPE: Shape completion and animation of people," ACM Trans. Graph., vol. 24, no. 3, 2005, pp. 408-416. https://doi.org/10.1145/1073204.1073207
- H. Joo, T. Simon, and Y. Sheikh, "Total capture: A 3d deformation model for tracking faces, hands, and bodies," in Proc. CVPR, (Salt Lake City, UT, USA), June 2018, pp. 8320-8329.
- 연구개발특구진흥재단, "유망시장 Issue Report 햅틱 기술," 2021. 7.
- S. Jeong et al., "Pattern design of a liquid metal based wearable heater for constant heat generation under biaxial strain," iScience, vol. 26, no. 7, 2023.
- H. Kim and J. Bae, "Analysis of electrical resistance changes in liquid metal printed wires under strain for stretchable electronics," Smart Mater. Struct., vol. 30, no. 9, 2021, article no. 095004.
- www.bhaptics.com
- https://haptx.com/
- M. Kim et al., "SpinOcchio: Understanding haptic-visual congruency of skin-slip in vr with a dynamic grip controller," in Proc. ACM CHI, (New Orleans. LA. USA), Apr. 2022, pp. 1-14.
- R. Kovacs et al., "Haptic PIVOT: On-demand handhelds in VR," in Proc, UIST, (Minneapolis, MN, USA), Oct. 2020.
- www.playstation.com
- http://first-vr.com/
- S. Oh et al., "Easy-to-wear auxetic SMA knot-architecture for spatiotemporal and multimodal haptic feedbacks," Adv. Mater., vol. 35, no. 47, 2023.
- https://owogame.com/
- https://www.actronika.com/
- https://teslasuit.io/
- T. Yang et al., "Recent advances and opportunities of active materials for haptic technologies in virtual and augmented reality," Adv. Funct. Mater., vol. 31, no. 39, 2021, pp. 1-30. https://doi.org/10.1002/adfm.202008831
- A. Singhal and L.A. Jones, "Perceptual interactions in thermo-tactile displays," in Proc. IEEE WHC, (Munich, Germany), Jul. 2017.
- Y. Zhang et al., "Force-aware interface via electro-myography for natural VR/AR interaction," ACM Trans. Graph., vol. 41, no. 6, 2022, pp. 1-18. https://doi.org/10.1145/3550454.3555461
- https://hpi.de/
- P. Ekman, "Basic emotions," in Handbook of Cognition and Emotion, Wiley, 1999, pp. 45-60.
- J.A. Russell, "A circumplex model of affect," J. Pers. Soc. Psychol., vol. 39, no. 6, 1980.
- K. Olszewski et al., "High-fidelity facial and speech animation for VR HMDs," ACM Trans. Graph., vol. 35, no. 6, 2016, pp. 1-14. https://doi.org/10.1145/2980179.2980252
- Road to VR, HTC Announces Face-tracker for Vive Pro and Vive Tracker 3.0, Mar. 10, 2021, https://www.roadtovr.com/htc-vive-facial-tracker-3-0-announcement-release-date-price/
- MIXED Reality News, Meta Quest Pro: What the new eye and face tracking can do, Oct 25, 2022, https://mixed-news.com/en/meta-quest-pro-what-the-new-eye-and-face-tracking-can-do/
- UploadVR, Quest Pro Now Has Tongue Tracking, Dec 18, 2023, https://www.uploadvr.com/quest-pro-tongue-tracking/
- Meta, Pixel Codec Avatars, Jun. 19, 2021, https://research.facebook.com/publications/pixel-codec-avatars/
- M. Horvat et al., "Assessing emotional responses induced in virtual reality using a consumer EEG headset: A preliminary report," in Proc. MIPRO, (Opatija, Croatia), May 2018, pp. 1006-1010.
- N.S. Suhaimi, J. Mountstephens, and J. Teo, "A dataset for emotion recognition using virtual reality and EEG (DER-VREEG): Emotional state classification using low-cost wearable VR-EEG headsets," Big Data Cognit. Comput., vol. 6, no. 1, 2022.
- M. Yu et al., "EEG-based emotion recognition in an immersive virtual reality environment: From local activity to brain network features," Biomed. Signal Process. Control, vol. 72, 2022, article no. 103349.
- K. Gupta et al., "Affectivelyvr: Towards vr personalized emotion recognition," in Proc. ACM VRST, (Virtual), Nov. 2020, pp. 1-3.
- Q. Wu et al., "Emotion classification on eye tracking and electroencephalograph fused signals employing deep gradient neural networks," Appl. Soft Comput., vol. 110, 2021, article no. 107752.
- J.Z. Lim et al., "Emotion recognition using eye-tracking: Taxonomy, review and current challenges," Sensors, vol. 20, no. 8, 2020.
- VREED: Virtual Reality Emotion Recognition Dataset Using Eye Tracking & Physiological Measures.
- N. Hube, K. Vidackovic, and M. Sedlmair, "Using expressive avatars to increase emotion recognition: A pilot study," in Proc. CHI EA, (New Orleans, LA, USA), Apr., pp. 1-7.
- A. Valente et al., "Empathic AuRea: Exploring the effects of an augmented reality cue for emotional sharing across three face-to-face tasks," in Proc. VR, (Christchurch, New Zealand), Mar. 2022, pp. 158-166.
- M. Salminen et al., "Evoking physiological synchrony and empathy using social vr with biofeedback," IEEE Trans. Affect. Comput., vol. 13, no. 2, 2019, pp. 746-755. https://doi.org/10.1109/TAFFC.2019.2958657
- S. Lee et al., "Understanding and designing avatar biosignal visualizations for social virtual reality entertainment," in Proc. CHI, (New Orleans, LA, USA), Apr. 2022, pp. 1-15.
- K. Gupta et al., "VRdoGraphy: An empathic VR photography experience," in Proc. VRW, (Shanghai, China), Mar. 2023, pp. 1013-1014.
- Y.S. Pai et al., "The empathic metaverse: An assistive bioresponsive platform for emotional experience sharing," arXiv preprint, CoRR, 2023, arXiv: 2311.16610.
- T. Rinnert et al., "How can one share a user's activity during VR synchronous augmentative cooperation?," Multimodal Technologies and Interaction, vol. 7, no. 2, 2023.
- T.M. Michaels et al., "Cognitive empathy contributes to poor social functioning in schizophrenia: Evidence from a new self-report measure of cognitive and affective empathy," Psychiatry Res., vol. 220, no. 3, 2014, pp. 803-810. https://doi.org/10.1016/j.psychres.2014.08.054
- S.H. Konrath et al., "Changes in dispositional empathy in American college students over time: A meta-analysis," Pers. Soc. Psychol. Rev., vol. 15, no. 2, 2011, pp. 180-198. https://doi.org/10.1177/1088868310377395
- C. Milk, "How virtual reality can create the ultimate empathy machine," TED Talk, vol. 22, 2015.
- K.E. Stavroulia et al., "The role of perspective-taking on empowering the empathetic behavior of educators in VR-based training sessions: An experimental evaluation," Comput. Edu., vol. 197, 2023, article no. 104739.
- M. Tassinari et al., "Investigating the influence of intergroup contact in virtual reality on empathy: An exploratory study using AltspaceVR," Front. Psychol., vol. 12, 2022, article no. 815497.
- M.H. Davis, "Measuring individual differences in empathy: Evidence for a multidimensional approach," J. Pers. Soc. Psychol., vol. 44, no. 1, 1983.
- E. Parra Vargas et al., "Virtual reality stimulation and organizational neuroscience for the assessment of empathy," Front. Psychol., vol. 13, 2022, article no. 993162.
- 애플 뉴스룸, "Apple, 최초의 공간 컴퓨터 Apple Vision Pro 발표," 2023. 6. 5.
- W. Xu et al., "Mo2Cap2: Real-time mobile 3D motion capture with a cap-mounted fisheye camera," arXiv preprint, CoRR, 2019, arXiv: 1803.05959.
- J. Wang et al., "Estimating egocentric 3D human pose in global space," in Proc. IEEE/CVF ICCV, (Virtual), Oct. 2021, pp. 11500-11509.
- J. Wang et al., "Scene-aware egocentric 3D human pose estimation," arXiv preprint, CoRR, 2023, arXiv: 2212.11684.
- V. Mollyn et al., "IMUPoser: Full-body pose estimation using IMUs in phones, watches, and earbuds," in Proc. CHI, (Hamburg Germany), Apr. 2023, pp. 1-12.