References
- Andreasson, R., Alenljung, B., Billing, E., & Lowe, R. (2018). Affective Touch in Human-Robot Interaction: Conveying Emotion to the NAO Robot. International Journal of Social Robotics, 10(4), 473-491. https://doi.org/10.1007/s12369-017-0446-3
- Alenljung, B., Andreasson, R., Billing, E A., Lindblom, J., & Lowe, R. (2017) User Experience of Conveying Emotions by Touch. In Proceedings of the 26th IEEE International Symposium on Robot and Human Interactive Communication (pp. 1240-1247). IEEE
- Atkinson, D., Hancock, P., Hoffman, R. R., Lee, J. D., Rovira, E., Stokes, C., & Wagner, A. R. (2012). Trust in computers and robots: The uses and boundaries of the analogy to interpersonal trust. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 56(1), 303-307. Sage.
- Baker, A. L., Phillips, E. K.,Ullman, D., & Keebler, J. R. (2018). Toward an understanding of trust repair in human-robot interaction. ACM Transactions on Interactive Intelligent Systems, 8(4), 1-30.
- Bansal, G., & Zahedi, F. M. (2015). Trust violation and repair: The information privacy perspective. Decision Support Systems, 71, 62-77. https://doi.org/10.1016/j.dss.2015.01.009
- Bartneck, C. (2003). Interacting with an Embodied Emotional Character. In Proceedings of the 2003 International Conference on Designing pleasurable products and interfaces (pp. 55-60).
- Beck, A., Stevens, B., Bard, K. A., & Canamero, L. (2012). Emotional Body Language Displayed by Artificial Agents. ACM Transactions on Interactive Intelligent Systems, 2(1), 1-29.
- Bies, R. J., & Tripp, T. (1996). Beyond distrust: 'Getting even' and the need for revenge. In. R. Kramer, & T. Tyler (Eds.) Trust in organizations (pp. 246-260).
- Breazeal, C. (2004). Social Interactions in HRI: the robot view. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 34(2), 181-186. https://doi.org/10.1109/TSMCC.2004.826268
- Broadbent, E., Kumar, V., Li, X., Sollers 3rd, J., Stafford, R. Q., MacDonald, B. A., & Wegner, D. M. (2013). Robots with Display Screens: A Robot with a More Human like Face Display Is Perceived to Have More Mind and a Better Personality. PloS ONE, 8(8), e72589. https://doi.org/10.1371/journal.pone.0072589
- Butler, J.K., Jr., & Cantrell, R. S. (1984). A behavioral decision theory approach to modeling dyadic trust in superiors and subordinates. Psychological Reports, 55, 19-28. https://doi.org/10.2466/pr0.1984.55.1.19
- Carli, L. L., LaFleur, S.J., & Loeber, C.C. (1995). Nonverbal Behavior, Gender, and Influence. Journal of Personality and Social Psychology, 68(6), 1030-1041. https://doi.org/10.1037/0022-3514.68.6.1030
- Carney, D.R., Hall, J.A., & Smith-LeBeau, L. (2005). Beliefs about the nonverbal expression of social power. Journal of Nonverbal Behavior, 29(2), 105-123. https://doi.org/10.1007/s10919-005-2743-z
- Cassell, J., & Bickmore, T. (2003). Negotiated collusion: Modeling social language and its relationship effects in intelligent agents. User modeling and user-adapted interaction, 13(1-2), 89-132. https://doi.org/10.1023/A:1024026532471
- Cuddy, A. J. C., Fiske, S. T., & Glick, P. (2007). The BIAS map: behaviors from intergroup affect and stereotypes. Journal of Personality and Social Psychology, 92(4), 631-648. https://doi.org/10.1037/0022-3514.92.4.631
- Desai, M.,Kaniarasu, P., Medvedev, M., Steinfeld, A., & Yanco, H. (2013). Impact ofrobot failures and feedback on real-time trust. In 8th ACM/IEEE International Conference on Human-Robot Interaction, 251-258.
- DeSteno, D., Breazeal, C., Frank, R. H., Pizarro, D., Baumann, J., Dickens, L., & Lee,J. J. (2012). Detecting the Trustworthiness of Novel Partners in Economic Exchange. Psychological Science, 23(12), 1549-1556. https://doi.org/10.1177/0956797612448793
- De Visser, E. J., Pak, R., & Shaw, T. H. (2018). From "automation" to "autonomy": The importance of trust repair in human-machine interaction. Ergonomics, 61(10), 1409-1427. https://doi.org/10.1080/00140139.2018.1457725
- Dirks, K.T., Lewicki, R. J., & Zaheer, A. (2009). Repairing relationships within and between organizations: Building a conceptual foundation. Academy of Management Review, 34(1), 68-84. https://doi.org/10.5465/AMR.2009.35713285
- Dweck, C.S., Chiu, C. Y., & Hong, Y. Y. (1995). Implicit theories and their role in judgments and reactions: A word from two perspectives. Psychological Inquiry, 6(4), 267-285. https://doi.org/10.1207/s15327965pli0604_1
- Engelhardt, S., Hansson, E., & Leite, I. (2017, August). Better Faulty than Sorry: Investigating Social Recovery Strategies to Minimize the Impact of Failure in Human-Robot Interaction. In WCIHAI@ IVA, 19-27.
- Fratczak, P., Goh, Y. M., Kinnell, P., Justham, L., & Soltoggio, A. (2021). Robot apology as a post-accident trust-recovery control strategy in industrial human-robot interaction. International Journal of Industrial Ergonomics, 82.
- Ferrin, D.L., Kim, P. H., Cooper, C. D., & Dirks, K. T. (2007). Silence speaks Vols.:The effectiveness of reticence in comparison to apology and denial for responding to integrity- and competence-based trust violations. Journal of Applied Psychology, 92(4), 893-908. https://doi.org/10.1037/0021-9010.92.4.893
- Fuoli, M., Van de Weijer, J., & Paradis, C. (2017). Denial outperforms apology in repairing organizational trust despite strong evidence of guilt. Public Relations Review, 43(4), 645-660. https://doi.org/10.1016/j.pubrev.2017.07.007
- Hald, K., Weitz, K., Andre, E., & Rehm, M. (2021, November). "An Error Occurred!" - Trust Repair With Virtual Robot Using Levels of Mistake Explanation. In Proceedings of the 9th International Conference on Human-Agent Interaction, 218-226.
- Hancock, P. A., Billings, D. R., Schaefer, K. E., Chen, J. Y. C., De Visser, E. J., & Parasuraman, R. (2011). A meta-analysis of factors affecting trust in human-robot interaction. Human Factors, 53(5), 517-527. https://doi.org/10.1177/0018720811417254
- Haring, K. S., Matsumoto, Y., & Watanabe, K. (2013). How do people perceive and trust a lifelike robot. Lecture Notes in Engineering and Computer Science, 1, 425-430.
- Hoff, K. A., & Bashir, M. (2015). Trust in automation: Integrating empirical evidence on factors that influence trust. Human Factors, 57(3), 407-434. https://doi.org/10.1177/0018720814547570
- Hur, Y., & Han, J. (2009). Analysis on Children's Tolerance to Weak Recognition of Storytelling Robots. J. Convergence Inf. Technol., 4(3), 103-109. https://doi.org/10.4156/jcit.vol4.issue3.15
- Jessup, S.A., Schneider, T. R., Alarcon, G. M., Ryan, T. J., & Capiola, A. (2019). The measurement of the propensity to trust automation. In International Conference on Human-Computer Interaction, 476-489.
- Jian, J., Bisantz, A. & Drury, C. (2000). Foundations for an empirically determined scale of trust in automated systems. International Journal of Cognitive Ergonomics, 4(1), 53-71. https://doi.org/10.1207/S15327566IJCE0401_04
- Johnson, D. O., & Cuijpers, R. H. (2019). Investigating the Effect of a Humanoid Robot's Head Position on Imitating Human Emotions. International Journal of Social Robotics, 11(1), 65-74 https://doi.org/10.1007/s12369-018-0477-4
- Kahkonen, T., Blomqvist, K., Gillespie, N., & Vanhala, M. (2021). Employee trust repair: A systematic review of 20 years of empirical research and future research directions. Journal of Business Research, 130, 98-109. https://doi.org/10.1016/j.jbusres.2021.03.019
- Kim, P. H., Ferrin, D. L., Cooper, C. D., & Dirks, K. T. (2004). Removing the shadow of suspicion: The effects of apology versus denial for repairing competence-versus integrity- based trust violations. Journal of Applied Psychology, 89(1), 104-118. https://doi.org/10.1037/0021-9010.89.1.104
- Kim, P. H., Dirks, K. T., Cooper, C. D., & Ferrin, D. L. (2006). When more blame is better than less: The implications of internal vs. external attributions for the repair of trust after a competence-vs. integrity-based trust violation. Organizational behavior and human decision processes, 99(1), 49-65. https://doi.org/10.1016/j.obhdp.2005.07.002
- Kim, P. H., Dirks, K. T., & Cooper, C. D. (2009) "The repair of trust: A dynamic bilateral perspective and multi level conceptualization". Academy of Management Review, 34(3), 401-422. https://doi.org/10.5465/AMR.2009.40631887
- Kim, P. H.,Cooper, C. D., Dirks, K. T., & Ferrin, D. L. (2013). Repairing trust with individuals vs. groups. Organizational Behavior and Human Decision Processes, 120(1), 1-14. https://doi.org/10.1016/j.obhdp.2012.08.004
- Lee, J. D, & See, K. A. (2004). Trust in automation: Designing for appropriate reliance. Human Factors, 46(1), 50-80. https://doi.org/10.1518/hfes.46.1.50.30392
- Lee, J. D., & Moray, N. (1994). Trust, self-confidence, and operators' adaptation to automation. International Journal of Human-Computer Studies, 40(1), 153-184. https://doi.org/10.1006/ijhc.1994.1007
- Lee, K. M., Peng, W., Jin, S. A., & Yan, C. (2006). Can Robots Manifest Personality?: An Empirical Test of Personality Recognition, Social Responses, and Social Presence in Human-Robot Interaction. Journal of Communication, 56(4), 754-772. https://doi.org/10.1111/j.1460-2466.2006.00318.x
- Leung, S. O. (2011). A comparison of psychometric properties and normality in 4-, 5-, 6-, and 11-point Likert scales. Journal of social service research, 37(4), 412-421. https://doi.org/10.1080/01488376.2011.580697
- Lewicki, R.J., & Brinsfield, C. (2017). Trust repair. Annual Review of Organizational Psychology and Organizational Behavior, 4, 287-313. https://doi.org/10.1146/annurev-orgpsych-032516-113147
- Li, J., Cuadra, A., Mok, B., Reeves, B., Kaye, J., & Ju, W. (2019). Communicating Dominance in a Non anthropomorphic Robot Using Locomotion. ACM Transactions on Human-Robot Interaction, 8(1), 1-14.
- Maddux, W. W., Kim, P. H., Okumura, T., & Brett, J. M. (2011). Cultural differences in the function and meaning of apologies. International negotiation, 16(3), 405-425. https://doi.org/10.1163/157180611X592932
- Madhavan, D., & Wiegmann, D. A. (2005). Effects of information source, pedigree, and reliability on operators utilization of diagnostic advice. Human Factors and Ergonomics Society Annual Meeting Proceedings, 49(3), 487-491. https://doi.org/10.1177/154193120504900358
- Mayer, R. C., & Davis, J. H. (1999). The effect of the performance appraisal system on trust for management: A field quasi-experiment. Journal of Applied Psychology, 84(1), 123-136. https://doi.org/10.1037/0021-9010.84.1.123
- Mayer, R. C., Davis, J. H., & Schoorman, F. D. (1995). An integrative model of organizational trust. Academy of management review, 20(3), 709-734. https://doi.org/10.5465/AMR.1995.9508080335
- McCall, D., & Kolling, M. (2014). Meaningful categorization of novice programmer errors. In Proceedings of 2014 IEEE Frontiers in Education Conference, 1-8.
- McColl, D., & Nejat, G. (2014). Recognizing Emotional Body Language Displayed by a Human-like Social Robot. International Journal of Social Robotics, 6(2), 261-280. https://doi.org/10.1007/s12369-013-0226-7
- Mende, M., Scott, M. L., van Doorn, J., Grewal, D., & Shanks, I. (2019). Service Robots Rising: How Humanoid Robots Influence Service Experiences and Elicit Compensatory Consumer Responses. Journal of Marketing Research, 56(4), 535-556. https://doi.org/10.1177/0022243718822827
- Merritt, S. M., & Ilgen, D. R. (2008). Not all trust is created equal: Dispositional and history-based trust in human-automation interactions. Human Factors, 50(2), 194-210. https://doi.org/10.1518/001872008X288574
- Mori, M., MacDorman, K. F., & Kageki, N. (2012). The Uncanny Valley [from the field]. IEEE Robotics & Automation Magazine, 19(2), 98-100.
- Muir, B. M., & Moray, N. (1996). Trust in automation. Part II. Experimental studies of trust and human intervention in a process control simulation. Ergonomics, 39, 429-460. https://doi.org/10.1080/00140139608964474
- Nass, C. I., & Moon, Y. (2000). Machines and mindlessness: Social responses to computers. Journal of Social Issues, 56(1), 81-103. https://doi.org/10.1111/0022-4537.00153
- Nass, C., Steuer, J., & Tauber, E. R. (1994). Computers are Social Actors. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems (pp. 72-78).
- Purington, A., Taft J. G., Sannon, S., Bazarova, N. N., & Taylor, S. H. (2017). "Alexa is my new BFF," Social Roles, User Satisfaction, and Personification of the Amazon Echo. In Proceedings of the 2017 CHI Conference Expended Abstracts on Human Factors in Computing Systems (pp. 2853-2859).
- Robinette, P., Howard, A. M., &Wagner, A. R. (2017). Effect of robot performance on human-robot trust in time-critical situations. IEEE Transactions on Human-Machine Systems, 47(4), 425-436. https://doi.org/10.1109/THMS.2017.2648849
- Salem, M., Eyssel, F., Rohlfing, K., Kopp, S., & Joublin, F. (2013). To Err is Human(-like): Effects of Robot Gesture on Perceived Anthropomorphism and Likability. International Journal of Social Robotics, 5(3), 313-323. https://doi.org/10.1007/s12369-013-0196-9
- Salem, M., Lakatos, G., Amirabdollahian, F., & Dautenhahn, K. (2015, March). Would you trust a (faulty) robot? Effects of error, task type and personality on human-robot cooperation and trust. In 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI) (pp. 1-8). IEEE.
- Sanders, T.,Kaplan, A., Koch, R., Schwartz, M., & Hancock, P. A. (2019). The relationship between trust and use choice in human-robot interaction. Human Factors: The Journal of Human Factors and Ergonomics Society, 61(4),614-626. https://doi.org/10.1177/0018720818816838
- Strait, M. K., Floerke, V. A., Ju, W., Maddox, K., Remedios, J. D., Jung, M. F., & Urry, H. L. (2017). Understanding the Uncanny: Both Atypical Features and Category Ambiguity Provoke Aversion toward Humanlike Robots. Frontiers in Psychology, 8, 1366 https://doi.org/10.3389/fpsyg.2017.01366
- Schaefer K. E. (2016) Measuring Trust in Human Robot Interactions: Development of the "Trust Perception Scale-HRI". In Robust Intelligence and Trust in Autonomous Systems (pp. 191-218). Springer, Boston, MA.
- Sebo, S. S., Krishnamurthi, P., & Scassellati, B. (2019). "I Don't Believe You":Investigating the Effects of Robot Trust Violation and Repair. In 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (pp. 57-65). IEEE.
- Sebo, S. S., Traeger, M., Jung, M., & Scassellati, B. (2018). The ripple effects of vulnerability: The effects of a robot's vulnerable behavior on trust in human-robot teams. In Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, 178-186.
- Tomlinson, E. C., Dineen, B. R., & Lewicki, R. J. (2004). The road to reconciliation: Antecedents of victim willingness to reconcile following a broken promise. Journal of Management, 30(2), 165-187. https://doi.org/10.1016/j.jm.2003.01.003
- Torre, I., Goslin, J., White, L., & Zanatto, D. (2018). Trust in artificial voices: "A congruency effect" of first impressions and behavioral experience. In Proceedings of the Technology, Mind, and Society, 1-6.
- Traeger, M.L., Sebo, S. S., Jung, M., Scassellati, B., & Christakis, N. A. (2020). Vulnerable Robots Positively Shape Human Conversational Dynamics in a Human-Robot Team. In Proceedings of the National Academy of Sciences, 117(12), 6370-6375. https://doi.org/10.1073/pnas.1910402117
- Utz, S., Matzat, U., & Snijders, C. (2009). On-line reputation systems: The effects of feedback comments and reactions on building and rebuilding trust in on-line auctions. International Journal of Electronic Commerce, 13(3), 95-118. https://doi.org/10.2753/JEC1086-4415130304
- Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A.N., Kaiser, L., Polosukhin, I. (2017). Attention is all you need. In Advances in neural information processing systems, 5998-6008.
- Wang, L., Jamieson, G. A., & Hollands, J. G. (2009). Trust and reliance on an automated combat identification system. Human Factors, 51(3), 281-291. https://doi.org/10.1177/0018720809338842
- Weun, S., Beatty, S. E., & Jones, M. A. (2004). The impact of service failure severity on service recovery evaluations and post-recovery relationships. The Journal of Services Marketing, 18(2), 133-146. https://doi.org/10.1108/08876040410528737
- Xu, J., Broekens, J.,Hindriks, K., & Neerincx, M. A. (2014). Robot Mood is Contagious: Effects of Robot Body Language in the Imitation Game. In Proceedings of the 2014 International Conference on Autonomous Agents and Multi-agent Systems, 973-980.