Let Me Explain: Impact of Personal and Impersonal Explanations on Trust in Recommender Systems

Kunkel, J., Donkers, T., Michael, L., Barbu, C.-M., & Ziegler, J. (2019). Proceedings of the 37th International Conference on Human Factors in Computing Systems (CHI ’19), 487:1–487:12.

Thesis by Lisa Michael

Abstract

Trust in a Recommender System (RS) is crucial for its overall success. However, it remains underexplored whether users trust personal recommendation sources (i.e. other humans) more than impersonal sources (i.e. conventional RS), and, if they do, whether the perceived quality of explanation provided account for the difference. We conducted an empirical study in which we compared these two sources of recommendations and explanations. Human advisors were asked to explain movies they recommended in short texts while the RS created explanations based on item similarity. Our experiment comprised two rounds of recommending. Over both rounds the quality of explanations provided by users was assessed higher than the quality of the system’s explanations. Moreover, explanation quality significantly influenced perceived recommendation quality as well as trust in the recommendation source. Consequently, we suggest that RS should provide richer explanations in order to increase their perceived recommendation quality and trustworthiness.

Resources

Related publications

Trust-Related Effects of Expertise and Similarity Cues in Human-Generated Recommendations

The Influence of Trust Cues on the Trustworthiness of Online Reviews for Recommendations

More »