Explaining recommendations in an interactive hybrid social recommender

Chun-Hua Tsai, Peter Brusilovsky

Research output: Contribution to conferencePaper

3 Scopus citations

Abstract

Hybrid social recommender systems use social relevance from multiple sources to recommend relevant items or people to users. To make hybrid recommendations more transparent and controllable, several researchers have explored interactive hybrid recommender interfaces, which allow for a user-driven fusion of recommendation sources. In this field of work, the intelligent user interface has been investigated as an approach to increase transparency and improve the user experience. In this paper, we attempt to further promote the transparency of recommendations by augmenting an interactive hybrid recommender interface with several types of explanations. We evaluate user behavior patterns and subjective feedback by a within-subject study (N=33). Results from the evaluation show the effectiveness of the proposed explanation models. The result of post-treatment survey indicates a significant improvement in the perception of explainability, but such improvement comes with a lower degree of perceived controllability.

Original languageEnglish (US)
Pages391-396
Number of pages6
DOIs
StatePublished - Jan 1 2019
Event24th ACM International Conference on Intelligent User Interfaces, IUI 2019 - Marina del Ray, United States
Duration: Mar 17 2019Mar 20 2019

Conference

Conference24th ACM International Conference on Intelligent User Interfaces, IUI 2019
CountryUnited States
CityMarina del Ray
Period3/17/193/20/19

    Fingerprint

All Science Journal Classification (ASJC) codes

  • Software
  • Human-Computer Interaction

Cite this

Tsai, C-H., & Brusilovsky, P. (2019). Explaining recommendations in an interactive hybrid social recommender. 391-396. Paper presented at 24th ACM International Conference on Intelligent User Interfaces, IUI 2019, Marina del Ray, United States. https://doi.org/10.1145/3301275.3302318