Ecai 2008 Workshop on Recommender Systems (original) (raw)
The user is the addressee of the explanation, presented by the explainer, which chooses the form of the explanation and organises a dialog if needed. In order to provide good explanations, originator and explainer need to be tightly integrated. Not all of the knowledge needed for explaining is available from the originator. Additional knowledge needs to be acquired for the explainer. Explanations are strongly associated with trust and transparency. One trusts a knowledge-based system much more if it is able to explain what it is doing and, thus, can "prove" its trustworthiness to its user. Any information system, and even more so any recommender system, should be able to explain at every point in time why it prefers solution A over solution B. Furthermore, it should also clarify the meaning of used concepts, and where an information item originally came from ("knowledge provenance"). Explanations are part of human understanding processes and part of most dialogues, and, therefore, need to be incorporated into system interactions in order to improve decision-making processes. Case-Based Reasoning (CBR) systems are well suited for building content-based recommender systems as is demonstrated by their utilisation in e-commerce scenarios. The case base often is filled from product catalogues. Modelling and editing the similarity measures for products is a complex task for the knowledge engineer, who can be supported by explanations about the structure and content of the case base. Such support features have been implemented in the open source CBR tool myCBR 2 [4]. I will illustrate myCBR's explanation capabilities [1, 2] using an online shop scenario.