Item request has been placed! ×
Item request cannot be made. ×
loading  Processing Request

Questioning the ability of feature-based explanations to empower non-experts in robo-advised financial decision-making

Item request has been placed! ×
Item request cannot be made. ×
loading   Processing Request
  • معلومة اضافية
    • Contributors:
      Design, Interaction, Visualization & Applications (DIVA); Laboratoire Traitement et Communication de l'Information (LTCI); Institut Mines-Télécom Paris (IMT)-Télécom Paris-Institut Mines-Télécom Paris (IMT)-Télécom Paris; Département Informatique et Réseaux (INFRES); Télécom ParisTech; Numérique, Organisation et Société (NOS); Institut interdisciplinaire de l’innovation de Telecom Paris (I3 SES); Télécom Paris-Institut interdisciplinaire de l’innovation (I3); École polytechnique (X); Institut Polytechnique de Paris (IP Paris)-Institut Polytechnique de Paris (IP Paris)-Mines Paris - PSL (École nationale supérieure des mines de Paris); Université Paris Sciences et Lettres (PSL)-Université Paris Sciences et Lettres (PSL)-Centre National de la Recherche Scientifique (CNRS)-Télécom Paris-École polytechnique (X); Université Paris Sciences et Lettres (PSL)-Université Paris Sciences et Lettres (PSL)-Centre National de la Recherche Scientifique (CNRS)-Télécom Paris-Institut interdisciplinaire de l’innovation (I3); Université Paris Sciences et Lettres (PSL)-Université Paris Sciences et Lettres (PSL)-Centre National de la Recherche Scientifique (CNRS); Département Sciences Economiques et Sociales (SES); Télécom Paris
    • بيانات النشر:
      HAL CCSD
      ACM
    • الموضوع:
      2023
    • Collection:
      École Polytechnique, Université Paris-Saclay: HAL
    • الموضوع:
    • نبذة مختصرة :
      International audience ; Robo-advisors are democratizing access to life-insurance by enabling fully online underwriting. In Europe, financial legislation requires that the reasons for recommending a life insurance plan be explained according to the characteristics of the client, in order to empower the client to make a "fully informed decision". In this study conducted in France, we seek to understand whether legal requirements for feature-based explanations actually help users in their decision-making. We conduct a qualitative study to characterize the explainability needs formulated by non-expert users and by regulators expert in customer protection. We then run a large-scale quantitative study using Robex, a simplified robo-advisor built using ecological interface design that delivers recommendations with explanations in different hybrid textual and visual formats: either "dialogic"-more textual-or "graphical"-more visual. We find that providing feature-based explanations does not improve appropriate reliance or understanding compared to not providing any explanation. In addition, dialogic explanations increase users' trust in the recommendations of the robo-advisor, sometimes to the users' detriment. This real-world scenario illustrates how XAI can address information asymmetry in complex areas such as finance. This work has implications for other critical, AI-based recommender systems, where the General Data Protection Regulation (GDPR) may require similar provisions for feature-based explanations. CCS CONCEPTS • Human-centered computing → Empirical studies in HCI.
    • Relation:
      hal-04125939; https://hal.science/hal-04125939; https://hal.science/hal-04125939/document; https://hal.science/hal-04125939/file/facct23-83%20%281%29.pdf
    • الرقم المعرف:
      10.1145/3593013.3594053
    • Rights:
      info:eu-repo/semantics/OpenAccess
    • الرقم المعرف:
      edsbas.3D9BEEFD