Reviewing the Need for Explainable Artificial Intelligence (xAI) Gerlings, Julie Shollo, Arisa Constantiou, Ioanna 2020-12-24T19:14:32Z 2020-12-24T19:14:32Z 2021-01-05
dc.description.abstract The diffusion of artificial intelligence (AI) applications in organizations and society has fueled research on explaining AI decisions. The explainable AI (xAI) field is rapidly expanding with numerous ways of extracting information and visualizing the output of AI technologies (e.g. deep neural networks). Yet, we have a limited understanding of how xAI research addresses the need for explainable AI. We conduct a systematic review of xAI literature on the topic and identify four thematic debates central to how xAI addresses the black-box problem. Based on this critical analysis of the xAI scholarship we synthesize the findings into a future research agenda to further the xAI body of knowledge.
dc.format.extent 10 pages
dc.identifier.doi 10.24251/HICSS.2021.156
dc.identifier.isbn 978-0-9981331-4-0
dc.language.iso English
dc.relation.ispartof Proceedings of the 54th Hawaii International Conference on System Sciences
dc.rights Attribution-NonCommercial-NoDerivatives 4.0 International
dc.subject Explainable Artificial Intelligence (XAI)
dc.subject explainable ai
dc.subject machine learning
dc.subject socio-technical
dc.subject stakeholders
dc.subject xai
dc.title Reviewing the Need for Explainable Artificial Intelligence (xAI)
prism.startingpage 1284
Original bundle
Now showing 1 - 1 of 1
No Thumbnail Available
332.48 KB
Adobe Portable Document Format