The search result changed since you submitted your search request. Documents might be displayed in a different sort order.
  • search hit 20 of 1138
Back to Result List

Justice and the normative standards of explainability in healthcare

  • Providing healthcare services frequently involves cognitively demanding tasks, including diagnoses and analyses as well as complex decisions about treatments and therapy. From a global perspective, ethically significant inequalities exist between regions where the expert knowledge required for these tasks is scarce or abundant. One possible strategy to diminish such inequalities and increase healthcare opportunities in expert-scarce settings is to provide healthcare solutions involving digital technologies that do not necessarily require the presence of a human expert, e.g., in the form of artificial intelligent decision-support systems (AI-DSS). Such algorithmic decision-making, however, is mostly developed in resource- and expert-abundant settings to support healthcare experts in their work. As a practical consequence, the normative standards and requirements for such algorithmic decision-making in healthcare require the technology to be at least as explainable as the decisions made by the experts themselves. The goal of providing healthcare in settings where resources and expertise are scarce might come with a normative pull to lower the normative standards of using digital technologies in order to provide at least some healthcare in the first place. We scrutinize this tendency to lower standards in particular settings from a normative perspective, distinguish between different types of absolute and relative, local and global standards of explainability, and conclude by defending an ambitious and practicable standard of local relative explainability.

Export metadata

Additional Services

Share in Twitter Search Google Scholar
Metadaten
Author:Hendrik KemptORCiD, Nils Freyer, Saskia K. Nagel
DOI:https://doi.org/10.1007/s13347-022-00598-0
Parent Title (English):Philosophy & Technology
Publisher:Springer Nature
Place of publication:Berlin
Document Type:Article
Language:English
Year of Completion:2022
Date of the Publication (Server):2023/08/03
Tag:Clinical decision support systems; Explainability; Justice; Medical AI; Normative standards
Volume:35
Issue:Article number: 100
First Page:1
Last Page:19
Link:https://doi.org/10.1007/s13347-022-00598-0
Zugriffsart:weltweit
Institutes:FH Aachen / Fachbereich Wirtschaftswissenschaften
collections:Verlag / Springer Nature
Open Access / Hybrid
Licence (German):License LogoCreative Commons - Namensnennung