TY - JOUR A1 - Kempt, Hendrik A1 - Freyer, Nils A1 - Nagel, Saskia K. T1 - Justice and the normative standards of explainability in healthcare JF - Philosophy & Technology N2 - Providing healthcare services frequently involves cognitively demanding tasks, including diagnoses and analyses as well as complex decisions about treatments and therapy. From a global perspective, ethically significant inequalities exist between regions where the expert knowledge required for these tasks is scarce or abundant. One possible strategy to diminish such inequalities and increase healthcare opportunities in expert-scarce settings is to provide healthcare solutions involving digital technologies that do not necessarily require the presence of a human expert, e.g., in the form of artificial intelligent decision-support systems (AI-DSS). Such algorithmic decision-making, however, is mostly developed in resource- and expert-abundant settings to support healthcare experts in their work. As a practical consequence, the normative standards and requirements for such algorithmic decision-making in healthcare require the technology to be at least as explainable as the decisions made by the experts themselves. The goal of providing healthcare in settings where resources and expertise are scarce might come with a normative pull to lower the normative standards of using digital technologies in order to provide at least some healthcare in the first place. We scrutinize this tendency to lower standards in particular settings from a normative perspective, distinguish between different types of absolute and relative, local and global standards of explainability, and conclude by defending an ambitious and practicable standard of local relative explainability. KW - Clinical decision support systems KW - Justice KW - Medical AI KW - Explainability KW - Normative standards Y1 - 2022 U6 - http://dx.doi.org/10.1007/s13347-022-00598-0 VL - 35 IS - Article number: 100 SP - 1 EP - 19 PB - Springer Nature CY - Berlin ER -