Challenges in Explaining Brain Tumor Detection

Benedicte Legastelois, Amy Rafferty, Paul Brennan, Hana Chockler, Ajitha Rajan, Vaishak Belle

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

Explanations for AI are a crucial part of autonomous systems: they increase user's confidence, provide an interpretation of an otherwise black-box system, and can serve as an interface between the user and the AI system. Explanations are to become mandatory for all AI systems influencing people (see, for example, the upcoming EU AI Act). While so far explanations of image classifiers focused on explaining images of objects, such as ImageNet, there is an important area of application for them, namely, healthcare. In this paper we focus on a particular area of healthcare: the use of CNN machine-learning models for cancer detection in MRI brain images. We compare a number of explanation techniques and analyse whether they provide helpful and adequate explanations. We argue that the requirements from explanations in healthcare are different from those for generic images, and that existing explanations techniques fall short in the healthcare domain.

Original languageEnglish
Title of host publicationTAS 2023 - Proceedings of the 1st International Symposium on Trustworthy Autonomous Systems
PublisherACM Association for Computing Machinery
Number of pages8
ISBN (Electronic)9798400707346
Publication statusPublished - 11 Jul 2023
Event1st International Symposium on Trustworthy Autonomous Systems, TAS 2023 - Edinburgh, United Kingdom
Duration: 11 Jul 202312 Jul 2023

Publication series

NameACM International Conference Proceeding Series


Conference1st International Symposium on Trustworthy Autonomous Systems, TAS 2023
Country/TerritoryUnited Kingdom

Keywords / Materials (for Non-textual outputs)

  • MRI Image classification Explanations.


Dive into the research topics of 'Challenges in Explaining Brain Tumor Detection'. Together they form a unique fingerprint.

Cite this