Enhancing Understandability of Deep Learning Models for Colorectal Cancer Diagnosis using Explainable AI
DOI:
https://doi.org/10.24135/rangahau-aranga.v4i1.255Abstract
In recent decades, cancer has emerged as one of the primary causes of death globally. Colorectal Cancer (CRC) is a serious form of cancer with high incidence and mortality rates in developed nations. Deep Neural Networks (DNN) have shown remarkable performance in classification of CRC polyps from endoscopy images. However, some clinicians have reservations about automated cancer diagnosis as the decision-making process is not easily understandable. Because of the black-box nature of DNN, it is impossible to find out which features are contributing to the model predictions. Therefore, this paper aims to illustrate how deep learning models make a prediction and build the trust of practitioners by offering an understanding of the inner working of the model.
This study used endoscopy datasets from various public and private sources. Synthetic endoscopic images were produced using the Conditional Deep Convolutional Generative Adversarial Network (CDCGAN), and Grad-CAM (Gradient-weighted Class Activation Mapping) was implemented to visualize the image regions, contributing to the final decision of the DNN.
The high-quality diverse images generated by CDCGAN were found to help in making the decision more understandable. Moreover, the proposed model distinguished between polyp types even though they have complex structural differences. The presented model maintained a high accuracy while improving the trust of clinicians in computer-aided diagnosis by offering insights into the decision-making process.
In this presentation, there will be an introduction of the proposed model that improves the explainability of DNN models by providing visual explanations which highlight the areas of CRC polyp images contributing to the decision-making process. It will be demonstrated that this approach could enhance the use of synthetic images for better visualization, and that clinical adaptation of automated tools could enhance the collaboration of human and AI in virtual biopsies.