1 MS in Computer Science and Engineering, East West University, Dhaka, Bangladesh.
2 BSc in CSE, Dhaka International University.
3 CSE, Daffodil International University.
4 Reproductive and Child Health (RCH), Bangladesh University of Health Science.
5 BTech in Computer Science and Technology, Harbin Institute of Technology.
International Journal of Science and Research Archive, 2025, 17(01), 1093-1108
Article DOI: 10.30574/ijsra.2025.17.1.2930
Received on 21 September 2025; revised on 25 October 2025; accepted on 27 October 2025
This paper explores the current landscape of Explainable Artificial Intelligence (XAI) in healthcare diagnostics, emphasizing its critical role in enhancing transparency, trust, and interpretability of AI-driven medical decision-making. It investigates various XAI methodologies, highlights the challenges faced in clinical integration, and discusses future directions to bridge the gap between AI model complexity and clinician usability. By synthesizing recent advances and practical applications, this study aims to contribute valuable insights for researchers and practitioners striving to foster responsible and effective AI adoption in healthcare diagnostics.
Explainable Artificial Intelligence; Healthcare Diagnostics; Interpretability; Trustworthiness; Machine Learning; Clinical Integration
Preview Article PDF
EFAZ KABIR, Md Nyem Hasan Bhuiyan, Mohammad Quayes Bin Habib, Sanjoy Modak and Abrar Shahriar Mahtab. Explainable Artificial Intelligence (XAI) for Healthcare Diagnostics: Current Landscape, Methodologies, Challenges, and Future Directions. International Journal of Science and Research Archive, 2025, 17(01), 1093-1108. Article DOI: https://doi.org/10.30574/ijsra.2025.17.1.2930.
Copyright © 2025 Author(s) retain the copyright of this article. This article is published under the terms of the Creative Commons Attribution Liscense 4.0







