Deep Learning 2025-03-29
Explainable AI in Radiology
Research into interpretable AI methods for radiology, focusing on attention visualization, saliency mapping, and confidence calibration that enable radiologists to understand and trust AI-generated findings in clinical workflows.
5C Network Research Team · arXiv · DOI: 10.48550/arXiv.2503.14536
Key Findings
- Attention visualization reveals which image regions drive AI diagnostic predictions, building radiologist trust
- Vision-language model outputs provide natural language explanations alongside quantitative confidence scores
- Multimodal reasoning over imaging and clinical history produces more interpretable diagnostic rationale
- Framework applied to chronic tuberculosis diagnostics demonstrates how explainability improves clinical adoption