zikele

zikele

人生如此自可乐

可解释的知识蒸馏用于高效的医学图像分类

2508.15251v1

中文标题#

可解释的知识蒸馏用于高效的医学图像分类

英文标题#

Explainable Knowledge Distillation for Efficient Medical Image Classification

中文摘要#

本研究全面探讨了用于使用胸部 X 光(CXR)图像进行冠状病毒病和肺癌分类的知识蒸馏框架。我们采用高容量教师模型,包括 VGG19 和轻量级视觉变换器(Visformer-S 和 AutoFormer-V2-T),以指导从 OFA-595 超网络派生的紧凑型、硬件感知学生模型的训练。我们的方法利用混合监督,结合真实标签和教师模型的软目标,以平衡准确性和计算效率。我们在两个基准数据集上验证了我们的模型:COVID-QU-Ex 和 LCS25000,涵盖多个类别,包括冠状病毒病、健康、非冠状病毒肺炎、肺部和结肠癌。为了解释模型的空间关注点,我们采用基于 Score-CAM 的可视化方法,这为教师和学生网络的推理过程提供了见解。结果表明,蒸馏后的学生模型在显著减少参数和推理时间的情况下保持了较高的分类性能,使其成为资源受限临床环境中的最佳选择。我们的工作强调了将模型效率与可解释性相结合对于实际可信的医疗 AI 解决方案的重要性。

英文摘要#

This study comprehensively explores knowledge distillation frameworks for COVID-19 and lung cancer classification using chest X-ray (CXR) images. We employ high-capacity teacher models, including VGG19 and lightweight Vision Transformers (Visformer-S and AutoFormer-V2-T), to guide the training of a compact, hardware-aware student model derived from the OFA-595 supernet. Our approach leverages hybrid supervision, combining ground-truth labels with teacher models' soft targets to balance accuracy and computational efficiency. We validate our models on two benchmark datasets: COVID-QU-Ex and LCS25000, covering multiple classes, including COVID-19, healthy, non-COVID pneumonia, lung, and colon cancer. To interpret the spatial focus of the models, we employ Score-CAM-based visualizations, which provide insight into the reasoning process of both teacher and student networks. The results demonstrate that the distilled student model maintains high classification performance with significantly reduced parameters and inference time, making it an optimal choice in resource-constrained clinical environments. Our work underscores the importance of combining model efficiency with explainability for practical, trustworthy medical AI solutions.

文章页面#

可解释的知识蒸馏用于高效的医学图像分类

PDF 获取#

查看中文 PDF - 2508.15251v1

智能达人抖店二维码

抖音扫码查看更多精彩内容

Loading...
Ownership of this post data is guaranteed by blockchain and smart contracts to the creator alone.