论文标题

域名的医学图像分类器通过反事实影响分析解释

Domain aware medical image classifier interpretation by counterfactual impact analysis

论文作者

Lenis, Dimitrios, Major, David, Wimmer, Maria, Berg, Astrid, Sluiter, Gert, Bühler, Katja

论文摘要

机器学习方法对计算机视觉任务的成功促进了计算机辅助医学和生物学预测的激增。基于输入图像和病理分类之间的数据驱动关系,这些预测因子提供了前所未有的准确性。然而,试图解释这种学习关系的因果关系的众多方法缺乏:时间限制,粗糙,弥漫性和有时是由于使用启发式技术(如高斯噪声和模糊而造成的)误导性结果,从而阻碍了他们的临床采用。 在这项工作中,我们通过引入基于神经网络的归因方法(适用于任何受过训练的预测指标)来讨论和克服这些障碍。我们的解决方案通过测量局部图像扰动对预测变量的分数的影响,从而确定了单个前通用图像中输入图像的显着区域。我们用强大的邻里条件覆盖方法代替了启发式技术,避免了解剖学上令人难以置信的,因此是对抗性伪像。我们评估公共乳房X线摄影数据,并根据现有的最新方法进行比较。此外,我们通过在胸部X射线上证明结果来体现该方法的普遍性。我们的解决方案在定量和定性上都表明了本地化的歧义和更清晰的传达结果的显着降低,而无需牺牲时间效率。

The success of machine learning methods for computer vision tasks has driven a surge in computer assisted prediction for medicine and biology. Based on a data-driven relationship between input image and pathological classification, these predictors deliver unprecedented accuracy. Yet, the numerous approaches trying to explain the causality of this learned relationship have fallen short: time constraints, coarse, diffuse and at times misleading results, caused by the employment of heuristic techniques like Gaussian noise and blurring, have hindered their clinical adoption. In this work, we discuss and overcome these obstacles by introducing a neural-network based attribution method, applicable to any trained predictor. Our solution identifies salient regions of an input image in a single forward-pass by measuring the effect of local image-perturbations on a predictor's score. We replace heuristic techniques with a strong neighborhood conditioned inpainting approach, avoiding anatomically implausible, hence adversarial artifacts. We evaluate on public mammography data and compare against existing state-of-the-art methods. Furthermore, we exemplify the approach's generalizability by demonstrating results on chest X-rays. Our solution shows, both quantitatively and qualitatively, a significant reduction of localization ambiguity and clearer conveying results, without sacrificing time efficiency.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源