Por favor, use este identificador para citar o enlazar este ítem:
http://conacyt.repositorioinstitucional.mx/jspui/handle/1000/4484
Estimating Uncertainty and Interpretability in Deep Learning for Coronavirus (COVID-19) Detection | |
Biraja Ghoshal. Allan Tucker. | |
Acceso Abierto | |
Atribución-NoComercial-SinDerivadas | |
https://arxiv.org/pdf/2003.10769v2.pdf | |
Deep Learning has achieved state of the art performance in medical imaging. However, these methods for disease detection focus exclusively on improving the accuracy of classification or predictions without quantifying uncertainty in a decision. Knowing how much confidence there is in a computer-based medical diagnosis is essential for gaining clinicians trust in the technology and therefore improve treatment. Today, the 2019 Coronavirus (SARS-CoV-2) infections are a major healthcare challenge around the world. Detecting COVID-19 in X-ray images is crucial for diagnosis, assessment and treatment. However, diagnostic uncertainty in the report is a challenging and yet inevitable task for radiologist. In this paper, we investigate how drop-weights based Bayesian Convolutional Neural Networks (BCNN) can estimate uncertainty in Deep Learning solution to improve the diagnostic performance of the human-machine team using publicly available COVID-19 chest X-ray dataset and show that the uncertainty in prediction is highly correlates with accuracy of prediction. We believe that the availability of uncertainty-aware deep learning solution will enable a wider adoption of Artificial Intelligence (AI) in a clinical setting. | |
arxiv.org | |
2020 | |
Artículo | |
https://arxiv.org/pdf/2003.10769v2.pdf | |
Inglés | |
VIRUS RESPIRATORIOS | |
Aparece en las colecciones: | Artículos científicos |
Cargar archivos:
Fichero | Tamaño | Formato | |
---|---|---|---|
1106647.pdf | 1.81 MB | Adobe PDF | Visualizar/Abrir |