Document Type
Article
Publication Date
12-20-2024
Publication Title
Journal of Supercomputing
Volume
81
Issue
315
Publisher Name
Springer Link
Abstract
Deep learning (DL) has become a cornerstone for advancements in computer vision, yielding models capable of remarkable performance on complex tasks. Despite these achievements, DL models often exhibit undue confidence; for example, when encountering out-of-distribution (OOD) inputs during inference, they may misclassify with high confidence. For many DL applications, these errors are critical and make accurate uncertainty estimates necessary. Our research focuses on implementing and evaluating different uncertainty assessment techniques for DL models. Our findings show each method’s computational advantages and challenges, providing researchers with invaluable insight. Furthermore, we present different real-world use cases of uncertainty estimations, such as image classification, scientific visualization (SciVis), detection of adversarial attacks on classification, and performance improvement on active learning classifiers. These tests used traditional High-Performance Computing (HPC) platforms alongside cutting edge AI accelerators. With their unique architectures, these platforms presented varying efficiencies in applying uncertainty estimation.
Identifier
10.1007/s11227-024-06818-y
Recommended Citation
Guerrero-Pantoja, David, Eric Pautsch, Clara Almeida, Silvio Rizzi, George K. Thiruvathukal, and Maria Pantoja. "Accelerating Uncertainty Methods for Distributed Deep Learning on Novel Architectures." The Journal of Supercomputing, vol. 81, no. 1, 2024, p. 315, https://doi.org/10.1007/s11227-024-06818-y.
Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.