varMax: Uncertainty and Novelty Management in Deep Neural Networks
Files
Date
2025-01-07
Contributor
Advisor
Department
Instructor
Depositor
Speaker
Researcher
Consultant
Interviewer
Narrator
Transcriber
Annotator
Journal Title
Journal ISSN
Volume Title
Publisher
Volume
Number/Issue
Starting Page
7512
Ending Page
Alternative Title
Abstract
Traditional Deep Neural Networks often struggle with new or unfamiliar data patterns since they operate on a closed-set assumption. This challenge arises due to inherent limitations in the model architecture, such as the softmax function commonly used for classification tasks, which tends to exhibit overconfidence and inaccuracies when faced with novel inputs. Prior studies have highlighted the need for open-set recognition (OSR) techniques to differentiate between known and unknown data points, but existing approaches often exhibit a bias toward flagging inputs as unknown. To address this issue, we introduce a novel OSR technique called VarMax, designed to maintain a balanced approach. VarMax leverages the variance in model predictions to discern between known and unknown inputs. We propose a method for classifying ambiguous samples based on prediction variance to detect out-of-distribution samples to enhance classification accuracy and reliability. Our experiments demonstrate that VarMax meets and exceeds the performance of existing methods in identifying unknown data points while also improving the model's confidence and robustness in distinguishing between known and unknown inputs.
Description
Keywords
Trustworthy Artificial Intelligence and Machine Learning, deep neural networks, open-set recognition, uncertainty management
Citation
Extent
10
Format
Geographic Location
Time Period
Related To
Proceedings of the 58th Hawaii International Conference on System Sciences
Related To (URI)
Table of Contents
Rights
Attribution-NonCommercial-NoDerivatives 4.0 International
Rights Holder
Local Contexts
Email libraryada-l@lists.hawaii.edu if you need this content in ADA-compliant format.