Document detail
ID

oai:arXiv.org:2401.08909

Topic
Computer Science - Machine Learnin...
Author
Xie, Renchunzi Odonnat, Ambroise Feofanov, Vasilii Redko, Ievgen Zhang, Jianfeng An, Bo
Category

Computer Science

Year

2024

listing date

3/6/2024

Keywords
ground-truth test distribution
Metrics

Abstract

Estimating test accuracy without access to the ground-truth test labels under varying test environments is a challenging, yet extremely important problem in the safe deployment of machine learning algorithms.

Existing works rely on the information from either the outputs or the extracted features of neural networks to formulate an estimation score correlating with the ground-truth test accuracy.

In this paper, we investigate--both empirically and theoretically--how the information provided by the gradients can be predictive of the ground-truth test accuracy even under a distribution shift.

Specifically, we use the norm of classification-layer gradients, backpropagated from the cross-entropy loss after only one gradient step over test data.

Our key idea is that the model should be adjusted with a higher magnitude of gradients when it does not generalize to the test dataset with a distribution shift.

We provide theoretical insights highlighting the main ingredients of such an approach ensuring its empirical success.

Extensive experiments conducted on diverse distribution shifts and model structures demonstrate that our method significantly outperforms state-of-the-art algorithms.

Xie, Renchunzi,Odonnat, Ambroise,Feofanov, Vasilii,Redko, Ievgen,Zhang, Jianfeng,An, Bo, 2024, Leveraging Gradients for Unsupervised Accuracy Estimation under Distribution Shift

Document

Open

Share

Source

Articles recommended by ES/IODE AI

An Updated Overview of Existing Cancer Databases and Identified Needs
advancements insights assess review lipidomics glycomics proteomics databases research cancer