Document detail
ID

oai:arXiv.org:2404.14716

Topic
Computer Science - Computation and... Computer Science - Artificial Inte... Computer Science - Computer Vision... Computer Science - Sound Electrical Engineering and Systems...
Author
Wang, Siyin Yang, Chao-Han Huck Wu, Ji Zhang, Chao
Category

Computer Science

Year

2024

listing date

6/19/2024

Keywords
based speech probability icl bycs inverse selection example examples in-context inference science computer
Metrics

Abstract

Large language models (LLMs) can adapt to new tasks through in-context learning (ICL) based on a few examples presented in dialogue history without any model parameter update.

Despite such convenience, the performance of ICL heavily depends on the quality of the in-context examples presented, which makes the in-context example selection approach a critical choice.

This paper proposes a novel Bayesian in-Context example Selection method (ByCS) for ICL.

Extending the inference probability conditioned on in-context examples based on Bayes' theorem, ByCS focuses on the inverse inference conditioned on test input.

Following the assumption that accurate inverse inference probability (likelihood) will result in accurate inference probability (posterior), in-context examples are selected based on their inverse inference results.

Diverse and extensive cross-tasking and cross-modality experiments are performed with speech, text, and image examples.

Experimental results show the efficacy and robustness of our ByCS method on various models, tasks and modalities.

;Comment: 17 pages, 6 figures

Wang, Siyin,Yang, Chao-Han Huck,Wu, Ji,Zhang, Chao, 2024, Bayesian Example Selection Improves In-Context Learning for Speech, Text, and Visual Modalities

Document

Open

Share

Source

Articles recommended by ES/IODE AI

Diabetes and obesity: the role of stress in the development of cancer
stress diabetes mellitus obesity cancer non-communicable chronic disease stress diabetes obesity patients cause cancer