News Score: Score the News, Sort the News, Rewrite the Headlines

BERTs are Generative In-Context Learners

View PDF HTML (experimental) Abstract:While in-context learning is commonly associated with causal language models, such as GPT, we demonstrate that this capability also 'emerges' in masked language models. Through an embarrassingly simple inference technique, we enable an existing masked model, DeBERTa, to perform generative tasks without additional training or architectural changes. Our evaluation reveals that the masked and causal language models behave very differently, as they clearly outpe...

Read more at arxiv.org

© News Score  score the news, sort the news, rewrite the headlines