Distributionally Robust Recurrent Decoders with Random Network Distillation

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Neural machine learning models can successfully model language that is similar to their training distribution, but they are highly susceptible to degradation under distribution shift, which occurs in many practical applications when processing out-of-domain (OOD) text. This has been attributed to ``shortcut learning'''':'' relying on weak correlations over arbitrary large contexts. We propose a method based on OOD detection with Random Network Distillation to allow an autoregressive language model to automatically disregard OOD context during inference, smoothly transitioning towards a less expressive but more robust model as the data becomes more OOD, while retaining its full context capability when operating in-distribution. We apply our method to a GRU architecture, demonstrating improvements on multiple language modeling (LM) datasets.
Original languageEnglish
Title of host publicationProceedings of the 7th Workshop on Representation Learning for NLP
EditorsSpandana Gella, He He, Bodhisattwa Prasad Majumdar, Burcu Can, Eleonora Giunchiglia, Samuel Cahyawijaya, Sewon Min, Maximillian Mozes, Xiang Lorraine Li, Isabelle Augenstein, Anna Rogers, Kyunghyun Cho, Edward Grefenstette, Laura Rimell, Chris Dyer
Place of PublicationDublin, Ireland
PublisherAssociation for Computational Linguistics
Pages1-8
Number of pages8
ISBN (Electronic) 978-1-955917-48-3
DOIs
Publication statusPublished - 3 Jun 2022
EventThe 7th Workshop on Representation Learning for NLP - Dublin, Ireland
Duration: 26 May 202226 May 2022
Conference number: 7
https://sites.google.com/view/repl4nlp2022/home

Workshop

WorkshopThe 7th Workshop on Representation Learning for NLP
Abbreviated titleRepl4NLP 2022
Country/TerritoryIreland
CityDublin
Period26/05/2226/05/22
Internet address

Fingerprint

Dive into the research topics of 'Distributionally Robust Recurrent Decoders with Random Network Distillation'. Together they form a unique fingerprint.

Cite this