Understanding Domain Learning in Language Models Through Subpopulation Analysis

Zheng Zhao, Yftah Ziser, Shay Cohen

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We investigate how different domains are encoded in modern neural network architectures. We analyze the relationship between natural language domains, model size, and the amount of training data used. The primary analysis tool we develop is based on subpopulation analysis with Singular Vector Canonical Correlation Analysis (SVCCA), which we apply to Transformer-based language models (LMs). We compare the latent representations of such a language model at its different layers from a pair of models: a model trained on multiple domains (an experimental model) and a model trained on a single domain (a control model). Through our method, we find that increasing the model capacity impacts how domain information is stored in upper and lower layers differently. In addition, we show that larger experimental models simultaneously embed domain-specific information as if they were conjoined control models. These findings are confirmed qualitatively, demonstrating the validity of our method.
Original languageEnglish
Title of host publicationProceedings of the Fifth BlackboxNLP Workshop on Analyzing and Interpreting Neural Networks for NLP
Place of PublicationAbu Dhabi, United Arab Emirates (Hybrid)
PublisherAssociation for Computational Linguistics
Pages192-209
Number of pages18
ISBN (Electronic)9781959429050
Publication statusPublished - 8 Dec 2022
EventBlackboxNLP 2022: Analyzing and interpreting neural networks for NLP - Abu Dhabi, United Arab Emirates
Duration: 8 Dec 2022 → …
Conference number: 5
https://blackboxnlp.github.io/2022/

Workshop

WorkshopBlackboxNLP 2022
Country/TerritoryUnited Arab Emirates
CityAbu Dhabi
Period8/12/22 → …
Internet address

Fingerprint

Dive into the research topics of 'Understanding Domain Learning in Language Models Through Subpopulation Analysis'. Together they form a unique fingerprint.

Cite this