Weak semantic context helps phonetic learning in a model of infant language acquisition

Stella Frank, Naomi Feldman, Sharon Goldwater

Research output: Chapter in Book/Report/Conference proceedingConference contribution


Learning phonetic categories is one of the first steps to learning a language, yet is hard to do using only distributional phonetic information. Semantics could potentially be useful, since words with different meanings have distinct phonetics, but it is unclear how many word meanings are known to infants learning phonetic categories. We show that attending to a weaker source of semantics, in the form of a distribution over topics in the current context, can lead to improvements in phonetic category learning. In our model, an extension of a previous model of joint word-form and phonetic category inference, the probability of word-forms is topic-dependent, enabling the model to find significantly better phonetic vowel categories and word-forms than a model with no semantic knowledge.
Original languageEnglish
Title of host publicationProceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
PublisherAssociation for Computational Linguistics
Number of pages11
ISBN (Print)978-1-937284-72-5
Publication statusPublished - 2014


Dive into the research topics of 'Weak semantic context helps phonetic learning in a model of infant language acquisition'. Together they form a unique fingerprint.

Cite this