Emily M. Bender is a professor of linguistics at the University of Washington and director of its Computational Linguistics Laboratory. Her work spans grammar engineering for diverse languages, the development of linguistic annotation standards, and, increasingly, rigorous critical analysis of the claims, methods, and societal impacts of large language models and NLP systems.
Early Life and Education
Born in 1973, Bender studied linguistics at the University of California, Berkeley, and Stanford University, earning her PhD from Stanford in 2000. Her doctoral work on the grammar of Japanese contributed to the LinGO Grammar Matrix, a framework for building precision grammars in the HPSG formalism for typologically diverse languages.
Born in the United States
Completed PhD at Stanford University
Co-developed the LinGO Grammar Matrix for multilingual grammar engineering
Published Linguistic Fundamentals for Natural Language Processing
Co-authored "Climbing towards NLU: On Meaning, Form, and Understanding in the Age of Data"
Co-authored "On the Dangers of Stochastic Parrots"
Key Contributions
Bender's Grammar Matrix is a framework for rapid development of precision HPSG grammars for typologically diverse languages. By providing a shared core of linguistic analyses and a customisation system, it enables linguists to build computationally implementable grammars for under-resourced languages more efficiently, supporting the goal of truly multilingual NLP.
Her influential paper "Climbing towards NLU" (2020, with Alexander Koller) argued that language models trained solely on text form cannot learn meaning in the way humans do, because meaning is grounded in communicative intent and real-world reference. Her 2021 paper "On the Dangers of Stochastic Parrots" (with Timnit Gebru and others) examined the environmental, social, and ethical risks of ever-larger language models. She has also championed the Bender Rule — the principle that NLP papers should always name the specific language(s) they study, rather than implicitly treating English as the default.
"Text generated by a language model is not grounded in communicative intent, and we should not mistake fluency for understanding." — Emily Bender
Legacy
Bender's work has been instrumental in bringing linguistic rigour and ethical reflection to computational linguistics. Her critique of the conflation of language model fluency with language understanding has shaped how the field thinks about the capabilities and limitations of large language models. Her advocacy for multilingual NLP, the Bender Rule, and responsible AI development continues to influence both research practice and policy discussions.