Computational Linguistics
About

Emily Bender

Emily Bender (b. 1973) is a computational linguist at the University of Washington who has made foundational contributions to multilingual grammar engineering, linguistic annotation, and the critical study of ethical and methodological issues in NLP.

Bender Rule: Always name the language(s) studied in NLP papers

Emily M. Bender is a professor of linguistics at the University of Washington and director of its Computational Linguistics Laboratory. Her work spans grammar engineering for diverse languages, the development of linguistic annotation standards, and, increasingly, rigorous critical analysis of the claims, methods, and societal impacts of large language models and NLP systems.

Early Life and Education

Born in 1973, Bender studied linguistics at the University of California, Berkeley, and Stanford University, earning her PhD from Stanford in 2000. Her doctoral work on the grammar of Japanese contributed to the LinGO Grammar Matrix, a framework for building precision grammars in the HPSG formalism for typologically diverse languages.

1973

Born in the United States

2000

Completed PhD at Stanford University

2002

Co-developed the LinGO Grammar Matrix for multilingual grammar engineering

2013

Published Linguistic Fundamentals for Natural Language Processing

2020

Co-authored "Climbing towards NLU: On Meaning, Form, and Understanding in the Age of Data"

2021

Co-authored "On the Dangers of Stochastic Parrots"

Key Contributions

Bender's Grammar Matrix is a framework for rapid development of precision HPSG grammars for typologically diverse languages. By providing a shared core of linguistic analyses and a customisation system, it enables linguists to build computationally implementable grammars for under-resourced languages more efficiently, supporting the goal of truly multilingual NLP.

Her influential paper "Climbing towards NLU" (2020, with Alexander Koller) argued that language models trained solely on text form cannot learn meaning in the way humans do, because meaning is grounded in communicative intent and real-world reference. Her 2021 paper "On the Dangers of Stochastic Parrots" (with Timnit Gebru and others) examined the environmental, social, and ethical risks of ever-larger language models. She has also championed the Bender Rule — the principle that NLP papers should always name the specific language(s) they study, rather than implicitly treating English as the default.

"Text generated by a language model is not grounded in communicative intent, and we should not mistake fluency for understanding." — Emily Bender

Legacy

Bender's work has been instrumental in bringing linguistic rigour and ethical reflection to computational linguistics. Her critique of the conflation of language model fluency with language understanding has shaped how the field thinks about the capabilities and limitations of large language models. Her advocacy for multilingual NLP, the Bender Rule, and responsible AI development continues to influence both research practice and policy discussions.

Interactive Calculator

Enter a CSV of publications: year,title,citations_count. The calculator computes total citations, h-index, peak year, and a per-decade breakdown of scholarly output.

Click Calculate to see results, or Animate to watch the statistics update one record at a time.

Related Topics

References

  1. Bender, E. M., & Koller, A. (2020). Climbing towards NLU: On meaning, form, and understanding in the age of data. Proceedings of the 58th Annual Meeting of the ACL, 5185–5198. doi:10.18653/v1/2020.acl-main.463
  2. Bender, E. M., Gebru, T., McMillan-Major, A., & Shmitchell, S. (2021). On the dangers of stochastic parrots: Can language models be too big? Proceedings of FAccT, 610–623. doi:10.1145/3442188.3445922
  3. Bender, E. M., Flickinger, D., & Oepen, S. (2002). The grammar matrix: An open-source starter-kit for the rapid development of cross-linguistically consistent broad-coverage precision grammars. Proceedings of the COLING Workshop on Grammar Engineering, 8–14.
  4. Bender, E. M. (2013). Linguistic Fundamentals for Natural Language Processing: 100 Essentials from Morphology and Syntax. Morgan & Claypool.

External Links