Formal semantics is the study of meaning in natural language using the tools of mathematical logic, model theory, and lambda calculus. Originating in the work of logicians such as Frege, Tarski, and Carnap, formal semantics was brought into linguistics primarily by Richard Montague in the early 1970s. The central commitment of formal semantics is that the meaning of a sentence can be identified with its truth conditions: to know the meaning of a sentence is to know under what circumstances it would be true.
Truth-Conditional Semantics
⟦every P Q⟧ = 1 iff {x : P(x)} ⊆ {x : Q(x)}
⟦some P Q⟧ = 1 iff {x : P(x)} ∩ {x : Q(x)} ≠ ∅
A model M = ⟨D, I⟩ provides a domain D and interpretation function I
In a truth-conditional framework, the meaning of a declarative sentence is the set of possible worlds in which it is true, or equivalently, a function from possible worlds to truth values. Compositional rules specify how the meanings of complex expressions are built from the meanings of their parts. Quantifiers like "every" and "some" receive precise set-theoretic interpretations: "every student runs" is true iff the set of students is a subset of the set of runners.
Compositionality and Logical Form
A guiding principle of formal semantics is the Principle of Compositionality, attributed to Frege: the meaning of a complex expression is determined by the meanings of its parts and the way they are syntactically combined. This requires a systematic mapping from syntactic structures to semantic representations, typically mediated by logical forms expressed in a typed lambda calculus. The syntax-semantics interface is where formal semantics meets computational linguistics most directly.
In computational linguistics, formal semantics provides the theoretical backbone for semantic parsing systems that map natural language to executable logical forms. Systems like those built on Combinatory Categorial Grammar (CCG) use lambda calculus to compose meanings in lockstep with syntactic derivation. More recently, neural semantic parsers learn to map sentences to formal meaning representations, combining the precision of formal semantics with the robustness of statistical methods.
Extensions and Phenomena
Formal semantics has been extended to handle a wide range of linguistic phenomena beyond simple declaratives. Intensional semantics, building on Montague's work, models propositional attitudes (belief, knowledge) and modality (possibility, necessity) using possible worlds. Dynamic semantics, developed by Kamp, Heim, and Groenendijk and Stokhof, treats meaning as context change potential rather than static truth conditions, handling anaphora and presupposition. Event semantics, following Davidson, introduces events as first-class objects to handle adverbial modification and thematic roles.
The formal semantic tradition continues to be influential in computational approaches to meaning. Discourse Representation Theory (DRT) and its extensions provide compositional treatments of cross-sentential anaphora and temporal reasoning. Type-logical grammars unify syntax and semantics in a single calculus. These frameworks inform the design of meaning representations used in semantic parsing, question answering, and natural language inference systems.