Documentation

Linglib.Phenomena.Morphology.Studies.HalleMarantz1993

@cite{halle-marantz-1993}: Distributed Morphology and the Pieces of Inflection #

@cite{halle-marantz-1993}

This study file formalizes the core architecture and predictions of Distributed Morphology as presented in @cite{halle-marantz-1993}, Chapter 3 of The View from Building 20 (Hale & Keyser, eds.).

Structure #

Architecture #

@cite{halle-marantz-1993} adopt a Y-model (DS → SS → {LF, MS → PF}). Terminal nodes bear morphosyntactic features but no phonological content until Vocabulary Insertion at MS — this is Late Insertion. The VI mechanism (subsetPrinciple) IS Late Insertion in action: it maps feature bundles to exponents, making DM realizational by construction.

Impoverishment and Fission — also introduced in this paper — are formalized as general mechanisms in Theories/Morphology/DM/Impoverishment.lean and Theories/Morphology/DM/Fission.lean. This study file instantiates impoverishment on the English paradigm to derive syncretism (§4).

The Subset Principle and English Tns/Agr #

@cite{halle-marantz-1993} give the Vocabulary Items for English verbal inflection. The terminal node for Tns+Agr (after fusion — see §3) bears features drawn from {[+past], [+participle], [3sg]}. The Subset Principle / Elsewhere Condition selects the most specific matching VI entry: the entry whose feature specification is the largest subset of the terminal's features.

Context-free VI entries:

FeaturesExponentExample
[+past, +participle]-ntaken, eaten
[+past]-dwalked, played
[+participle]-ingwalking, playing
[3sg]-zwalks, plays
∅ (elsewhere)walk, play

Features on the English Tns+Agr terminal after fusion. @cite{halle-marantz-1993}.

Instances For
    @[implicit_reducible]
    Equations
    Equations
    • One or more equations did not get rendered due to their size.
    Instances For

      Context-free VI entries for English verbal inflection. @cite{halle-marantz-1993}.

      Equations
      • One or more equations did not get rendered due to their size.
      Instances For

        Past participle: [+past, +participle] → -n. -n beats -d and -ing because its feature set {past, participle} is the largest subset of the target. @cite{halle-marantz-1993}.

        Past finite: [+past] → -d. -n does not match because [+participle] ⊄ {past}. @cite{halle-marantz-1993}.

        Non-finite participle: [+participle] → -ing. @cite{halle-marantz-1993}.

        Third singular present: [3sg] → -z. @cite{halle-marantz-1993}.

        Elsewhere (bare stem): [] → . When no features are present, the elsewhere entry wins. @cite{halle-marantz-1993}.

        The Subset Principle resolves -n vs -d competition: for a [+past, +participle] target, -n (2 features) beats -d (1 feature).

        The paradigm is total: every possible feature combination receives an exponent (thanks to the elsewhere entry).

        Root-Specific Past Tense Entries #

        The default past tense entry -d coexists with root-conditioned variants (@cite{halle-marantz-1993}):

        These share the same morphosyntactic context ([+past]) but have different root restrictions. The Paninian principle (= Elsewhere Condition applied to root-conditioned entries) selects the most specific matching entry: a root-restricted rule overrides the unrestricted default when the root matches.

        This section uses VocabItem (which supports root restrictions via rootMatch) rather than FeatureVI (which is context-free).

        Sample verb roots for demonstrating conditioned allomorphy.

        Instances For
          @[implicit_reducible]
          Equations
          Equations
          • One or more equations did not get rendered due to their size.
          Instances For

            Root-conditioned past tense VI rules.

            All three entries share [+past] context (modeled as Bool); they differ in root restriction and specificity. @cite{halle-marantz-1993}.

            Equations
            • One or more equations did not get rendered due to their size.
            Instances For

              Regular verbs get -d: no root-specific entry matches.

              Verbs in the -t class: root restriction overrides default.

              Verbs in the class: no overt past tense marking.

              Non-past context: no entry matches (all require [+past]).

              Fusion of Tns and Agr at MS #

              @cite{halle-marantz-1993} argue that English Tns and Agr are separate syntactic heads. At MS, they undergo Fusion: the two adjacent terminals merge into a single terminal bearing the union of both feature bundles. A single VI entry then spells out the fused node.

              This explains why English has a single affix (not two stacked affixes) for Tns+Agr: walk-s realizes both non-past Tns and 3sg Agr in one exponent, because Fusion merges the two terminals before VI applies.

              We model fusion using FusionRule from the DM theory layer and show that the fused features feed directly into the VI paradigm from §1.

              The two inflectional heads that fuse in English.

              • tns (past participle : Bool) : InflHead

                Tns head: bears tense/aspect features.

              • agr (sg3 : Bool) : InflHead

                Agr head: bears agreement features.

              Instances For
                def HalleMarantz1993.instDecidableEqInflHead.decEq (x✝ x✝¹ : InflHead) :
                Decidable (x✝ = x✝¹)
                Equations
                Instances For
                  Equations
                  • One or more equations did not get rendered due to their size.
                  Instances For
                    def HalleMarantz1993.fusedFeatures (tPast tPart aSg3 : Bool) :

                    Compute the fused feature bundle for a Tns+Agr combination. Uses FusionRule.apply with list concatenation as the union operation.

                    Equations
                    • One or more equations did not get rendered due to their size.
                    Instances For

                      3sg present: Tns[−past,−part] fused with Agr[3sg] → [sg3] → -z. One exponent realizes both heads.

                      Past finite: Tns[+past] fused with Agr[−3sg] → [past] → -d.

                      Past participle: Tns[+past,+part] fused with Agr[−3sg] → [past, participle] → -n.

                      Present participle: Tns[−past,+part] fused with Agr[−3sg] → [participle] → -ing.

                      Elsewhere: Tns[−past,−part] fused with Agr[−3sg] → [] → .

                      theorem HalleMarantz1993.fusion_always_spellable (tPast tPart aSg3 : Bool) :

                      Every Tns+Agr fusion produces a feature bundle that the VI paradigm can spell out — the paradigm is complete.

                      Impoverishment Derives Syncretism #

                      @cite{halle-marantz-1993} introduce Impoverishment: deletion of features from a terminal node before Vocabulary Insertion. When a distinguishing feature is deleted, two formerly distinct contexts fall together at VI, producing syncretism — distinct morphosyntactic specifications receiving the same exponent.

                      The general Impoverishment mechanism is formalized in Theories/Morphology/DM/Impoverishment.lean. Here we instantiate it on the English Tns/Agr paradigm to demonstrate the derivation of syncretism: deleting [+participle] from [+past, +participle] causes the past participle to receive the same exponent as simple past.

                      This models the fact that regular English verbs have identical simple past and past participle forms (walked does both): the [+participle] feature is not visible at VI, leaving only [+past] to trigger -d.

                      Delete occurrences of a feature from a bundle.

                      This mirrors deleteFeature in Theories/Morphology/DM/Impoverishment.lean, instantiated for EngInflFeature. The structural parallel is exact: both filter a list, removing elements that match the target.

                      Equations
                      Instances For
                        theorem HalleMarantz1993.deleteFeature_idempotent (bundle : List EngInflFeature) (target : EngInflFeature) :
                        deleteFeature (deleteFeature bundle target) target = deleteFeature bundle target

                        Impoverishment is idempotent: deleting a feature twice is the same as deleting it once.

                        This mirrors deleteFeature_idempotent in Impoverishment.lean — filter is idempotent when the predicate is stable.

                        Impoverishing [+participle] from [+past, +participle] produces syncretism: the impoverished bundle receives the same exponent as simple [+past].

                        Without impoverishment, [+past, +participle] gets -n (taken, eaten).

                        With impoverishment of [+participle], the same context gets -d (walked as both simple past and past participle).

                        Full DM pipeline: Fusion → Impoverishment → VI.

                        Past participle with impoverishment of [+participle]:

                        1. Fusion: Tns[+past,+part] + Agr[−3sg] → [past, participle]
                        2. Impoverishment: delete [+participle] → [past]
                        3. VI: [past] → -d

                        Without impoverishment, step 3 would give -n (via fusion_past_participle). This full pipeline connects §1 (VI), §3 (fusion), and §4 (impoverishment).

                        Connecting to the Mirror Principle and Bybee's Hierarchy #

                        @cite{halle-marantz-1993} discuss how DM's post-syntactic architecture derives @cite{baker-1985}'s Mirror Principle: GF-rules (passive, causative, applicative, reflexive/reciprocal) are syntactic head movements, and Morphological Structure preserves the derivation order. Affix layering necessarily mirrors syntactic structure because MS is derived from syntax.

                        We formalize two connections:

                        1. English verb inflection is concatenative, placing it within @cite{baker-1985}'s scope.

                        2. All English Tns/Agr features map to @cite{bybee-1985} categories that are OUTSIDE GF-rule categories in the relevance hierarchy. This is consistent with DM's clause structure: Tns and Agr sit structurally above GF-rule projections, so their exponents are outermost after head movement.

                        All English Tns/Agr features map to categories that are OUTSIDE GF-rule categories in @cite{bybee-1985}'s relevance hierarchy.

                        GF-rules (passive → voice rank 3, causative/applicative/reciprocal → valence rank 2) are always closer to the stem than English Tns/Agr features (aspect rank 4, tense rank 5, agreement rank 8).

                        This is consistent with DM's clause structure: Tns and Agr are structurally above GF-rule projections (PassP, CausP, ApplP), so after head movement and fusion, the Tns+Agr exponent sits outermost. The relevance hierarchy and the syntactic hierarchy converge — connecting @cite{baker-1985}, @cite{bybee-1985}, and @cite{halle-marantz-1993}.

                        English verb inflection is concatenative: affixes are linearly concatenated to the stem. This places it within the scope of @cite{baker-1985}'s Mirror Principle (@cite{baker-1985} restricts the principle to concatenative morphology, excluding clitics and nonconcatenative processes).