Documentation

Linglib.Theories.Syntax.Mueller2013

@cite{mueller-2013}: Unifying Everything #

@cite{mueller-2013}

Cross-theory comparison formalizing Müller's central thesis: Minimalism, HPSG, CCG, Construction Grammar, and Dependency Grammar converge on three universal combination schemata (Head-Complement, Head-Specifier, Head-Filler).

Structure #

§1. Classification Functions #

Lightweight mappers from each theory's combination operations to the theory-neutral CombinationKind.

CCG classification #

Classify a CCG derivation step as one of the three schemata.

  • Forward/backward application → Head-Complement (functor selects argument)
  • Forward/backward composition → Head-Filler (enables extraction; this is an approximation — composition also serves non-extraction functions like heavy NP shift and right-node raising)
  • Type-raising → none (unary operation, not a binary combination)
  • Coordination → none (symmetric, not one of the three headed schemata)
Equations
Instances For

    HPSG classification #

    Classify an HPSG schema application as one of the three schemata.

    @cite{mueller-2013}'s three universal schemata are Head-Complement, Head-Subject, and Head-Filler. HPSG's fourth schema, Head-Modifier (adjunction), falls outside this classification — Müller does not include adjunction in the convergence claim.

    Equations
    Instances For

      Dependency Grammar classification #

      Classify a UD dependency relation as one of the three schemata.

      Subject dependencies are Head-Specifier; all other core dependencies are Head-Complement. Non-projective dependencies (handled separately) correspond to Head-Filler.

      Equations
      Instances For

        CxG classification #

        Classify whether a CxG construction is fully compositional.

        Fully abstract constructions without pragmatic function decompose into sequences of Head-Complement and Head-Specifier steps. Other constructions are irreducible phrasal patterns.

        Equations
        Instances For

          §2. Labeling Convergence (Müller §2.1) #

          The head determines the category of the result. This is called:

          CCG forward application preserves the functor's result category.

          When X/Y combines with Y via forward application, the result is X — the left part of the functor's slash category.

          CCG backward application preserves the functor's result category.

          A head function is selection-respecting at (a, b) iff when a selects b, the head function picks a's head leaf as the head of .node a b (i.e., a projects). This is the explicit MCB-aligned encoding of "selector projects": the legacy assumption is no longer baked into HeadFunction itself, but reappears as a property a particular h may or may not satisfy.

          Restated against the Phase 2 externalize encoding (MCB §1.12.1): "marking places the projecting head on the left daughter" becomes "the head of the merged node equals the head of a".

          Equations
          Instances For

            Minimalist labeling (MCB §1.13.6 / §1.15): under a selection-respecting head function h, when α selects β, the head of {α, β} agrees with the head of α. The legacy selector projects claim restated MCB- natively: it is no longer a property of labeling, but a property of a chosen head function — namely those satisfying IsSelectionRespectingAt.

            The substantive refutation theorem MCB's framework now makes statable: selection and projection are independent properties. There exist a HeadFunction and a configuration where one side selects but the marking puts the head on the other side — formalising MCB's "selection ≠ projection" position (the selector-projects assumption is a property of some head functions, not a structural fact about Merge).

            Witness construction (Phase 2-aware): with the externalize-based HeadFunction, witnesses require a custom h.section_.σ that places b's head as the leftmost-leaf of .node a b's planar representative (so headAt h (.node a b) = headAt h b ≠ headAt h a), violating IsSelectionRespectingAt. The leftSpine/rightSpine defaults both use Quot.out and so cannot serve as distinct witnesses post-Phase-2; a constructive witness needs a per-test externalize choice. The statement remains TODO until Tier B+ provides parameterized selection apparatus (checkedSelWith?).

            Why this matters cross-framework: HPSG's Head Feature Principle and CCG's slash-functor invariance both bake "the selector projects" into the type of their structures (HeadCompRule's designated head field; CCG's slash directions). MCB's nonplanar Merge does NOT bake this in — it's a downstream choice of head function. This theorem witnesses the separation in Lean.

            theorem Comparisons.Mueller2013.labeling_convergence :
            (∀ (h : Minimalist.HeadFunction) (a b : Minimalist.SyntacticObject), h.Dom (a.node b)Minimalist.Labeling.labelRoot h (a.node b) = some (h.head (a.node b)).item.outerCat) (∀ (x y : CCG.Cat), CCG.forwardApp (x.rslash y) y = some x) ∀ (x y : CCG.Cat), CCG.backwardApp y (x.lslash y) = some x

            Labeling convergence across theories (MCB-aligned formulation).

            Four independent formulations of "the head determines the result's category" all hold simultaneously. The Minimalist clause is parametric over a head function h: whichever head function the analyst chooses, the head's category projects (this is the definition of MCBLabeling.labelRoot).

            §3. External Merge ↔ Head-Complement ↔ Application (§2.1–2.2) #

            All theories implement the head-complement combination:

            §4. Internal Merge ↔ Head-Filler ↔ Composition (§2.3) #

            All theories handle long-distance dependencies via the third schema:

            Non-projective dependencies in DG correspond to Head-Filler.

            A non-projective (crossing) dependency represents displacement — the DG analogue of Internal Merge and the Head-Filler Schema.

            §5. Coordination Diagnostic (§2.2) #

            Coordination requires matching categories across all theories. This is a diagnostic for whether two expressions have the same category.

            theorem Comparisons.Mueller2013.ccg_coordination_same_cat (c1 c2 : CCG.Cat) :
            CCG.coordinate c1 c2 nonec1 = c2

            CCG coordination requires matching categories.

            theorem Comparisons.Mueller2013.ccg_coordination_mismatch (c1 c2 : CCG.Cat) (h : c1 c2) :
            CCG.coordinate c1 c2 = none

            CCG coordination of mismatched categories fails.

            HPSG lexical rules preserve head features, enabling coordination.

            When two signs undergo the same lexical rule, they retain the same category — which is why passivized verbs can coordinate with each other.

            §6. Directional MG ≈ CCG (§2.3) #

            Stabler's directional Minimalist Grammar uses features =x (looking right) and x= (looking left), which correspond directly to CCG's X/Y and X\Y.

            This formal correspondence is not yet formalized because directional MG is not in the codebase.

            §7. Both Directions Right (§3) #

            Müller's conclusion: the three universal schemata handle fully abstract constructions, but Construction Grammar's phrasal constructions are irreducible — they cannot be decomposed into the three schemata.

            "Both directions right": we need BOTH Merge/schemata AND constructions.

            Both directions right: the three schemata AND phrasal constructions are needed.

            1. Fully abstract constructions without pragmatic functions are fully compositional — decomposable into Head-Complement and Head-Specifier steps.
            2. There exist constructions that are not fully compositional — they cannot be captured by the three schemata alone, requiring CxG's phrasal patterns.

            §8. Concrete Cross-Theory Examples #

            Verify that specific combinations classify identically across theories.

            The three primary HPSG schemata map to the three universal schemata; Head-Modifier (adjunction) falls outside the classification.

            Head-Modifier falls outside Müller's three schemata. Adjunction is HPSG-specific and not part of the universal convergence claim.

            §9. Labeling Failures (§2.1) #

            Müller shows that Chomsky's labeling algorithm fails in two ways:

            1. Free relatives: rules 14a and 14b give contradictory labels (D vs V)
            2. Coordination of phrases: neither rule applies (neither daughter is an LI, neither selects the other)

            Note: The free relative SO freeRelSO models the surface structure {what, [wrote ___]} without explicitly modeling Internal Merge — "what" and the gap have different token IDs rather than being literal copies. The labeling conflict holds regardless of how the gap is represented.

            Theorems free_relative_labeling_conflict and coordination_labeling_failure were removed: their components in Theories/Syntax/Minimalist/Labeling.lean (specifically freeRel_labelCat_gives_V, coord_neither_selects) were also removed because their decide-based proofs failed in the kernel (recursive label does not reduce). The substantive Müller arguments remain documented in Labeling.lean's docstrings and in §10 below.

            §10. Monovalent Verb Serialization Problem (§2.3) #

            Merge classifies a monovalent verb's sole argument as a complement, yielding wrong linearization ("*Sleeps Max" instead of "Max sleeps").

            Theorem monovalent_verb_serialization_problem removed for the same reason — its monovalent_classified_as_complement component in CombinationSchemata.lean required decide to reduce classifyExternalMerge, which doesn't happen in the kernel. The linearization claim (monovalent_wrong_linearization) and Müller's argument both stand in their original locations.

            §11. Iterable Valence Operations (§1) #

            Lexical rules compose freely, capturing double passivization (Turkish, Lithuanian) without phrasal machinery.

            Summary Table #

            ClaimMinHPSGCCGDGCxGStatus
            Head-ComplementExt Merge + selHeadCompfapp/bappobj/det/...slot decompProved
            Head-SpecifierExt Merge − selHeadSubj(= bapp)subjslot decompProved
            Head-FillerInt MergeHeadFillerfcomp/bcompnonprojirreducibleProved
            Head-ModifierHeadModNot in 3 schemata
            Labelingselector projHFPfunctor resultheadProved
            Coordinationsame catsame catsame catProved
            Labeling failure (FR)14a≠14bProved
            Labeling failure (coord)no rule appliesProved
            Monovalent verb*Sleeps MaxProved
            Valence iterationdouble passiveProved
            Directional MG ≈ CCG=x ≈ X/YSorry
            Both directions rightabstract ∨ phrasalProved

            Note: CCG has no separate Head-Specifier mechanism. Subject combination uses backward application (the verb S\NP is the functor), which is Head-Complement in the classification. The subject/complement distinction is syntactic, not combinatory, in CCG.