@cite{koev-2017}: Bulgarian Evidentials and Spatiotemporal Distance #
The Bulgarian evidential (-l participle) is felicitous when the speaker's evidence acquisition is spatiotemporally distant from the described event: either temporally non-overlapping (standard indirect evidence) or spatially distant (same time, different place). Direct witness (same time, same place) is infelicitous. Plus: the evidential contribution projects past negation/modals (not-at-issue) and the speaker is fully committed to the proposition (non-modal analysis, contra @cite{izvorski-1997}).
Main definitions #
temporallyDisjoint/spatiotemporallyDistant— Koev's △ predicate (Def. 24)EvidentialDatum+coreData— felicity data table for indirect / direct-witness / smoke-from-chimney scenariostrianglePredicts— felicity prediction by △triangle_predicts_all— verification that △ matches felicity data
TODO #
- Location field on
Event Time: △ is parameterized over an externalloc : Event Time → LbecauseEvent Timelacks a built-in location. Extending the core event type would affect ~20 files. Until then, spatial-distance reasoning takeslocas a parameter at use sites. - Learning event
e_lontology: Koev's deepest contribution (74b) is the existential introduction of a learning event withlearn_{cs(k)}subscripted on the context set. Not yet formalized; the substrate bridge to @cite{cumming-2026}'s downstream T ≤ A constraint is partial.
References #
- @cite{koev-2017} (primary)
- @cite{cumming-2026} (downstream consumer of △ → T ≤ A bridge)
- @cite{izvorski-1997} (contra: modal analysis Koev rejects)
Spatiotemporal Overlap Types #
Whether the described event and the learning event overlap in time.
- overlapping : TemporalOverlap
- nonoverlapping : TemporalOverlap
Instances For
Equations
- Koev2017.instDecidableEqTemporalOverlap x✝ y✝ = if h : x✝.ctorIdx = y✝.ctorIdx then isTrue ⋯ else isFalse ⋯
Equations
- One or more equations did not get rendered due to their size.
Instances For
Equations
- Koev2017.instReprTemporalOverlap = { reprPrec := Koev2017.instReprTemporalOverlap.repr }
Whether the described event and the learning event share a location.
- samePlace : SpatialRelation
- differentPlace : SpatialRelation
Instances For
Equations
- Koev2017.instDecidableEqSpatialRelation x✝ y✝ = if h : x✝.ctorIdx = y✝.ctorIdx then isTrue ⋯ else isFalse ⋯
Equations
- One or more equations did not get rendered due to their size.
Instances For
Equations
- Koev2017.instReprSpatialRelation = { reprPrec := Koev2017.instReprSpatialRelation.repr }
Evidential Datum Structure #
An evidential felicity datum from @cite{koev-2017}. Each records the spatiotemporal configuration of the described event and the learning event, and whether the evidential is felicitous.
- temporal : TemporalOverlap
Temporal overlap between described and learning events
- spatial : SpatialRelation
Spatial relation between described and learning events
- evidentialFelicitous : Bool
Whether the Bulgarian evidential is felicitous in this configuration
- exampleNum : String
Example number in @cite{koev-2017}
Instances For
Equations
- Koev2017.instReprEvidentialDatum = { reprPrec := Koev2017.instReprEvidentialDatum.repr }
Equations
- One or more equations did not get rendered due to their size.
Instances For
Equations
Equations
- One or more equations did not get rendered due to their size.
- Koev2017.instBEqEvidentialDatum.beq x✝¹ x✝ = false
Instances For
Core △ Data (@cite{koev-2017}, §4) #
(3)/(25a): Standard indirect evidence — speaker was not present when the event occurred. Non-overlapping in time, same place. Felicitous.
Equations
- Koev2017.indirectEvidence = { temporal := Koev2017.TemporalOverlap.nonoverlapping, spatial := Koev2017.SpatialRelation.samePlace, evidentialFelicitous := true, exampleNum := "25a" }
Instances For
Direct witness — speaker perceived the event as it happened. Overlapping in time, same place. Infelicitous.
Equations
- Koev2017.directWitness = { temporal := Koev2017.TemporalOverlap.overlapping, spatial := Koev2017.SpatialRelation.samePlace, evidentialFelicitous := false, exampleNum := "25a-control" }
Instances For
(25b): Smoke from chimney — speaker perceives evidence of the event from a different location, at the same time. Overlapping in time, different place. Felicitous — spatial distance suffices.
Equations
- Koev2017.smokeFromChimney = { temporal := Koev2017.TemporalOverlap.overlapping, spatial := Koev2017.SpatialRelation.differentPlace, evidentialFelicitous := true, exampleNum := "25b" }
Instances For
Commitment and Projection Data #
The evidential does not weaken commitment: "EV(p) and I know because I was there" is not contradictory (unlike a modal which would predict contradiction). @cite{koev-2017}, §3.
Equations
- Koev2017.commitmentDatum = true
Instances For
The evidential contribution projects past negation: "It is not the case that Ivan EV-came" presupposes indirect evidence while negating the proposition. @cite{koev-2017}, §5.
Equations
- Koev2017.projectionDatum = true
Instances For
△ Felicity Generalization #
△ predicts felicity: the evidential is felicitous iff the described event and the learning event are spatiotemporally distant (temporally non-overlapping or spatially distant).
Equations
- One or more equations did not get rendered due to their size.
Instances For
All core data points.
Equations
Instances For
Data Verification #
△ correctly predicts felicity for all core data points.
Bridge: Connecting to Linglib Infrastructure #
Bridge theorems connecting @cite{koev-2017}'s spatiotemporal distance analysis to existing linglib infrastructure, organized around the paper's four properties (property 6):
- (i) Spatiotemporal meaning — △(e, e_l): §§3–4 below
- (ii) Speaker commitment — assertion = p, non-modal: §5
- (iii)–(iv) Not at issue + Projection — presup projects past negation: §6
Plus structural bridges:
- △ vs. temporal ordering — these are independent constraints: §4
- Bridge to @cite{cumming-2026} — △ → T ≤ A (downstream evidence): §7
- Bridge to nfutL — existing fragment connection: §8
Central Claim: Learning Events #
The paper's deepest contribution is ontological: evidentials introduce a learning event e_l — the event through which the speaker acquired the reported information. The formal representation (74b):
∃e_l ∧ learn_{cs(k)}(e_l, sp(k), p) ∧ τ(e_l) ≤ time(k) ∧ e △ e_l
The learn predicate is subscripted with cs(k) (context set), not with p (scope proposition). This is the formal mechanism for not-at-issue status: the evidential restricts the context set directly (≈ presupposition), while the assertion commits the speaker to p via DECL (72).
Spatiotemporal distance △ substrate (inlined) #
@cite{koev-2017} Definition 24: two events satisfy △ when either
their temporal traces don't overlap (standard indirect evidence) or
they occur at different locations (smoke-from-chimney scenario).
Inlined from former Theories/Semantics/Events/SpatiotemporalDistance.lean
— single-consumer (this file) substrate, paper-anchored entirely on
Koev 2017. Architectural note: Event Time lacks a built-in location
field, so △ is parameterized by an external loc : Event Time → L.
Two events are temporally disjoint when their temporal traces do not overlap (Koev 2017, first disjunct of Def. 24).
Equations
- Koev2017.temporallyDisjoint e₁ e₂ = ¬e₁.τ.overlaps e₂.τ
Instances For
Spatiotemporal distance △ (Koev 2017, Def. 24). Two events are spatiotemporally distant if either their temporal traces don't overlap or they occur at different locations.
Equations
- Koev2017.spatiotemporallyDistant loc e₁ e₂ = (Koev2017.temporallyDisjoint e₁ e₂ ∨ loc e₁ ≠ loc e₂)
Instances For
If e₁ temporally precedes e₂, they are temporally disjoint (standard indirect evidence: described event finished before learning event started).
If two events are temporally disjoint and the first starts no later than the second, then the first is temporally before the second: bridges Koev's event-based △ to Cumming's point-based T ≤ A.
Overlapping runtimes are incompatible with temporal distance.
Spatial distance alone suffices for △ (Koev 2017, ex. 25b).
Temporal disjointness alone suffices for △.
Learning Scenarios (@cite{koev-2017}, §4) #
A learning scenario: the evidential introduces a learning event e_l — the event through which the speaker acquired the reported information — paired with the described event e.
The paper's representation (74b): ∃e_l ∧ learn_{cs(k)}(e_l, sp(k), p) ∧ τ(e_l) ≤ time(k) ∧ e △ e_l
The cs(k) Subscript #
The learn predicate is subscripted with cs(k) (the context set at
discourse move k), not with the scope proposition p. This is the formal
mechanism for not-at-issue status: the evidential contribution restricts
the context set directly (≈ presupposition in PrProp.presup), while
the assertion commits the speaker to p via DECL (72), which maps to
PrProp.assertion.
The mapping is:
learn_{cs(k)}(e_l, sp(k), p)→PrProp.presup(restricts cs)DECL(72): dc^sp(c) ⊆ p→PrProp.assertion(commits to p)
This explains why the evidential projects past negation (property 6iv):
PrProp.neg preserves presup while negating assertion.
What's Captured #
- The event pair (e, e_l) — the described event and the learning event
- △(e, e_l) — spatiotemporal distance, via
isTemporallyDisjoint/isSpatiotemporallyDistantand bridge toPrPropviatoEvidentialProp - The presup/assertion split — cs(k) subscript → presup, DECL → assertion
What's Not Captured (Future Work) #
- The learn predicate itself: We don't model the knowledge-change
semantics of
learn(e_l, sp(k), p). This would require time-indexed epistemic states: K_sp(p, t) ∧ ¬K_sp(p, t') for t' < τ(e_l). - Propositional content p: The structure pairs events but doesn't
carry the proposition learned. Adding
p : W → Boolwould require a world type parameter constraining downstream usage. - Speech time constraint: τ(e_l) ≤ time(k) ensures the learning event is past. This interacts with tense morphology (the L-participle is morphologically past) but is not modeled here.
- Evidence source typology: The paper distinguishes reportative, inferential, and assumptive evidence (§5) via different learn predicates. We collapse these into a single △ constraint.
- described : Semantics.Events.Event Time
The described event (what happened: e.g., Ivan kissing Maria)
- learning : Semantics.Events.Event Time
The learning event (how the speaker found out: e.g., hearing a report)
Instances For
△ holds for this scenario (temporal component): the described and learning events have non-overlapping temporal traces.
Equations
Instances For
△ holds for this scenario (full spatiotemporal version): temporal disjointness OR spatial distance.
Equations
Instances For
Computable temporal △ for ℤ events: ¬(τ(e) overlaps τ(e_l)). Since integer comparison is decidable, we can evaluate △ from the event structure directly.
Equations
Instances For
triangleTemporalB agrees with the propositional isTemporallyDisjoint:
the Bool computation and the Prop predicate coincide for ℤ events.
Construct a PrProp from a learning scenario, making the cs(k) → presup mapping constructive.
The presupposition is derived from the event structure (△ holds or not), and the assertion is the scope proposition p (committed via DECL). This is the concrete realization of Koev's (74b):
presup:= △(described, learning) — the evidential's cs(k) contributionassertion:= p — the scope proposition
Equations
- s.toEvidentialProp p = { presup := fun (x : W) => s.triangleTemporalB = true, assertion := p }
Instances For
Concrete Scenarios #
Described event: interval [0, 5].
Equations
- Koev2017.describedEvent = { runtime := { start := 0, finish := 5, valid := Koev2017.describedEvent._proof_2 }, sort := Semantics.Events.EventSort.action }
Instances For
Learning event (indirect): interval [10, 15] — strictly later.
Equations
- Koev2017.learningEventIndirect = { runtime := { start := 10, finish := 15, valid := Koev2017.learningEventIndirect._proof_2 }, sort := Semantics.Events.EventSort.state }
Instances For
Learning event (direct witness): interval [2, 4] — overlaps described.
Equations
- Koev2017.learningEventDirect = { runtime := { start := 2, finish := 4, valid := Koev2017.learningEventDirect._proof_2 }, sort := Semantics.Events.EventSort.state }
Instances For
Learning event (spatial distance): interval [0, 5] — same time, different place (smoke from chimney).
Equations
- Koev2017.learningEventSpatial = { runtime := { start := 0, finish := 5, valid := Koev2017.describedEvent._proof_2 }, sort := Semantics.Events.EventSort.state }
Instances For
Indirect evidence scenario: described event [0,5], learning event [10,15].
Equations
- Koev2017.indirectScenario = { described := Koev2017.describedEvent, learning := Koev2017.learningEventIndirect }
Instances For
Smoke-from-chimney scenario: described event [0,5], learning event [0,5] at a different location.
Equations
- Koev2017.smokeScenario = { described := Koev2017.describedEvent, learning := Koev2017.learningEventSpatial }
Instances For
Property (i): Spatiotemporal Meaning — △ #
Indirect evidence: described and learning events are temporally disjoint — described event [0,5] finished before learning event [10,15] started. △ satisfied via temporal disjointness.
Direct witness: described event [0,5] and learning event [2,4] overlap. They are NOT temporally disjoint — △ fails (when also co-located).
The smoke scenario events temporally overlap — temporal disjointness alone does NOT yield △ here.
Despite temporal overlap, any location function assigning different locations to the described and learning events yields △. This captures the smoke-from-chimney scenario (§4): spatial distance suffices.
△ vs. Temporal Ordering (Independent Constraints) #
The paper separates two constraints in (74b):
- e △ e_l : spatiotemporal distance (the evidential's contribution)
- τ(e) < τ(e_l) : temporal ordering (the past tense's contribution)
These are independent: △ can hold via spatial distance alone (smoke
scenario has △ without temporal ordering), and temporal ordering is
imposed by tense morphology, not the evidential.
Temporal ordering: the described event PRECEDES the learning event. This is the past tense's contribution, NOT the evidential's. Paper (74b): τ(e) < τ(e_l).
The smoke scenario has NO temporal ordering (events are simultaneous), yet △ holds via spatial distance. This demonstrates that △ and temporal ordering are independent constraints.
The Four Properties (Derived from toEvidentialProp) #
All four properties (property 6) follow from toEvidentialProp:
- **(i) Spatiotemporal meaning**: presup = △(described, learning),
derived from event structure via `triangleTemporalB`
- **(ii) Speaker commitment**: assertion = p (non-modal, full commitment)
- **(iii) Not at issue**: △ is in presup (cs restriction), not assertion
- **(iv) Projection**: PrProp.neg preserves presup → △ projects past ¬
Property (6i): the presupposition of the constructed PrProp IS the △ condition, derived from the event structure. When △ holds (indirect evidence), the presupposition is satisfied at every world.
When △ fails (direct witness), the presupposition fails — the evidential sentence is undefined (infelicitous).
Equations
- Koev2017.directScenario = { described := Koev2017.describedEvent, learning := Koev2017.learningEventDirect }
Instances For
Property (6ii): the assertion of a scenario's PrProp IS the scope
proposition. The speaker commits to p, not to a modalized version.
This holds by construction: DECL (72) maps to PrProp.assertion.
A modal evidential would assert □_e(p) — "p must be true given evidence e" — a DIFFERENT proposition from p.
This is a simplified stub; the full Kratzer-grounded version is
Izvorski1997.Bridge.izvorskiEv, which uses necessity f g p as
the assertion and !(accessibleWorlds f w).isEmpty as the presup.
Equations
- Koev2017.modalEvidential evidence must_p = { presup := fun (x : W) => evidence = true, assertion := must_p }
Instances For
The modal analysis CAN weaken the assertion: there exist instantiations where the modal's assertion diverges from the scope proposition, while Koev's assertion is always p by construction.
Property (6iv): the evidential presupposition projects past negation. Negating the evidential negates the assertion (p → ¬p) but preserves the presupposition (△). This follows from PrProp's general negation rule and captures the paper's formalization (78).
Bridge to @cite{cumming-2026}: △ → T ≤ A #
For the indirect evidence case, temporal disjointness + ordering gives isBefore: τ(e).finish ≤ τ(e_l).start.
Construct Cumming's EvidentialFrame from the learning scenario: T = τ(e).finish, A = τ(e_l).start. This bridges Koev's event-based analysis to Cumming's point-based (S, A, T) frame.
Equations
- One or more equations did not get rendered due to their size.
Instances For
Cumming's downstream evidence (T ≤ A) holds for the indirect frame — the temporal special case of Koev's △.
Bridge to Existing Fragment #
The existing Bulgarian nfutL entry has EP = downstream (T ≤ A), which is the temporal special case of Koev's △: when spatial distance is not at play, △ reduces to temporal disjointness, and temporal disjointness + described-before-learning gives T ≤ A.