hex-canon discipline: canonical-name + #hex at every trait-color reference

Establish the hex-canon discipline in style/trait-palette.md (v0.1 -> 0.2):
every trait-color reference pairs canonical-name (semantic anchor for prose,
training-corpus extraction, LLM context) with hex (precise value for shaders,
renderers, machine-checkable canon enforcement). Mirrors the existing
color+motion-signature two-channel discipline that already secures color-blind
accessibility -- same architectural move, different audience pair.

Trait-palette.md gains:
- "Canonical name" + "Hex" columns in The full table (Eros-red #ee1b24,
  Philotes-orange #e28a46, ..., Mnemosyne-dusky-rose #cf3b74)
- New section: The hex-canon discipline -- format conventions, scope, the
  one-line grep that surfaces canon-violations

Sweep applies the discipline to existing trait-color references:
- identity-and-personhood/architecture.md (trait-to-body-part bridge table)
- topology-and-rendering/architecture.md (faction-color-politics table,
  imperial-net distortion descriptors, achromatic-exception statement,
  mind-pool color-inheritance narrative)
- runtime-engine/architecture.md (cosmetic-shader prose feedback)
- identity-and-personhood/bodies.md (Moira-violet pill in pending-design notes)
- political-register/economics.md (catalogue-slug example uses canonical token)

Drift-fix: schemas/findings.md trait_colors seed-data carried 8 divergent hex
"approximations" -- Eros #D03030 vs canonical #ee1b24, Aletheia #E5C520 vs
canonical #fcf001 (losing the brightness-zenith register), Dikaiosyne #2860B0
vs canonical #3f47cd, etc. Replaced with exact propagation from the canonical
palette; the rationalizing comment ("approximate hue-family targets") is
replaced with "exact propagation per the hex-canon discipline"; HSV-hue column
recomputed from canonical hex per RGB->HSV conversion, integer-rounded.

Non-trait colors stay untouched per recursive-as-we-touch-it scope: machine-
aesthetic (gold rim-light, commercial-coral, fluorescent-pallor, lavender-
decor, obsidian-black, cyan, matrix-green), historical-sumptuary (Tyrian
purple), cinematic (Matrix red-pill), medical (red-green color blindness).
Each non-trait domain receives canonical-name + hex pairings when that domain
comes into architectural focus.

Discipline is mechanically-checkable via one grep against the architecture-
papers, excluding the canonical source. Zero canon-violations remain after
this sweep.
This commit is contained in:
chrysalis
2026-04-28 03:32:02 +02:00
parent ecb5080379
commit 7f0abcb839
7 changed files with 84 additions and 51 deletions

View File

@@ -98,7 +98,7 @@ The player-gesture overlay (above) is the broadcast-channel by which player gest
**Two latencies, one architecture:**
- *Cosmetic visual feedback runs continuously.* NPC body-shader pulses warm-orange when the player gestures Philotes, green when they gesture Sophrosyne (per [`../style/trait-palette.md`](../style/trait-palette.md) §The full table) — purely visual, doesn't touch the system bus. The player gets moment-by-moment "I see you" reassurance.
- *Cosmetic visual feedback runs continuously.* NPC body-shader pulses with the gesture's canonical trait-hue — Philotes-orange (`#e28a46`) on a Philotes-gesture, Sophrosyne-green (`#2cad52`) on a Sophrosyne-gesture (per [`../style/trait-palette.md`](../style/trait-palette.md) §The full table and §The hex-canon discipline) — purely visual, doesn't touch the system bus. The player gets moment-by-moment "I see you" reassurance.
- *Systemic alignment-update runs at axis-cadence.* The trait-vector summary is computed at the crossing and fed into the next turn's LLM context. The player is **one turn behind** — their gestures during turn N inform the LLM's generation for turn N+1.
**One-turn-behind matches real conversation cadence.** Humans accumulate impressions during someone's speech and respond to *what was just said*; we don't react gesture-by-gesture. The lemniscate-bound integration is the *right* cadence for affective state-changes, not a latency-cost.
@@ -453,4 +453,4 @@ This converts emergent-zones into a **competitive attention economy**. Multiple
---
**Version:** 0.7.0 | **Created:** 2026-04-26 | **Updated:** 2026-04-27 | **Origin:** Split from architecture-index.md v0.7 (2026-04-26)
**Version:** 0.7.0 | **Created:** 2026-04-26 | **Updated:** 2026-04-28 | **Origin:** Split from architecture-index.md v0.7 (2026-04-26)