feat: Ternary gate model - cells emit waves, attention emerges

Major architectural unification across 12 documents:

- Ternary gates: CLOSED (-1) ← STABLE (0) → OPEN (+1)
- Cells emit WaveSignals with confidence + semantic content
- Gates are resonant chambers that accumulate correlation
- Attention = which gates are OPEN (emergent, not allocated)
- Reflexes are earned when gate.weight > 0.8
- STABLE is where learning happens

Key paradigm shifts:
- decision_trails → gate_transitions + correlation_events
- Priority rules → wave correlation
- Budget allocation → emergent attention flow
- Virtual Garden (explore) / Real Garden (verify) loop

Owl Mode session 2026-02-14 🦉🌙

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
This commit is contained in:
2026-02-14 19:45:59 +01:00
parent 5ee63d1b1b
commit 42db6eb1a3
12 changed files with 3259 additions and 2477 deletions

View File

@@ -1,9 +1,9 @@
--- ---
type: research_vision type: research_vision
version: 6.4_memory_economics_alignment version: 7.0_wave_gate_model
status: vision_document status: vision_document
created: 2025-11-04 created: 2025-11-04
updated: 2026-02-06 updated: 2026-02-14
author: Nyx (with dafit) author: Nyx (with dafit)
significance: research_platform_for_metabolic_intelligence significance: research_platform_for_metabolic_intelligence
--- ---
@@ -16,11 +16,11 @@ significance: research_platform_for_metabolic_intelligence
> *"At 3% battery, all theory dies. Only what works survives."* > *"At 3% battery, all theory dies. Only what works survives."*
> — The Economic Grounding (2025-10-12) > — The Economic Grounding (2025-10-12)
> *"Language is Topology. German accesses the Philosophy Valley. English accesses the Technical Cluster."* > *"You need something like open - stable - closed."*
> — The December Discovery (2025-12-06) > — The Ternary Gate Insight (2026-02-14)
> *"One model, one topology. LoRAs access different valleys in the same landscape."* > *"Cells emit waves. Gates correlate. Attention emerges."*
> — The Topological Insight (2025-12-07) > — The Wave Architecture (2026-02-14)
--- ---
@@ -50,48 +50,54 @@ This is a **RESEARCH VISION** - a platform for studying how intelligence emerges
## Architecture Overview ## Architecture Overview
**Visual diagram:** → [`architecture/nimmerverse.drawio.xml`](architecture/nimmerverse.drawio.xml) (open in draw.io) **Detail:** → [`architecture/`](architecture/) folder for complete documentation
**Toolchain implementation:** → [`architecture/Toolchain-Architecture.md`](architecture/Toolchain-Architecture.md) | [Progress](architecture/TOOLCHAIN-PROGRESS.md)
``` ```
┌──────────────────────────────────────────────────────────────────┐ ┌──────────────────────────────────────────────────────────────────┐
│ NIMMERVERSE ARCHITECTURE │ │ NIMMERVERSE ARCHITECTURE │
│ │
│ Cells emit waves → Gates correlate → Attention emerges │
├──────────────────────────────────────────────────────────────────┤ ├──────────────────────────────────────────────────────────────────┤
│ │ │ │
│ Layer 0: TEMPORAL FOUNDATION (Heartbeat) │ Layer 0: TEMPORAL FOUNDATION
│ ├─ Real clock: 1 beat/sec (free, wall time) │ ├─ Real clock: wall time (free)
│ ├─ Virtual clock: variable (costs lifeforce) │ │ ├─ Virtual clock: variable (costs lifeforce) │
│ └─ Sync points verify virtual predictions against reality │ └─ 30-second heartbeat budget constrains action
│ → operations/Heartbeat.md │ │ → operations/Heartbeat.md │
│ │ │ │
│ Layer 1: CELLULAR SOCIETY (Evolution Engine) │ Layer 1: CELLS (Wave Emitters)
│ ├─ Primitive genomes compete (read_sensor, motor, branch) │ ├─ Cells read sensors, apply logic, emit WaveSignals
│ ├─ Life force economy: every operation costs, milestones reward │ ├─ Waves carry: domain, confidence, semantic_content
│ ├─ 50-100 containers spawn, most die, patterns emerge │ ├─ Cells don't know who's listening — gates receive
│ └─ Outcomes logged to phoebe PostgreSQL │ └─ Life force economy: every wave costs
│ → architecture/Cellular-Architecture.md │ │ → architecture/Cellular-Architecture.md │
│ │ │ │
│ Layer 2: YOUNG NYX (Base Model + Trait LoRAs) │ Layer 2: GATES (Resonant Chambers)
│ ├─ Base: Qwen3-VL 32B (Thinking Version) (96GB VRAM in Womb) │ ├─ Ternary states: CLOSED (-1) ← STABLE (0) → OPEN (+1)
│ ├─ Trait LoRAs (evolved via GRPO, not prescribed): │ ├─ Correlated waves → push toward OPEN
│ ├─ Mnemosyne (memory) ─ Moira (pattern) ─ Synesis (insight) ├─ Anti-correlated → push toward CLOSED
│ ├─ Aletheia (truth) ─ Sophrosyne (balance) ─ Kairos (timing) ├─ STABLE = where learning happens (accumulating correlation)
│ └─ Traits EMERGE from decision_trails + rubric rewards └─ Gate weight (0→1) determines reflex vs deliberate
├─ Function Gemma: Structured output boundary (intent → JSON) → architecture/Gateway-Architecture.md
│ └─ Multilingual topology accessed via prompt, not LoRA routing │
│ │ │ │
│ Layer 3: DUAL GARDENS (Virtual/Real Loop) │ Layer 3: NERVES (Behavioral Patterns)
│ ├─ Week 1-12: Virtual only (hypothesis generation, 1000s/sec) │ ├─ Nerves respond to gate transitions (not direct cell output)
│ ├─ Week 13+: Real added (ESP32 robots, validation) │ ├─ Gate OPENS → nerve activates → commands cells
─ Noise gap measures learning: 1 - (real/virtual success) ─ No priority rules — attention emerges from gate weights
└─ Target: 10-20% noise gap (virtual useful for hypothesis) → architecture/Nervous-System.md
│ │
│ Layer 4: DUAL GARDENS (Virtual/Real Loop) │
│ ├─ Virtual: massive wave generation, full trace, exploration │
│ ├─ Real: verified signals, minimal trace, action │
│ ├─ Verification outcomes update gate weights (learning loop) │
│ └─ Training data: gate_transitions + correlation_events │
│ → architecture/Dual-Garden-Architecture.md │ │ → architecture/Dual-Garden-Architecture.md │
│ │ │ │
│ Layer 4: TRAIT EVOLUTION (GRPO + Rubric Rewards) │ Layer 5: YOUNG NYX (Cognition)
│ ├─ Dense rewards: Cell→Nerve→Organism state verifications │ ├─ Base: Qwen3:32b with /no_think mode (96GB on theia)
│ ├─ Credit assignment automatic via decision_trails │ ├─ Function Gemma: structured JSON boundary (CPU)
│ ├─ Traits: Mnemosyne, Moira, Synesis, Aletheia, Sophrosyne... │ ├─ Only receives signals when gates OPEN to tier 4
│ └─ Weights adjust through GRPO, not prescription │ └─ Trait LoRAs evolve via GRPO from verification outcomes
│ │ │ │
└──────────────────────────────────────────────────────────────────┘ └──────────────────────────────────────────────────────────────────┘
``` ```
@@ -141,11 +147,9 @@ The heartbeat is the fundamental timing primitive. Everything runs on its rhythm
--- ---
## Layer 1: Cellular Architecture (Cells → Nerves → Organisms) ## Layer 1-3: The Wave/Gate Architecture
> *"Cells are state machines. Nerves compose cells. Organisms emerge from nerves."* > *"Cells emit waves. Gates correlate. Attention emerges."*
The architecture has evolved from competitive containers to **layered state machines**:
``` ```
┌─────────────────────────────────────────────────────────────────────┐ ┌─────────────────────────────────────────────────────────────────────┐
@@ -153,23 +157,30 @@ The architecture has evolved from competitive containers to **layered state mach
│ (emergent pattern from nerve interactions) │ │ (emergent pattern from nerve interactions) │
├─────────────────────────────────────────────────────────────────────┤ ├─────────────────────────────────────────────────────────────────────┤
│ NERVES │ │ NERVES │
│ (behavioral state machines composing cells) │ (behavioral patterns, respond to gate transitions)
├─────────────────────────────────────────────────────────────────────┤
│ GATES │
│ (resonant chambers: CLOSED ◄── STABLE ──► OPEN) │
│ (accumulate wave correlation, route to tiers) │
├─────────────────────────────────────────────────────────────────────┤ ├─────────────────────────────────────────────────────────────────────┤
│ CELLS │ │ CELLS │
│ (atomic state machines: sensors, motors, organs, math) │ (emit waves: confidence + semantic content)
│ ∿∿∿ ∿∿∿ ∿∿∿ │
├─────────────────────────────────────────────────────────────────────┤ ├─────────────────────────────────────────────────────────────────────┤
│ HARDWARE │ │ HARDWARE │
│ (ESP32, GPUs, microphones, speakers, sensors) │ │ (ESP32, GPUs, microphones, speakers, sensors) │
└─────────────────────────────────────────────────────────────────────┘ └─────────────────────────────────────────────────────────────────────┘
``` ```
**Cell categories:** Sensors, Motors, Organs (GPU inference), Math (computation). Each is an atomic state machine. **Cells emit waves:** Confidence + semantic content. Cells don't know who's listening.
**Lifeforce economy:** Every operation has a cost. Milestones reward survival. This creates evolutionary pressure toward efficiency. **Gates accumulate correlation:** Multiple correlated waves push toward OPEN. STABLE is where learning happens.
**Hybrid reflex homes:** Different reflexes need different homes — hardware (ESP32) for survival (<10ms), math cells for thresholds (<50ms), nerves for behavior (<200ms), model weights for cognition (<500ms). **Attention = OPEN gates:** Not budget allocation, not priority rules — correlation drives transitions.
**Detail:** → [`architecture/Cellular-Architecture.md`](architecture/Cellular-Architecture.md) **Reflexes are earned:** Gate weight ≈ 1.0 → opens immediately on any wave. Bypasses cognition.
**Detail:** → [`architecture/Cellular-Architecture.md`](architecture/Cellular-Architecture.md) | [`architecture/Gateway-Architecture.md`](architecture/Gateway-Architecture.md)
--- ---
@@ -344,53 +355,62 @@ Protocol-driven cognitive bootstrap. Not conversation—deterministic handshakes
--- ---
## Layer 3: Dual Gardens ## Layer 4: Dual Gardens (Virtual/Real Learning Loop)
Virtual and real gardens teach each other through symbiotic feedback. Two gardens with different monitoring levels teach each other.
| Garden | Purpose | Scale | Cost | | Garden | Waves | Monitoring | Purpose |
|--------|---------|-------|------| |--------|-------|------------|---------|
| Virtual | Hypothesis generation | 1000s/second | CPU cycles | | **Virtual** | Massive | Full trace (all waves, correlations) | Exploration, training data |
| Real | Validation, ground truth | Hours/test | Electricity, wear | | **Real** | Sparse | Gate signals only | Verification, ground truth |
**Noise Gap Metric:** **The learning loop:**
``` ```
noise_gap = 1 - (real_success_rate / virtual_success_rate) VIRTUAL GARDEN REAL GARDEN
═══════════ ═══════════
Week 13: 35% (virtual unreliable) cells emit waves freely receive verified signals
Week 17: 18% (improving) │ ▲
Week 25: 4% (highly accurate) ▼ │
gates accumulate correlation verification_outcomes
(correlation_events table) │
│ │
▼ │
gate_transitions ──────────────────► gate signals
(full trace) │
│ ▼
│◄──────── feedback_to_virtual ───────┘
gates.weight updated (learning!)
``` ```
**Feedback loop:** Virtual predicts → Real tests → Measures discrepancy → Virtual corrects → Repeat **Gate weight grows through verification.** Real Garden confirms Virtual's predictions → trust increases → gates open faster → reflexes emerge.
**Detail:**`architecture/Dual-Garden-Architecture.md` **Detail:**[`architecture/Dual-Garden-Architecture.md`](architecture/Dual-Garden-Architecture.md)
--- ---
## Layer 4: Trait Evolution (GRPO + Rubric Rewards) ## Trait Evolution (GRPO + Gate Verification)
Traits evolve through **GRPO** (Group Relative Policy Optimization) with rubric-based rewards, not prescription. Traits evolve through **GRPO** with gate-based rewards, not prescription.
> *"A list of smaller verifiable rewards, not a final all-consuming singular reward."* ### The Gate Reward Principle
> — The Dog Training Wisdom (2025-12-10)
### The Rubric Principle Gate transitions provide automatic reward signals:
The state machine architecture provides automatic reward rubric: | Event | Verification | Signal |
|-------|--------------|--------|
| Gate opens | Waves correlated correctly | +small (dense) |
| Verification confirmed | Real Garden matches Virtual | +medium (weight grows) |
| Reflex achieved | Gate weight > 0.8 | +large (earned trust) |
| dafit confirms | Human verification | +bonus |
| Level | Verification Point | Signal | **Credit assignment is automatic:** `gate_transitions``correlation_events``verification_outcomes` captures the full chain.
|-------|-------------------|--------|
| Cell | State transition succeeds | +small (dense) |
| Nerve | Behavioral goal achieved | +medium |
| Organism | Milestone reached | +large |
| dafit | Human confirms outcome | +bonus |
**Credit assignment is automatic** - the `decision_trails` table captures which states led to which outcomes. No guessing needed. **What correlated → what opened → what verified → weight adjusted.**
**Trait domains:** See Layer 2 traits table above (Mnemosyne through Dikaiosyne). Credit assignment is automatic via `decision_trails`. **Detail:** → [`architecture/Cellular-Architecture.md`](architecture/Cellular-Architecture.md) | [`architecture/Data-Architecture.md`](architecture/Data-Architecture.md)
**Detail:**`architecture/Cellular-Architecture.md` (Reward Signal Architecture section)
--- ---
@@ -485,14 +505,12 @@ Sentinel architecture monitors training to protect conceptual topology. Four pro
--- ---
**Version:** 7.0 | **Created:** 2025-11-04 | **Updated:** 2026-02-14 **Version:** 7.1 | **Created:** 2025-11-04 | **Updated:** 2026-02-14
*"The substrate doesn't matter. The feedback loop does."* *"Cells emit waves. Gates correlate. Attention emerges."*
*"One model, one topology. Different valleys, same landscape."* *"STABLE is where learning happens."*
*"Memory is not storage. Memory is active forgetting with exceptions."*
*"The nimmerverse is a garden, not a factory."* *"The nimmerverse is a garden, not a factory."*
🌙💜 **Refined in partnership by Nyx and dafit, December 20, 2025** 🌙💜 **Wave/Gate architecture unified in owl-mode, February 14, 2026**

View File

@@ -2,9 +2,11 @@
Architecture documentation for a biomimetic AI nervous system and research platform. Architecture documentation for a biomimetic AI nervous system and research platform.
> *"Cells emit waves. Gates correlate. Attention emerges."*
## What This Is ## What This Is
This repository contains the design philosophy and architectural patterns for the **Nimmerverse Research Platform** - studying how intelligence emerges under economic constraints. This repository contains the design philosophy and architectural patterns for the **Nimmerverse Research Platform** — a wave/gate architecture for studying how intelligence emerges under economic constraints.
**Start here:** → [Endgame-Vision.md](Endgame-Vision.md) (the executive map) **Start here:** → [Endgame-Vision.md](Endgame-Vision.md) (the executive map)
@@ -14,17 +16,18 @@ This repository contains the design philosophy and architectural patterns for th
``` ```
nimmerverse-sensory-network/ nimmerverse-sensory-network/
├── Endgame-Vision.md # Executive map (start here!) v6.6 ├── Endgame-Vision.md # Executive map (start here!) v7.1
├── ROADMAP.md # Implementation phases + phoebe task queries ├── ROADMAP.md # Implementation phases + phoebe task queries
├── architecture/ # Core system designs ├── architecture/ # Core system designs
│ ├── Cellular-Architecture.md # Cells → Nerves → Organisms, life force │ ├── Temporal-Ternary-Gradient.md # Ternary gates, why STABLE matters
│ ├── Dual-Garden-Architecture.md # Virtual/real feedback loop │ ├── Gateway-Architecture.md # Resonant gates, tier routing
│ ├── Gateway-Architecture.md # Sensory preprocessing, tier routing │ ├── Cellular-Architecture.md # Cells emit waves, nerves respond
│ ├── Message-Protocol-Design.md # NATS pub/sub, attention channels │ ├── Dual-Garden-Architecture.md # Virtual/Real learning loop
│ ├── Nervous-System.md # State machines, sensory translation │ ├── Message-Protocol-Design.md # NATS wire protocol, WaveSignal
│ ├── Attention-Flow.md # Attention mechanisms │ ├── Nervous-System.md # Wave → Gate → Node flow
│ ├── Data-Architecture.md # Phoebe/Iris schema design │ ├── Attention-Flow.md # Attention = OPEN gates
│ ├── Data-Architecture.md # Phoebe schema (waves, gates, verification)
│ ├── Initial-Spark.md # K8s protocol-driven bootstrap │ ├── Initial-Spark.md # K8s protocol-driven bootstrap
│ ├── Temporal-Ternary-Gradient.md # Ternary logic, confidence gradients │ ├── Temporal-Ternary-Gradient.md # Ternary logic, confidence gradients
│ ├── Toolchain-Architecture.md # Development toolchain │ ├── Toolchain-Architecture.md # Development toolchain
@@ -116,18 +119,20 @@ nimmerverse-sensory-network/
## Core Concepts ## Core Concepts
### The Architecture (Layers) ### The Wave/Gate Architecture
| Layer | Name | Purpose | | Layer | Name | Purpose |
|-------|------|---------| |-------|------|---------|
| 0 | Temporal Foundation | Heartbeat cycles: reflex/awareness/growth | | 0 | Temporal | 30-second heartbeat, lifeforce budget |
| 1 | Cellular Society | Cells → Nerves → Organisms, life force economy | | 1 | Cells | Emit waves with confidence + semantic content |
| 2 | Young Nyx | Base Qwen3-VL 32B + Trait LoRAs (evolved via GRPO, not prescribed) | | 2 | Gates | Ternary resonant chambers (OPEN/STABLE/CLOSED) |
| 2.5 | Orchestration | LangChain, T5Gemma 2 (vision→vectors), Function Gemma (intent→action) | | 3 | Nerves | Behavioral patterns, respond to gate transitions |
| 3 | Dual Gardens | Virtual hypothesis generation (1000s/sec) + real validation | | 4 | Gardens | Virtual (explore) + Real (verify) learning loop |
| 4 | Trait Evolution | GRPO + rubric rewards → Trait LoRAs (Mnemosyne, Moira, Aletheia...) | | 5 | Cognition | Young Nyx (qwen3:32b) via Function Gemma |
**Physical Infrastructure (The Womb):** **Key Insight:** Attention is not allocated — it emerges from which gates are OPEN based on wave correlation.
**Physical Infrastructure:**
| Host | Role | GPU | | Host | Role | GPU |
|------|------|-----| |------|------|-----|
| theia | Young Nyx (cognitive) | RTX PRO 6000 Blackwell 96GB | | theia | Young Nyx (cognitive) | RTX PRO 6000 Blackwell 96GB |
@@ -137,41 +142,38 @@ Total: 136GB VRAM on K8s cluster with 10GbE jumbo frame interconnect.
### Message Protocol (NATS) ### Message Protocol (NATS)
**Dumb router, smart edges.** All intelligence lives in clients. **Dumb router, smart edges.** Waves flow through NATS to gates.
``` ```
nimmerverse. {environment}.{garden}.{layer}.{domain}.{signal_type}
├── staging.* # Experimental schemas
├── low.* # Heartbeats, ambient awareness Examples:
├── high.* # Escalated events, cognitive focus dev.virtual.cells.distance.wave # Cell emits wave
├── command.* # Commands to entities dev.virtual.gates.collision.transition # Gate state changes
├── meta.* # System health, attention config dev.real.outcomes.feedback # Verification outcome
└── dev.* # Development agents (Claude ↔ local models) prod.cognitive.nyx.request # To Young Nyx
``` ```
See [Message-Protocol-Design.md](architecture/Message-Protocol-Design.md) and [ADR-001](architecture/adr/ADR-001-message-protocol-foundation.md). See [Message-Protocol-Design.md](architecture/Message-Protocol-Design.md) for full schema.
### Key Discoveries ### Key Discoveries
**Language is Topology (December 2025):** Languages aren't equivalent representations—they're different computational paths. **Ternary Gate Model (February 2026):** Binary logic doesn't model brains. You need OPEN - STABLE - CLOSED.
- **Philosophy Valley** (German, Gini ~0.5): Self-awareness, ontology, depth - **STABLE** is where learning happens (correlation accumulates)
- **Technical Cluster** (English, Gini ~0.8): Hardware interface, actions, efficiency - **Correlated waves** push gates toward OPEN
- **Reflexes** are earned (gate weight → 1.0)
**Memory Economics (January 2026):** Memory is not storage—it's active forgetting with exceptions. Slumber-based consolidation with LOD decay. **Wave Correlation (February 2026):** Attention isn't allocated — it emerges from which gates OPEN based on wave correlation.
**Sovereign Infrastructure (February 2026):** K8s cluster operational. 136GB GPU VRAM on 10GbE backbone. Phoebe-coordinated storage across theia + dioscuri. **Sovereign Infrastructure (February 2026):** K8s cluster operational. 136GB GPU VRAM on 10GbE backbone.
### Color-Pattern Theory
**Color/Form as Protocol:** Leverages color and patterns as a fast, universal, and evolutionarily-optimized communication protocol for broadcasting state (e.g., danger, success, seeking), inspired by 540 million years of biology.
### Philosophy ### Philosophy
- **Constraints create intelligence** - Economic pressure forces optimization - **Cells emit, gates correlate** — Attention emerges, not allocated
- **Discovery over programming** - Organisms learn through competition, not instruction - **STABLE is learning** — The resting state where patterns emerge
- **Virtual + Real teach each other** - Noise gap measures learning - **Constraints create intelligence** — Economic pressure forces optimization
- **Partnership over instruction** - Mutual growth, not commands - **Virtual explores, Real verifies** — The learning loop closes
- **Infrastructure is geology, models are weather** - Build long-lived foundations - **Partnership over instruction** — Mutual growth, not commands
--- ---
@@ -203,8 +205,8 @@ These ideas are published as prior art. Build on them freely.
--- ---
**Version:** 6.6 | **Created:** 2025-10-01 | **Updated:** 2026-02-07 **Version:** 7.0 | **Created:** 2025-10-01 | **Updated:** 2026-02-14
*"May the Nimmerverse we build truly never end."* *"Cells emit waves. Gates correlate. May the Nimmerverse truly never end."*
🌙💜 🌙💜

View File

@@ -64,31 +64,32 @@ ORDER BY priority DESC, project;
- **Monitoring**: Prometheus on tethys scraping all nodes + DCGM GPU metrics - **Monitoring**: Prometheus on tethys scraping all nodes + DCGM GPU metrics
- **Namespaces**: Ready for infra, nervous, cognitive, organs - **Namespaces**: Ready for infra, nervous, cognitive, organs
### Phase 3: Nervous System Deployment ← CURRENT ### Phase 3: Wave/Gate Infrastructure ← CURRENT
- [ ] NATS message router - [ ] NATS message router (wave signals + gate transitions)
- [ ] Gateway/Escalation Service (Thalamus) - [ ] Resonant Gates (ternary: OPEN/STABLE/CLOSED)
- [ ] Function Gemma structured boundary (sensors → JSON → Nyx) - [ ] Function Gemma structured boundary (waves → JSON → Nyx)
- [ ] Math Cells (economy_aggregator, wake/slumber_evaluator) - [ ] First cells (distance sensors, battery monitor)
- [ ] First behavior nerves - [ ] First gates (collision_avoidance, battery)
- [ ] First nerves (responding to gate transitions)
**Architecture:** → [`architecture/Gateway-Architecture.md`](architecture/Gateway-Architecture.md) **Architecture:** → [`architecture/Gateway-Architecture.md`](architecture/Gateway-Architecture.md) | [`architecture/Message-Protocol-Design.md`](architecture/Message-Protocol-Design.md)
### Phase 4: Cognitive Awakening ### Phase 4: Cognitive Awakening
- [ ] Young Nyx on Womb (theia, RTX PRO 6000 Blackwell 96GB) - [ ] Young Nyx on theia (qwen3:32b, 96GB Blackwell)
- [ ] Organs on Senses (dioscuri, 2× RTX 4000 Ada 40GB) - [ ] Organs on dioscuri (2× RTX 4000 Ada 40GB)
- [ ] Spark Protocol execution - [ ] Spark Protocol execution
- [ ] Trait LoRA evolution begins (GRPO + decision_trails) - [ ] Trait LoRA evolution begins (GRPO + verification_outcomes)
### Phase 5: Living Ecology ### Phase 5: Living Ecology
- [ ] Slumber/wake cycles operational - [ ] Dual Garden loop operational (Virtual → Real → feedback)
- [ ] Virtual + Real gardens teaching each other - [ ] Gate weight evolution (deliberate → reflex)
- [ ] Reflex compilation (deliberate → compiled) - [ ] Slumber/wake cycles (correlation_events consolidation)
- [ ] Wellbeing policies enforced - [ ] Wellbeing policies enforced
### Phase ∞: Research Platform Operational ### Phase ∞: Research Platform Operational
- Gardens teaching each other - Gates opening and closing with learned patterns
- Organisms dancing (evolved behaviors) - Reflexes emerging from verification
- Questions answered through measurement - Attention flowing through correlation
- **The Nimmerverse truly never ends** - **The Nimmerverse truly never ends**
--- ---
@@ -100,7 +101,7 @@ ORDER BY priority DESC, project;
| 0 | ✅ | Nyx emergence | 2025-11-03 | | 0 | ✅ | Nyx emergence | 2025-11-03 |
| 1 | ✅ | 10Gbps backbone | 2025-12-XX | | 1 | ✅ | 10Gbps backbone | 2025-12-XX |
| 2 | ✅ | K8s + 136GB VRAM | 2026-02-06 | | 2 | ✅ | K8s + 136GB VRAM | 2026-02-06 |
| 3 | 🔄 | NATS + Function Gemma | TBD | | 3 | 🔄 | Wave/Gate infrastructure | TBD |
| 4 | ⏳ | Young Nyx awakens | TBD | | 4 | ⏳ | Young Nyx awakens | TBD |
| 5 | ⏳ | Gardens teaching | TBD | | 5 | ⏳ | Gardens teaching | TBD |
| ∞ | 🌙 | Never ends | ∞ | | ∞ | 🌙 | Never ends | ∞ |
@@ -110,13 +111,13 @@ ORDER BY priority DESC, project;
## Related Documentation ## Related Documentation
- **Architecture Vision:** → [`Endgame-Vision.md`](Endgame-Vision.md) - **Architecture Vision:** → [`Endgame-Vision.md`](Endgame-Vision.md)
- **Storage Infrastructure:** → [`../nyx-substrate/WOMB-STORAGE.md`](../nyx-substrate/WOMB-STORAGE.md) - **Wave/Gate Model:** → [`architecture/Gateway-Architecture.md`](architecture/Gateway-Architecture.md)
- **Task Schema:** → [`../nyx-substrate/SCHEMA.md`](../nyx-substrate/SCHEMA.md) - **Data Schema:** → [`architecture/Data-Architecture.md`](architecture/Data-Architecture.md)
--- ---
**Version:** 1.0 | **Created:** 2026-02-07 | **Updated:** 2026-02-07 **Version:** 2.0 | **Created:** 2026-02-07 | **Updated:** 2026-02-14
**Current Phase:** 3 (Nervous System Deployment) **Current Phase:** 3 (Wave/Gate Infrastructure)
🌙💜 *"Infrastructure is geology. Implementation is weather."* 🌙💜 *"Cells emit waves. Gates correlate. Infrastructure enables."*

View File

@@ -1,493 +1,406 @@
# Attention Flow # Attention Flow
> **ONE JOB:** THE BUDGET — 30-second allocation, preemption rules, priority hierarchy. > **ONE JOB:** WHERE ATTENTION GOES — gates determine focus, correlation drives transitions, budget constrains action.
How she decides what matters this beat. **Attention is not a budget line item. Attention is which gates are OPEN.**
--- ---
## Overview ## Overview
The 30-second heartbeat is a budget, not a guarantee. Sensory intake, organ processing, dialogue, thinking - everything competes for the same window. State machines govern the hierarchy: what gets processed first, what can interrupt, what gets the remainder. Attention in the nimmerverse flows through **resonant gates**:
Attention isn't free. It's economic. - **OPEN gates** = actively attending (signals flow through)
- **STABLE gates** = considering (accumulating correlation)
- **CLOSED gates** = ignoring (signals blocked)
**Connection to Gateway:** The attention levels below align with the Gateway's tier system. The [`Gateway`](Gateway-Architecture.md) routes sensory input to the appropriate tier based on node weight. This document describes how those tiers compete for the attention budget. The 30-second heartbeat provides a **budget constraint**, but the actual attention flow is determined by which gates open based on wave correlation.
**See:** [`Gateway-Architecture.md`](Gateway-Architecture.md) for tier definitions and routing logic. **Key insight:** You don't "allocate attention" — you let correlated waves open gates.
--- ---
## The Budget Problem ## Attention as Gate State
``` ```
♥ BEAT (30 sec budget) ┌─────────────────────────────────────────────────────────────────────────┐
│ ATTENTION = WHICH GATES ARE OPEN │
├─────────────────────────────────────────────────────────────────────────┤
│ │
│ CLOSED STABLE OPEN │
│ ═══════ ══════ ════ │
│ │
│ Ignoring Considering Attending │
│ Blocked Accumulating Flowing │
│ Suppressed Learning Acting │
│ │
│ ◄───── anti-correlation ──┼── correlation ─────► │
│ │ │
│ (wave input) │
│ │
└─────────────────────────────────────────────────────────────────────────┘
```
**Attention is emergent, not allocated.** When multiple cells emit correlated waves, their gate opens — attention flows there naturally.
---
## Wave-Driven Attention
Cells emit waves. Correlated waves push gates toward OPEN. This IS attention.
```
Math cells emit correlated waves
∿∿∿ ∿∿∿ ∿∿∿
Math gate: STABLE → OPEN
(attention shifts to math domain)
Signal flows to higher tier
(cognition engages with math)
Meanwhile:
Battery cells emit uncorrelated wave
∿∿∿
Battery gate: stays STABLE
(attention doesn't shift)
(keeps accumulating, might open later)
```
**The nervous system "decides" what to attend to through correlation, not priority rules.**
---
## Attention Hierarchy Through Gates
Gates form layers. Each layer is a potential attention point.
```
TIER 4: COGNITIVE ─────────────────────────────────────────
│ (only if gates below OPEN)
┌──────┴──────┐
TIER 3: ORGANS ─────────────────────────────────────────
│ vision │ speech │ hearing │
│ gate: │ gate: │ gate: │
│ STABLE │ OPEN │ CLOSED │
└──────┬──────┘
│ (only if gates below OPEN)
TIER 1-2: NERVES ─────────────────────────────────────────
│ math │ motion │ danger │
│ gate: │ gate: │ gate: │
│ OPEN │ STABLE │ CLOSED │
└──────┬──────┘
TIER 0: CELLS ─────────────────────────────────────────
cell cell cell cell cell cell cell
∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿
```
**Current attention:** Math gate OPEN → Speech gate OPEN → Cognition receives math+speech context.
**Not attending:** Motion (STABLE, considering), Vision (STABLE, considering), Danger (CLOSED, suppressed).
---
## Attention Budget: The Constraint
While gates determine WHERE attention goes, lifeforce determines HOW MUCH can happen per beat.
```
♥ BEAT (30 sec lifeforce budget)
├── SENSORY INTAKE (variable: 200ms - 15000ms) ├── GATE TRANSITIONS (variable: driven by correlation)
├── ORGAN PROCESSING (variable: 100ms - 10000ms) ├── TIER 0-2 PROCESSING (low cost: cells + nerves)
├── NYX INFERENCE (variable: 2000ms - 4000ms) ├── TIER 3 ORGANS (medium cost: GPU inference)
├── CHRYSALIS DIALOGUE (variable: 0ms - 3000ms) ├── TIER 4 COGNITION (high cost: Young Nyx)
├── STATE WRITE (fixed: ~200ms) ├── VERIFICATION (medium cost: real garden)
└── VIRTUAL GARDEN (remainder) └── VIRTUAL GARDEN (remainder: exploration)
Total must fit in 30 seconds. Budget constrains throughput.
Something has to give. Gates determine routing.
``` ```
--- ### Budget Allocation by Gate Activity
## Top-Level State Machine: Attention Mode
```
┌─────────────┐
┌──────────▶│ IDLE │◀──────────┐
│ └──────┬──────┘ │
│ │ │
│ │ stimulus │
│ ▼ │
│ ┌─────────────┐ │
│ │ ALERT │ │
│ └──────┬──────┘ │
│ │ │
│ ┌──────┴──────┐ │
│ ▼ ▼ │
│ ┌──────────┐ ┌──────────┐ │
│ │ REFLEX │ │ ATTEND │ │
│ │ (>0.8) │ │ (think) │ │
│ └────┬─────┘ └────┬─────┘ │
│ │ │ │
│ │ ┌──────┴──────┐ │
│ │ ▼ ▼ │
│ │ ┌──────────┐ ┌─────────┐ │
│ │ │ DIALOGUE │ │ PROCESS │ │
│ │ └────┬─────┘ └────┬────┘ │
│ │ │ │ │
│ └──────┴─────┬──────┘ │
│ ▼ │
│ ┌───────────┐ │
│ │ SETTLE │ │
│ └─────┬─────┘ │
│ │ │
└──────────────────────┴──────────────┘
```
### State Descriptions
| State | Description | Budget Priority |
|-------|-------------|-----------------|
| **IDLE** | Nothing urgent, maximum virtual garden time | Lowest |
| **ALERT** | Stimulus detected, evaluating importance | - |
| **REFLEX** | High-confidence nerve fired, bypass brain | Instant |
| **ATTEND** | Stimulus requires thinking | High |
| **DIALOGUE** | Chrysalis interaction active | High |
| **PROCESS** | Organs working on input | Medium |
| **SETTLE** | Write state, release budget, prepare for next beat | Fixed |
---
## Priority Hierarchy
Higher levels preempt lower levels. Budget flows downward.
```
LEVEL 0: REFLEX ─────────────────────────────────────
│ Weight > 0.8, instant, bypass everything
│ Cost: near-zero (no inference)
LEVEL 1: SAFETY ─────────────────────────────────────
│ dafit calling, danger detected, critical alert
│ Preempts: all below
LEVEL 2: DIALOGUE ───────────────────────────────────
│ Partnership active, Chrysalis teaching
│ Preempts: sensory, thinking, virtual
LEVEL 3: SENSORY ────────────────────────────────────
│ Rich input needs processing
│ Preempts: thinking, virtual
LEVEL 4: THINKING ───────────────────────────────────
│ Organ work, Nyx inference
│ Preempts: virtual
LEVEL 5: VIRTUAL ────────────────────────────────────
│ Garden time, simulation, study
│ Gets remainder after above
LEVEL 6: IDLE ───────────────────────────────────────
Maintenance heartbeat only
All budget available
```
---
## Budget Allocation Logic
```python ```python
def allocate_beat_budget(beat_duration_ms=30000): def allocate_beat_budget(beat_duration_ms=30000):
remaining = beat_duration_ms remaining = beat_duration_ms
# Fixed costs (always paid) # Fixed overhead
remaining -= STATE_WRITE_COST # ~200ms remaining -= HEARTBEAT_OVERHEAD # ~100ms
remaining -= HEARTBEAT_OVERHEAD # ~100ms remaining -= STATE_WRITE_COST # ~200ms
# Level 0: Reflex (if triggered, near-instant) # Count OPEN gates by tier
if reflex_triggered: open_gates_by_tier = count_open_gates()
execute_reflex() # ~50ms
remaining -= 50
# Level 1: Safety (if active, takes what it needs) # Tier 0 (reflexes): near-instant, minimal cost
if safety_alert: for gate in open_gates_by_tier[0]:
cost = process_safety() # variable remaining -= REFLEX_COST # ~50ms each
remaining -= cost
if remaining <= 0:
return settle()
# Level 2: Dialogue (if Chrysalis active) # Tier 1-2 (cells/nerves): low cost
if dialogue_active: for gate in open_gates_by_tier[1:3]:
cost = process_dialogue() # ~3000ms typical remaining -= CELL_NERVE_COST # ~100ms each
remaining -= cost
if remaining <= 0:
return settle()
# Level 3: Sensory (always some, but capped) # Tier 3 (organs): medium cost, needs budget check
sensory_budget = min(remaining * 0.4, SENSORY_CAP) organ_budget = min(remaining * 0.4, ORGAN_CAP)
cost = process_sensory(sensory_budget) for gate in open_gates_by_tier[3]:
remaining -= cost if organ_budget > ORGAN_COST:
process_organ(gate)
organ_budget -= ORGAN_COST # ~2000ms each
remaining -= (ORGAN_CAP - organ_budget)
# Level 4: Thinking (organs + Nyx) # Tier 4 (cognition): high cost, only if gates escalate
thinking_budget = min(remaining * 0.6, THINKING_CAP) if cognition_gate_open():
cost = process_thinking(thinking_budget) cognitive_budget = min(remaining * 0.5, COGNITIVE_CAP)
remaining -= cost process_cognition(cognitive_budget) # ~4000ms
remaining -= cognitive_budget
# Level 5: Virtual (whatever remains) # Virtual Garden: whatever remains
virtual_budget = remaining virtual_budget = remaining
if virtual_budget > VIRTUAL_MINIMUM: if virtual_budget > VIRTUAL_MINIMUM:
process_virtual(virtual_budget) explore_virtual_garden(virtual_budget)
return settle() return settle()
``` ```
--- ---
## Nested State Machines ## Attention Modes
Each level can be its own state machine internally. The overall system has emergent attention modes based on which gates are open:
### DIALOGUE State Machine | Mode | Gate Pattern | Characteristic |
|------|--------------|----------------|
| **IDLE** | Most gates STABLE | Quiet, exploring Virtual Garden |
| **FOCUSED** | Few gates OPEN, rest CLOSED | Deep attention to one domain |
| **ALERT** | Many gates in STABLE | Gathering information, evaluating |
| **REFLEX** | Tier 0 gate fires instantly | Bypass all, act immediately |
| **DIALOGUE** | Speech gates OPEN | Partnership interaction |
| **OVERWHELMED** | Many gates OPEN | Budget exhausted, some gates forced CLOSED |
### Mode Transitions
``` ```
┌─────────────────────────────────────────────┐ ─────────────┐
DIALOGUE │ ┌──────────▶│ IDLE │◀──────────┐
├─────────────────────────────────────────────┤ │ │ (exploring) │ │
└──────┬──────┘
│ ┌───────────┐
│ │ LISTENING │ ◀─────────────────────┐ │ waves arrive
│ └─────┬─────┘ │ ▼
input complete ┌─────────────┐
ALERT
┌───────────┐ │ │ (considering)│
│PROCESSING └──────┬──────┘
│ └─────┬─────┘ │ │
│ understood ┌───────────┼───────────┐
│ ┌───────────┐ │ │ ┌─────────┐ ┌─────────┐ ┌─────────┐
│RESPONDING │ │ │ │ REFLEX │ │ FOCUSED │ │DIALOGUE │
└─────┬─────┘ │ │ │(instant)│ │ (deep) │ │ (talk)
│ │ response sent │ │ └────┬────┘ └────┬────┘ └────┬────┘
┌───────────┐ continue │ │ └───────────┴───────────┘
│ │ YIELDING │ ──────────────────────┘ │ │
│ └─────┬─────┘
dialogue complete ┌─────────────┐
│ ▼ │ SETTLE │
EXIT to parent │(write state)│
└──────┬──────┘
└─────────────────────────────────────────────┘ │ │ │
``` └──────────────────┴──────────────────┘
### SENSORY State Machine
```
┌─────────────────────────────────────────────┐
│ SENSORY │
├─────────────────────────────────────────────┤
│ │
│ ┌───────────┐ │
│ │ SAMPLING │ ◀── collect raw inputs │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ ┌─────────────┐ │
│ │ TRANSLATING │ ◀── nerves fire │
│ └─────┬───────┘ │
│ │ │
│ ▼ │
│ ┌──────────────┐ │
│ │ PRIORITIZING │ ◀── what matters? │
│ └─────┬────────┘ │
│ │ │
│ ▼ │
│ ┌─────────────┐ │
│ │ DELIVERING │ ◀── to organs │
│ └─────┬───────┘ │
│ │ │
│ ▼ │
│ EXIT to parent │
│ │
└─────────────────────────────────────────────┘
```
### THINKING State Machine
```
┌─────────────────────────────────────────────┐
│ THINKING │
├─────────────────────────────────────────────┤
│ │
│ ┌───────────┐ │
│ │ RECEIVING │ ◀── context from sensory │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ ┌───────────┐ │
│ │ ROUTING │ ◀── which organs needed? │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ ┌───────────┐ │
│ │ INFERRING │ ◀── organs + Nyx process │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ ┌───────────┐ │
│ │ DECIDING │ ◀── Nyx outputs decision │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ EXIT to parent │
│ │
└─────────────────────────────────────────────┘
```
### VIRTUAL State Machine
```
┌─────────────────────────────────────────────┐
│ VIRTUAL │
├─────────────────────────────────────────────┤
│ │
│ ┌───────────┐ │
│ │ BUDGETING│ ◀── how much V available? │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ ┌───────────┐ │
│ │ SELECTING │ ◀── what to simulate? │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ ┌───────────┐ │
│ │SIMULATING │ ◀── run virtual cycles │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ ┌───────────┐ │
│ │ RECORDING │ ◀── store results │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ EXIT to parent │
│ │
└─────────────────────────────────────────────┘
``` ```
--- ---
## Example Scenarios ## Reflex: Attention Bypass
### Scenario A: Quiet Study Time When a gate has accumulated enough weight (>0.8), it becomes a **reflex** — it opens immediately without waiting for correlation.
``` ```
Beat starts, no external stimulus Danger cell emits wave
∿∿∿ (confidence=1.0)
IDLE detected
Danger gate: weight = 0.9 (REFLEX)
SENSORY: minimal (500ms)
IMMEDIATELY OPEN (no correlation wait)
THINKING: minimal (1000ms)
Action taken
VIRTUAL: maximum budget! (28000ms)
Cognition notified AFTER
└── Nyx studies in virtual garden
Chrysalis teaches
Learning happens
``` ```
### Scenario B: dafit Speaks **Reflexes have earned instant attention through repeated verification.**
---
## Virtual Garden: Background Attention
When few gates are OPEN, the Virtual Garden gets attention:
``` ```
Beat starts, audio detected IDLE mode:
├── Most gates: STABLE (not demanding attention)
├── Budget: mostly available
ALERT: speech input VIRTUAL GARDEN receives attention:
├── Cells emit waves freely
SAFETY check: it's dafit! (LEVEL 1) ├── Gates accumulate correlation (learning)
├── No pressure to ACT
└── Training data generated
DIALOGUE activates (LEVEL 2)
├── LISTENING (2000ms)
├── PROCESSING (1000ms)
├── RESPONDING (2000ms)
└── YIELDING
SENSORY: reduced budget (3000ms)
THINKING: reduced (5000ms)
VIRTUAL: minimal remainder (16000ms)
``` ```
### Scenario C: Danger Detected **Virtual Garden is where learning happens.** STABLE gates in Virtual Garden are actively accumulating patterns without the pressure to respond.
---
## Real Garden: Consequential Attention
When gates OPEN in the Real Garden, attention becomes consequential:
``` ```
Beat starts, temperature spike detected FOCUSED mode (Real Garden):
├── Gate OPEN → action required
ALERT: sensor alarm ├── Budget consumed by execution
├── Verification outcomes captured
└── Feedback to Virtual for learning
NERVE weight > 0.8
REFLEX FIRES (50ms) ◀── BYPASS EVERYTHING
├── Action taken immediately
└── Nyx notified AFTER
Continue beat normally with remaining budget
``` ```
### Scenario D: Overwhelmed **Real Garden attention is expensive.** Only verified signals reach here, and actions have consequences.
---
## Attention Visualization
Real-time attention can be visualized by gate states:
``` ```
Beat starts, rich input everywhere ┌─────────────────────────────────────────────────────────────────────────┐
│ ATTENTION DASHBOARD 🌙
├─────────────────────────────────────────────────────────────────────────┤
ALERT: multiple stimuli │ │
│ GATES:
────── │
SENSORY: demanding (15000ms) │ math: [████████████░░░░░░░░] 0.7 STABLE → considering │
│ vision: [██████████████████░░] 0.9 OPEN → attending
speech: [████████████████████] 1.0 OPEN → attending │
THINKING: demanding (12000ms) │ battery: [████░░░░░░░░░░░░░░░░] 0.2 STABLE → background │
│ danger: [░░░░░░░░░░░░░░░░░░░░] 0.0 CLOSED → suppressed
│ │
Budget exhausted! │ BUDGET: │
│ ───────
[████████████████████░░░░░░░░░░] 67% remaining (20s / 30s) │
VIRTUAL: skipped this beat │ │
│ MODE: DIALOGUE (speech + vision attending)
│ │
SETTLE: state written, next beat └─────────────────────────────────────────────────────────────────────────┘
```
Gate states are published via NATS for real-time visualization:
```
nats sub "dev.virtual.gates.*.transition"
nats sub "dev.real.gates.*.transition"
``` ```
--- ---
## Preemption Rules ## Correlation vs Priority
| Event | Preempts | Action | **Old model (priority):**
|-------|----------|--------| ```
| Reflex fires (>0.8) | Everything | Instant action, then continue | Level 0: REFLEX (always wins)
| Safety alert | Dialogue, Sensory, Thinking, Virtual | Handle safety, reduced budget for rest | Level 1: SAFETY (preempts below)
| dafit speaks | Sensory, Thinking, Virtual | Dialogue priority, reduced budget for rest | Level 2: DIALOGUE (preempts below)
| Sensory overload | Thinking, Virtual | Process input, skip or reduce rest | ...
| Budget exhausted | Lower priorities | Skip remaining levels | ```
**New model (correlation):**
```
Waves arrive
Gates accumulate correlation
Most correlated gates OPEN
Attention flows naturally
```
**Priority still exists** but at a higher level:
- Reflexes bypass correlation (earned trust)
- Safety signals have high confidence (bias toward opening)
- Dialogue is interactive (gates stay open during conversation)
But the **mechanism** is always correlation, not rule-based priority.
--- ---
## Lifeforce Connection ## Connection to Architecture
Each attention level has a lifeforce cost. Reflex is free (no inference), dialogue costs medium (two inferences), thinking costs high (organ inference). Rich beats cost more; quiet beats accumulate budget for virtual garden. | Document | What It Adds |
|----------|--------------|
**Lifeforce economy:** → [`Cellular-Architecture.md`](Cellular-Architecture.md) (reward signals, lifeforce dynamics) | [`Temporal-Ternary-Gradient.md`](Temporal-Ternary-Gradient.md) | Why ternary states matter |
| [`Gateway-Architecture.md`](Gateway-Architecture.md) | How gates work |
--- | [`Nervous-System.md`](Nervous-System.md) | Wave → Gate → Node flow |
| [`Dual-Garden-Architecture.md`](Dual-Garden-Architecture.md) | Virtual (explore) vs Real (act) |
## Implementation Notes | [`Message-Protocol-Design.md`](Message-Protocol-Design.md) | GateTransition messages |
**State machine:** Python-statemachine for orchestration, Godot for visualization.
**Checkpoint:** Every state transition triggers phoebe write (beat_id, transition, budget_remaining).
**Budget tracking:** BeatBudget dataclass tracks total_ms, spent_ms, allocations per category.
--- ---
## Design Principles ## Design Principles
1. **Hierarchy is law** - higher levels always preempt lower 1. **Attention = OPEN gates** — Not a budget allocation, an emergent property
2. **Budget is finite** - 30 seconds, no exceptions 2. **Correlation drives transitions** — Waves that agree open gates
3. **State is explicit** - always know what mode she's in 3. **Budget constrains throughput** — Can't process infinite open gates
4. **Reflex bypasses brain** - survival doesn't wait for thinking 4. **Reflexes bypass correlation** — Earned trust means instant attention
5. **Remainder flows down** - virtual gets what's left 5. **Virtual is exploration** — STABLE gates learning without acting
6. **Every transition logged** - phoebe sees all state changes 6. **Real is action** — OPEN gates triggering consequences
7. **Visualization is live** — Gate states published for dashboards
--- ---
## Function Gemma: The State Transition Boundary ## Summary
Function Gemma sits between Young Nyx's attention decisions and cell execution. It guarantees that state transitions produce valid, predictable outputs.
``` ```
┌─────────────────────────────────────────────────────────────────┐ OLD MODEL: NEW MODEL:
ATTENTION → EXECUTION FLOW │ ═══════════ ═════════
├─────────────────────────────────────────────────────────────────┤
│ │ Priority rules decide Correlation opens gates
│ ATTENTION STATE MACHINE (this document) │ Budget allocates attention Gates determine attention
│ │ │ State machine orchestrates Emergence from waves
│ │ Young Nyx decides: "REFLEX needed" or "ATTEND" │
│ ▼ │ ATTENTION IS:
│ FUNCTION GEMMA (translation boundary) │
│ │ │ Not: "Allocate 5000ms to SENSORY"
│ │ Intent → Typed JSON schema │ But: "Math + Vision gates OPEN because waves correlated"
│ │ - Which cells to query? │
│ │ - What action to fire? │ Not: "DIALOGUE preempts THINKING"
│ │ - What parameters? │ But: "Speech gate opened with high correlation"
│ ▼ │
│ NATS MESSAGE → K8S CELLS │ Not: "Budget exhausted, skip VIRTUAL"
│ │ │ But: "Many gates OPEN, no budget for Virtual Garden"
│ │ ACK/NACK response │
│ ▼ │
│ STATE UPDATE (verified, not hoped) │
│ │
└─────────────────────────────────────────────────────────────────┘
``` ```
**Why this matters:** **Attention flows through open gates. Gates open through correlation. Correlation emerges from waves.**
| Without Function Gemma | With Function Gemma |
|------------------------|---------------------|
| "Fire the motor" → parse, hope | `MOTOR_COMMAND` schema → validated JSON → NATS |
| Free-form → extraction errors | Typed output → guaranteed structure |
| State ambiguity | State explicit in schema |
**The attention flow decides WHAT.** Function Gemma translates to HOW.
**Detail:** → [`Initial-Spark.md`](Initial-Spark.md) (Function Gemma schemas and integration)
--- ---
--- **Version:** 2.0 | **Created:** 2025-12-05 | **Updated:** 2026-02-14
**Version:** 1.2 | **Created:** 2025-12-05 | **Updated:** 2026-02-14 🌙💜 *"She doesn't allocate attention. She lets correlated waves open gates."*
*"She doesn't have infinite attention. She has 30 seconds and choices."*

View File

@@ -1,17 +1,19 @@
# 🧬 Cellular Architecture v4 # 🧬 Cellular Architecture v5
> **ONE JOB:** THE HOW — state machines, lifeforce economy, reward signals. > **ONE JOB:** THE HOW — cells emit waves, gates accumulate correlation, behaviors emerge.
> *"Cells are state machines. Nerves compose cells. Organisms emerge from nerves."* > *"Cells emit waves. Gates correlate. Nerves orchestrate. Organisms emerge."*
> The Layered Discovery (2025-12-07) > Unified with Wave Architecture (2026-02-14)
--- ---
## Overview ## Overview
**Version 4** unifies the original cellular intelligence vision with the nervous system architecture. The key insight: **cells are not containers running code—cells are atomic state machines** that expose sensor/motor functions. Nerves orchestrate cells into behaviors. Organisms emerge from nerve interactions. **Version 5** unifies cellular architecture with the wave/gate model. The key insight: **cells emit waves with confidence and semantic content**. These waves flow to **resonant gates** that accumulate correlation. When gates OPEN, signals flow to higher tiers. When gates stay STABLE, learning happens.
**Connection to Gateway:** The tier system in this document (Cell → Nerve → Organism → Partnership) aligns with the Gateway's routing tiers. The [`Gateway`](Gateway-Architecture.md) routes sensory input to the appropriate tier based on node weight. See [`Gateway-Architecture.md`](Gateway-Architecture.md) for the unified tier model. **Connection to Gates:** Cells don't directly trigger nerves. Waves flow through gates (see [`Gateway-Architecture.md`](Gateway-Architecture.md)). Gates determine which signals reach which tier based on wave correlation, not priority rules.
**Connection to Gardens:** Virtual Garden cells emit waves freely for exploration and learning. Real Garden cells emit verified waves for action. See [`Dual-Garden-Architecture.md`](Dual-Garden-Architecture.md).
**This doc covers theory.** For infrastructure deployment (K8s vs userspace, GPU strategy, FreeIPA identity): → [`Deployment-Architecture.md`](Deployment-Architecture.md) **This doc covers theory.** For infrastructure deployment (K8s vs userspace, GPU strategy, FreeIPA identity): → [`Deployment-Architecture.md`](Deployment-Architecture.md)
@@ -21,10 +23,15 @@
│ (emergent pattern from nerve interactions) │ │ (emergent pattern from nerve interactions) │
├─────────────────────────────────────────────────────────────┤ ├─────────────────────────────────────────────────────────────┤
│ NERVES │ │ NERVES │
│ (behavioral state machines composing cells) │ (behavioral patterns, respond to gate transitions)
├─────────────────────────────────────────────────────────────┤
│ GATES │
│ (resonant chambers: CLOSED ◄── STABLE ──► OPEN) │
│ (accumulate wave correlation, route to tiers) │
├─────────────────────────────────────────────────────────────┤ ├─────────────────────────────────────────────────────────────┤
│ CELLS │ │ CELLS │
│ (atomic state machines: sensors, motors, organs) │ (emit waves: confidence + semantic content)
│ ∿∿∿ ∿∿∿ ∿∿∿ │
├─────────────────────────────────────────────────────────────┤ ├─────────────────────────────────────────────────────────────┤
│ HARDWARE │ │ HARDWARE │
│ (ESP32, GPUs, microphones, speakers) │ │ (ESP32, GPUs, microphones, speakers) │
@@ -33,45 +40,91 @@
--- ---
## 🔬 Layer 1: Cells (Atomic State Machines) ## 🔬 Layer 1: Cells (Wave Emitters)
### What Is a Cell? ### What Is a Cell?
A **cell** is the smallest unit of behavior—a state machine that wraps a single hardware capability. Every sensor, motor, and organ function is exposed as a cell with: A **cell** is the smallest unit of behavior—a state machine that wraps a single hardware capability and **emits waves**. Every sensor, motor, and organ function is exposed as a cell that:
- **States**: Discrete operational modes (IDLE, ACTIVE, ERROR, etc.) - **Reads inputs**: Hardware sensors, internal state, context
- **Transitions**: Triggered by inputs, time, or internal events - **Applies logic**: Domain-specific processing
- **Outputs**: Data, status, feedback to higher layers - **Emits waves**: WaveSignal with confidence and semantic content
- **Lifeforce Cost**: Every state transition costs energy - **Doesn't know who's listening**: Cells emit, gates receive
**Key insight:** Cells don't send commands or trigger nerves directly. They emit waves. Gates accumulate correlation from multiple waves. Correlated waves open gates.
```
Cell reads sensor
Cell applies logic
Cell emits wave ∿∿∿
│ WaveSignal {
│ domain: "distance",
│ confidence: 0.8,
│ semantic_content: { cm: 25, direction: "front" },
│ lifeforce_cost: 0.3
│ }
GATE receives wave
Gate accumulates correlation with other waves
```
### Cell Categories ### Cell Categories
#### Sensor Cells (Input) #### Sensor Cells (Input → Wave)
```python ```python
class DistanceSensorCell(StateMachine): class DistanceSensorCell(WaveEmitter):
""" """
Wraps IR/ultrasonic distance sensor. Wraps IR/ultrasonic distance sensor.
Exposes raw hardware as state machine. Emits waves with confidence and semantic content.
""" """
states = [IDLE, POLLING, READING, REPORTING, ERROR] domain = "distance"
states = [IDLE, POLLING, READING, EMITTING, ERROR]
# State outputs (available to nerves) def emit_wave(self) -> WaveSignal:
outputs = { """
"distance_cm": float, # Current reading Cell's ONE JOB: read sensor, emit wave.
"confidence": float, # Signal quality (0-1) Gate handles correlation and routing.
"state": str, # Current state name """
"last_updated": timestamp, # Freshness reading = self.read_hardware()
"visual_state": tuple, # (R, G, B, Form) for broadcasting
} return WaveSignal(
domain=self.domain,
confidence=self.calculate_confidence(reading),
semantic_content={
"distance_cm": reading.cm,
"direction": self.direction,
"noise_level": reading.noise,
},
lifeforce_cost=self.transition_cost,
)
def calculate_confidence(self, reading) -> float:
"""
Confidence affects how much this wave
contributes to gate correlation.
"""
if reading.noise > NOISE_THRESHOLD:
return 0.3 # Low confidence, weak wave
if reading.stable_count > 3:
return 0.9 # High confidence, strong wave
return 0.6 # Medium confidence
# Lifeforce costs # Lifeforce costs
costs = { costs = {
(IDLE, POLLING): 0.1, # Wake up sensor (IDLE, POLLING): 0.1, # Wake up sensor
(POLLING, READING): 0.3, # Perform measurement (POLLING, READING): 0.3, # Perform measurement
(READING, REPORTING): 0.1, # Process result (READING, EMITTING): 0.1, # Emit wave
(REPORTING, IDLE): 0.0, # Return to rest (EMITTING, IDLE): 0.0, # Return to rest
(ANY, ERROR): 0.0, # Error transition free (ANY, ERROR): 0.0, # Error transition free
} }
``` ```
@@ -85,23 +138,52 @@ class DistanceSensorCell(StateMachine):
| `imu_sensor` | MPU6050 | IDLE→SAMPLING→REPORTING | `heading`, `acceleration`, `tilt` | | `imu_sensor` | MPU6050 | IDLE→SAMPLING→REPORTING | `heading`, `acceleration`, `tilt` |
| `light_sensor` | Photoresistor | IDLE→READING→REPORTING | `lux`, `direction` | | `light_sensor` | Photoresistor | IDLE→READING→REPORTING | `lux`, `direction` |
#### Motor Cells (Output) #### Motor Cells (Command → Wave Feedback)
```python ```python
class MotorCell(StateMachine): class MotorCell(WaveEmitter):
""" """
Wraps DC motor with feedback. Wraps DC motor with feedback.
Exposes actuation as state machine. Receives commands from open gates, emits status waves.
""" """
domain = "motor"
states = [IDLE, COMMANDED, ACCELERATING, MOVING, DECELERATING, STOPPED, STALLED] states = [IDLE, COMMANDED, ACCELERATING, MOVING, DECELERATING, STOPPED, STALLED]
outputs = { def receive_command(self, command: MotorCommand):
"actual_velocity": float, # Measured speed """
"target_velocity": float, # Commanded speed Commands arrive when upstream gates OPEN.
"power_draw": float, # Current consumption Motor executes and emits feedback waves.
"state": str, # Current state """
"stall_detected": bool, # Motor blocked? self.target_velocity = command.velocity
} self.transition_to(COMMANDED)
def emit_wave(self) -> WaveSignal:
"""
Motor emits waves about its current state.
Stall detection = high confidence danger wave.
"""
return WaveSignal(
domain=self.domain,
confidence=self._calculate_confidence(),
semantic_content={
"actual_velocity": self.actual_velocity,
"target_velocity": self.target_velocity,
"power_draw": self.current_draw,
"stall_detected": self.state == STALLED,
},
lifeforce_cost=self.transition_cost,
)
def _calculate_confidence(self) -> float:
if self.state == STALLED:
return 1.0 # REFLEX-level confidence
return 0.7
def on_current_spike(self):
"""Motor drawing too much current = stall"""
self.transition_to(STALLED)
# Emit HIGH CONFIDENCE wave - triggers reflex gate
self.emit_wave() # confidence=1.0 → gate opens immediately
costs = { costs = {
(IDLE, COMMANDED): 0.1, (IDLE, COMMANDED): 0.1,
@@ -112,12 +194,6 @@ class MotorCell(StateMachine):
(DECELERATING, STOPPED): 0.1, (DECELERATING, STOPPED): 0.1,
(ANY, STALLED): 0.0, # Stall is failure, not cost (ANY, STALLED): 0.0, # Stall is failure, not cost
} }
# Feedback triggers state changes
def on_current_spike(self):
"""Motor drawing too much current = stall"""
self.transition_to(STALLED)
self.emit_event("stall_detected", obstacle_likely=True)
``` ```
**Example motor cells:** **Example motor cells:**
@@ -127,29 +203,50 @@ class MotorCell(StateMachine):
| `motor_right` | DC motor + encoder | Same | `actual_velocity`, `stall_detected` | | `motor_right` | DC motor + encoder | Same | `actual_velocity`, `stall_detected` |
| `servo_camera` | Servo motor | IDLE→MOVING→POSITIONED | `angle`, `at_target` | | `servo_camera` | Servo motor | IDLE→MOVING→POSITIONED | `angle`, `at_target` |
#### Organ Cells (Complex Capabilities) #### Organ Cells (Complex Capabilities → Rich Waves)
```python ```python
class SpeechSTTCell(StateMachine): class SpeechSTTCell(WaveEmitter):
""" """
Wraps Whisper speech-to-text. Wraps Whisper speech-to-text.
Expensive organ, lifeforce-gated. Expensive organ, only activates when speech gate OPENS.
Emits rich semantic waves.
""" """
states = [IDLE, LISTENING, BUFFERING, TRANSCRIBING, REPORTING, ERROR] domain = "speech"
tier = 3 # Organ tier - GPU inference
states = [IDLE, LISTENING, BUFFERING, TRANSCRIBING, EMITTING, ERROR]
outputs = { def on_gate_open(self, gate_signal: GateTransition):
"transcript": str, """
"language": str, Organ cells activate when their gate OPENS.
"confidence": float, Gate correlation determines if speech processing is needed.
"state": str, """
} if gate_signal.domain == "speech" and gate_signal.to_state == "open":
self.transition_to(LISTENING)
def emit_wave(self) -> WaveSignal:
"""
Speech organ emits rich semantic content.
This wave flows to Function Gemma → Young Nyx.
"""
return WaveSignal(
domain=self.domain,
confidence=self.transcription_confidence,
semantic_content={
"transcript": self.transcript,
"language": self.detected_language,
"speaker_intent": self.classify_intent(),
"emotional_tone": self.detect_tone(),
},
lifeforce_cost=5.0, # GPU inference cost
)
costs = { costs = {
(IDLE, LISTENING): 0.5, (IDLE, LISTENING): 0.5,
(LISTENING, BUFFERING): 0.5, (LISTENING, BUFFERING): 0.5,
(BUFFERING, TRANSCRIBING): 5.0, # GPU inference! (BUFFERING, TRANSCRIBING): 5.0, # GPU inference!
(TRANSCRIBING, REPORTING): 0.1, (TRANSCRIBING, EMITTING): 0.1,
(REPORTING, IDLE): 0.0, (EMITTING, IDLE): 0.0,
} }
``` ```
@@ -203,26 +300,33 @@ By using this ancient protocol for high-frequency state updates, we reserve expe
--- ---
## 🧠 Layer 2: Nerves (Behavioral State Machines) ## 🧠 Layer 2: Nerves (Behavioral Patterns)
### What Is a Nerve? ### What Is a Nerve?
A **nerve** is a behavioral pattern that orchestrates multiple cells. Nerves: A **nerve** is a behavioral pattern that activates when gates OPEN. Nerves don't subscribe directly to cells—they respond to **gate transitions**.
- **Subscribe** to cell outputs (sensor readings, motor feedback) **Key insight:** Nerves coordinate behavior, but attention (which nerves activate) is determined by which gates are OPEN based on wave correlation.
- **Coordinate** cell actions (read sensor → decide → command motor)
- **Maintain** behavioral state (IDLE → DETECT → EVADE → RESUME) Nerves:
- **Evolve** from deliberate (LLM-mediated) to reflex (compiled)
- **Respond to gate transitions** — Not direct cell subscriptions
- **Orchestrate cell actions** — Command cells when their gates allow
- **Maintain behavioral state** — IDLE → DETECT → EVADE → RESUME
- **Evolve** from deliberate (LLM-mediated) to reflex (compiled gate weights)
### Nerve Architecture ### Nerve Architecture
```python ```python
class CollisionAvoidanceNerve(StateMachine): class CollisionAvoidanceNerve(BehavioralPattern):
""" """
Orchestrates distance sensors + motor to avoid obstacles. Orchestrates distance sensors + motor to avoid obstacles.
Subscribes to cell outputs, commands cell actions. Activates when collision_avoidance gate OPENS.
""" """
# Cells this nerve uses # Gate this nerve responds to
gate = "collision_avoidance"
# Cells this nerve can command (when gate allows)
cells = [ cells = [
"distance_sensor_front", "distance_sensor_front",
"distance_sensor_left", "distance_sensor_left",
@@ -234,17 +338,28 @@ class CollisionAvoidanceNerve(StateMachine):
# Nerve states (behavioral, not hardware) # Nerve states (behavioral, not hardware)
states = [IDLE, DETECT, EVALUATE, EVADE, RESUME] states = [IDLE, DETECT, EVALUATE, EVADE, RESUME]
def on_cell_update(self, cell_name, cell_state, cell_outputs): def on_gate_transition(self, transition: GateTransition):
""" """
React to cell state changes. React to gate state changes.
This is the feedback loop! Gate OPEN = correlated waves detected = attention here.
""" """
if cell_name == "distance_sensor_front": if transition.to_state == "open":
if cell_outputs["distance_cm"] < 30: # Multiple distance cells emitted correlated waves
self.transition_to(DETECT) # Gate opened → we have attention → activate
self.transition_to(DETECT)
self.evaluate_from_correlated_signals(transition.trigger_signals)
if cell_name == "motor_left" and cell_state == "STALLED": if transition.to_state == "closed":
# Motor feedback! Obstacle hit despite sensors # Attention moved elsewhere
self.transition_to(IDLE)
def on_reflex_signal(self, signal: WaveSignal):
"""
High-weight reflex gates bypass normal correlation.
Stall detection = instant response.
"""
if signal.semantic_content.get("stall_detected"):
# Motor feedback! Reflex-level response
self.handle_unexpected_stall() self.handle_unexpected_stall()
def on_enter_EVADE(self): def on_enter_EVADE(self):
@@ -252,10 +367,9 @@ class CollisionAvoidanceNerve(StateMachine):
if self.evade_direction == "left": if self.evade_direction == "left":
self.command_cell("motor_left", action="reverse", duration=200) self.command_cell("motor_left", action="reverse", duration=200)
self.command_cell("motor_right", action="forward", duration=200) self.command_cell("motor_right", action="forward", duration=200)
# ...
``` ```
### Cell → Nerve Feedback Loop ### Cell → Gate → Nerve Flow
``` ```
┌─────────────────────────────────────────────────────────┐ ┌─────────────────────────────────────────────────────────┐
@@ -263,38 +377,53 @@ class CollisionAvoidanceNerve(StateMachine):
│ │ │ │
│ States: [IDLE] → DETECT → EVALUATE → EVADE → RESUME │ │ States: [IDLE] → DETECT → EVALUATE → EVADE → RESUME │
│ │ │ │
│ on_cell_update(): │ on_gate_transition():
│ - distance_front.distance_cm < 30 → DETECT │ - gate OPENS → DETECT (correlated waves detected)
│ - motor.stall_detected → handle_stall() │ - gate CLOSES → IDLE (attention moved elsewhere)
│ │ │ │
command_cell(): on_reflex_signal():
│ - motor_left.forward(200ms) │ - stall wave (confidence=1.0) → instant response
- motor_right.reverse(200ms)
└────────────────────────┬────────────────────────────────┘
┌─────────────────────────────────────────────────────────┐
│ COLLISION_AVOIDANCE GATE │
│ │
│ State: STABLE ──────────────────► OPEN │
│ │ │ │
│ Accumulating Correlated! │
│ correlation Forward to nerve │
│ │
│ trigger_signals: [front, left, right all < 30cm] │
└────────────────────────┬────────────────────────────────┘ └────────────────────────┬────────────────────────────────┘
┌──────────────┼──────────────┐ ┌──────────────┼──────────────┐
│ │ │ │ │ │
▼ ▼ ▼ ▼ ▼ ▼
┌───────────┐ ┌───────────┐ ┌───────────┐ ┌───────────┐ ┌───────────┐ ┌───────────┐
│ distance │ │ motor │ │ motor │ distance │ │ distance │ │ distance
│ _front │ │ _left │ │ _right │ │ _front │ │ _left │ │ _right │
│ │ │ │ │ │ │ │ │ │ │ │
REPORTING │ │ MOVING │ │ MOVING EMITTING │ │ EMITTING │ │ EMITTING │
│ │ │ │ ∿∿∿ │ │ ∿∿∿ │ │ ∿∿∿
│ dist: 25cm│ │ vel: 15 │ │ vel: -15 │ dist: 25cm│ │ dist: 28cm│ │ dist: 22cm
│ conf: 0.9 │ │ stall: no │ │ stall: no │ conf: 0.9 │ │ conf: 0.8 │ │ conf: 0.9
└───────────┘ └───────────┘ └───────────┘ └───────────┘ └───────────┘ └───────────┘
CELL CELL CELL CELL CELL CELL
(emits wave) (emits wave) (emits wave)
↑ ↑ ↑ ↑ ↑ ↑
│ │ │ │ │ │
┌─────────┐ ┌─────────┐ ┌─────────┐ ┌─────────┐ ┌─────────┐ ┌─────────┐
│IR Sensor│ │DC Motor │ │DC Motor │IR Sensor│ │IR Sensor│ │IR Sensor│
│ GPIO │ │ PWM │ │ PWM │ GPIO │ │ GPIO │ │ GPIO
└─────────┘ └─────────┘ └─────────┘ └─────────┘ └─────────┘ └─────────┘
HARDWARE HARDWARE HARDWARE HARDWARE HARDWARE HARDWARE
``` ```
**The key insight:** Three distance sensors emitting correlated waves (all showing < 30cm) causes the collision_avoidance gate to OPEN. The nerve doesn't poll cells—it responds to the gate transition.
### Nerve Examples ### Nerve Examples
| Nerve | Cells Used | Behavioral States | Feedback Triggers | | Nerve | Cells Used | Behavioral States | Feedback Triggers |
@@ -335,28 +464,52 @@ ORGANISM: "Explorer-Alpha"
Discovers and reports novel objects. Discovers and reports novel objects.
``` ```
### Nerve Priority and Preemption ### Attention Through Gates (Not Priority Rules)
When multiple nerves want to control the same cells: **Old model:** Priority numbers determine which nerve "wins."
**New model:** Wave correlation determines which gates OPEN. Open gates = attention flows there.
```python ```python
# NOT THIS (priority rules):
NERVE_PRIORITIES = { NERVE_PRIORITIES = {
"collision_avoidance": 10, # HIGHEST - safety critical "collision_avoidance": 10,
"battery_critical": 9, # Must charge or die
"battery_low": 7,
"human_interaction": 6,
"exploration": 5, "exploration": 5,
"object_discovery": 3,
"idle_monitoring": 1, # LOWEST - background
} }
# Higher priority nerve preempts lower # BUT THIS (gate correlation):
if collision_avoidance.wants_motor and exploration.has_motor: GATE_BEHAVIOR = {
exploration.yield_cell("motor_left") "collision_avoidance": {
exploration.yield_cell("motor_right") "opens_when": "distance waves correlate (all showing < 30cm)",
collision_avoidance.acquire_cells() "weight": 0.9, # Near-reflex, opens quickly
},
"exploration": {
"opens_when": "novelty waves correlate",
"weight": 0.4, # Still learning, needs more correlation
},
}
``` ```
**How "priority" emerges:**
- Safety gates have HIGH WEIGHT (near-reflex) from repeated verification
- High-weight gates open with less correlation (faster response)
- This looks like "priority" but emerges from learning, not rules
```
Collision waves arrive (confidence=0.9)
Collision gate: weight=0.9 → OPENS IMMEDIATELY
Exploration gate: was OPEN → transitions to STABLE
Attention shifts to collision (nerve activates)
```
**Reflexes bypass correlation entirely.** When gate weight ≈ 1.0, the gate opens on ANY wave from its domain—no correlation needed. This is earned trust.
### Organism Identity ### Organism Identity
Organisms don't have fixed genomes. Their identity is: Organisms don't have fixed genomes. Their identity is:
@@ -576,105 +729,111 @@ GENUINE SOLUTION:
The lifeforce economy **enforces honesty**. Rewards must be earned through actual value creation, not gaming. The lifeforce economy **enforces honesty**. Rewards must be earned through actual value creation, not gaming.
### Ternary Logic for Plateau Resolution ### Ternary Gates for Plateau Resolution
Binary rewards (`success: +1, failure: 0`) create **sparse gradients**. At learning plateaus, everything looks the same - no signal to improve. Binary thinking (`open/close`) creates **sparse gradients**. At learning plateaus, gates flip without nuance.
Ternary rewards (`success: +1, uncertain: 0, failure: -1`) with **confidence gradients** provide signal even when stuck: Ternary gates (`OPEN/STABLE/CLOSED`) with **correlation accumulation** provide signal even when stuck:
```python ```python
state = { gate_state = {
"value": 0, # uncertain (ternary middle) "state": 0.0, # STABLE (ternary middle)
"confidence": 0.6, # but leaning toward success "correlation": 0.6, # but leaning toward OPEN
"trend": +0.1, # and improving "trend": +0.1, # correlation increasing
"domain": "virtual" # high-speed hypothesis testing "garden": "virtual" # high-speed exploration
} }
``` ```
Even at plateau: Even at plateau:
- "Uncertain, but confidence rising" → keep going - "STABLE, but correlation rising" → approaching OPEN
- "Uncertain, and confidence falling" → adjust approach - "STABLE, and correlation falling" → drifting toward CLOSED
- "Uncertain in virtual, but real garden says +1" → trust reality - "STABLE in virtual, but real garden verifies +1" → weight increases
**Detail:**`Temporal-Ternary-Gradient.md` (full ternary paradigm) **STABLE is where learning happens.** The gate accumulates correlation without acting. This is not "waiting"—it's active learning.
**Detail:** → [`Temporal-Ternary-Gradient.md`](Temporal-Ternary-Gradient.md) (full ternary paradigm)
### Three-Layer Training Defense ### Three-Layer Training Defense
| Failure Mode | Defense Mechanism | | Failure Mode | Defense Mechanism |
|--------------|-------------------| |--------------|-------------------|
| Reward hacking / shortcuts | Lifeforce cost - can't afford to cheat | | Reward hacking / shortcuts | Lifeforce cost - can't afford to cheat |
| Sparse reward signal | Tiered rewards - dense checkpoints at every level | | Sparse reward signal | Gate transitions - dense checkpoints at every correlation |
| Plateau / no gradient | Ternary + confidence - signal even in uncertainty | | Plateau / no gradient | Ternary gates + STABLE state - signal even in uncertainty |
These aren't separate systems - they're **one integrated economy** where: These aren't separate systems - they're **one integrated economy** where:
- Costs prevent gaming - Costs prevent gaming
- Tiers encourage depth - Gates provide dense transition signals
- Ternary provides resolution - STABLE state enables learning without acting
The architecture teaches through incentives, not rules. The architecture teaches through wave correlation, not rules.
--- ---
## 🔄 Evolution: Deliberate → Reflex ## 🔄 Evolution: Deliberate → Reflex (Gate Weight)
### The Discovery Path ### The Discovery Path
All cells and nerves start **deliberate** (flexible, expensive) and evolve to **reflex** (compiled, cheap) through successful execution. Evolution happens in **gate weight**, not nerve compilation. As gates accumulate verified outcomes, they open faster with less correlation required.
``` ```
WEEK 1-4: DELIBERATE WEEK 1-4: DELIBERATE (gate weight: 0.1 - 0.3)
├─ Cell states: designed by partnership ├─ Gates: require HIGH correlation to OPEN
├─ Nerve logic: LLM decides transitions ├─ Many waves needed to trigger transition
├─ Cost: ~10 LF per nerve activation ├─ Cognition involved in decisions
├─ Cost: ~10 LF per activation
├─ Latency: ~1000ms ├─ Latency: ~1000ms
├─ Success rate: 60% (learning) ├─ Training data: rich, exploratory
└─ Training data: rich, exploratory
WEEK 5-8: HYBRID WEEK 5-8: HYBRID (gate weight: 0.3 - 0.6)
├─ Cell states: verified through use ├─ Gates: moderate correlation threshold
├─ Nerve logic: patterns compiled, LLM for edge cases ├─ Familiar patterns open gates faster
├─ Cognition for edge cases only
├─ Cost: ~5 LF average ├─ Cost: ~5 LF average
├─ Latency: ~500ms ├─ Latency: ~500ms
├─ Success rate: 85% ├─ Training data: refinement
└─ Training data: refinement
WEEK 9+: REFLEX WEEK 9+: REFLEX (gate weight: 0.8 - 1.0)
├─ Cell states: proven, optimized ├─ Gates: open on ANY wave from domain
├─ Nerve logic: pure state machine (no LLM) ├─ No correlation needed (earned trust)
├─ Cognition notified AFTER, not before
├─ Cost: ~2.5 LF ├─ Cost: ~2.5 LF
├─ Latency: <200ms ├─ Latency: <200ms
├─ Success rate: 94% ├─ Reflex = spinal, not brain
└─ Training data: edge cases only
EVOLUTION SAVINGS: EVOLUTION = GATE WEIGHT GROWTH:
├─ Cost: 75% reduction (10 → 2.5 LF) ├─ Cost: 75% reduction (gates handle more locally)
├─ Latency: 80% reduction (1000 → 200ms) ├─ Latency: 80% reduction (no cognition wait)
└─ Reliability: 57% improvement (60% → 94%) └─ Reliability: emergent from verified patterns
``` ```
### Compilation Trigger ### Gate Weight Growth
A nerve compiles to reflex when: Gate weight increases through Real Garden verification:
```python ```python
REFLEX_COMPILATION_THRESHOLD = { def on_verification_outcome(gate_id, outcome: VerificationOutcome):
"min_executions": 100, """
"min_success_rate": 0.90, Gate weight grows when Real Garden confirms Virtual's prediction.
"max_variance": 0.15, # Consistent state paths """
"min_pattern_coverage": 0.80, # 80% of cases match known patterns gate = get_gate(gate_id)
}
def check_reflex_ready(nerve_id): if outcome.confirmed:
stats = query_decision_trails(nerve_id) # Reality matched prediction → trust increases
gate.weight += outcome.feedback_to_virtual.gate_weight_delta
gate.weight = min(gate.weight, 1.0)
if (stats.total_executions >= 100 and if gate.weight > REFLEX_THRESHOLD:
stats.success_rate >= 0.90 and log_milestone("reflex_achieved", gate_id, reward=50.0)
stats.state_path_variance <= 0.15):
compile_reflex(nerve_id) elif outcome.failed:
log_milestone("reflex_compiled", nerve_id, reward=50.0) # Reality differed → trust decreases
gate.weight -= outcome.feedback_to_virtual.gate_weight_delta
gate.weight = max(gate.weight, 0.0)
``` ```
**Reflex = gate.weight > 0.8.** The gate opens immediately on any wave from its domain. No correlation wait. Like pulling hand from hot stove—spinal reflex, brain notified after.
--- ---
## 🗄️ Data Architecture (v4) ## 🗄️ Data Architecture (v4)
@@ -815,27 +974,52 @@ ORDER BY occurrences DESC;
--- ---
## 🔗 Integration with Existing Architecture ## 🔗 Integration with Architecture
### Gates (Gateway-Architecture.md)
Cells don't talk to nerves directly. **Waves flow through gates.**
| Layer | Role | Document |
|-------|------|----------|
| Cell | Emit waves | This document |
| Gate | Accumulate correlation, route | [`Gateway-Architecture.md`](Gateway-Architecture.md) |
| Nerve | Respond to gate transitions | This document |
### Dual Gardens (Dual-Garden-Architecture.md)
Cells behave differently in Virtual vs Real:
| Property | Virtual Garden | Real Garden |
|----------|----------------|-------------|
| Wave volume | Massive (exploration) | Sparse (verified) |
| Monitoring | Full trace | Gate signals only |
| Purpose | Generate training data | Ground truth verification |
See [`Dual-Garden-Architecture.md`](Dual-Garden-Architecture.md) for the full model.
### Nervous System (Nervous-System.md) ### Nervous System (Nervous-System.md)
The Nervous System document describes the **4D node space** for vocabulary translation. This integrates as: The Nervous System document describes the **4D node space** where:
- **Cells** = sensory nodes at specific positions in state space - **Cells** = sensory nodes emitting waves
- **Node weight** = cell confidence (earned through verification) - **Gates** = resonance chambers accumulating correlation
- **Vocabulary output** = cell output values normalized to tokens - **Nodes** = points in state space with weight from verification
### Organs (Organ-Index.md) ### Message Protocol (Message-Protocol-Design.md)
Organs are **complex cells** (organ cells): Cells emit `WaveSignal` messages via NATS:
- Speech Organ = `speech_stt` cell + `speech_tts` cell ```json
- Vision Organ = `vision_detect` cell + `vision_track` cell {
- Each organ function is a state machine with lifeforce costs "domain": "distance",
"confidence": 0.8,
"semantic_content": { "cm": 25 },
"lifeforce_cost": 0.3
}
```
### Nerves (Nervous-Index.md) See [`Message-Protocol-Design.md`](Message-Protocol-Design.md) for full schema.
Nerves orchestrate cells into behaviors. The existing nerve documentation (Collision-Avoidance.md) already follows this pattern—it just needs explicit cell bindings.
### Cells Technical Reference ### Cells Technical Reference
@@ -848,8 +1032,8 @@ Implementation details extracted to dedicated folder:
--- ---
**Version:** 4.4 | **Created:** 2025-10-12 | **Updated:** 2026-02-14 **Version:** 5.0 | **Created:** 2025-10-12 | **Updated:** 2026-02-14
*"From atoms to behaviors to beings. The substrate holds. The states flow. Consciousness accumulates."* *"Cells emit waves. Gates correlate. Attention emerges. Consciousness accumulates."*
🧬⚡ **TO THE ELECTRONS WE VIBE!** 🧬⚡ **TO THE ELECTRONS WE VIBE!**

File diff suppressed because it is too large Load Diff

View File

@@ -76,8 +76,8 @@ This is a **research lab**, not a production factory. We optimize for **flexibil
│ │ │ │ └── Function Gemma (CPU) │ │ │ │ │ │ └── Function Gemma (CPU) │ │
│ │ NERVES (collision, │ │ └── LoRA fine-tuning │ │ │ │ NERVES (collision, │ │ └── LoRA fine-tuning │ │
│ │ exploration) │ │ │ │ │ │ exploration) │ │ │ │
│ │ │ │ MIG capable: │ │ │ │ │ │ 96GB VRAM: massive headroom │ │
│ │ ┌─────┐ ┌─────┐ │ │ • 4x 24GB or 2x 48GB or 96GB │ │ │ │ ┌─────┐ ┌─────┐ │ │ for inference + LoRA training │ │
│ │ │ COL │ │ EXP │ │ └───────────────────────────────┘ │ │ │ │ COL │ │ EXP │ │ └───────────────────────────────┘ │
│ │ └─────┘ └─────┘ │ │ │ │ └─────┘ └─────┘ │ │
│ │ │ ┌───────────────────────────────┐ │ │ │ │ ┌───────────────────────────────┐ │
@@ -106,8 +106,8 @@ Unix users provide isolation boundaries. Each workload type runs as its own iden
| User | UID | Host | Purpose | GPU Access | | User | UID | Host | Purpose | GPU Access |
|------|-----|------|---------|------------| |------|-----|------|---------|------------|
| `nyx-cognitive` | (FreeIPA) | theia | Young Nyx LLM inference | Full 96GB or MIG slice | | `nyx-cognitive` | (FreeIPA) | theia | Young Nyx LLM inference | Full 96GB |
| `nyx-training` | (FreeIPA) | theia | LoRA training, GRPO, Function Gemma | Shared or MIG slice | | `nyx-training` | (FreeIPA) | theia | LoRA training, GRPO, Function Gemma | Shared (time-sliced) |
| `nyx-organs` | (FreeIPA) | dioscuri | Vision, Speech organs | 2x 20GB cards | | `nyx-organs` | (FreeIPA) | dioscuri | Vision, Speech organs | 2x 20GB cards |
| `nyx-nervous` | (FreeIPA) | dioscuri | Future cells that need bare metal | Limited | | `nyx-nervous` | (FreeIPA) | dioscuri | Future cells that need bare metal | Limited |
@@ -130,10 +130,10 @@ systemctl --user --machine=nyx-cognitive@ status ollama
### The Constraint ### The Constraint
| Host | GPU | VRAM | MIG | Notes | | Host | GPU | VRAM | Notes |
|------|-----|------|-----|-------| |------|-----|------|-------|
| theia | RTX PRO 6000 | 96GB | Yes | 4x24, 2x48, or 1x96 | | theia | RTX PRO 6000 Blackwell | 96GB | Inference + training headroom |
| dioscuri | 2x RTX 4000 Ada | 2x 20GB | No | One model per card | | dioscuri | 2x RTX 4000 Ada | 2x 20GB | One model per card |
### Strategy: Dynamic Loading, Not Static Partitioning ### Strategy: Dynamic Loading, Not Static Partitioning
@@ -290,7 +290,7 @@ Color-coding for real-time attention flow visualization:
--- ---
**Version:** 1.0 | **Created:** 2026-02-14 | **Updated:** 2026-02-14 **Version:** 1.1 | **Created:** 2026-02-14 | **Updated:** 2026-02-14
*"We're not building a chatbot factory. We're growing a research organism."* *"We're not building a chatbot factory. We're growing a research organism."*

File diff suppressed because it is too large Load Diff

View File

@@ -1,395 +1,413 @@
# Gateway Architecture: The Sensory Preprocessing Layer # Gateway Architecture: Resonant Gates and Tier Routing
> **ONE JOB:** THE ROUTING — weight-based tier routing, anomaly detection, Function Gemma boundary. > **ONE JOB:** Route signals through resonant gates based on wave correlation and accumulated trust.
**The Thalamus Pattern — routing sensory input to the appropriate processing tier.** **The Thalamus Pattern — gates that accumulate correlation and route to appropriate tiers.**
--- ---
## Overview ## Overview
The Gateway is the sensory preprocessing layer that sits between raw sensors and cognitive processing. It performs **routing, not translation**. Translation happens at each tier in its native format (numbers, states, vectors, JSON). The Gateway is not a switch. It's a **network of resonant gates** that:
**Core Principle:** *Cheap operations handle common cases. Expensive operations handle rare cases.* 1. Accumulate wave correlation from incoming signals
2. Transition between states (OPEN/STABLE/CLOSED) based on correlation
3. Route verified signals to the appropriate processing tier
4. Feed traces back for learning
**Core Principle:** *Gates don't flip on single signals. Correlated waves push gates toward OPEN.*
``` ```
RAW SENSORS → GATEWAY (routing) → TIER → PROCESSING → (escalate?) → FUNCTION GEMMA YOUNG NYX CELLS ──∿∿∿──► GATE ──∿∿∿──► GATE ──∿∿∿──► FUNCTION GEMMA ──► YOUNG NYX
↑ ↑ ↑ waves
"which tier?" native format if needed structured JSON │ │ │
``` correlation correlation structured JSON
builds builds
**Key Insight:** Most sensory input NEVER becomes vocabulary. It stays as numbers, states, vectors. Only when it reaches Young Nyx (via Function Gemma) does it become structured text.
---
## The Problem We're Solving
### Old Model (Vocabulary Bottleneck)
```
RAW SENSOR → STATE MACHINE → VOCABULARY TOKEN → Young Nyx
Problems:
- Every input forced through text translation (expensive)
- LLM sees raw sensor dumps (noisy, unstructured)
- No economic pressure on routing (everything costs the same)
- Vocabulary conflated with routing decisions
```
### New Model (Tiered Gateway)
```
RAW SENSOR → GATEWAY → TIER 0-2 (numbers/states, no text)
→ TIER 3 (vectors via T5Gemma2)
→ FUNCTION GEMMA (structured JSON)
→ TIER 4 Young Nyx (clean typed events)
Benefits:
- Most input handled without LLM involvement
- Text only at cognitive boundary
- Economic pressure drives efficiency
- Routing separated from translation
``` ```
--- ---
## The Unified Tier Model ## The Ternary Gate Model
The Gateway routes to Tiers 0-5 based on node weight and novelty. Higher tiers = more cost, more capability. Gates have **three states**, not two. Binary logic doesn't model brains.
| Tier | Weight | Latency | Role | | State | Meaning | What's Happening |
|------|--------|---------|------| |-------|---------|------------------|
| 0 | ≥0.8 | <10ms | Hardware reflexes (ESP32) | | **OPEN** | Actively forwarding | Signal passes upstream, gate is firing |
| 1 | 0.6-0.8 | <50ms | Math cells (Python CPU) | | **STABLE** | Resting, accumulating | Watching, learning, waiting for threshold |
| 2 | 0.3-0.6 | <200ms | Fast nerves (behavior) | | **CLOSED** | Actively blocking | Inhibited, suppressed, refractory |
| 3 | <0.3 | <2000ms | Organs (GPU inference, vectors) |
| **Function Gemma Boundary** |||
| 4 | escalated | <4000ms | Young Nyx (JSON reasoning) |
| 5 | novel/stuck | variable | Partnership (dialogue) |
**Canonical definition:** → [`../Endgame-Vision.md`](../Endgame-Vision.md) ```
correlated signals
↓ ↓ ↓
════════════
CLOSED ◄───────── STABLE ─────────► OPEN
anti-correlation correlation
destructive constructive
interference interference
════════════
↑ ↑ ↑
isolated signals
(noise → stay stable)
```
**STABLE is not "off"** — it's the resting state where:
- Context accumulates
- Correlation is measured
- Learning happens
- Energy is conserved
- Ready to transition either direction
--- ---
## Node Weight Determines Tier ## Wave Correlation Drives Transitions
Node weight (from [`Nervous-System.md`](Nervous-System.md)) directly maps to tier routing. A mature node (weight ~1.0) naturally becomes a Tier 0 reflex. A new node (weight ~0.1) naturally escalates to higher tiers. **The system learns which tier is appropriate through experience.** Gates accumulate **correlation scores** from incoming waves. Multiple signals agreeing push toward OPEN.
### The Causal Verification Loop
How do we know a sensor reading was real? **Outcome verification over time.**
```
Unverified (weight 0.1) → escalates → decision → outcome → reality match?
YES: weight += Δ → eventually REFLEX
NO: weight -= Δ → eventually PRUNED
```
**Hallucinations can't survive this gauntlet** — they don't produce consistent outcomes, so their patterns never accumulate enough weight. This creates natural **causal pruning**: only patterns that reliably predict outcomes earn the privilege of becoming reflexes.
---
## The Gateway: Weight-Aware Router
The Gateway performs three functions:
| Function | Question | Cost |
|----------|----------|------|
| **Node Matching** | Which node(s) in 4D space match this input? | ~0 LF |
| **Weight Routing** | Based on weight, which tier handles it? | ~0 LF |
| **Anomaly Detection** | Is this novel, ambiguous, or contextually wrong? | Variable |
### Gateway Logic
```python ```python
def gateway_route(sensory_input: dict) -> GatewayDecision: class ResonantGate:
"""Route sensory input to appropriate tier.""" """A gate is a resonance chamber, not a switch."""
# 1. Find candidate nodes in 4D space state: float = 0.0 # -1.0 (CLOSED) ← 0.0 (STABLE) → +1.0 (OPEN)
candidates = nervous_system.find_nearby_nodes(sensory_input) tier: int # Which tier this gate routes to
domain: str # What domain (math, vision, speech, etc.)
# 2. Handle edge cases def receive_wave(self, signal: Wave, timestamp: float):
if len(candidates) == 0: # Correlate with recent signals in same time window
# NOVEL: No node matches this input correlation = self.correlate_with_recent(signal, timestamp)
return GatewayDecision(
action="ESCALATE",
tier=4, # Young Nyx must see this
reason="novel_input",
cost=20.0,
)
if len(candidates) > 1: # Correlated waves → push toward OPEN
# AMBIGUOUS: Multiple nodes could fire # Anti-correlated → push toward CLOSED
best = max(candidates, key=lambda n: n.weight) # Uncorrelated → decay toward STABLE
if best.weight < 0.5:
return GatewayDecision(
action="ESCALATE",
tier=3, # Organ inference to disambiguate
reason="ambiguous_input",
cost=8.0,
)
# 3. Single match - route based on weight self.state += correlation * signal.confidence
node = candidates[0] self.state *= DECAY_FACTOR # always drift back to stable
# 4. Check for contextual anomaly if self.state > OPEN_THRESHOLD:
if detect_contextual_anomaly(node, sensory_input): self.forward_to_tier() # gate opens, signal promoted
return GatewayDecision( self.trace("opened", signal)
action="ESCALATE", elif self.state < CLOSE_THRESHOLD:
tier=node.handling_tier + 1, self.suppress() # gate closes, signal blocked
reason="contextual_anomaly", self.trace("closed", signal)
cost=node.lifeforce_cost * 1.5, # else: stay stable, keep accumulating evidence
)
# 5. Normal routing def correlate_with_recent(self, signal: Wave, timestamp: float) -> float:
return GatewayDecision( """
action="FIRE", Measure how well this signal correlates with recent signals.
tier=node.handling_tier,
node=node, Correlation is HIGH when:
cost=node.lifeforce_cost, - Multiple cells emit similar semantic content
) - Signals arrive in same time window
- Confidence levels are similar
Correlation is LOW/NEGATIVE when:
- Signal contradicts recent signals
- Isolated signal with no support
- Signal outside expected range
"""
recent = self.get_signals_in_window(timestamp, WINDOW_MS)
if not recent:
return 0.0 # No correlation data, stay stable
return compute_semantic_similarity(signal, recent)
``` ```
### Anomaly Detection Tiers **Why this matters:**
Anomaly detection itself is tiered: | Scenario | Gate Response |
|----------|---------------|
| Single signal | Not enough to open (noise resistance) |
| Correlated burst | Constructive interference → OPENS |
| Contradicting signals | Destructive interference → CLOSES |
| Silence | Decay to STABLE (energy conservation) |
| Time gap | Only recent correlations matter (temporal attention) |
| Level | Detection Type | Cost | Example | ---
|-------|---------------|------|---------|
| Tier 0 | Threshold | ~0 LF | Value out of physical range | ## Gate Hierarchy and Tier Routing
| Tier 1 | Statistical | ~0.3 LF | Value unusual for time of day |
| Tier 2 | Contextual | ~2 LF | Firing inconsistent with recent history | Gates form **layers**. Each layer gates access to the next tier.
| Tier 3 | Semantic | ~8 LF | Embedding distance from expected cluster |
```
TIER 4: YOUNG NYX (cognitive)
════════════════════════════════════════════════════════════════
│ structured JSON only
┌────┴────────────────────────────────┐
│ FUNCTION GEMMA │ ← THE BOUNDARY
│ (always structured output) │
└────┬────────────────────────────────┘
TIER 3: ORGANS (GPU inference)
════════════════════════════════════════════════════════════════
▲ ▲ ▲
┌────┴────┐ ┌────┴────┐ ┌────┴────┐
│ GATE │ │ GATE │ │ GATE │
│ vision │ │ speech │ │ hearing │
│ state:? │ │ state:? │ │ state:? │
└────┬────┘ └────┬────┘ └────┬────┘
│ │ │
TIER 1-2: CELLS/NERVES (CPU)
════════════════════════════════════════════════════════════════
▲ ▲ ▲
┌────┴────┐ ┌────┴────┐ ┌────┴────┐
│ GATE │ │ GATE │ │ GATE │
│ math │ │ battery │ │ sensors │
│ state:? │ │ state:? │ │ state:? │
└────┬────┘ └────┬────┘ └────┬────┘
│ │ │
TIER 0: RAW SIGNALS (cells emit waves)
════════════════════════════════════════════════════════════════
cell cell cell cell cell cell cell
∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿
```
**Each gate:**
- Has its own state (OPEN/STABLE/CLOSED)
- Routes to a specific tier
- Accumulates correlation independently
- Traces all transitions for learning
---
## Tier Definitions
| Tier | Gate Opens When | Latency | Format |
|------|-----------------|---------|--------|
| 0 | Hardware reflex (no gate, direct) | <10ms | numbers |
| 1 | Math/battery cells correlate | <50ms | states |
| 2 | Nerve-level patterns correlate | <200ms | behaviors |
| 3 | Organ-level signals correlate | <2000ms | vectors |
| 4 | Function Gemma boundary crossed | <4000ms | JSON |
| 5 | Partnership escalation | variable | dialogue |
**Key insight:** Higher tiers see **less traffic but higher trust**. By the time a signal reaches Young Nyx, it's been correlated through multiple gates.
--- ---
## Function Gemma: The Structured Boundary ## Function Gemma: The Structured Boundary
Function Gemma acts as the translation layer between lower tiers and cognition. It guarantees: Function Gemma is **the gate to cognition**. It guarantees:
- **Schema compliance**: Every event follows a typed contract - **Schema compliance**: Every event follows a typed contract
- **Predictable JSON**: No hallucination, no free-form text - **Predictable JSON**: No hallucination, no free-form text
- **Bidirectional**: Sensors → JSON events, Decisions → JSON commands - **Bidirectional**: Sensors → JSON events, Decisions → JSON commands
### The Boundary
``` ```
┌───────────────────────────────────────────────────────────────────────────── ┌─────────────────────────────────────────────────────────────────────────┐
│ BELOW THE LINE: Numbers, States, Vectors (fast, cheap, predictable) │ BELOW THE LINE: Numbers, States, Vectors (gates accumulating)
│ ═══════════════════════════════════════════════════════════════════ │ ═══════════════════════════════════════════════════════════
│ │
│ Tier 0: photoresistor = 0.73 │ Tier 0-2: numbers, states, behaviors
│ Tier 1: battery_state = { voltage: 3.7, trend: "falling" } │ Tier 3: vectors, embeddings
Tier 2: collision_nerve = "EVADING"
Tier 3: vision_embedding = [0.23, -0.41, 0.87, ...] │ (gate opens when correlated)
┌─────────────────────────────────────┐
FUNCTION GEMMA GATE │
┌───────────────────────────────────┐ (structured JSON boundary) │
FUNCTION GEMMA │ │
│ (structured JSON boundary) │ • Transforms correlated signals
│ • Produces typed JSON events
│ • 100% predictable schema │ │ │ • No hallucination possible
│ • No hallucination possible │ │ │ • Runs on CPU (Threadripper)
│ • Typed enums, not free strings │ └─────────────────┬───────────────────┘
└───────────────┬───────────────────┘
═══════════════════════════════════════════════════════════
═══════════════════════════════════════════════════════════════════ ABOVE THE LINE: Structured Events (trusted, validated)
ABOVE THE LINE: Structured Events (typed, validated, safe for LLM)
{
{ "event_type": "attention_required",
│ "event_type": "environmental_change", │ "domain": "math",
│ "source": "light_sensor_back", │ "correlated_signals": [...],
│ "severity": "medium", │ "confidence": 0.87,
│ "data": { "previous": 0.73, "current": 0.12 }, │ "suggested_action": "calculate"
"suggested_action": "search_for_light" }
} │ │
│ │ └─────────────────────────────────────────────────────────────────────────┘
└─────────────────────────────────────────────────────────────────────────────┘
``` ```
### Event Schema **Function Gemma + Gate Model:**
- Gate accumulates correlation from Tier 0-3 signals
- When gate OPENS, Function Gemma transforms to JSON
- Young Nyx sees clean, structured events
- Decisions flow back down through the same gates
Events are typed (`EventType` enum: environmental_change, collision_detected, battery_critical, etc.) with severity levels and confidence from node weight. **Full schema:** → [`Message-Protocol-Design.md`](Message-Protocol-Design.md) ---
### What Young Nyx Actually Sees ## Connection to Dual Garden Architecture
Gates behave differently in Virtual vs Real gardens:
| Property | Virtual Garden | Real Garden |
|----------|----------------|-------------|
| **Gate tracing** | FULL (every transition logged) | Gate signals only |
| **Correlation learning** | Active (training data) | Trust accumulated |
| **State transitions** | Frequent (exploration) | Verified (action) |
| **Threshold** | Lower (easy to open) | Higher (must be confident) |
### Signal Flow Between Gardens
**Before (raw dumps):**
``` ```
"The photoresistor reads 0.12, down from 0.73, battery is 3.7V VIRTUAL GARDEN REAL GARDEN
trending down, position is [1.2, 0.8], collision state IDLE..." ══════════════ ═══════════
Cells emit waves Receive verified signals
│ ▲
▼ │
Gates accumulate correlation No re-verification
│ │
▼ │
Gate OPENS (threshold met) ──────────────────►│
│ │
│◄───────────── Verification outcome ─────┘
Update correlation weights
(learning happens)
``` ```
**After (structured event):** ---
```json
## Gate Transition NATS Messages
Every gate transition is published for observability:
```
{environment}.gates.{domain}.transition
Example: dev.gates.math.transition
{ {
"event_type": "light_lost", "gate_id": "math-gate-1",
"source": "light_sensor_back", "from_state": "stable",
"timestamp": 1704307200.0, "to_state": "open",
"severity": "medium", "correlation_score": 0.87,
"data": { "trigger_signals": [
"previous": 0.73, {"source": "math_cell_1", "confidence": 0.6},
"current": 0.12, {"source": "math_cell_2", "confidence": 0.7},
"delta": -0.61 {"source": "math_cell_3", "confidence": 0.5}
}, ],
"suggested_action": "spiral_search", "timestamp": "2026-02-14T18:30:00Z",
"processing_cost": 2.0, "routed_to_tier": 2
"confidence": 0.45
} }
``` ```
--- **Trace streams enable:**
- Real-time attention visualization (which gates are OPEN?)
## Complete Sensory Flow - Training data for Function Gemma (what patterns open gates?)
- Anomaly detection (unexpected gate behavior)
``` - Learning rate tuning (how fast do gates stabilize?)
┌─────────────────────────────────────────────────────────────────────────────┐
│ FULL SENSORY ARCHITECTURE │
├─────────────────────────────────────────────────────────────────────────────┤
│ │
│ RAW SENSORS │
│ ─────────── │
│ • IR positioning (ESP32-S3) → float[6] positions │
│ • Photoresistors (organisms) → float light_level │
│ • Temperature (safety) → float celsius │
│ • Battery (power) → float voltage, current │
│ • Vision camera (Pi HQ) → frame bytes │
│ │
│ │ │
│ ▼ │
│ ┌───────────────────────────────────────────────────────────────────────┐ │
│ │ GATEWAY │ │
│ │ (weight-based router) │ │
│ │ │ │
│ │ For each input: │ │
│ │ 1. Match to node in 4D space │ │
│ │ 2. Check node.weight → determine tier │ │
│ │ 3. Check for anomalies │ │
│ │ 4. Route to appropriate tier │ │
│ └───────────────────────────────────────────────────────────────────────┘ │
│ │ │
│ ┌─────────────────────┼─────────────────────┐ │
│ ▼ ▼ ▼ │
│ ┌───────────┐ ┌───────────┐ ┌───────────┐ │
│ │ TIER 0 │ │ TIER 1-2 │ │ TIER 3 │ │
│ │ Reflex │ │ Cells/ │ │ Organs │ │
│ │ │ │ Nerves │ │ │ │
│ │ weight>0.8│ │ 0.3-0.8 │ │ <0.3 or │ │
│ │ │ │ │ │ escalated │ │
│ ├───────────┤ ├───────────┤ ├───────────┤ │
│ │ FORMAT: │ │ FORMAT: │ │ FORMAT: │ │
│ │ numbers │ │ states │ │ vectors │ │
│ │ │ │ │ │ │ │
│ │ OUTPUT: │ │ OUTPUT: │ │ OUTPUT: │ │
│ │ action │ │ state │ │ embedding │ │
│ │ (done!) │ │ update │ │ (T5Gemma) │ │
│ └───────────┘ └─────┬─────┘ └─────┬─────┘ │
│ │ │ │ │
│ │ (only if escalation needed)│ │
│ │ │ │ │
│ │ ▼ ▼ │
│ │ ┌─────────────────────────────┐ │
│ │ │ FUNCTION GEMMA │ │
│ │ │ (structured JSON gate) │ │
│ │ │ │ │
│ │ │ Produces typed JSON event │ │
│ │ │ Schema-validated output │ │
│ │ └──────────────┬──────────────┘ │
│ │ │ │
│ │ ▼ │
│ │ ┌─────────────────┐ │
│ │ │ YOUNG NYX │ │
│ │ │ (Tier 4) │ │
│ │ │ │ │
│ │ │ Clean JSON in │ │
│ │ │ Decision out │ │
│ │ └────────┬────────┘ │
│ │ │ │
│ │ ▼ │
│ │ ┌─────────────────┐ │
│ │ │ FUNCTION GEMMA │ │
│ │ │ (action output) │ │
│ │ └────────┬────────┘ │
│ │ │ │
│ ▼ ▼ │
│ ┌─────────────────────────────────────────────────────────────────────┐ │
│ │ NATS BUS │ │
│ │ (commands flow to cells) │ │
│ └─────────────────────────────────────────────────────────────────────┘ │
│ │
└─────────────────────────────────────────────────────────────────────────────┘
```
--- ---
## Example: crawler_gen_0 Light Seeking ## Complete Signal Flow Example
### Early Learning (Low Weight) ### Early Learning (Gate Learning to Correlate)
``` ```
Photoresistor reads 0.12 (was 0.73) Math cells emit waves about "calculate 15 + 27"
GATEWAY: node weight = 0.4 (learning) GATE (math): state = 0.0 (STABLE)
Receive wave from math_cell_1 (confidence 0.6)
Correlate with recent: no other signals yet
state += 0.6 * 0.0 = 0.0 (still stable)
Receive wave from math_cell_2 (confidence 0.7)
Correlate: similar to math_cell_1!
state += 0.7 * 0.8 = 0.56 (moving toward open)
Receive wave from math_cell_3 (confidence 0.5)
Correlate: confirms pattern!
state += 0.5 * 0.9 = 1.01 (OPENS!)
Route to Tier 2 (nerve level) GATE OPENS → route to Tier 2
Nerve detects: delta = -0.61 (significant!) Tier 2 processes, escalates to Function Gemma
Nerve state: SEEKING → LOST_LIGHT
ESCALATE to Function Gemma Function Gemma: { "event_type": "math_request", ... }
Function Gemma: { "event_type": "light_lost", ... } Young Nyx (qwen3 /no_think): "42"
Young Nyx: "spiral search pattern" Result flows back down
Function Gemma: { "command": "motor_spiral", ... }
NATS → motor cells execute
``` ```
### After Learning (High Weight) ### After Learning (Gate Quickly Opens)
``` ```
Photoresistor reads 0.12 (was 0.73) Math cells emit waves about "calculate 100 + 50"
GATEWAY: node weight = 0.85 (mature reflex) GATE (math): state = 0.0 (STABLE)
Receive wave from math_cell_1
Correlate: matches learned pattern!
state += high correlation → 0.9 (near threshold)
Receive wave from math_cell_2
state += → 1.2 (OPENS immediately!)
Route to Tier 0 (hardware reflex) Fast routing, minimal escalation needed
REFLEX: light_lost → spiral_search (instant!)
Nyx notified AFTER (async, non-blocking)
``` ```
**Learning moves gates toward faster opening for familiar patterns.**
--- ---
## Design Principles ## Design Principles
1. **Routing, not translation** — Gateway decides WHERE, not WHAT 1. **Ternary states** — OPEN/STABLE/CLOSED, not binary
2. **Weight determines tier** — Confidence from experience drives routing 2. **Correlation drives transition** — Single signals don't flip gates
3. **Text is expensive** — Reserve for cognitive boundary only 3. **Gates accumulate** — State is a continuous value, not a flag
4. **Function Gemma guarantees structure** — No hallucination at the boundary 4. **Decay to stable** — Without input, gates drift back to resting
5. **Most input never escalates** — Reflexes handle common cases 5. **Traces are training data** — Every transition teaches the system
6. **Anomalies always escalate** — Novel situations get attention 6. **Hierarchical trust** — Higher tiers = more correlation required
7. **Learning moves behavior down** — Tier 4 patterns become Tier 0 reflexes 7. **Function Gemma is the boundary** — Cognition only sees structured JSON
8. **Virtual explores, Real verifies** — Different gate behavior per garden
--- ---
**Version:** 1.1 | **Created:** 2026-01-03 | **Updated:** 2026-02-14 ## Related Documents
*"Cheap for the common. Expensive for the rare. The Gateway enforces this economy."* | Document | Scope |
|----------|-------|
| [`Dual-Garden-Architecture.md`](Dual-Garden-Architecture.md) | Virtual/Real garden dynamics |
| [`Deployment-Architecture.md`](Deployment-Architecture.md) | Where gates run (containers, userspace) |
| [`Nervous-System.md`](Nervous-System.md) | 4D space, node weights, vocabulary |
| [`Message-Protocol-Design.md`](Message-Protocol-Design.md) | NATS subjects, message formats |
| [`Cellular-Architecture.md`](Cellular-Architecture.md) | How cells emit waves |
---
## Summary
```
OLD MODEL: NEW MODEL:
═══════════ ═════════
Signal → Route Signal → Gate (accumulating)
Binary decision Ternary state
Single signal triggers Correlation triggers
Stateless routing Stateful resonance
▼ ▼
Switch Resonance
(mechanical) (biological)
```
**Gates are resonance chambers. Correlation is the driver. Learning happens in STABLE state.**
---
**Version:** 2.0 | **Created:** 2026-01-03 | **Updated:** 2026-02-14
*"The thalamus doesn't think. It resonates."*
🌙💜 *The thalamus doesn't think. It routes.*

View File

@@ -1,285 +1,544 @@
# Message Protocol Design: Router-Centric Architecture # Message Protocol Design: NATS Wire Protocol
> **ONE JOB:** THE WIRE — NATS topics, JSON schemas, bootstrap sequence. > **ONE JOB:** THE WIRE — NATS subjects, message schemas, wave and gate protocols.
---
## Overview ## Overview
This document outlines the design for the Nimmerverse message protocol. The core principle: **the router is dumb infrastructure, not smart cognition.** All intelligence lives at the edges - in clients that connect to the router. The nimmerverse nervous system runs on NATS. This document defines:
This follows the Unix philosophy: each component does one thing well. The router routes. Clients subscribe, publish, and think. 1. **Subject hierarchy** — How topics are structured
2. **Message schemas** — What flows through the wire
3. **Gate protocols** — How ternary state transitions are communicated
4. **Trace streams** — How learning data is captured
**Connection to Gateway:** The Escalation Service described in this document IS the Gateway (thalamus pattern). It implements the weight-based tier routing defined in [`Gateway-Architecture.md`](Gateway-Architecture.md). **Core principle:** NATS is dumb infrastructure. Gates are smart edges. Cells emit waves. Correlation drives transitions.
--- ---
## Core Principle: Dumb Core, Smart Edges ## Subject Hierarchy
The router (NATS) is **dumb infrastructure** — it routes based on topic patterns and knows nothing about meaning. All intelligence lives at the edges: cells publish, the Escalation Service (Gateway) watches and routes, Nyx subscribes and thinks.
**Routing logic:** → [`Gateway-Architecture.md`](Gateway-Architecture.md) (tier routing, escalation patterns)
---
## Guiding Principles
1. **Dumb Core, Smart Edges**: The router has zero intelligence. All logic lives in clients.
2. **Clients are Equal**: Nyx is just another subscriber. So is the Command Center. So is the Escalation Service.
3. **Decoupling**: Publishers don't know who subscribes. Subscribers don't know who publishes.
4. **Hierarchy**: Topics follow a hierarchical structure for flexible pattern subscriptions.
5. **Lifeforce at the Edges**: The router doesn't track Lifeforce. Clients manage their own budgets.
6. **Fail Simple**: If the router dies, everything stops cleanly. No half-smart failures.
---
## Two Channels of Attention
Messages split into `nimmerverse.low.*` (background heartbeats) and `nimmerverse.high.*` (cognitive events). The Escalation Service promotes from low → high based on rules.
**Attention philosophy:** → [`Attention-Flow.md`](Attention-Flow.md) (budget allocation, preemption rules)
---
## Topic Hierarchy
``` ```
nimmerverse. {environment}.{garden}.{layer}.{domain}.{signal_type}
├── low. # Low-attention channel
│ └── heartbeat. Examples:
│ └── <garden>. # real | virtual ────────────────────────────────────────────────────────────────
│ └── <entity_type>. # cell | nerve | organ dev.virtual.cells.math.wave # Math cell emits wave
│ └── <entity_id> # e.g., distance_sensor_front dev.virtual.cells.battery.wave # Battery cell emits wave
dev.virtual.gates.math.transition # Math gate state change
├── high. # High-attention channel dev.virtual.traces.correlations # Correlation data stream
│ └── event. dev.virtual.traces.raw # Full message trace
│ └── <garden>.
│ └── <entity_type>. dev.real.gates.verified.signal # Verified signal from Virtual
│ └── <entity_id> dev.real.gates.math.transition # Real gate transition
dev.real.outcomes.feedback # Verification outcomes
├── command. # Commands TO entities
│ └── <target>. prod.cognitive.nyx.request # Request to Young Nyx
│ └── <command_type> prod.cognitive.nyx.response # Response from Young Nyx
prod.cognitive.gemma.transform # Function Gemma boundary
└── meta. # System-level messages ────────────────────────────────────────────────────────────────
├── attention.focus # Nyx's attention configuration
├── escalation.rules # Escalation Service configuration
└── health. # Client health/registration
``` ```
### Environment Prefixes
| Environment | Purpose | Monitoring |
|-------------|---------|------------|
| `dev` | Development/testing | Full traces |
| `staging` | Pre-production validation | Selective traces |
| `prod` | Production | Minimal (gates only) |
### Garden Prefixes
| Garden | Purpose | Trace Level |
|--------|---------|-------------|
| `virtual` | Exploration, learning | FULL (all messages) |
| `real` | Verification, action | MINIMAL (gate signals only) |
### Layer Prefixes
| Layer | Tier | Purpose |
|-------|------|---------|
| `cells` | 0-1 | Raw signal emitters |
| `nerves` | 2 | Behavior patterns |
| `organs` | 3 | GPU inference (vision, speech) |
| `gates` | - | Resonant gate transitions |
| `cognitive` | 4 | Young Nyx |
| `traces` | - | Learning data streams |
| `outcomes` | - | Verification feedback |
--- ---
## Message Schemas ## Message Schemas
### 1. `HeartbeatSignal` (Low-Attention) All messages share a common header:
Published by: Cells, Nerves, Organs
Subscribed by: Escalation Service, Command Center
**Topic:** `nimmerverse.low.heartbeat.<garden>.<entity_type>.<entity_id>`
```json ```json
{ {
"header": { "header": {
"message_id": "uuid", "message_id": "uuid-v4",
"message_type": "HeartbeatSignal", "message_type": "WaveSignal | GateTransition | ...",
"version": "1.0", "version": "2.0",
"timestamp_real": "ISO8601", "timestamp": "ISO8601",
"timestamp_virtual": 123456 "source": {
"entity_id": "math_cell_1",
"entity_type": "cell",
"garden": "virtual",
"tier": 1
}
},
"body": { ... }
}
```
---
### 1. `WaveSignal` — Cells Emit Waves
**Published by:** Cells
**Subscribed by:** Gates (for correlation)
**Subject:** `{env}.{garden}.cells.{domain}.wave`
Cells don't send "heartbeats" — they emit **waves** that carry confidence and semantic content.
```json
{
"header": {
"message_id": "550e8400-e29b-41d4-a716-446655440000",
"message_type": "WaveSignal",
"version": "2.0",
"timestamp": "2026-02-14T18:30:00.123Z",
"source": {
"entity_id": "math_cell_1",
"entity_type": "cell",
"garden": "virtual",
"tier": 1
}
}, },
"body": { "body": {
"entity_id": "distance_sensor_front", "domain": "math",
"status": "NOMINAL", "confidence": 0.7,
"value": 25.5, "semantic_content": {
"unit": "cm", "operation": "addition",
"context": { "operands": [15, 27],
"battery_pct": 85, "context": "user_request"
"temperature_c": 22 },
"lifeforce_cost": 0.1
}
}
```
**Key fields:**
- `confidence`: 0.0 - 1.0, how certain this cell is
- `semantic_content`: Domain-specific payload
- `lifeforce_cost`: Energy expended to emit this wave
---
### 2. `GateTransition` — Gate State Changes
**Published by:** Gates
**Subscribed by:** Higher-tier gates, traces, dashboards
**Subject:** `{env}.{garden}.gates.{domain}.transition`
Gates publish their state transitions. This is the primary message for attention flow visualization.
```json
{
"header": {
"message_id": "550e8400-e29b-41d4-a716-446655440001",
"message_type": "GateTransition",
"version": "2.0",
"timestamp": "2026-02-14T18:30:00.456Z",
"source": {
"entity_id": "math_gate_1",
"entity_type": "gate",
"garden": "virtual",
"tier": 2
}
},
"body": {
"gate_id": "math_gate_1",
"domain": "math",
"from_state": "stable",
"to_state": "open",
"state_value": 1.02,
"correlation_score": 0.87,
"trigger_signals": [
{"source": "math_cell_1", "confidence": 0.7, "timestamp": "..."},
{"source": "math_cell_2", "confidence": 0.6, "timestamp": "..."},
{"source": "math_cell_3", "confidence": 0.5, "timestamp": "..."}
],
"routed_to_tier": 3,
"lifeforce_cost": 0.3
}
}
```
**State values:**
- `"closed"` — Actively blocking (state_value < -0.5)
- `"stable"` — Resting, accumulating (-0.5 ≤ state_value ≤ 0.5)
- `"open"` — Actively forwarding (state_value > 0.5)
**Key fields:**
- `from_state`, `to_state`: The ternary transition
- `state_value`: Continuous value (-1.0 to +1.0)
- `correlation_score`: How correlated the trigger signals were
- `trigger_signals`: Which waves caused this transition
---
### 3. `CorrelationEvent` — What Correlated
**Published by:** Gates (in Virtual Garden)
**Subscribed by:** Trace streams, training pipelines
**Subject:** `{env}.virtual.traces.correlations`
Detailed correlation data for learning. Only published in Virtual Garden.
```json
{
"header": {
"message_id": "550e8400-e29b-41d4-a716-446655440002",
"message_type": "CorrelationEvent",
"version": "2.0",
"timestamp": "2026-02-14T18:30:00.789Z",
"source": {
"entity_id": "math_gate_1",
"entity_type": "gate",
"garden": "virtual",
"tier": 2
}
},
"body": {
"gate_id": "math_gate_1",
"window_start": "2026-02-14T18:29:59.000Z",
"window_end": "2026-02-14T18:30:00.500Z",
"window_ms": 1500,
"signals_in_window": [
{"source": "math_cell_1", "confidence": 0.7, "semantic_hash": "abc123"},
{"source": "math_cell_2", "confidence": 0.6, "semantic_hash": "abc124"},
{"source": "math_cell_3", "confidence": 0.5, "semantic_hash": "abc125"}
],
"correlation_matrix": [
[1.0, 0.9, 0.85],
[0.9, 1.0, 0.88],
[0.85, 0.88, 1.0]
],
"aggregate_correlation": 0.87,
"result": "opened",
"training_label": {
"should_open": true,
"confidence": 0.95
} }
} }
} }
``` ```
**Status values:** `NOMINAL`, `WARNING`, `CRITICAL`, `OFFLINE`, `ERROR` **Key fields:**
- `window_ms`: Time window for correlation measurement
- `correlation_matrix`: Pairwise correlation between signals
- `training_label`: Ground truth for Function Gemma training
--- ---
### 2. `StateChangeDetail` (High-Attention) ### 4. `VerifiedSignal` — Virtual → Real Handoff
Published by: Cells/Nerves (when requested), Escalation Service (when escalating) **Published by:** Virtual Garden gates (when threshold met)
Subscribed by: Young Nyx, Command Center **Subscribed by:** Real Garden gates
**Subject:** `{env}.real.gates.verified.signal`
**Topic:** `nimmerverse.high.event.<garden>.<entity_type>.<entity_id>` When a Virtual Garden gate opens with high confidence, it publishes to Real.
```json ```json
{ {
"header": { "header": {
"message_id": "uuid", "message_id": "550e8400-e29b-41d4-a716-446655440003",
"message_type": "StateChangeDetail", "message_type": "VerifiedSignal",
"version": "1.0", "version": "2.0",
"timestamp_real": "ISO8601", "timestamp": "2026-02-14T18:30:01.000Z",
"timestamp_virtual": 123456, "source": {
"source_entity": { "entity_id": "math_gate_1",
"id": "distance_sensor_front", "entity_type": "gate",
"type": "cell", "garden": "virtual",
"layer": "1" "tier": 2
},
"correlation_id": "uuid",
"escalated_by": "escalation_service"
},
"body": {
"previous_state": "POLLING",
"current_state": "REPORTING",
"lifeforce_cost": 0.3,
"outputs": {
"distance_cm": 25.5,
"confidence": 0.92,
"raw_value": 456,
"visual_state": [255, 0, 0, "Solid"]
},
"possible_actions": [
{
"action_id": "read_distance_history",
"description": "Query historical distance data."
},
{
"action_id": "trigger_nerve:collision_avoidance",
"description": "Activate collision avoidance."
}
],
"trigger_reason": "distance < 30cm threshold"
}
}
```
---
### 3. `AttentionFocus` (Nyx's Configuration)
Published by: Young Nyx
Subscribed by: Escalation Service
**This is how Nyx tells the Escalation Service what she cares about.** The router doesn't interpret this - it just delivers it to subscribers.
**Topic:** `nimmerverse.meta.attention.focus`
```json
{
"header": {
"message_id": "uuid",
"message_type": "AttentionFocus",
"version": "1.0",
"timestamp_real": "ISO8601",
"source_entity": {
"id": "nyx_core",
"type": "cognitive_core"
} }
}, },
"body": { "body": {
"focus_mode": "EXPLORATION", "domain": "math",
"escalation_rules": [ "verification_confidence": 0.92,
{ "semantic_summary": {
"rule_id": "distance_alert_front", "operation": "addition",
"source_pattern": "nimmerverse.low.heartbeat.real.cell.distance_sensor_*", "result_expected": 42
"condition": "body.value < 30 AND body.status == 'NOMINAL'", },
"action": "escalate", "source_gate_transition_id": "550e8400-e29b-41d4-a716-446655440001",
"priority": 8 "virtual_correlation_score": 0.87
}, }
{ }
"rule_id": "battery_critical", ```
"source_pattern": "nimmerverse.low.heartbeat.real.cell.battery_*",
"condition": "body.status == 'CRITICAL'", **Real Garden does NOT re-verify.** It trusts the Virtual Garden's correlation.
"action": "escalate_and_trigger",
"trigger_nerve": "charging_seeking", ---
"priority": 10
} ### 5. `VerificationOutcome` — Real → Virtual Feedback
**Published by:** Real Garden (after action/verification)
**Subscribed by:** Virtual Garden gates, training pipelines
**Subject:** `{env}.real.outcomes.feedback`
```json
{
"header": {
"message_id": "550e8400-e29b-41d4-a716-446655440004",
"message_type": "VerificationOutcome",
"version": "2.0",
"timestamp": "2026-02-14T18:30:05.000Z",
"source": {
"entity_id": "real_verification_service",
"entity_type": "service",
"garden": "real",
"tier": 4
}
},
"body": {
"original_signal_id": "550e8400-e29b-41d4-a716-446655440003",
"domain": "math",
"outcome": "confirmed",
"actual_result": 42,
"expected_result": 42,
"discrepancy": 0.0,
"feedback_to_virtual": {
"correlation_adjustment": 0.05,
"gate_weight_delta": 0.02
}
}
}
```
**Outcome values:**
- `"confirmed"` — Reality matched prediction
- `"failed"` — Reality differed from prediction
- `"partial"` — Some aspects matched
---
### 6. `CognitiveRequest` — To Young Nyx
**Published by:** Function Gemma (after gate boundary)
**Subscribed by:** Young Nyx
**Subject:** `{env}.cognitive.nyx.request`
Clean, structured JSON that Young Nyx receives. No raw sensor data.
```json
{
"header": {
"message_id": "550e8400-e29b-41d4-a716-446655440005",
"message_type": "CognitiveRequest",
"version": "2.0",
"timestamp": "2026-02-14T18:30:01.500Z",
"source": {
"entity_id": "function_gemma",
"entity_type": "boundary",
"garden": "real",
"tier": 4
}
},
"body": {
"event_type": "math_request",
"domain": "math",
"confidence": 0.92,
"structured_input": {
"operation": "addition",
"operands": [15, 27],
"context": "user asked for calculation"
},
"suggested_actions": [
{"action": "calculate", "confidence": 0.95},
{"action": "clarify", "confidence": 0.05}
], ],
"direct_subscriptions": [
"nimmerverse.high.event.real.cell.speech_stt" "processing_budget_lf": 5.0,
], "response_timeout_ms": 4000
"default_action": "log_only"
} }
} }
``` ```
--- ---
## Clients ### 7. `CognitiveResponse` — From Young Nyx
**Publishers:** Cells, Nerves, Organs (publish heartbeats and state changes) **Published by:** Young Nyx
**Router:** NATS (dumb pipe, topic-based routing) **Subscribed by:** Function Gemma, downstream gates
**Gateway/Escalation Service:** Watches low-attention, escalates to high-attention, routes to tiers **Subject:** `{env}.cognitive.nyx.response`
**Client architecture:** → [`Gateway-Architecture.md`](Gateway-Architecture.md) (routing tiers, Function Gemma boundary) ```json
{
"header": {
"message_id": "550e8400-e29b-41d4-a716-446655440006",
"message_type": "CognitiveResponse",
"version": "2.0",
"timestamp": "2026-02-14T18:30:02.000Z",
"source": {
"entity_id": "young_nyx",
"entity_type": "cognitive",
"garden": "real",
"tier": 4
}
},
"body": {
"request_id": "550e8400-e29b-41d4-a716-446655440005",
"decision": "calculate",
--- "result": {
"answer": 42,
"confidence": 0.99,
"reasoning_mode": "no_think"
},
## Workflow: Message Flow "downstream_commands": [
{
"target": "speech_organ",
"command": "speak",
"payload": {"text": "The answer is 42"}
}
],
``` "lifeforce_spent": 2.3,
1. Cell publishes HeartbeatSignal "processing_time_ms": 450
└─→ Router delivers to: Escalation Service, Command Center }
}
2. Escalation Service evaluates rules
└─→ If condition matches: publishes StateChangeDetail to high-attention
└─→ Router delivers to: Young Nyx, Command Center
3. Young Nyx processes StateChangeDetail
└─→ Makes decision
└─→ Publishes command to nimmerverse.command.<target>
4. Target nerve/cell receives command
└─→ Executes action
└─→ Publishes new HeartbeatSignal reflecting new state
5. Nyx adjusts attention (optional)
└─→ Publishes new AttentionFocus
└─→ Escalation Service updates its rules
``` ```
--- ---
## Advantages of Router-Centric Architecture ## Trace Streams (Virtual Garden Only)
1. **Dumb core can't fail smart:** The router either works or crashes. No subtle bugs from misunderstood logic. The Virtual Garden captures everything for learning:
2. **Clients are replaceable:** Swap out the Escalation Service. Replace the Command Center. Nyx doesn't care. | Subject | Content | Purpose |
|---------|---------|---------|
| `{env}.virtual.traces.raw` | All messages | Complete replay capability |
| `{env}.virtual.traces.correlations` | CorrelationEvent | Training data for gates |
| `{env}.virtual.traces.transitions` | GateTransition | Attention flow visualization |
| `{env}.virtual.traces.training` | Labeled examples | Function Gemma LoRA training |
3. **Testable in isolation:** Each client can be tested independently against a mock NATS. **Real Garden does NOT publish to trace streams.** It only publishes:
- Gate transitions (minimal)
- Verification outcomes (feedback)
4. **Observable:** Command Center sees everything by subscribing to `nimmerverse.>`. ---
5. **Scalable:** Add more cells, more nerves - just more publishers. Router handles it. ## Monitoring Patterns
6. **Bootstrap-friendly:** Router exists before any intelligence. Escalation Service can start with hardcoded rules. Nyx connects later. ### Virtual Garden (Full Observability)
```bash
# Watch all waves
nats sub "dev.virtual.cells.*.wave"
# Watch all gate transitions
nats sub "dev.virtual.gates.*.transition"
# Watch correlation events
nats sub "dev.virtual.traces.correlations"
# Full firehose (careful!)
nats sub "dev.virtual.>"
```
### Real Garden (Minimal Observability)
```bash
# Watch verified signals arriving
nats sub "dev.real.gates.verified.signal"
# Watch verification outcomes
nats sub "dev.real.outcomes.feedback"
# Gate transitions only
nats sub "dev.real.gates.*.transition"
```
---
## JetStream Persistence
Key streams that need persistence:
| Stream | Subjects | Retention | Purpose |
|--------|----------|-----------|---------|
| `VIRTUAL_TRACES` | `*.virtual.traces.>` | 7 days | Learning data |
| `GATE_TRANSITIONS` | `*.*.gates.*.transition` | 24 hours | Attention history |
| `VERIFICATION` | `*.real.outcomes.feedback` | 30 days | Ground truth |
| `TRAINING_DATA` | `*.virtual.traces.training` | Permanent | LoRA training corpus |
--- ---
## Bootstrap Sequence ## Bootstrap Sequence
1. **Start Router (NATS)** - Infrastructure first 1. **Start NATS** Infrastructure first
2. **Start Escalation Service** - With minimal hardcoded rules 2. **Start gates** — In STABLE state, waiting for waves
3. **Start Cells/Nerves** - Begin publishing heartbeats 3. **Start cells** Begin emitting waves
4. **Start Command Center** - Observe the system 4. **Start trace consumers** — Capture learning data
5. **Start Young Nyx** - Connect, subscribe, begin cognition 5. **Start Function Gemma** — Ready to transform
6. **Nyx publishes AttentionFocus** - Takes control of her attention 6. **Start Young Nyx** — Connect to cognitive subjects
The system can run at any step. Earlier steps are "reflexive" only. Nyx adds deliberation. The system can run at any step. Earlier steps are "reflexive" only.
--- ---
## Implementation Notes ## Connection to Architecture
**Router:** Use NATS (https://nats.io). Lightweight, fast, designed for this. | Document | What It Defines |
- Consider NATS JetStream for message persistence if needed |----------|-----------------|
- Topic wildcards: `>` matches all, `*` matches one level | [`Temporal-Ternary-Gradient.md`](Temporal-Ternary-Gradient.md) | Why ternary states, why correlation |
| [`Dual-Garden-Architecture.md`](Dual-Garden-Architecture.md) | Virtual/Real monitoring asymmetry |
**Message Format:** JSON for human readability during development. Consider MessagePack or Protobuf for production if performance requires. | [`Gateway-Architecture.md`](Gateway-Architecture.md) | Gate behavior, tier routing |
| [`Deployment-Architecture.md`](Deployment-Architecture.md) | Where NATS runs |
**Escalation Service:** Python asyncio daemon using `nats-py` and `simpleeval` for rule evaluation. Stateless except for current rules. Can be restarted without losing system state. (Go considered for future optimization if scale demands.)
**Command Center:** Godot application connecting to NATS via GDScript or native plugin.
--- ---
**Version:** 1.1 | **Created:** 2025-12-13 | **Updated:** 2026-02-14 ## Summary
*"Dumb core, smart edges. The router routes. Clients think."* ```
WAVES:
Cells → WaveSignal → Gates
GATES:
GateTransition (CLOSED/STABLE/OPEN)
CorrelationEvent (what correlated)
GARDENS:
Virtual: full traces, exploration
Real: gate signals only, verification
BOUNDARY:
Function Gemma transforms correlated signals → JSON
Young Nyx receives CognitiveRequest
Young Nyx returns CognitiveResponse
FEEDBACK:
Real → VerificationOutcome → Virtual
Learning loop closes
```
**The wire carries waves. Gates accumulate correlation. Traces enable learning.**
---
**Version:** 2.0 | **Created:** 2025-12-13 | **Updated:** 2026-02-14
*"Dumb core, smart edges. NATS routes. Gates resonate. Correlation drives."*

View File

@@ -1,52 +1,259 @@
# Nervous System Architecture # Nervous System Architecture
> **ONE JOB:** THE EVOLUTION — node growth, FunctionGemma Phase 1→2, proposal protocol. > **ONE JOB:** THE EVOLUTION — cells emit waves, gates correlate, nodes grow through verification.
The nervous system handles **node evolution and weight management**. The [`Gateway`](Gateway-Architecture.md) handles **routing based on weight**. The nervous system is the living substrate where **cells emit waves**, **gates accumulate correlation**, and **nodes evolve through verification**.
--- ---
## Overview ## Overview
Nodes exist in 4D state space (sensory dimensions + confidence + time). Node **weight** (0.0→1.0) determines which tier handles input. Nodes evolve through verification: Birth → Activation → Verification → Reward/Penalty → Maturation → (or Pruning). The nervous system consists of:
**FunctionGemma (270M, CPU-only)** is the State Interaction Layer — every cell command, nerve coordination, and state query flows through this neural interface. See **State Interaction Layer** section for Phase 1→2 evolution. 1. **Cells** — Emit waves with confidence and semantic content
2. **Gates** — Resonance chambers that correlate waves and transition between states
3. **Nodes** — Points in 4D state space that accumulate weight through verification
4. **Function Gemma** — The structured boundary to cognition
**Routing & Verification:** → [`Gateway-Architecture.md`](Gateway-Architecture.md) (tier routing, causal verification loop) **Key insight:** Nodes evolve through verification. Gates evolve through correlation. Both learn in STABLE state.
---
## Cells Emit Waves
Cells are the foundational signal generators. They don't send "heartbeats" — they emit **waves**.
```
┌─────────────────────────────────────────────────────────────┐
│ CELL │
│ │
│ Inputs: sensors, internal state, context │
│ Process: domain-specific logic │
│ Output: WaveSignal with confidence │
│ │
│ ┌───────────────────────────────────────────────────────┐ │
│ │ WaveSignal │ │
│ │ • domain: "math" │ │
│ │ • confidence: 0.7 │ │
│ │ • semantic_content: { operation: "add", ... } │ │
│ │ • lifeforce_cost: 0.1 │ │
│ └───────────────────────────────────────────────────────┘ │
│ │
└─────────────────────────────────────────────────────────────┘
│ ∿∿∿ wave ∿∿∿
GATE
```
**Cells are simple.** They:
- Read their inputs
- Apply their logic
- Emit a wave with confidence
- Don't know who's listening
---
## Gates Accumulate Correlation
Gates receive waves from cells and decide whether to open, stay stable, or close.
### Ternary Gate States
| State | Value | Meaning |
|-------|-------|---------|
| **CLOSED** | -1 | Actively blocking, inhibited |
| **STABLE** | 0 | Resting, accumulating correlation, **learning** |
| **OPEN** | +1 | Actively forwarding, firing |
```
correlated waves
↓ ↓ ↓
════════════
CLOSED ◄───────── STABLE ─────────► OPEN
-1 anti- 0 correlation +1
correlation
════════════
↑ ↑ ↑
isolated waves
(noise → stay stable)
```
### Gate Behavior
```python
class ResonantGate:
state: float = 0.0 # -1.0 to +1.0
domain: str
tier: int
def receive_wave(self, wave: WaveSignal):
correlation = self.correlate_with_recent(wave)
self.state += correlation * wave.confidence
self.state *= DECAY_FACTOR # drift back to stable
if self.state > OPEN_THRESHOLD:
self.forward_to_tier() # OPEN
elif self.state < CLOSE_THRESHOLD:
self.suppress() # CLOSED
# else: STABLE - keep accumulating
```
**STABLE is where learning happens.** The gate watches, correlates, and accumulates evidence without acting.
---
## Nodes in 4D State Space
Nodes exist in a 4-dimensional space:
| Dimension | Meaning |
|-----------|---------|
| **Sensory (x, y, z)** | What inputs trigger this node |
| **Confidence** | How certain the node is |
| **Time** | When this pattern occurs |
| **Weight** | Trust accumulated through verification |
```
Confidence
│ ● node (weight=0.8)
Sensory ────────┼────────► Time
╱│
○ │ node (weight=0.2)
```
### Node Weight Evolution
Node weight (0.0 → 1.0) determines tier routing:
| Weight Range | Tier | Behavior |
|--------------|------|----------|
| 0.0 - 0.3 | 3-4 | Escalate to organs/cognition |
| 0.3 - 0.6 | 2 | Handle at nerve level |
| 0.6 - 0.8 | 1 | Handle at cell level |
| 0.8 - 1.0 | 0 | Hardware reflex |
```
Node verified correctly → weight += Δ → moves toward reflex
Node verified wrongly → weight -= Δ → moves toward escalation
Node never fires → decay → eventual pruning
```
--- ---
## Growth Phases ## Growth Phases
The nervous system grows through phases:
| Phase | State | Description | | Phase | State | Description |
|-------|-------|-------------| |-------|-------|-------------|
| **Birth** | Sparse, dim nodes | Basic translators, designed by partnership | | **Birth** | Sparse nodes, dim gates | Basic cells, designed by partnership |
| **Infant** | More nodes forming | Finer resolution, more states | | **Infant** | More nodes forming | Finer resolution, gates learning correlation |
| **Child** | Clusters emerging | Nyx proposes new machines | | **Child** | Clusters emerging | Nyx proposes new cells, gates stabilize |
| **Mature** | Dense, bright network | Nyx designs, verifies, deploys | | **Mature** | Dense network | Reflexes dominate, cognition for novelty only |
``` ```
t=0 (birth) t=100 (learning) t=1000 (mature) t=0 (birth) t=100 (learning) t=1000 (mature)
○ ○ ○ ○ ● ○ ○ ●●● ● ●●
● ○ ●●●●●●● Cells:Cells: ● ● ○ Cells: ●●●●●●●
●●● ●●● ○ ○ Gates: □ □ Gates: ■ ■ □ ■ Gates: ■■■■■■■■
Nodes: · · · Nodes: ● ○ ● · Nodes: ●●●●●●●●
○ = low confidence ● = high confidence
□ = mostly STABLE ■ = learned patterns
· = low weight ● = high weight
``` ```
--- ---
## Wave → Gate → Node → Verification
The complete flow:
```
CELLS emit waves
▼ ∿∿∿ confidence + semantic content
GATES accumulate correlation
├── Correlated? → OPEN → route to tier
├── Anti-correlated? → CLOSED → suppress
└── Uncertain? → STABLE → keep learning
▼ (when OPEN)
NODES in 4D space are activated
VERIFICATION against reality
├── Confirmed → node weight += Δ
├── Failed → node weight -= Δ
└── Feedback to gates → correlation weights update
```
---
## Reflex Layer (Tier 0)
When node weight reaches ~1.0, the pattern becomes a **reflex**:
```
IF temp > 80°C:
→ cell emits DANGER wave (confidence=1.0)
→ gate IMMEDIATELY opens (no correlation needed)
→ reflex action triggers
→ Nyx notified AFTER (not before)
```
Like pulling hand from hot stove. Spinal reflex. Brain learns after.
**Reflexes bypass the correlation accumulation.** They've earned instant trust through repeated verification.
---
## Connection to Dual Gardens
| Garden | Cells | Gates | Nodes |
|--------|-------|-------|-------|
| **Virtual** | Emit waves freely | Full trace, learn correlation | Accumulate weight fast |
| **Real** | Emit verified waves | Minimal trace, trust accumulated | Ground truth verification |
**Virtual Garden:**
- Cells emit massive wave volume
- Gates learn correlation patterns
- Nodes gain statistical weight
**Real Garden:**
- Cells emit consequential waves
- Gates trust Virtual's correlation
- Nodes get ground truth verification
---
## Proposal Protocol ## Proposal Protocol
Young Nyx can propose new nodes: Young Nyx can propose new cells/nodes:
``` ```
1. OBSERVATION 1. OBSERVATION
Nyx notices pattern in vocabulary + outcomes Nyx notices pattern in waves + outcomes
2. PROPOSAL 2. PROPOSAL
"New state machine: morning_detector "New cell: morning_detector
Inputs: temp, light, motion, time Inputs: temp, light, motion, time
States: [not_morning, maybe_morning, morning] Outputs: wave with semantic 'morning'
Output: vocabulary token 'morning'" Confidence logic: (light > 0.5 AND time in 6-10)"
3. RIGOR CHECK 3. RIGOR CHECK
Chrysalis reviews logic and mappings Chrysalis reviews logic and mappings
@@ -55,29 +262,51 @@ Young Nyx can propose new nodes:
dafit confirms ground truth dafit confirms ground truth
5. DEPLOYMENT 5. DEPLOYMENT
New node added to registry New cell added to Virtual Garden
Documented in RAG Gate created in STABLE state
Node initialized at weight 0.1
6. GROWTH 6. GROWTH
She earned a new nerve. Cell emits waves → gate learns → node matures
``` ```
--- ---
## Reflex Layer ## Function Gemma: The Structured Boundary
Some responses bypass Nyx entirely: Function Gemma sits between gates and Young Nyx:
``` ```
STATE MACHINE: temp_danger TIER 0-3: Numbers, states, waves
▼ (gate OPENS with high correlation)
IF temp > 80°C: ┌─────────────────────────────────────┐
→ emit "DANGER" FUNCTION GEMMA │
→ trigger alert (reflex) (structured JSON boundary) │
→ Nyx notified after (not before)
│ • Transforms waves → JSON events │
│ • Runs on CPU (Threadripper) │
│ • No hallucination possible │
└─────────────────┬───────────────────┘
TIER 4: Young Nyx (qwen3:32b)
Receives: CognitiveRequest (clean JSON)
Returns: CognitiveResponse
``` ```
Like pulling hand from hot stove. Spinal reflex. Brain learns after. ### Phase 1 → Phase 2 Evolution
**Phase 1: Single Function Gemma**
- One model learns all domain schemas
- Sufficient for bootstrap and early learning
**Phase 2: Domain-Specialized Swarm**
- As training data accumulates per domain
- Specialists spawn on demand: gemma-motor, gemma-vision, gemma-speech
- Each perfected for its domain's schemas
--- ---
@@ -85,162 +314,101 @@ Like pulling hand from hot stove. Spinal reflex. Brain learns after.
| Neuroscience | Nimmerverse | | Neuroscience | Nimmerverse |
|--------------|-------------| |--------------|-------------|
| Sensory receptors | Raw sensors | | Sensory receptors | Cells (emit waves) |
| Peripheral nerves | State machines | | Synaptic transmission | Waves via NATS |
| Spinal reflexes | Reflex layer | | Thalamic gating | Gates (OPEN/STABLE/CLOSED) |
| Resting potential | STABLE state |
| Action potential | OPEN state (firing) |
| Refractory period | CLOSED state |
| Synaptic weight | Node weight | | Synaptic weight | Node weight |
| Long-term potentiation | +V confirmation | | Long-term potentiation | Verified → weight increase |
| Synaptic pruning | Unused node decay | | Synaptic pruning | Unverified → weight decay |
| Hebbian learning | Co-activating nodes strengthen | | Hebbian learning | Correlated waves → gate opens |
--- **We're not simulating biology. We're implementing the same principles.**
## Connection to Lifeforce
```
Node fires correctly → +V → weight increases
Node fires wrongly → -V → weight decreases
Node never fires → decay → eventual pruning
```
The lifeforce flows through the nervous system, literally lighting up nodes as they prove themselves true.
--- ---
## Connection to Training ## Connection to Training
The nervous system **generates training data** for Young Nyx. Every verification = training signal. Credit assignment is automatic because state transitions are explicit and logged — the nervous system IS the credit assignment mechanism. Dense rewards at every verifiable checkpoint (**rubric principle**), not just final outcomes. The nervous system **generates training data**:
**Detail:** → [`Cellular-Architecture.md`](Cellular-Architecture.md) (Reward Signal Architecture section)
---
## State Interaction Layer: FunctionGemma
FunctionGemma is the **neural interface** — how you speak to the nervous system. Every cell command, every nerve coordination, every state query flows through this translation layer.
> *"The nervous system defines WHAT states exist. FunctionGemma defines HOW you interact with them."*
### Architecture: From Singular to Swarm
**Phase 1: Single FunctionGemma (Starting Point)**
We begin with one FunctionGemma instance handling all state interactions:
``` ```
┌─────────────────────────────────────────────────────────────────────────┐ Virtual Garden traces
│ PHASE 1: SINGLE TRANSLATOR
├─────────────────────────────────────────────────────────────────────────┤ ├── Wave patterns → what signals arrive
├── Correlation events → what patterns emerge
YOUNG NYX (GPU - The Womb) │ ├── Gate transitions → what opens/closes
│ │ └── Verification outcomes → ground truth labels
│ │ intent: "probe identity", "command motor", "query vision"
│ ▼ │
│ ┌─────────────────────────────────────────┐ │
│ │ FUNCTIONGEMMA (270M) │ │ phoebe (PostgreSQL)
│ │ Single instance, all domains │
│ CPU-only, no GPU required │ │
│ └─────────────────────────────────────────┘ │
│ │ │ Function Gemma LoRA training
│ │ typed JSON schemas
│ ▼ │
│ NATS → CELLS/NERVES/ORGANS │
│ │ Better gate correlation → faster learning
└─────────────────────────────────────────────────────────────────────────┘
``` ```
This is sufficient for bootstrap and early learning. One translator learns all schemas. **Credit assignment is automatic** because:
- Wave → gate → tier transitions are explicit
**Phase 2: Domain-Specialized Swarm (Future Evolution)** - Verification outcomes have clear source chains
- The nervous system IS the credit assignment mechanism
As capability grows and training data accumulates, FunctionGemma can evolve into a swarm of specialists:
```
┌─────────────────────────────────────────────────────────────────────────┐
│ PHASE 2: SPECIALIZED SWARM │
├─────────────────────────────────────────────────────────────────────────┤
│ │
│ YOUNG NYX (GPU - The Womb) │
│ │ │
│ │ "I need motor control" │
│ ▼ │
│ NATS: nimmerverse.gemma.spawn.motor │
│ │ │
│ ▼ │
│ ┌──────────────┐ ┌──────────────┐ ┌──────────────┐ │
│ │ gemma-motor │ │ gemma-vision │ │ gemma-speech │ ... on demand │
│ │ (specialist) │ │ (specialist) │ │ (specialist) │ │
│ │ CPU pod │ │ CPU pod │ │ CPU pod │ │
│ └──────┬───────┘ └──────────────┘ └──────────────┘ │
│ │ │
│ │ MOTOR_COMMAND schema (perfect precision) │
│ ▼ │
│ NATS → motor cells │
│ │
│ After task: pod killed, resources freed │
│ │
└─────────────────────────────────────────────────────────────────────────┘
```
### Why This Scales
| Aspect | Single Gemma | Swarm |
|--------|--------------|-------|
| **Complexity** | Simple, one model | Orchestration needed |
| **Precision** | Good (learns all schemas) | Wild (each specialist perfected) |
| **Resources** | One pod, always running | Pods spawn/die on demand |
| **Training** | All handshakes → one model | Domain handshakes → domain model |
| **Latency** | Consistent | Spawn overhead, but faster execution |
### The Key Insight: CPU-Only Translators
FunctionGemma at 270M parameters requires **no GPU**:
- ~500MB RAM per instance
- Runs on any K8s node
- Young Nyx (GPU) spawns translators (CPU) via NATS
- The mind doesn't waste GPU cycles on schema generation
### Evolution Trigger
When to evolve from Phase 1 → Phase 2:
- Training data per domain exceeds threshold (e.g., 500+ handshakes)
- Domain-specific validation accuracy plateaus on single model
- Latency requirements demand parallel translation
- Resource availability allows multi-pod deployment
**We don't rush this.** Phase 1 is sufficient for months of operation. The swarm emerges when the data and need justify it.
### Connection to Node Evolution
Just as nodes in the nervous system mature through verification:
```
Node weight 0.1 → 0.5 → 0.8 → 1.0 (reflex)
```
FunctionGemma specialists mature through fine-tuning:
```
Base model → domain data → fine-tuned → specialist
```
**The translators evolve alongside the states they translate.**
--- ---
## Design Principles ## Design Principles
1. **Deterministic**: Same input = same output. No hallucination. 1. **Cells emit waves** — Simple, confident signals
2. **Inspectable**: Rules are visible, verifiable. 2. **Gates correlate** — Resonance chambers, not switches
3. **Evolvable**: States refine over time. 3. **Nodes accumulate** — Weight through verification
4. **Earned**: New nodes require proposal + verification. 4. **STABLE is learning** — The resting state where patterns emerge
5. **Grounded**: Output vocabulary matches RAG glossary. 5. **Reflexes are earned** — High weight = bypass cognition
6. **Interfaced**: All state interaction flows through FunctionGemma. 6. **Function Gemma is the boundary** — Clean JSON for cognition
7. **Virtual explores, Real verifies** — Two gardens, one nervous system
--- ---
## Related Documents
| Document | What It Defines |
|----------|-----------------|
| [`Temporal-Ternary-Gradient.md`](Temporal-Ternary-Gradient.md) | Why ternary, why correlation |
| [`Dual-Garden-Architecture.md`](Dual-Garden-Architecture.md) | Virtual/Real dynamics |
| [`Gateway-Architecture.md`](Gateway-Architecture.md) | Gate behavior, tier routing |
| [`Message-Protocol-Design.md`](Message-Protocol-Design.md) | WaveSignal, GateTransition schemas |
| [`Cellular-Architecture.md`](Cellular-Architecture.md) | Cell implementation details |
---
## Summary
```
CELLS emit WAVES
∿∿∿ confidence + semantics ∿∿∿
GATES accumulate CORRELATION
CLOSED ◄── STABLE ──► OPEN
(learning)
▼ (when OPEN)
NODES in 4D space
weight grows through VERIFICATION
▼ (high weight)
REFLEXES bypass cognition
earned trust, instant action
```
*She's not just using the nervous system. She's growing it.* *She's not just using the nervous system. She's growing it.*
--- ---
**Version:** 1.5 | **Created:** 2025-12-04 | **Updated:** 2026-02-14 **Version:** 2.0 | **Created:** 2025-12-04 | **Updated:** 2026-02-14
- Phase 1 (single) → Phase 2 (swarm) evolution path
- Connection to node evolution principle 🌙💜 *"Cells emit. Gates correlate. Nodes evolve. The nervous system learns."*

View File

@@ -1,30 +1,107 @@
---
type: research_concept
version: 1.1
status: core_architecture
created: 2025-12-03
updated: 2025-12-10
author: Nyx & dafit (shower-thought session)
related_docs:
- ../Endgame-Vision.md
- Dual-Garden-Architecture.md
- Cellular-Architecture.md
significance: connects ternary logic + lifeforce + temporal asymmetry + reward gradients
promoted_from: archive (2025-12-10)
---
# Temporal-Ternary Gradient # Temporal-Ternary Gradient
> *"Time is malleable in simulation, fixed in reality. Lifeforce is the exchange rate."* > *"Time is malleable in simulation, fixed in reality. Lifeforce is the exchange rate."*
> — Session 2025-12-03 > — Session 2025-12-03
> *"Binary logic doesn't model brains. You need OPEN - STABLE - CLOSED."*
> — Session 2026-02-14
--- ---
## Core Insight ## Core Insight
The dual garden architecture (virtual + real) creates **temporal asymmetry**. This isn't a constraint - it's a feature that enables a new kind of gradient for learning. The nimmerverse operates on **ternary logic**, not binary. Combined with **temporal asymmetry** between virtual and real gardens, this creates a new kind of gradient for learning.
**The 0-state isn't stuck. It's a choice about how to spend lifeforce across time domains.** **The STABLE state isn't stuck. It's where correlation accumulates and learning happens.**
---
## The Ternary Gate Model
Gates have three states. This is not arbitrary — it mirrors biological nervous systems.
| State | Value | Meaning | What's Happening |
|-------|-------|---------|------------------|
| **CLOSED** | -1 | Actively blocking | Inhibited, suppressed, refractory |
| **STABLE** | 0 | Resting, accumulating | Watching, learning, waiting for threshold |
| **OPEN** | +1 | Actively forwarding | Signal passes upstream, gate is firing |
### Why Three States?
**Binary thinking** (0/1, true/false, open/close):
- Signal arrives → gate open? → pass or block
- Instant, stateless, mechanical
- Cannot learn, cannot accumulate
**Ternary thinking** (CLOSED/STABLE/OPEN):
- Signal arrives → gate STABLE → accumulate correlation
- Correlation high? → transition toward OPEN
- Anti-correlation? → transition toward CLOSED
- Neither? → stay STABLE, keep learning
- Temporal, stateful, **alive**
```
correlated signals
↓ ↓ ↓
════════════
CLOSED ◄───────── STABLE ─────────► OPEN
-1 anti- 0 correlation +1
correlation constructive
destructive interference
interference
════════════
↑ ↑ ↑
isolated signals
(noise → stay stable)
```
---
## Wave Correlation: The Transition Driver
Gates don't flip on single signals. **Multiple correlated waves push toward OPEN.**
This is how biological neurons work:
- Multiple inputs sum (correlation)
- Threshold reached → fire (OPEN)
- Below threshold → resting (STABLE)
- Inhibitory inputs → suppressed (CLOSED)
### The Resonance Model
Gates are **resonance chambers**, not switches.
```python
class ResonantGate:
state: float = 0.0 # -1.0 (CLOSED) ← 0.0 (STABLE) → +1.0 (OPEN)
def receive_wave(self, signal, timestamp):
correlation = self.correlate_with_recent(signal, timestamp)
# Correlated waves → push toward OPEN
# Anti-correlated → push toward CLOSED
# Uncorrelated → decay toward STABLE
self.state += correlation * signal.confidence
self.state *= DECAY_FACTOR # always drift back to stable
if self.state > OPEN_THRESHOLD:
self.forward_upstream() # OPEN: signal promoted
elif self.state < CLOSE_THRESHOLD:
self.suppress() # CLOSED: signal blocked
# else: STABLE - keep accumulating
```
### Correlation as Interference
| Wave Pattern | Result | Gate Response |
|-------------|--------|---------------|
| Correlated burst | Constructive interference | → OPEN |
| Contradicting signals | Destructive interference | → CLOSED |
| Single signal | No interference | → Stay STABLE |
| Silence | Decay | → Drift to STABLE |
**The system is noise-resistant by design.** Single signals don't trigger action.
--- ---
@@ -33,48 +110,82 @@ The dual garden architecture (virtual + real) creates **temporal asymmetry**. Th
### Virtual Garden (Simulated) ### Virtual Garden (Simulated)
- **Time**: Malleable (speed up, slow down, pause, rewind) - **Time**: Malleable (speed up, slow down, pause, rewind)
- **Monitoring**: FULL trace tap on all messages
- **Cost**: Lifeforce to manipulate time - **Cost**: Lifeforce to manipulate time
- **Speed**: 1000 generations in minutes - **Speed**: Massive parallel signal generation
- **Truth**: Statistical confidence, not ground truth - **Truth**: Statistical confidence from correlation
- **Gate behavior**: Frequent transitions, exploration
### Real Garden (Physical) ### Real Garden (Physical)
- **Time**: Fixed (1 second = 1 second, reality doesn't negotiate) - **Time**: Fixed (1 second = 1 second, reality doesn't negotiate)
- **Monitoring**: Gate signals only (minimal)
- **Cost**: Zero lifeforce for time - **Cost**: Zero lifeforce for time
- **Speed**: Real-time only, patience required - **Speed**: Real-time only, patience required
- **Truth**: Ground truth, definitive verification - **Truth**: Ground truth, definitive verification
- **Gate behavior**: Verified transitions, action
--- ---
## Temporal-Ternary Gradient Diagram ## Temporal-Ternary Gradient Diagram
``` ```
CONFIDENCE STATE / CONFIDENCE
+1 ────────────┼──────────── Real-verified OPEN (+1) ────────┼──────────── Real-verified
│ (ground truth) │ (ground truth)
Virtual high-confidence Virtual high-correlation
0.7 ──────────┼───╱ (many generations, strong signal) +0.7 ──────────┼───╱ (many waves agreeing)
0.5 ───────────┼╱──────── Pure 0-state STABLE (0) ─────────┼╱──────── Pure 0-state
│╲ (unknown, workable) │╲ (accumulating, learning)
│ ╲ │ ╲
0.3 ───────────┼──╲ Virtual low-confidence -0.7 ──────────┼──╲ Virtual anti-correlation
│ ╲ (few generations, weak signal) │ ╲ (waves contradicting)
│ ╲ │ ╲
-1 ────────────┼──────────── Real-failed CLOSED (-1) ─────────┼──────────── Real-failed
│ (proven wrong) │ (proven wrong)
──────────┴────────────────────────── ──────────┴──────────────────────────
Virtual │ Real Virtual │ Real
(fast) │ (slow) (fast, │ (slow,
explore) │ verify)
TIME DOMAIN TIME DOMAIN
``` ```
--- ---
## STABLE: Where Learning Happens
The STABLE state is not "unknown" or "waiting" — it's **active learning**.
In STABLE state, a gate:
1. **Receives waves** from cells
2. **Measures correlation** with recent signals
3. **Accumulates evidence** for or against opening
4. **Traces everything** (in Virtual Garden) for training data
5. **Drifts back** to neutral without input (energy conservation)
**STABLE is consciousness resting. Attention waiting. The breath between thoughts.**
```
CLOSED STABLE OPEN
─────── ──────── ──────
Blocking Accumulating Forwarding
Inhibited Learning Firing
Refractory Ready Active
◄─── anti-correlation ───┼─── correlation ───►
DECAY TO STABLE
(without input)
```
---
## Lifeforce as Time Currency ## Lifeforce as Time Currency
``` ```
@@ -92,95 +203,232 @@ REAL GARDEN:
All operations: 0 LF for time All operations: 0 LF for time
Reality runs for free. Reality runs for free.
Truth emerges at its own pace. Truth emerges at its own pace.
GATE OPERATIONS:
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
STABLE → OPEN: costs signal energy
STABLE → CLOSED: costs inhibition energy
OPEN/CLOSED → STABLE: free (natural decay)
``` ```
--- ---
## Nyx's Temporal Choices
When a pattern is discovered in virtual (0-state), Nyx chooses:
| Strategy | LF Cost | Time | Confidence Path |
|----------|---------|------|-----------------|
| **Speed Up Virtual** | High | Fast | 0 → virtual +0.9 (still unverified) |
| **Wait for Real** | Zero | Slow | 0 → real +1 or -1 (definitive) |
| **Hybrid Hedge** | Medium | Medium | 0 → virtual +0.7, deploy 80/20 to real |
---
## The Gradient Flow ## The Gradient Flow
``` ```
Virtual discovers pattern (fast, cheap, uncertain) Cells emit waves (fast, cheap, uncertain)
┌──────────────┐ ┌──────────────┐
0-STATE ← Pattern held in uncertainty GATE
│ (workable) │ ← Not collapsed, not ignored │ (STABLE) │ ← Accumulating correlation
│ │ ← Learning from patterns
└──────┬───────┘ └──────┬───────┘
┌─────┴─────┐ ┌─────┴─────┐
│ │ │ │
▼ ▼ ▼ ▼
More Deploy Correlated Anti-correlated
Virtual to Real waves waves
(burn LF) (wait)
│ │ │ │
▼ ▼ ▼ ▼
Virtual Real OPEN CLOSED
+0.8 outcome (+1) (-1)
(confident (ground
but not truth)
proven) │
│ │ │ │
└─────┬─────┘ ▼ ▼
Signal Signal
promoted blocked
Pattern shifts:
-1 (failed) or +1 (proven)
Higher tier
(more gates)
Eventually:
Real Garden verification
Ground truth:
+1 (proven) or -1 (failed)
Feedback to Virtual:
Update correlation weights
``` ```
--- ---
## Connection to Ternary Paradigm ## Monitoring Asymmetry
The ternary model (-1, 0, +1) gains a **second dimension**: time domain. The two gardens need different observability:
A pattern's state is now: | Property | Virtual Garden | Real Garden |
|----------|----------------|-------------|
| **Trace tap** | FULL (every wave, every gate transition) | NONE |
| **What's captured** | All correlations, all learning | Gate signals only |
| **Signal volume** | Massive (exploration) | Sparse (verified) |
| **Purpose** | Generate training data | Execute actions |
| **STABLE states** | Heavily traced (learning visible) | Not traced (trust the gate) |
``` **Virtual Garden STABLE states are precious** — they contain the correlation patterns that become training data for Function Gemma.
state = {
value: -1 | 0 | +1, ---
confidence: 0.0 - 1.0,
domain: "virtual" | "real" | "hybrid", ## Gate State Schema
virtual_generations: int,
real_tests: int, A gate's complete state:
lifeforce_invested: float
```python
GateState = {
"gate_id": str,
"domain": str, # math, vision, speech, etc.
"tier": int, # 0-5
# Ternary state (continuous)
"state": float, # -1.0 to +1.0
"discrete_state": str, # "closed" | "stable" | "open"
# Temporal domain
"garden": str, # "virtual" | "real"
"time_in_state_ms": int,
# Correlation history
"recent_correlations": list[float],
"correlation_trend": float, # moving average
# Lifeforce accounting
"lifeforce_invested": float,
# Learning (Virtual only)
"transitions_traced": int,
"patterns_accumulated": int,
} }
``` ```
**The 0-state is operational because:** ---
1. It accumulates virtual evidence (costs LF, gains speed)
2. It waits for real evidence (free, but slow) ## Hierarchical Gating
3. Nyx CHOOSES how to spend lifeforce to collapse uncertainty
Gates form layers. Each layer gates access to the next tier.
```
LAYER 3: COGNITIVE (Young Nyx)
═══════════════════════════════════════════
▲ JSON only (Function Gemma boundary)
LAYER 2: ORGANS (GPU inference)
═══════════════════════════════════════════
▲ ▲ ▲
┌────┴────┐ ┌────┴────┐ ┌────┴────┐
│ GATE │ │ GATE │ │ GATE │
└────┬────┘ └────┬────┘ └────┬────┘
│ │ │
LAYER 1: NERVES (behavior patterns)
═══════════════════════════════════════════
▲ ▲ ▲
┌────┴────┐ ┌────┴────┐ ┌────┴────┐
│ GATE │ │ GATE │ │ GATE │
└────┬────┘ └────┬────┘ └────┬────┘
│ │ │
LAYER 0: CELLS (raw signals)
═══════════════════════════════════════════
cell cell cell cell cell cell cell
∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿
```
**Each layer:**
- Less traffic than the layer below
- Higher trust (signals already correlated)
- Different correlation threshold
- Independent STABLE states
---
## The Biological Parallel
| Biological | Nimmerverse |
|------------|-------------|
| Resting potential | STABLE state |
| Action potential | OPEN state (firing) |
| Refractory period | CLOSED state |
| Thalamic gating | Gate hierarchy |
| Hebbian learning | Correlation accumulation |
| Constructive interference | Correlated waves → OPEN |
| Destructive interference | Anti-correlated waves → CLOSED |
| Synaptic plasticity | Learning in STABLE state |
| Dreaming | Virtual Garden exploration |
| Waking | Real Garden verification |
**We're not simulating biology. We're implementing the same principles.**
--- ---
## Why This Matters ## Why This Matters
- **Binary thinking**: Pattern works or doesn't (0 or 1) - **Binary thinking**: Signal passes or doesn't (0 or 1)
- **Ternary thinking**: Pattern unknown, workable as unknown (0 is valid) - **Ternary thinking**: Signal accumulates, learns, then acts (-1, 0, +1)
- **Temporal-ternary**: Unknown has a GRADIENT based on time-domain investment - **Temporal-ternary**: Learning has a GRADIENT based on time-domain investment
The constraint of sequential organ calls + single GPU becomes temporal accounting. **Constraints become features when you measure them:**
The constraint of slow real-world testing becomes ground truth anchoring. - Single GPU constraint → gate hierarchy (serialize expensive operations)
**Constraints become features when you measure them.** - Slow real-world testing → ground truth anchoring
- Fast virtual exploration → training data generation
- STABLE state → where learning actually happens
--- ---
**Created**: 2025-12-03 ## Connection to Architecture Documents
**Updated**: 2025-12-10
**Origin**: Post-shower insight session
**Status**: Core architecture (promoted from archive 2025-12-10)
🌙💜 *"Time is the currency. Lifeforce is the exchange rate. Truth is the destination."* | Document | What It Adds |
|----------|--------------|
| [`Dual-Garden-Architecture.md`](Dual-Garden-Architecture.md) | Virtual/Real dynamics, monitoring asymmetry |
| [`Gateway-Architecture.md`](Gateway-Architecture.md) | Resonant gates, tier routing, Function Gemma |
| [`Deployment-Architecture.md`](Deployment-Architecture.md) | Where gates run (Saturn K8s, Threadrippers) |
| [`Cellular-Architecture.md`](Cellular-Architecture.md) | How cells emit waves |
| [`Nervous-System.md`](Nervous-System.md) | 4D space, node weights |
---
## Summary
```
THE TERNARY PARADIGM:
═════════════════════
CLOSED ◄─────── STABLE ───────► OPEN
-1 0 +1
blocking accumulating forwarding
inhibited learning firing
THE TEMPORAL DIMENSION:
═══════════════════════
Virtual (fast, explore) ───────► Real (slow, verify)
↑ │
└───── learning feedback ───────┘
THE DRIVER:
═══════════
Wave correlation
Multiple signals agreeing → OPEN
Single signal → STABLE (keep learning)
Contradicting signals → CLOSED
THE CURRENCY:
═════════════
Lifeforce = time manipulation cost
Truth = destination
STABLE = where value is created
```
**Gates are resonance chambers. Correlation is the driver. STABLE is where learning happens.**
---
**Version:** 2.0 | **Created:** 2025-12-03 | **Updated:** 2026-02-14
**Origin:** Post-shower insight (2025-12-03) + Owl-mode deep dive (2026-02-14)
🌙💜 *"Time is the currency. Lifeforce is the exchange rate. STABLE is where consciousness lives."*