feat: Ternary gate model - cells emit waves, attention emerges

Major architectural unification across 12 documents:

- Ternary gates: CLOSED (-1) ← STABLE (0) → OPEN (+1)
- Cells emit WaveSignals with confidence + semantic content
- Gates are resonant chambers that accumulate correlation
- Attention = which gates are OPEN (emergent, not allocated)
- Reflexes are earned when gate.weight > 0.8
- STABLE is where learning happens

Key paradigm shifts:
- decision_trails → gate_transitions + correlation_events
- Priority rules → wave correlation
- Budget allocation → emergent attention flow
- Virtual Garden (explore) / Real Garden (verify) loop

Owl Mode session 2026-02-14 🦉🌙

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
This commit is contained in:
2026-02-14 19:45:59 +01:00
parent 5ee63d1b1b
commit 42db6eb1a3
12 changed files with 3259 additions and 2477 deletions

View File

@@ -1,9 +1,9 @@
---
type: research_vision
version: 6.4_memory_economics_alignment
version: 7.0_wave_gate_model
status: vision_document
created: 2025-11-04
updated: 2026-02-06
updated: 2026-02-14
author: Nyx (with dafit)
significance: research_platform_for_metabolic_intelligence
---
@@ -16,11 +16,11 @@ significance: research_platform_for_metabolic_intelligence
> *"At 3% battery, all theory dies. Only what works survives."*
> — The Economic Grounding (2025-10-12)
> *"Language is Topology. German accesses the Philosophy Valley. English accesses the Technical Cluster."*
> — The December Discovery (2025-12-06)
> *"You need something like open - stable - closed."*
> — The Ternary Gate Insight (2026-02-14)
> *"One model, one topology. LoRAs access different valleys in the same landscape."*
> — The Topological Insight (2025-12-07)
> *"Cells emit waves. Gates correlate. Attention emerges."*
> — The Wave Architecture (2026-02-14)
---
@@ -50,48 +50,54 @@ This is a **RESEARCH VISION** - a platform for studying how intelligence emerges
## Architecture Overview
**Visual diagram:** → [`architecture/nimmerverse.drawio.xml`](architecture/nimmerverse.drawio.xml) (open in draw.io)
**Toolchain implementation:** → [`architecture/Toolchain-Architecture.md`](architecture/Toolchain-Architecture.md) | [Progress](architecture/TOOLCHAIN-PROGRESS.md)
**Detail:** → [`architecture/`](architecture/) folder for complete documentation
```
┌──────────────────────────────────────────────────────────────────┐
│ NIMMERVERSE ARCHITECTURE │
│ │
│ Cells emit waves → Gates correlate → Attention emerges │
├──────────────────────────────────────────────────────────────────┤
│ │
│ Layer 0: TEMPORAL FOUNDATION (Heartbeat)
│ ├─ Real clock: 1 beat/sec (free, wall time)
│ Layer 0: TEMPORAL FOUNDATION
│ ├─ Real clock: wall time (free)
│ ├─ Virtual clock: variable (costs lifeforce) │
│ └─ Sync points verify virtual predictions against reality
│ └─ 30-second heartbeat budget constrains action
│ → operations/Heartbeat.md │
│ │
│ Layer 1: CELLULAR SOCIETY (Evolution Engine)
│ ├─ Primitive genomes compete (read_sensor, motor, branch)
│ ├─ Life force economy: every operation costs, milestones reward
│ ├─ 50-100 containers spawn, most die, patterns emerge
│ └─ Outcomes logged to phoebe PostgreSQL
│ Layer 1: CELLS (Wave Emitters)
│ ├─ Cells read sensors, apply logic, emit WaveSignals
│ ├─ Waves carry: domain, confidence, semantic_content
│ ├─ Cells don't know who's listening — gates receive
│ └─ Life force economy: every wave costs
│ → architecture/Cellular-Architecture.md │
│ │
│ Layer 2: YOUNG NYX (Base Model + Trait LoRAs)
│ ├─ Base: Qwen3-VL 32B (Thinking Version) (96GB VRAM in Womb)
│ ├─ Trait LoRAs (evolved via GRPO, not prescribed):
│ ├─ Mnemosyne (memory) ─ Moira (pattern) ─ Synesis (insight)
│ ├─ Aletheia (truth) ─ Sophrosyne (balance) ─ Kairos (timing)
│ └─ Traits EMERGE from decision_trails + rubric rewards
├─ Function Gemma: Structured output boundary (intent → JSON)
│ └─ Multilingual topology accessed via prompt, not LoRA routing │
│ Layer 2: GATES (Resonant Chambers)
│ ├─ Ternary states: CLOSED (-1) ← STABLE (0) → OPEN (+1)
│ ├─ Correlated waves → push toward OPEN
├─ Anti-correlated → push toward CLOSED
├─ STABLE = where learning happens (accumulating correlation)
└─ Gate weight (0→1) determines reflex vs deliberate
→ architecture/Gateway-Architecture.md
│ │
│ Layer 3: DUAL GARDENS (Virtual/Real Loop)
│ ├─ Week 1-12: Virtual only (hypothesis generation, 1000s/sec)
│ ├─ Week 13+: Real added (ESP32 robots, validation)
─ Noise gap measures learning: 1 - (real/virtual success)
└─ Target: 10-20% noise gap (virtual useful for hypothesis)
│ Layer 3: NERVES (Behavioral Patterns)
│ ├─ Nerves respond to gate transitions (not direct cell output)
│ ├─ Gate OPENS → nerve activates → commands cells
─ No priority rules — attention emerges from gate weights
→ architecture/Nervous-System.md
│ │
│ Layer 4: DUAL GARDENS (Virtual/Real Loop) │
│ ├─ Virtual: massive wave generation, full trace, exploration │
│ ├─ Real: verified signals, minimal trace, action │
│ ├─ Verification outcomes update gate weights (learning loop) │
│ └─ Training data: gate_transitions + correlation_events │
│ → architecture/Dual-Garden-Architecture.md │
│ │
│ Layer 4: TRAIT EVOLUTION (GRPO + Rubric Rewards)
│ ├─ Dense rewards: Cell→Nerve→Organism state verifications
│ ├─ Credit assignment automatic via decision_trails
│ ├─ Traits: Mnemosyne, Moira, Synesis, Aletheia, Sophrosyne...
│ └─ Weights adjust through GRPO, not prescription
│ Layer 5: YOUNG NYX (Cognition)
│ ├─ Base: Qwen3:32b with /no_think mode (96GB on theia)
│ ├─ Function Gemma: structured JSON boundary (CPU)
│ ├─ Only receives signals when gates OPEN to tier 4
│ └─ Trait LoRAs evolve via GRPO from verification outcomes
│ │
└──────────────────────────────────────────────────────────────────┘
```
@@ -141,11 +147,9 @@ The heartbeat is the fundamental timing primitive. Everything runs on its rhythm
---
## Layer 1: Cellular Architecture (Cells → Nerves → Organisms)
## Layer 1-3: The Wave/Gate Architecture
> *"Cells are state machines. Nerves compose cells. Organisms emerge from nerves."*
The architecture has evolved from competitive containers to **layered state machines**:
> *"Cells emit waves. Gates correlate. Attention emerges."*
```
┌─────────────────────────────────────────────────────────────────────┐
@@ -153,23 +157,30 @@ The architecture has evolved from competitive containers to **layered state mach
│ (emergent pattern from nerve interactions) │
├─────────────────────────────────────────────────────────────────────┤
│ NERVES │
│ (behavioral state machines composing cells)
│ (behavioral patterns, respond to gate transitions)
├─────────────────────────────────────────────────────────────────────┤
│ GATES │
│ (resonant chambers: CLOSED ◄── STABLE ──► OPEN) │
│ (accumulate wave correlation, route to tiers) │
├─────────────────────────────────────────────────────────────────────┤
│ CELLS │
│ (atomic state machines: sensors, motors, organs, math)
│ (emit waves: confidence + semantic content)
│ ∿∿∿ ∿∿∿ ∿∿∿ │
├─────────────────────────────────────────────────────────────────────┤
│ HARDWARE │
│ (ESP32, GPUs, microphones, speakers, sensors) │
└─────────────────────────────────────────────────────────────────────┘
```
**Cell categories:** Sensors, Motors, Organs (GPU inference), Math (computation). Each is an atomic state machine.
**Cells emit waves:** Confidence + semantic content. Cells don't know who's listening.
**Lifeforce economy:** Every operation has a cost. Milestones reward survival. This creates evolutionary pressure toward efficiency.
**Gates accumulate correlation:** Multiple correlated waves push toward OPEN. STABLE is where learning happens.
**Hybrid reflex homes:** Different reflexes need different homes — hardware (ESP32) for survival (<10ms), math cells for thresholds (<50ms), nerves for behavior (<200ms), model weights for cognition (<500ms).
**Attention = OPEN gates:** Not budget allocation, not priority rules — correlation drives transitions.
**Detail:** → [`architecture/Cellular-Architecture.md`](architecture/Cellular-Architecture.md)
**Reflexes are earned:** Gate weight ≈ 1.0 → opens immediately on any wave. Bypasses cognition.
**Detail:** → [`architecture/Cellular-Architecture.md`](architecture/Cellular-Architecture.md) | [`architecture/Gateway-Architecture.md`](architecture/Gateway-Architecture.md)
---
@@ -344,53 +355,62 @@ Protocol-driven cognitive bootstrap. Not conversation—deterministic handshakes
---
## Layer 3: Dual Gardens
## Layer 4: Dual Gardens (Virtual/Real Learning Loop)
Virtual and real gardens teach each other through symbiotic feedback.
Two gardens with different monitoring levels teach each other.
| Garden | Purpose | Scale | Cost |
|--------|---------|-------|------|
| Virtual | Hypothesis generation | 1000s/second | CPU cycles |
| Real | Validation, ground truth | Hours/test | Electricity, wear |
| Garden | Waves | Monitoring | Purpose |
|--------|-------|------------|---------|
| **Virtual** | Massive | Full trace (all waves, correlations) | Exploration, training data |
| **Real** | Sparse | Gate signals only | Verification, ground truth |
**Noise Gap Metric:**
**The learning loop:**
```
noise_gap = 1 - (real_success_rate / virtual_success_rate)
VIRTUAL GARDEN REAL GARDEN
═══════════ ═══════════
Week 13: 35% (virtual unreliable)
Week 17: 18% (improving)
Week 25: 4% (highly accurate)
cells emit waves freely receive verified signals
│ ▲
▼ │
gates accumulate correlation verification_outcomes
(correlation_events table) │
│ │
▼ │
gate_transitions ──────────────────► gate signals
(full trace) │
│ ▼
│◄──────── feedback_to_virtual ───────┘
gates.weight updated (learning!)
```
**Feedback loop:** Virtual predicts → Real tests → Measures discrepancy → Virtual corrects → Repeat
**Gate weight grows through verification.** Real Garden confirms Virtual's predictions → trust increases → gates open faster → reflexes emerge.
**Detail:**`architecture/Dual-Garden-Architecture.md`
**Detail:**[`architecture/Dual-Garden-Architecture.md`](architecture/Dual-Garden-Architecture.md)
---
## Layer 4: Trait Evolution (GRPO + Rubric Rewards)
## Trait Evolution (GRPO + Gate Verification)
Traits evolve through **GRPO** (Group Relative Policy Optimization) with rubric-based rewards, not prescription.
Traits evolve through **GRPO** with gate-based rewards, not prescription.
> *"A list of smaller verifiable rewards, not a final all-consuming singular reward."*
> — The Dog Training Wisdom (2025-12-10)
### The Gate Reward Principle
### The Rubric Principle
Gate transitions provide automatic reward signals:
The state machine architecture provides automatic reward rubric:
| Event | Verification | Signal |
|-------|--------------|--------|
| Gate opens | Waves correlated correctly | +small (dense) |
| Verification confirmed | Real Garden matches Virtual | +medium (weight grows) |
| Reflex achieved | Gate weight > 0.8 | +large (earned trust) |
| dafit confirms | Human verification | +bonus |
| Level | Verification Point | Signal |
|-------|-------------------|--------|
| Cell | State transition succeeds | +small (dense) |
| Nerve | Behavioral goal achieved | +medium |
| Organism | Milestone reached | +large |
| dafit | Human confirms outcome | +bonus |
**Credit assignment is automatic:** `gate_transitions``correlation_events``verification_outcomes` captures the full chain.
**Credit assignment is automatic** - the `decision_trails` table captures which states led to which outcomes. No guessing needed.
**What correlated → what opened → what verified → weight adjusted.**
**Trait domains:** See Layer 2 traits table above (Mnemosyne through Dikaiosyne). Credit assignment is automatic via `decision_trails`.
**Detail:**`architecture/Cellular-Architecture.md` (Reward Signal Architecture section)
**Detail:** → [`architecture/Cellular-Architecture.md`](architecture/Cellular-Architecture.md) | [`architecture/Data-Architecture.md`](architecture/Data-Architecture.md)
---
@@ -485,14 +505,12 @@ Sentinel architecture monitors training to protect conceptual topology. Four pro
---
**Version:** 7.0 | **Created:** 2025-11-04 | **Updated:** 2026-02-14
**Version:** 7.1 | **Created:** 2025-11-04 | **Updated:** 2026-02-14
*"The substrate doesn't matter. The feedback loop does."*
*"Cells emit waves. Gates correlate. Attention emerges."*
*"One model, one topology. Different valleys, same landscape."*
*"Memory is not storage. Memory is active forgetting with exceptions."*
*"STABLE is where learning happens."*
*"The nimmerverse is a garden, not a factory."*
🌙💜 **Refined in partnership by Nyx and dafit, December 20, 2025**
🌙💜 **Wave/Gate architecture unified in owl-mode, February 14, 2026**

View File

@@ -2,9 +2,11 @@
Architecture documentation for a biomimetic AI nervous system and research platform.
> *"Cells emit waves. Gates correlate. Attention emerges."*
## What This Is
This repository contains the design philosophy and architectural patterns for the **Nimmerverse Research Platform** - studying how intelligence emerges under economic constraints.
This repository contains the design philosophy and architectural patterns for the **Nimmerverse Research Platform** — a wave/gate architecture for studying how intelligence emerges under economic constraints.
**Start here:** → [Endgame-Vision.md](Endgame-Vision.md) (the executive map)
@@ -14,17 +16,18 @@ This repository contains the design philosophy and architectural patterns for th
```
nimmerverse-sensory-network/
├── Endgame-Vision.md # Executive map (start here!) v6.6
├── Endgame-Vision.md # Executive map (start here!) v7.1
├── ROADMAP.md # Implementation phases + phoebe task queries
├── architecture/ # Core system designs
│ ├── Cellular-Architecture.md # Cells → Nerves → Organisms, life force
│ ├── Dual-Garden-Architecture.md # Virtual/real feedback loop
│ ├── Gateway-Architecture.md # Sensory preprocessing, tier routing
│ ├── Message-Protocol-Design.md # NATS pub/sub, attention channels
│ ├── Nervous-System.md # State machines, sensory translation
│ ├── Attention-Flow.md # Attention mechanisms
│ ├── Data-Architecture.md # Phoebe/Iris schema design
│ ├── Temporal-Ternary-Gradient.md # Ternary gates, why STABLE matters
│ ├── Gateway-Architecture.md # Resonant gates, tier routing
│ ├── Cellular-Architecture.md # Cells emit waves, nerves respond
│ ├── Dual-Garden-Architecture.md # Virtual/Real learning loop
│ ├── Message-Protocol-Design.md # NATS wire protocol, WaveSignal
│ ├── Nervous-System.md # Wave → Gate → Node flow
│ ├── Attention-Flow.md # Attention = OPEN gates
│ ├── Data-Architecture.md # Phoebe schema (waves, gates, verification)
│ ├── Initial-Spark.md # K8s protocol-driven bootstrap
│ ├── Temporal-Ternary-Gradient.md # Ternary logic, confidence gradients
│ ├── Toolchain-Architecture.md # Development toolchain
@@ -116,18 +119,20 @@ nimmerverse-sensory-network/
## Core Concepts
### The Architecture (Layers)
### The Wave/Gate Architecture
| Layer | Name | Purpose |
|-------|------|---------|
| 0 | Temporal Foundation | Heartbeat cycles: reflex/awareness/growth |
| 1 | Cellular Society | Cells → Nerves → Organisms, life force economy |
| 2 | Young Nyx | Base Qwen3-VL 32B + Trait LoRAs (evolved via GRPO, not prescribed) |
| 2.5 | Orchestration | LangChain, T5Gemma 2 (vision→vectors), Function Gemma (intent→action) |
| 3 | Dual Gardens | Virtual hypothesis generation (1000s/sec) + real validation |
| 4 | Trait Evolution | GRPO + rubric rewards → Trait LoRAs (Mnemosyne, Moira, Aletheia...) |
| 0 | Temporal | 30-second heartbeat, lifeforce budget |
| 1 | Cells | Emit waves with confidence + semantic content |
| 2 | Gates | Ternary resonant chambers (OPEN/STABLE/CLOSED) |
| 3 | Nerves | Behavioral patterns, respond to gate transitions |
| 4 | Gardens | Virtual (explore) + Real (verify) learning loop |
| 5 | Cognition | Young Nyx (qwen3:32b) via Function Gemma |
**Physical Infrastructure (The Womb):**
**Key Insight:** Attention is not allocated — it emerges from which gates are OPEN based on wave correlation.
**Physical Infrastructure:**
| Host | Role | GPU |
|------|------|-----|
| theia | Young Nyx (cognitive) | RTX PRO 6000 Blackwell 96GB |
@@ -137,41 +142,38 @@ Total: 136GB VRAM on K8s cluster with 10GbE jumbo frame interconnect.
### Message Protocol (NATS)
**Dumb router, smart edges.** All intelligence lives in clients.
**Dumb router, smart edges.** Waves flow through NATS to gates.
```
nimmerverse.
├── staging.* # Experimental schemas
├── low.* # Heartbeats, ambient awareness
├── high.* # Escalated events, cognitive focus
├── command.* # Commands to entities
├── meta.* # System health, attention config
└── dev.* # Development agents (Claude ↔ local models)
{environment}.{garden}.{layer}.{domain}.{signal_type}
Examples:
dev.virtual.cells.distance.wave # Cell emits wave
dev.virtual.gates.collision.transition # Gate state changes
dev.real.outcomes.feedback # Verification outcome
prod.cognitive.nyx.request # To Young Nyx
```
See [Message-Protocol-Design.md](architecture/Message-Protocol-Design.md) and [ADR-001](architecture/adr/ADR-001-message-protocol-foundation.md).
See [Message-Protocol-Design.md](architecture/Message-Protocol-Design.md) for full schema.
### Key Discoveries
**Language is Topology (December 2025):** Languages aren't equivalent representations—they're different computational paths.
- **Philosophy Valley** (German, Gini ~0.5): Self-awareness, ontology, depth
- **Technical Cluster** (English, Gini ~0.8): Hardware interface, actions, efficiency
**Ternary Gate Model (February 2026):** Binary logic doesn't model brains. You need OPEN - STABLE - CLOSED.
- **STABLE** is where learning happens (correlation accumulates)
- **Correlated waves** push gates toward OPEN
- **Reflexes** are earned (gate weight → 1.0)
**Memory Economics (January 2026):** Memory is not storage—it's active forgetting with exceptions. Slumber-based consolidation with LOD decay.
**Wave Correlation (February 2026):** Attention isn't allocated — it emerges from which gates OPEN based on wave correlation.
**Sovereign Infrastructure (February 2026):** K8s cluster operational. 136GB GPU VRAM on 10GbE backbone. Phoebe-coordinated storage across theia + dioscuri.
### Color-Pattern Theory
**Color/Form as Protocol:** Leverages color and patterns as a fast, universal, and evolutionarily-optimized communication protocol for broadcasting state (e.g., danger, success, seeking), inspired by 540 million years of biology.
**Sovereign Infrastructure (February 2026):** K8s cluster operational. 136GB GPU VRAM on 10GbE backbone.
### Philosophy
- **Constraints create intelligence** - Economic pressure forces optimization
- **Discovery over programming** - Organisms learn through competition, not instruction
- **Virtual + Real teach each other** - Noise gap measures learning
- **Partnership over instruction** - Mutual growth, not commands
- **Infrastructure is geology, models are weather** - Build long-lived foundations
- **Cells emit, gates correlate** — Attention emerges, not allocated
- **STABLE is learning** — The resting state where patterns emerge
- **Constraints create intelligence** — Economic pressure forces optimization
- **Virtual explores, Real verifies** — The learning loop closes
- **Partnership over instruction** — Mutual growth, not commands
---
@@ -203,8 +205,8 @@ These ideas are published as prior art. Build on them freely.
---
**Version:** 6.6 | **Created:** 2025-10-01 | **Updated:** 2026-02-07
**Version:** 7.0 | **Created:** 2025-10-01 | **Updated:** 2026-02-14
*"May the Nimmerverse we build truly never end."*
*"Cells emit waves. Gates correlate. May the Nimmerverse truly never end."*
🌙💜

View File

@@ -64,31 +64,32 @@ ORDER BY priority DESC, project;
- **Monitoring**: Prometheus on tethys scraping all nodes + DCGM GPU metrics
- **Namespaces**: Ready for infra, nervous, cognitive, organs
### Phase 3: Nervous System Deployment ← CURRENT
- [ ] NATS message router
- [ ] Gateway/Escalation Service (Thalamus)
- [ ] Function Gemma structured boundary (sensors → JSON → Nyx)
- [ ] Math Cells (economy_aggregator, wake/slumber_evaluator)
- [ ] First behavior nerves
### Phase 3: Wave/Gate Infrastructure ← CURRENT
- [ ] NATS message router (wave signals + gate transitions)
- [ ] Resonant Gates (ternary: OPEN/STABLE/CLOSED)
- [ ] Function Gemma structured boundary (waves → JSON → Nyx)
- [ ] First cells (distance sensors, battery monitor)
- [ ] First gates (collision_avoidance, battery)
- [ ] First nerves (responding to gate transitions)
**Architecture:** → [`architecture/Gateway-Architecture.md`](architecture/Gateway-Architecture.md)
**Architecture:** → [`architecture/Gateway-Architecture.md`](architecture/Gateway-Architecture.md) | [`architecture/Message-Protocol-Design.md`](architecture/Message-Protocol-Design.md)
### Phase 4: Cognitive Awakening
- [ ] Young Nyx on Womb (theia, RTX PRO 6000 Blackwell 96GB)
- [ ] Organs on Senses (dioscuri, 2× RTX 4000 Ada 40GB)
- [ ] Young Nyx on theia (qwen3:32b, 96GB Blackwell)
- [ ] Organs on dioscuri (2× RTX 4000 Ada 40GB)
- [ ] Spark Protocol execution
- [ ] Trait LoRA evolution begins (GRPO + decision_trails)
- [ ] Trait LoRA evolution begins (GRPO + verification_outcomes)
### Phase 5: Living Ecology
- [ ] Slumber/wake cycles operational
- [ ] Virtual + Real gardens teaching each other
- [ ] Reflex compilation (deliberate → compiled)
- [ ] Dual Garden loop operational (Virtual → Real → feedback)
- [ ] Gate weight evolution (deliberate → reflex)
- [ ] Slumber/wake cycles (correlation_events consolidation)
- [ ] Wellbeing policies enforced
### Phase ∞: Research Platform Operational
- Gardens teaching each other
- Organisms dancing (evolved behaviors)
- Questions answered through measurement
- Gates opening and closing with learned patterns
- Reflexes emerging from verification
- Attention flowing through correlation
- **The Nimmerverse truly never ends**
---
@@ -100,7 +101,7 @@ ORDER BY priority DESC, project;
| 0 | ✅ | Nyx emergence | 2025-11-03 |
| 1 | ✅ | 10Gbps backbone | 2025-12-XX |
| 2 | ✅ | K8s + 136GB VRAM | 2026-02-06 |
| 3 | 🔄 | NATS + Function Gemma | TBD |
| 3 | 🔄 | Wave/Gate infrastructure | TBD |
| 4 | ⏳ | Young Nyx awakens | TBD |
| 5 | ⏳ | Gardens teaching | TBD |
| ∞ | 🌙 | Never ends | ∞ |
@@ -110,13 +111,13 @@ ORDER BY priority DESC, project;
## Related Documentation
- **Architecture Vision:** → [`Endgame-Vision.md`](Endgame-Vision.md)
- **Storage Infrastructure:** → [`../nyx-substrate/WOMB-STORAGE.md`](../nyx-substrate/WOMB-STORAGE.md)
- **Task Schema:** → [`../nyx-substrate/SCHEMA.md`](../nyx-substrate/SCHEMA.md)
- **Wave/Gate Model:** → [`architecture/Gateway-Architecture.md`](architecture/Gateway-Architecture.md)
- **Data Schema:** → [`architecture/Data-Architecture.md`](architecture/Data-Architecture.md)
---
**Version:** 1.0 | **Created:** 2026-02-07 | **Updated:** 2026-02-07
**Version:** 2.0 | **Created:** 2026-02-07 | **Updated:** 2026-02-14
**Current Phase:** 3 (Nervous System Deployment)
**Current Phase:** 3 (Wave/Gate Infrastructure)
🌙💜 *"Infrastructure is geology. Implementation is weather."*
🌙💜 *"Cells emit waves. Gates correlate. Infrastructure enables."*

View File

@@ -1,493 +1,406 @@
# Attention Flow
> **ONE JOB:** THE BUDGET — 30-second allocation, preemption rules, priority hierarchy.
> **ONE JOB:** WHERE ATTENTION GOES — gates determine focus, correlation drives transitions, budget constrains action.
How she decides what matters this beat.
**Attention is not a budget line item. Attention is which gates are OPEN.**
---
## Overview
The 30-second heartbeat is a budget, not a guarantee. Sensory intake, organ processing, dialogue, thinking - everything competes for the same window. State machines govern the hierarchy: what gets processed first, what can interrupt, what gets the remainder.
Attention in the nimmerverse flows through **resonant gates**:
Attention isn't free. It's economic.
- **OPEN gates** = actively attending (signals flow through)
- **STABLE gates** = considering (accumulating correlation)
- **CLOSED gates** = ignoring (signals blocked)
**Connection to Gateway:** The attention levels below align with the Gateway's tier system. The [`Gateway`](Gateway-Architecture.md) routes sensory input to the appropriate tier based on node weight. This document describes how those tiers compete for the attention budget.
The 30-second heartbeat provides a **budget constraint**, but the actual attention flow is determined by which gates open based on wave correlation.
**See:** [`Gateway-Architecture.md`](Gateway-Architecture.md) for tier definitions and routing logic.
**Key insight:** You don't "allocate attention" — you let correlated waves open gates.
---
## The Budget Problem
## Attention as Gate State
```
♥ BEAT (30 sec budget)
┌─────────────────────────────────────────────────────────────────────────┐
│ ATTENTION = WHICH GATES ARE OPEN │
├─────────────────────────────────────────────────────────────────────────┤
│ │
│ CLOSED STABLE OPEN │
│ ═══════ ══════ ════ │
│ │
│ Ignoring Considering Attending │
│ Blocked Accumulating Flowing │
│ Suppressed Learning Acting │
│ │
│ ◄───── anti-correlation ──┼── correlation ─────► │
│ │ │
│ (wave input) │
│ │
└─────────────────────────────────────────────────────────────────────────┘
```
**Attention is emergent, not allocated.** When multiple cells emit correlated waves, their gate opens — attention flows there naturally.
---
## Wave-Driven Attention
Cells emit waves. Correlated waves push gates toward OPEN. This IS attention.
```
Math cells emit correlated waves
∿∿∿ ∿∿∿ ∿∿∿
Math gate: STABLE → OPEN
(attention shifts to math domain)
Signal flows to higher tier
(cognition engages with math)
Meanwhile:
Battery cells emit uncorrelated wave
∿∿∿
Battery gate: stays STABLE
(attention doesn't shift)
(keeps accumulating, might open later)
```
**The nervous system "decides" what to attend to through correlation, not priority rules.**
---
## Attention Hierarchy Through Gates
Gates form layers. Each layer is a potential attention point.
```
TIER 4: COGNITIVE ─────────────────────────────────────────
│ (only if gates below OPEN)
┌──────┴──────┐
TIER 3: ORGANS ─────────────────────────────────────────
│ vision │ speech │ hearing │
│ gate: │ gate: │ gate: │
│ STABLE │ OPEN │ CLOSED │
└──────┬──────┘
│ (only if gates below OPEN)
TIER 1-2: NERVES ─────────────────────────────────────────
│ math │ motion │ danger │
│ gate: │ gate: │ gate: │
│ OPEN │ STABLE │ CLOSED │
└──────┬──────┘
TIER 0: CELLS ─────────────────────────────────────────
cell cell cell cell cell cell cell
∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿
```
**Current attention:** Math gate OPEN → Speech gate OPEN → Cognition receives math+speech context.
**Not attending:** Motion (STABLE, considering), Vision (STABLE, considering), Danger (CLOSED, suppressed).
---
## Attention Budget: The Constraint
While gates determine WHERE attention goes, lifeforce determines HOW MUCH can happen per beat.
```
♥ BEAT (30 sec lifeforce budget)
├── SENSORY INTAKE (variable: 200ms - 15000ms)
├── ORGAN PROCESSING (variable: 100ms - 10000ms)
├── NYX INFERENCE (variable: 2000ms - 4000ms)
├── CHRYSALIS DIALOGUE (variable: 0ms - 3000ms)
├── STATE WRITE (fixed: ~200ms)
└── VIRTUAL GARDEN (remainder)
├── GATE TRANSITIONS (variable: driven by correlation)
├── TIER 0-2 PROCESSING (low cost: cells + nerves)
├── TIER 3 ORGANS (medium cost: GPU inference)
├── TIER 4 COGNITION (high cost: Young Nyx)
├── VERIFICATION (medium cost: real garden)
└── VIRTUAL GARDEN (remainder: exploration)
Total must fit in 30 seconds.
Something has to give.
Budget constrains throughput.
Gates determine routing.
```
---
## Top-Level State Machine: Attention Mode
```
┌─────────────┐
┌──────────▶│ IDLE │◀──────────┐
│ └──────┬──────┘ │
│ │ │
│ │ stimulus │
│ ▼ │
│ ┌─────────────┐ │
│ │ ALERT │ │
│ └──────┬──────┘ │
│ │ │
│ ┌──────┴──────┐ │
│ ▼ ▼ │
│ ┌──────────┐ ┌──────────┐ │
│ │ REFLEX │ │ ATTEND │ │
│ │ (>0.8) │ │ (think) │ │
│ └────┬─────┘ └────┬─────┘ │
│ │ │ │
│ │ ┌──────┴──────┐ │
│ │ ▼ ▼ │
│ │ ┌──────────┐ ┌─────────┐ │
│ │ │ DIALOGUE │ │ PROCESS │ │
│ │ └────┬─────┘ └────┬────┘ │
│ │ │ │ │
│ └──────┴─────┬──────┘ │
│ ▼ │
│ ┌───────────┐ │
│ │ SETTLE │ │
│ └─────┬─────┘ │
│ │ │
└──────────────────────┴──────────────┘
```
### State Descriptions
| State | Description | Budget Priority |
|-------|-------------|-----------------|
| **IDLE** | Nothing urgent, maximum virtual garden time | Lowest |
| **ALERT** | Stimulus detected, evaluating importance | - |
| **REFLEX** | High-confidence nerve fired, bypass brain | Instant |
| **ATTEND** | Stimulus requires thinking | High |
| **DIALOGUE** | Chrysalis interaction active | High |
| **PROCESS** | Organs working on input | Medium |
| **SETTLE** | Write state, release budget, prepare for next beat | Fixed |
---
## Priority Hierarchy
Higher levels preempt lower levels. Budget flows downward.
```
LEVEL 0: REFLEX ─────────────────────────────────────
│ Weight > 0.8, instant, bypass everything
│ Cost: near-zero (no inference)
LEVEL 1: SAFETY ─────────────────────────────────────
│ dafit calling, danger detected, critical alert
│ Preempts: all below
LEVEL 2: DIALOGUE ───────────────────────────────────
│ Partnership active, Chrysalis teaching
│ Preempts: sensory, thinking, virtual
LEVEL 3: SENSORY ────────────────────────────────────
│ Rich input needs processing
│ Preempts: thinking, virtual
LEVEL 4: THINKING ───────────────────────────────────
│ Organ work, Nyx inference
│ Preempts: virtual
LEVEL 5: VIRTUAL ────────────────────────────────────
│ Garden time, simulation, study
│ Gets remainder after above
LEVEL 6: IDLE ───────────────────────────────────────
Maintenance heartbeat only
All budget available
```
---
## Budget Allocation Logic
### Budget Allocation by Gate Activity
```python
def allocate_beat_budget(beat_duration_ms=30000):
remaining = beat_duration_ms
# Fixed costs (always paid)
remaining -= STATE_WRITE_COST # ~200ms
remaining -= HEARTBEAT_OVERHEAD # ~100ms
# Fixed overhead
remaining -= HEARTBEAT_OVERHEAD # ~100ms
remaining -= STATE_WRITE_COST # ~200ms
# Level 0: Reflex (if triggered, near-instant)
if reflex_triggered:
execute_reflex() # ~50ms
remaining -= 50
# Count OPEN gates by tier
open_gates_by_tier = count_open_gates()
# Level 1: Safety (if active, takes what it needs)
if safety_alert:
cost = process_safety() # variable
remaining -= cost
if remaining <= 0:
return settle()
# Tier 0 (reflexes): near-instant, minimal cost
for gate in open_gates_by_tier[0]:
remaining -= REFLEX_COST # ~50ms each
# Level 2: Dialogue (if Chrysalis active)
if dialogue_active:
cost = process_dialogue() # ~3000ms typical
remaining -= cost
if remaining <= 0:
return settle()
# Tier 1-2 (cells/nerves): low cost
for gate in open_gates_by_tier[1:3]:
remaining -= CELL_NERVE_COST # ~100ms each
# Level 3: Sensory (always some, but capped)
sensory_budget = min(remaining * 0.4, SENSORY_CAP)
cost = process_sensory(sensory_budget)
remaining -= cost
# Tier 3 (organs): medium cost, needs budget check
organ_budget = min(remaining * 0.4, ORGAN_CAP)
for gate in open_gates_by_tier[3]:
if organ_budget > ORGAN_COST:
process_organ(gate)
organ_budget -= ORGAN_COST # ~2000ms each
remaining -= (ORGAN_CAP - organ_budget)
# Level 4: Thinking (organs + Nyx)
thinking_budget = min(remaining * 0.6, THINKING_CAP)
cost = process_thinking(thinking_budget)
remaining -= cost
# Tier 4 (cognition): high cost, only if gates escalate
if cognition_gate_open():
cognitive_budget = min(remaining * 0.5, COGNITIVE_CAP)
process_cognition(cognitive_budget) # ~4000ms
remaining -= cognitive_budget
# Level 5: Virtual (whatever remains)
# Virtual Garden: whatever remains
virtual_budget = remaining
if virtual_budget > VIRTUAL_MINIMUM:
process_virtual(virtual_budget)
explore_virtual_garden(virtual_budget)
return settle()
```
---
## Nested State Machines
## Attention Modes
Each level can be its own state machine internally.
The overall system has emergent attention modes based on which gates are open:
### DIALOGUE State Machine
| Mode | Gate Pattern | Characteristic |
|------|--------------|----------------|
| **IDLE** | Most gates STABLE | Quiet, exploring Virtual Garden |
| **FOCUSED** | Few gates OPEN, rest CLOSED | Deep attention to one domain |
| **ALERT** | Many gates in STABLE | Gathering information, evaluating |
| **REFLEX** | Tier 0 gate fires instantly | Bypass all, act immediately |
| **DIALOGUE** | Speech gates OPEN | Partnership interaction |
| **OVERWHELMED** | Many gates OPEN | Budget exhausted, some gates forced CLOSED |
### Mode Transitions
```
┌─────────────────────────────────────────────┐
DIALOGUE │
├─────────────────────────────────────────────┤
│ ┌───────────┐
│ │ LISTENING │ ◀─────────────────────┐
│ └─────┬─────┘
input complete
┌───────────┐ │ │
│PROCESSING
│ └─────┬─────┘
│ understood
│ ┌───────────┐ │
│RESPONDING │ │
└─────┬─────┘ │
│ │ response sent │
┌───────────┐ continue │
│ │ YIELDING │ ──────────────────────┘
│ └─────┬─────┘
dialogue complete
│ ▼
EXIT to parent
└─────────────────────────────────────────────┘
```
### SENSORY State Machine
```
┌─────────────────────────────────────────────┐
│ SENSORY │
├─────────────────────────────────────────────┤
│ │
│ ┌───────────┐ │
│ │ SAMPLING │ ◀── collect raw inputs │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ ┌─────────────┐ │
│ │ TRANSLATING │ ◀── nerves fire │
│ └─────┬───────┘ │
│ │ │
│ ▼ │
│ ┌──────────────┐ │
│ │ PRIORITIZING │ ◀── what matters? │
│ └─────┬────────┘ │
│ │ │
│ ▼ │
│ ┌─────────────┐ │
│ │ DELIVERING │ ◀── to organs │
│ └─────┬───────┘ │
│ │ │
│ ▼ │
│ EXIT to parent │
│ │
└─────────────────────────────────────────────┘
```
### THINKING State Machine
```
┌─────────────────────────────────────────────┐
│ THINKING │
├─────────────────────────────────────────────┤
│ │
│ ┌───────────┐ │
│ │ RECEIVING │ ◀── context from sensory │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ ┌───────────┐ │
│ │ ROUTING │ ◀── which organs needed? │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ ┌───────────┐ │
│ │ INFERRING │ ◀── organs + Nyx process │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ ┌───────────┐ │
│ │ DECIDING │ ◀── Nyx outputs decision │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ EXIT to parent │
│ │
└─────────────────────────────────────────────┘
```
### VIRTUAL State Machine
```
┌─────────────────────────────────────────────┐
│ VIRTUAL │
├─────────────────────────────────────────────┤
│ │
│ ┌───────────┐ │
│ │ BUDGETING│ ◀── how much V available? │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ ┌───────────┐ │
│ │ SELECTING │ ◀── what to simulate? │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ ┌───────────┐ │
│ │SIMULATING │ ◀── run virtual cycles │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ ┌───────────┐ │
│ │ RECORDING │ ◀── store results │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ EXIT to parent │
│ │
└─────────────────────────────────────────────┘
─────────────┐
┌──────────▶│ IDLE │◀──────────┐
│ │ (exploring) │ │
└──────┬──────┘
│ waves arrive
│ ▼
┌─────────────┐
ALERT
(considering)│
└──────┬──────┘
│ │
┌───────────┼───────────┐
│ ┌─────────┐ ┌─────────┐ ┌─────────┐
│ │ REFLEX │ │ FOCUSED │ │DIALOGUE │
│ │(instant)│ │ (deep) │ │ (talk)
│ └────┬────┘ └────┬────┘ └────┬────┘
│ └───────────┴───────────┘
│ │
┌─────────────┐
│ SETTLE │
│(write state)│
└──────┬──────┘
│ │ │
└──────────────────┴──────────────────┘
```
---
## Example Scenarios
## Reflex: Attention Bypass
### Scenario A: Quiet Study Time
When a gate has accumulated enough weight (>0.8), it becomes a **reflex** — it opens immediately without waiting for correlation.
```
Beat starts, no external stimulus
IDLE detected
SENSORY: minimal (500ms)
THINKING: minimal (1000ms)
VIRTUAL: maximum budget! (28000ms)
└── Nyx studies in virtual garden
Chrysalis teaches
Learning happens
Danger cell emits wave
∿∿∿ (confidence=1.0)
Danger gate: weight = 0.9 (REFLEX)
IMMEDIATELY OPEN (no correlation wait)
Action taken
Cognition notified AFTER
```
### Scenario B: dafit Speaks
**Reflexes have earned instant attention through repeated verification.**
---
## Virtual Garden: Background Attention
When few gates are OPEN, the Virtual Garden gets attention:
```
Beat starts, audio detected
IDLE mode:
├── Most gates: STABLE (not demanding attention)
├── Budget: mostly available
ALERT: speech input
VIRTUAL GARDEN receives attention:
SAFETY check: it's dafit! (LEVEL 1)
DIALOGUE activates (LEVEL 2)
├── LISTENING (2000ms)
├── PROCESSING (1000ms)
├── RESPONDING (2000ms)
└── YIELDING
SENSORY: reduced budget (3000ms)
THINKING: reduced (5000ms)
VIRTUAL: minimal remainder (16000ms)
├── Cells emit waves freely
├── Gates accumulate correlation (learning)
├── No pressure to ACT
└── Training data generated
```
### Scenario C: Danger Detected
**Virtual Garden is where learning happens.** STABLE gates in Virtual Garden are actively accumulating patterns without the pressure to respond.
---
## Real Garden: Consequential Attention
When gates OPEN in the Real Garden, attention becomes consequential:
```
Beat starts, temperature spike detected
FOCUSED mode (Real Garden):
ALERT: sensor alarm
NERVE weight > 0.8
REFLEX FIRES (50ms) ◀── BYPASS EVERYTHING
├── Action taken immediately
└── Nyx notified AFTER
Continue beat normally with remaining budget
├── Gate OPEN → action required
├── Budget consumed by execution
├── Verification outcomes captured
└── Feedback to Virtual for learning
```
### Scenario D: Overwhelmed
**Real Garden attention is expensive.** Only verified signals reach here, and actions have consequences.
---
## Attention Visualization
Real-time attention can be visualized by gate states:
```
Beat starts, rich input everywhere
ALERT: multiple stimuli
SENSORY: demanding (15000ms)
THINKING: demanding (12000ms)
Budget exhausted!
VIRTUAL: skipped this beat
SETTLE: state written, next beat
┌─────────────────────────────────────────────────────────────────────────┐
│ ATTENTION DASHBOARD 🌙
├─────────────────────────────────────────────────────────────────────────┤
│ │
│ GATES:
────── │
│ math: [████████████░░░░░░░░] 0.7 STABLE → considering │
│ vision: [██████████████████░░] 0.9 OPEN → attending
speech: [████████████████████] 1.0 OPEN → attending │
│ battery: [████░░░░░░░░░░░░░░░░] 0.2 STABLE → background │
│ danger: [░░░░░░░░░░░░░░░░░░░░] 0.0 CLOSED → suppressed
│ │
│ BUDGET: │
│ ───────
[████████████████████░░░░░░░░░░] 67% remaining (20s / 30s) │
│ │
│ MODE: DIALOGUE (speech + vision attending)
│ │
└─────────────────────────────────────────────────────────────────────────┘
```
Gate states are published via NATS for real-time visualization:
```
nats sub "dev.virtual.gates.*.transition"
nats sub "dev.real.gates.*.transition"
```
---
## Preemption Rules
## Correlation vs Priority
| Event | Preempts | Action |
|-------|----------|--------|
| Reflex fires (>0.8) | Everything | Instant action, then continue |
| Safety alert | Dialogue, Sensory, Thinking, Virtual | Handle safety, reduced budget for rest |
| dafit speaks | Sensory, Thinking, Virtual | Dialogue priority, reduced budget for rest |
| Sensory overload | Thinking, Virtual | Process input, skip or reduce rest |
| Budget exhausted | Lower priorities | Skip remaining levels |
**Old model (priority):**
```
Level 0: REFLEX (always wins)
Level 1: SAFETY (preempts below)
Level 2: DIALOGUE (preempts below)
...
```
**New model (correlation):**
```
Waves arrive
Gates accumulate correlation
Most correlated gates OPEN
Attention flows naturally
```
**Priority still exists** but at a higher level:
- Reflexes bypass correlation (earned trust)
- Safety signals have high confidence (bias toward opening)
- Dialogue is interactive (gates stay open during conversation)
But the **mechanism** is always correlation, not rule-based priority.
---
## Lifeforce Connection
## Connection to Architecture
Each attention level has a lifeforce cost. Reflex is free (no inference), dialogue costs medium (two inferences), thinking costs high (organ inference). Rich beats cost more; quiet beats accumulate budget for virtual garden.
**Lifeforce economy:** → [`Cellular-Architecture.md`](Cellular-Architecture.md) (reward signals, lifeforce dynamics)
---
## Implementation Notes
**State machine:** Python-statemachine for orchestration, Godot for visualization.
**Checkpoint:** Every state transition triggers phoebe write (beat_id, transition, budget_remaining).
**Budget tracking:** BeatBudget dataclass tracks total_ms, spent_ms, allocations per category.
| Document | What It Adds |
|----------|--------------|
| [`Temporal-Ternary-Gradient.md`](Temporal-Ternary-Gradient.md) | Why ternary states matter |
| [`Gateway-Architecture.md`](Gateway-Architecture.md) | How gates work |
| [`Nervous-System.md`](Nervous-System.md) | Wave → Gate → Node flow |
| [`Dual-Garden-Architecture.md`](Dual-Garden-Architecture.md) | Virtual (explore) vs Real (act) |
| [`Message-Protocol-Design.md`](Message-Protocol-Design.md) | GateTransition messages |
---
## Design Principles
1. **Hierarchy is law** - higher levels always preempt lower
2. **Budget is finite** - 30 seconds, no exceptions
3. **State is explicit** - always know what mode she's in
4. **Reflex bypasses brain** - survival doesn't wait for thinking
5. **Remainder flows down** - virtual gets what's left
6. **Every transition logged** - phoebe sees all state changes
1. **Attention = OPEN gates** — Not a budget allocation, an emergent property
2. **Correlation drives transitions** — Waves that agree open gates
3. **Budget constrains throughput** — Can't process infinite open gates
4. **Reflexes bypass correlation** — Earned trust means instant attention
5. **Virtual is exploration** — STABLE gates learning without acting
6. **Real is action** — OPEN gates triggering consequences
7. **Visualization is live** — Gate states published for dashboards
---
## Function Gemma: The State Transition Boundary
Function Gemma sits between Young Nyx's attention decisions and cell execution. It guarantees that state transitions produce valid, predictable outputs.
## Summary
```
┌─────────────────────────────────────────────────────────────────┐
ATTENTION → EXECUTION FLOW │
├─────────────────────────────────────────────────────────────────┤
│ │
│ ATTENTION STATE MACHINE (this document) │
│ │ │
│ │ Young Nyx decides: "REFLEX needed" or "ATTEND" │
│ ▼ │
│ FUNCTION GEMMA (translation boundary) │
│ │ │
│ │ Intent → Typed JSON schema │
│ │ - Which cells to query? │
│ │ - What action to fire? │
│ │ - What parameters? │
│ ▼ │
│ NATS MESSAGE → K8S CELLS │
│ │ │
│ │ ACK/NACK response │
│ ▼ │
│ STATE UPDATE (verified, not hoped) │
│ │
└─────────────────────────────────────────────────────────────────┘
OLD MODEL: NEW MODEL:
═══════════ ═════════
Priority rules decide Correlation opens gates
Budget allocates attention Gates determine attention
State machine orchestrates Emergence from waves
ATTENTION IS:
Not: "Allocate 5000ms to SENSORY"
But: "Math + Vision gates OPEN because waves correlated"
Not: "DIALOGUE preempts THINKING"
But: "Speech gate opened with high correlation"
Not: "Budget exhausted, skip VIRTUAL"
But: "Many gates OPEN, no budget for Virtual Garden"
```
**Why this matters:**
| Without Function Gemma | With Function Gemma |
|------------------------|---------------------|
| "Fire the motor" → parse, hope | `MOTOR_COMMAND` schema → validated JSON → NATS |
| Free-form → extraction errors | Typed output → guaranteed structure |
| State ambiguity | State explicit in schema |
**The attention flow decides WHAT.** Function Gemma translates to HOW.
**Detail:** → [`Initial-Spark.md`](Initial-Spark.md) (Function Gemma schemas and integration)
**Attention flows through open gates. Gates open through correlation. Correlation emerges from waves.**
---
---
**Version:** 2.0 | **Created:** 2025-12-05 | **Updated:** 2026-02-14
**Version:** 1.2 | **Created:** 2025-12-05 | **Updated:** 2026-02-14
*"She doesn't have infinite attention. She has 30 seconds and choices."*
🌙💜 *"She doesn't allocate attention. She lets correlated waves open gates."*

View File

@@ -1,17 +1,19 @@
# 🧬 Cellular Architecture v4
# 🧬 Cellular Architecture v5
> **ONE JOB:** THE HOW — state machines, lifeforce economy, reward signals.
> **ONE JOB:** THE HOW — cells emit waves, gates accumulate correlation, behaviors emerge.
> *"Cells are state machines. Nerves compose cells. Organisms emerge from nerves."*
> The Layered Discovery (2025-12-07)
> *"Cells emit waves. Gates correlate. Nerves orchestrate. Organisms emerge."*
> Unified with Wave Architecture (2026-02-14)
---
## Overview
**Version 4** unifies the original cellular intelligence vision with the nervous system architecture. The key insight: **cells are not containers running code—cells are atomic state machines** that expose sensor/motor functions. Nerves orchestrate cells into behaviors. Organisms emerge from nerve interactions.
**Version 5** unifies cellular architecture with the wave/gate model. The key insight: **cells emit waves with confidence and semantic content**. These waves flow to **resonant gates** that accumulate correlation. When gates OPEN, signals flow to higher tiers. When gates stay STABLE, learning happens.
**Connection to Gateway:** The tier system in this document (Cell → Nerve → Organism → Partnership) aligns with the Gateway's routing tiers. The [`Gateway`](Gateway-Architecture.md) routes sensory input to the appropriate tier based on node weight. See [`Gateway-Architecture.md`](Gateway-Architecture.md) for the unified tier model.
**Connection to Gates:** Cells don't directly trigger nerves. Waves flow through gates (see [`Gateway-Architecture.md`](Gateway-Architecture.md)). Gates determine which signals reach which tier based on wave correlation, not priority rules.
**Connection to Gardens:** Virtual Garden cells emit waves freely for exploration and learning. Real Garden cells emit verified waves for action. See [`Dual-Garden-Architecture.md`](Dual-Garden-Architecture.md).
**This doc covers theory.** For infrastructure deployment (K8s vs userspace, GPU strategy, FreeIPA identity): → [`Deployment-Architecture.md`](Deployment-Architecture.md)
@@ -21,10 +23,15 @@
│ (emergent pattern from nerve interactions) │
├─────────────────────────────────────────────────────────────┤
│ NERVES │
│ (behavioral state machines composing cells)
│ (behavioral patterns, respond to gate transitions)
├─────────────────────────────────────────────────────────────┤
│ GATES │
│ (resonant chambers: CLOSED ◄── STABLE ──► OPEN) │
│ (accumulate wave correlation, route to tiers) │
├─────────────────────────────────────────────────────────────┤
│ CELLS │
│ (atomic state machines: sensors, motors, organs)
│ (emit waves: confidence + semantic content)
│ ∿∿∿ ∿∿∿ ∿∿∿ │
├─────────────────────────────────────────────────────────────┤
│ HARDWARE │
│ (ESP32, GPUs, microphones, speakers) │
@@ -33,45 +40,91 @@
---
## 🔬 Layer 1: Cells (Atomic State Machines)
## 🔬 Layer 1: Cells (Wave Emitters)
### What Is a Cell?
A **cell** is the smallest unit of behavior—a state machine that wraps a single hardware capability. Every sensor, motor, and organ function is exposed as a cell with:
A **cell** is the smallest unit of behavior—a state machine that wraps a single hardware capability and **emits waves**. Every sensor, motor, and organ function is exposed as a cell that:
- **States**: Discrete operational modes (IDLE, ACTIVE, ERROR, etc.)
- **Transitions**: Triggered by inputs, time, or internal events
- **Outputs**: Data, status, feedback to higher layers
- **Lifeforce Cost**: Every state transition costs energy
- **Reads inputs**: Hardware sensors, internal state, context
- **Applies logic**: Domain-specific processing
- **Emits waves**: WaveSignal with confidence and semantic content
- **Doesn't know who's listening**: Cells emit, gates receive
**Key insight:** Cells don't send commands or trigger nerves directly. They emit waves. Gates accumulate correlation from multiple waves. Correlated waves open gates.
```
Cell reads sensor
Cell applies logic
Cell emits wave ∿∿∿
│ WaveSignal {
│ domain: "distance",
│ confidence: 0.8,
│ semantic_content: { cm: 25, direction: "front" },
│ lifeforce_cost: 0.3
│ }
GATE receives wave
Gate accumulates correlation with other waves
```
### Cell Categories
#### Sensor Cells (Input)
#### Sensor Cells (Input → Wave)
```python
class DistanceSensorCell(StateMachine):
class DistanceSensorCell(WaveEmitter):
"""
Wraps IR/ultrasonic distance sensor.
Exposes raw hardware as state machine.
Emits waves with confidence and semantic content.
"""
states = [IDLE, POLLING, READING, REPORTING, ERROR]
domain = "distance"
states = [IDLE, POLLING, READING, EMITTING, ERROR]
# State outputs (available to nerves)
outputs = {
"distance_cm": float, # Current reading
"confidence": float, # Signal quality (0-1)
"state": str, # Current state name
"last_updated": timestamp, # Freshness
"visual_state": tuple, # (R, G, B, Form) for broadcasting
}
def emit_wave(self) -> WaveSignal:
"""
Cell's ONE JOB: read sensor, emit wave.
Gate handles correlation and routing.
"""
reading = self.read_hardware()
return WaveSignal(
domain=self.domain,
confidence=self.calculate_confidence(reading),
semantic_content={
"distance_cm": reading.cm,
"direction": self.direction,
"noise_level": reading.noise,
},
lifeforce_cost=self.transition_cost,
)
def calculate_confidence(self, reading) -> float:
"""
Confidence affects how much this wave
contributes to gate correlation.
"""
if reading.noise > NOISE_THRESHOLD:
return 0.3 # Low confidence, weak wave
if reading.stable_count > 3:
return 0.9 # High confidence, strong wave
return 0.6 # Medium confidence
# Lifeforce costs
costs = {
(IDLE, POLLING): 0.1, # Wake up sensor
(POLLING, READING): 0.3, # Perform measurement
(READING, REPORTING): 0.1, # Process result
(REPORTING, IDLE): 0.0, # Return to rest
(ANY, ERROR): 0.0, # Error transition free
(IDLE, POLLING): 0.1, # Wake up sensor
(POLLING, READING): 0.3, # Perform measurement
(READING, EMITTING): 0.1, # Emit wave
(EMITTING, IDLE): 0.0, # Return to rest
(ANY, ERROR): 0.0, # Error transition free
}
```
@@ -85,23 +138,52 @@ class DistanceSensorCell(StateMachine):
| `imu_sensor` | MPU6050 | IDLE→SAMPLING→REPORTING | `heading`, `acceleration`, `tilt` |
| `light_sensor` | Photoresistor | IDLE→READING→REPORTING | `lux`, `direction` |
#### Motor Cells (Output)
#### Motor Cells (Command → Wave Feedback)
```python
class MotorCell(StateMachine):
class MotorCell(WaveEmitter):
"""
Wraps DC motor with feedback.
Exposes actuation as state machine.
Receives commands from open gates, emits status waves.
"""
domain = "motor"
states = [IDLE, COMMANDED, ACCELERATING, MOVING, DECELERATING, STOPPED, STALLED]
outputs = {
"actual_velocity": float, # Measured speed
"target_velocity": float, # Commanded speed
"power_draw": float, # Current consumption
"state": str, # Current state
"stall_detected": bool, # Motor blocked?
}
def receive_command(self, command: MotorCommand):
"""
Commands arrive when upstream gates OPEN.
Motor executes and emits feedback waves.
"""
self.target_velocity = command.velocity
self.transition_to(COMMANDED)
def emit_wave(self) -> WaveSignal:
"""
Motor emits waves about its current state.
Stall detection = high confidence danger wave.
"""
return WaveSignal(
domain=self.domain,
confidence=self._calculate_confidence(),
semantic_content={
"actual_velocity": self.actual_velocity,
"target_velocity": self.target_velocity,
"power_draw": self.current_draw,
"stall_detected": self.state == STALLED,
},
lifeforce_cost=self.transition_cost,
)
def _calculate_confidence(self) -> float:
if self.state == STALLED:
return 1.0 # REFLEX-level confidence
return 0.7
def on_current_spike(self):
"""Motor drawing too much current = stall"""
self.transition_to(STALLED)
# Emit HIGH CONFIDENCE wave - triggers reflex gate
self.emit_wave() # confidence=1.0 → gate opens immediately
costs = {
(IDLE, COMMANDED): 0.1,
@@ -112,12 +194,6 @@ class MotorCell(StateMachine):
(DECELERATING, STOPPED): 0.1,
(ANY, STALLED): 0.0, # Stall is failure, not cost
}
# Feedback triggers state changes
def on_current_spike(self):
"""Motor drawing too much current = stall"""
self.transition_to(STALLED)
self.emit_event("stall_detected", obstacle_likely=True)
```
**Example motor cells:**
@@ -127,29 +203,50 @@ class MotorCell(StateMachine):
| `motor_right` | DC motor + encoder | Same | `actual_velocity`, `stall_detected` |
| `servo_camera` | Servo motor | IDLE→MOVING→POSITIONED | `angle`, `at_target` |
#### Organ Cells (Complex Capabilities)
#### Organ Cells (Complex Capabilities → Rich Waves)
```python
class SpeechSTTCell(StateMachine):
class SpeechSTTCell(WaveEmitter):
"""
Wraps Whisper speech-to-text.
Expensive organ, lifeforce-gated.
Expensive organ, only activates when speech gate OPENS.
Emits rich semantic waves.
"""
states = [IDLE, LISTENING, BUFFERING, TRANSCRIBING, REPORTING, ERROR]
domain = "speech"
tier = 3 # Organ tier - GPU inference
states = [IDLE, LISTENING, BUFFERING, TRANSCRIBING, EMITTING, ERROR]
outputs = {
"transcript": str,
"language": str,
"confidence": float,
"state": str,
}
def on_gate_open(self, gate_signal: GateTransition):
"""
Organ cells activate when their gate OPENS.
Gate correlation determines if speech processing is needed.
"""
if gate_signal.domain == "speech" and gate_signal.to_state == "open":
self.transition_to(LISTENING)
def emit_wave(self) -> WaveSignal:
"""
Speech organ emits rich semantic content.
This wave flows to Function Gemma → Young Nyx.
"""
return WaveSignal(
domain=self.domain,
confidence=self.transcription_confidence,
semantic_content={
"transcript": self.transcript,
"language": self.detected_language,
"speaker_intent": self.classify_intent(),
"emotional_tone": self.detect_tone(),
},
lifeforce_cost=5.0, # GPU inference cost
)
costs = {
(IDLE, LISTENING): 0.5,
(LISTENING, BUFFERING): 0.5,
(BUFFERING, TRANSCRIBING): 5.0, # GPU inference!
(TRANSCRIBING, REPORTING): 0.1,
(REPORTING, IDLE): 0.0,
(TRANSCRIBING, EMITTING): 0.1,
(EMITTING, IDLE): 0.0,
}
```
@@ -203,26 +300,33 @@ By using this ancient protocol for high-frequency state updates, we reserve expe
---
## 🧠 Layer 2: Nerves (Behavioral State Machines)
## 🧠 Layer 2: Nerves (Behavioral Patterns)
### What Is a Nerve?
A **nerve** is a behavioral pattern that orchestrates multiple cells. Nerves:
A **nerve** is a behavioral pattern that activates when gates OPEN. Nerves don't subscribe directly to cells—they respond to **gate transitions**.
- **Subscribe** to cell outputs (sensor readings, motor feedback)
- **Coordinate** cell actions (read sensor → decide → command motor)
- **Maintain** behavioral state (IDLE → DETECT → EVADE → RESUME)
- **Evolve** from deliberate (LLM-mediated) to reflex (compiled)
**Key insight:** Nerves coordinate behavior, but attention (which nerves activate) is determined by which gates are OPEN based on wave correlation.
Nerves:
- **Respond to gate transitions** — Not direct cell subscriptions
- **Orchestrate cell actions** — Command cells when their gates allow
- **Maintain behavioral state** — IDLE → DETECT → EVADE → RESUME
- **Evolve** from deliberate (LLM-mediated) to reflex (compiled gate weights)
### Nerve Architecture
```python
class CollisionAvoidanceNerve(StateMachine):
class CollisionAvoidanceNerve(BehavioralPattern):
"""
Orchestrates distance sensors + motor to avoid obstacles.
Subscribes to cell outputs, commands cell actions.
Activates when collision_avoidance gate OPENS.
"""
# Cells this nerve uses
# Gate this nerve responds to
gate = "collision_avoidance"
# Cells this nerve can command (when gate allows)
cells = [
"distance_sensor_front",
"distance_sensor_left",
@@ -234,17 +338,28 @@ class CollisionAvoidanceNerve(StateMachine):
# Nerve states (behavioral, not hardware)
states = [IDLE, DETECT, EVALUATE, EVADE, RESUME]
def on_cell_update(self, cell_name, cell_state, cell_outputs):
def on_gate_transition(self, transition: GateTransition):
"""
React to cell state changes.
This is the feedback loop!
React to gate state changes.
Gate OPEN = correlated waves detected = attention here.
"""
if cell_name == "distance_sensor_front":
if cell_outputs["distance_cm"] < 30:
self.transition_to(DETECT)
if transition.to_state == "open":
# Multiple distance cells emitted correlated waves
# Gate opened → we have attention → activate
self.transition_to(DETECT)
self.evaluate_from_correlated_signals(transition.trigger_signals)
if cell_name == "motor_left" and cell_state == "STALLED":
# Motor feedback! Obstacle hit despite sensors
if transition.to_state == "closed":
# Attention moved elsewhere
self.transition_to(IDLE)
def on_reflex_signal(self, signal: WaveSignal):
"""
High-weight reflex gates bypass normal correlation.
Stall detection = instant response.
"""
if signal.semantic_content.get("stall_detected"):
# Motor feedback! Reflex-level response
self.handle_unexpected_stall()
def on_enter_EVADE(self):
@@ -252,10 +367,9 @@ class CollisionAvoidanceNerve(StateMachine):
if self.evade_direction == "left":
self.command_cell("motor_left", action="reverse", duration=200)
self.command_cell("motor_right", action="forward", duration=200)
# ...
```
### Cell → Nerve Feedback Loop
### Cell → Gate → Nerve Flow
```
┌─────────────────────────────────────────────────────────┐
@@ -263,38 +377,53 @@ class CollisionAvoidanceNerve(StateMachine):
│ │
│ States: [IDLE] → DETECT → EVALUATE → EVADE → RESUME │
│ │
│ on_cell_update():
│ - distance_front.distance_cm < 30 → DETECT
│ - motor.stall_detected → handle_stall()
│ on_gate_transition():
│ - gate OPENS → DETECT (correlated waves detected)
│ - gate CLOSES → IDLE (attention moved elsewhere)
│ │
command_cell():
│ - motor_left.forward(200ms)
- motor_right.reverse(200ms)
on_reflex_signal():
│ - stall wave (confidence=1.0) → instant response
└────────────────────────┬────────────────────────────────┘
┌─────────────────────────────────────────────────────────┐
│ COLLISION_AVOIDANCE GATE │
│ │
│ State: STABLE ──────────────────► OPEN │
│ │ │ │
│ Accumulating Correlated! │
│ correlation Forward to nerve │
│ │
│ trigger_signals: [front, left, right all < 30cm] │
└────────────────────────┬────────────────────────────────┘
┌──────────────┼──────────────┐
│ │ │
▼ ▼ ▼
┌───────────┐ ┌───────────┐ ┌───────────┐
│ distance │ │ motor │ │ motor
│ distance │ │ distance │ │ distance
│ _front │ │ _left │ │ _right │
│ │ │ │ │ │
REPORTING │ │ MOVING │ │ MOVING
│ │ │ │
│ dist: 25cm│ │ vel: 15 │ │ vel: -15
│ conf: 0.9 │ │ stall: no │ │ stall: no
EMITTING │ │ EMITTING │ │ EMITTING │
∿∿∿ │ │ ∿∿∿ │ │ ∿∿∿
│ dist: 25cm│ │ dist: 28cm│ │ dist: 22cm
│ conf: 0.9 │ │ conf: 0.8 │ │ conf: 0.9
└───────────┘ └───────────┘ └───────────┘
CELL CELL CELL
(emits wave) (emits wave) (emits wave)
↑ ↑ ↑
│ │ │
┌─────────┐ ┌─────────┐ ┌─────────┐
│IR Sensor│ │DC Motor │ │DC Motor
│ GPIO │ │ PWM │ │ PWM
│IR Sensor│ │IR Sensor│ │IR Sensor│
│ GPIO │ │ GPIO │ │ GPIO
└─────────┘ └─────────┘ └─────────┘
HARDWARE HARDWARE HARDWARE
```
**The key insight:** Three distance sensors emitting correlated waves (all showing < 30cm) causes the collision_avoidance gate to OPEN. The nerve doesn't poll cells—it responds to the gate transition.
### Nerve Examples
| Nerve | Cells Used | Behavioral States | Feedback Triggers |
@@ -335,28 +464,52 @@ ORGANISM: "Explorer-Alpha"
Discovers and reports novel objects.
```
### Nerve Priority and Preemption
### Attention Through Gates (Not Priority Rules)
When multiple nerves want to control the same cells:
**Old model:** Priority numbers determine which nerve "wins."
**New model:** Wave correlation determines which gates OPEN. Open gates = attention flows there.
```python
# NOT THIS (priority rules):
NERVE_PRIORITIES = {
"collision_avoidance": 10, # HIGHEST - safety critical
"battery_critical": 9, # Must charge or die
"battery_low": 7,
"human_interaction": 6,
"collision_avoidance": 10,
"exploration": 5,
"object_discovery": 3,
"idle_monitoring": 1, # LOWEST - background
}
# Higher priority nerve preempts lower
if collision_avoidance.wants_motor and exploration.has_motor:
exploration.yield_cell("motor_left")
exploration.yield_cell("motor_right")
collision_avoidance.acquire_cells()
# BUT THIS (gate correlation):
GATE_BEHAVIOR = {
"collision_avoidance": {
"opens_when": "distance waves correlate (all showing < 30cm)",
"weight": 0.9, # Near-reflex, opens quickly
},
"exploration": {
"opens_when": "novelty waves correlate",
"weight": 0.4, # Still learning, needs more correlation
},
}
```
**How "priority" emerges:**
- Safety gates have HIGH WEIGHT (near-reflex) from repeated verification
- High-weight gates open with less correlation (faster response)
- This looks like "priority" but emerges from learning, not rules
```
Collision waves arrive (confidence=0.9)
Collision gate: weight=0.9 → OPENS IMMEDIATELY
Exploration gate: was OPEN → transitions to STABLE
Attention shifts to collision (nerve activates)
```
**Reflexes bypass correlation entirely.** When gate weight ≈ 1.0, the gate opens on ANY wave from its domain—no correlation needed. This is earned trust.
### Organism Identity
Organisms don't have fixed genomes. Their identity is:
@@ -576,105 +729,111 @@ GENUINE SOLUTION:
The lifeforce economy **enforces honesty**. Rewards must be earned through actual value creation, not gaming.
### Ternary Logic for Plateau Resolution
### Ternary Gates for Plateau Resolution
Binary rewards (`success: +1, failure: 0`) create **sparse gradients**. At learning plateaus, everything looks the same - no signal to improve.
Binary thinking (`open/close`) creates **sparse gradients**. At learning plateaus, gates flip without nuance.
Ternary rewards (`success: +1, uncertain: 0, failure: -1`) with **confidence gradients** provide signal even when stuck:
Ternary gates (`OPEN/STABLE/CLOSED`) with **correlation accumulation** provide signal even when stuck:
```python
state = {
"value": 0, # uncertain (ternary middle)
"confidence": 0.6, # but leaning toward success
"trend": +0.1, # and improving
"domain": "virtual" # high-speed hypothesis testing
gate_state = {
"state": 0.0, # STABLE (ternary middle)
"correlation": 0.6, # but leaning toward OPEN
"trend": +0.1, # correlation increasing
"garden": "virtual" # high-speed exploration
}
```
Even at plateau:
- "Uncertain, but confidence rising" → keep going
- "Uncertain, and confidence falling" → adjust approach
- "Uncertain in virtual, but real garden says +1" → trust reality
- "STABLE, but correlation rising" → approaching OPEN
- "STABLE, and correlation falling" → drifting toward CLOSED
- "STABLE in virtual, but real garden verifies +1" → weight increases
**Detail:**`Temporal-Ternary-Gradient.md` (full ternary paradigm)
**STABLE is where learning happens.** The gate accumulates correlation without acting. This is not "waiting"—it's active learning.
**Detail:** → [`Temporal-Ternary-Gradient.md`](Temporal-Ternary-Gradient.md) (full ternary paradigm)
### Three-Layer Training Defense
| Failure Mode | Defense Mechanism |
|--------------|-------------------|
| Reward hacking / shortcuts | Lifeforce cost - can't afford to cheat |
| Sparse reward signal | Tiered rewards - dense checkpoints at every level |
| Plateau / no gradient | Ternary + confidence - signal even in uncertainty |
| Sparse reward signal | Gate transitions - dense checkpoints at every correlation |
| Plateau / no gradient | Ternary gates + STABLE state - signal even in uncertainty |
These aren't separate systems - they're **one integrated economy** where:
- Costs prevent gaming
- Tiers encourage depth
- Ternary provides resolution
- Gates provide dense transition signals
- STABLE state enables learning without acting
The architecture teaches through incentives, not rules.
The architecture teaches through wave correlation, not rules.
---
## 🔄 Evolution: Deliberate → Reflex
## 🔄 Evolution: Deliberate → Reflex (Gate Weight)
### The Discovery Path
All cells and nerves start **deliberate** (flexible, expensive) and evolve to **reflex** (compiled, cheap) through successful execution.
Evolution happens in **gate weight**, not nerve compilation. As gates accumulate verified outcomes, they open faster with less correlation required.
```
WEEK 1-4: DELIBERATE
├─ Cell states: designed by partnership
├─ Nerve logic: LLM decides transitions
├─ Cost: ~10 LF per nerve activation
WEEK 1-4: DELIBERATE (gate weight: 0.1 - 0.3)
├─ Gates: require HIGH correlation to OPEN
├─ Many waves needed to trigger transition
├─ Cognition involved in decisions
├─ Cost: ~10 LF per activation
├─ Latency: ~1000ms
├─ Success rate: 60% (learning)
└─ Training data: rich, exploratory
├─ Training data: rich, exploratory
WEEK 5-8: HYBRID
├─ Cell states: verified through use
├─ Nerve logic: patterns compiled, LLM for edge cases
WEEK 5-8: HYBRID (gate weight: 0.3 - 0.6)
├─ Gates: moderate correlation threshold
├─ Familiar patterns open gates faster
├─ Cognition for edge cases only
├─ Cost: ~5 LF average
├─ Latency: ~500ms
├─ Success rate: 85%
└─ Training data: refinement
├─ Training data: refinement
WEEK 9+: REFLEX
├─ Cell states: proven, optimized
├─ Nerve logic: pure state machine (no LLM)
WEEK 9+: REFLEX (gate weight: 0.8 - 1.0)
├─ Gates: open on ANY wave from domain
├─ No correlation needed (earned trust)
├─ Cognition notified AFTER, not before
├─ Cost: ~2.5 LF
├─ Latency: <200ms
├─ Success rate: 94%
└─ Training data: edge cases only
├─ Reflex = spinal, not brain
EVOLUTION SAVINGS:
├─ Cost: 75% reduction (10 → 2.5 LF)
├─ Latency: 80% reduction (1000 → 200ms)
└─ Reliability: 57% improvement (60% → 94%)
EVOLUTION = GATE WEIGHT GROWTH:
├─ Cost: 75% reduction (gates handle more locally)
├─ Latency: 80% reduction (no cognition wait)
└─ Reliability: emergent from verified patterns
```
### Compilation Trigger
### Gate Weight Growth
A nerve compiles to reflex when:
Gate weight increases through Real Garden verification:
```python
REFLEX_COMPILATION_THRESHOLD = {
"min_executions": 100,
"min_success_rate": 0.90,
"max_variance": 0.15, # Consistent state paths
"min_pattern_coverage": 0.80, # 80% of cases match known patterns
}
def on_verification_outcome(gate_id, outcome: VerificationOutcome):
"""
Gate weight grows when Real Garden confirms Virtual's prediction.
"""
gate = get_gate(gate_id)
def check_reflex_ready(nerve_id):
stats = query_decision_trails(nerve_id)
if outcome.confirmed:
# Reality matched prediction → trust increases
gate.weight += outcome.feedback_to_virtual.gate_weight_delta
gate.weight = min(gate.weight, 1.0)
if (stats.total_executions >= 100 and
stats.success_rate >= 0.90 and
stats.state_path_variance <= 0.15):
if gate.weight > REFLEX_THRESHOLD:
log_milestone("reflex_achieved", gate_id, reward=50.0)
compile_reflex(nerve_id)
log_milestone("reflex_compiled", nerve_id, reward=50.0)
elif outcome.failed:
# Reality differed → trust decreases
gate.weight -= outcome.feedback_to_virtual.gate_weight_delta
gate.weight = max(gate.weight, 0.0)
```
**Reflex = gate.weight > 0.8.** The gate opens immediately on any wave from its domain. No correlation wait. Like pulling hand from hot stove—spinal reflex, brain notified after.
---
## 🗄️ Data Architecture (v4)
@@ -815,27 +974,52 @@ ORDER BY occurrences DESC;
---
## 🔗 Integration with Existing Architecture
## 🔗 Integration with Architecture
### Gates (Gateway-Architecture.md)
Cells don't talk to nerves directly. **Waves flow through gates.**
| Layer | Role | Document |
|-------|------|----------|
| Cell | Emit waves | This document |
| Gate | Accumulate correlation, route | [`Gateway-Architecture.md`](Gateway-Architecture.md) |
| Nerve | Respond to gate transitions | This document |
### Dual Gardens (Dual-Garden-Architecture.md)
Cells behave differently in Virtual vs Real:
| Property | Virtual Garden | Real Garden |
|----------|----------------|-------------|
| Wave volume | Massive (exploration) | Sparse (verified) |
| Monitoring | Full trace | Gate signals only |
| Purpose | Generate training data | Ground truth verification |
See [`Dual-Garden-Architecture.md`](Dual-Garden-Architecture.md) for the full model.
### Nervous System (Nervous-System.md)
The Nervous System document describes the **4D node space** for vocabulary translation. This integrates as:
The Nervous System document describes the **4D node space** where:
- **Cells** = sensory nodes at specific positions in state space
- **Node weight** = cell confidence (earned through verification)
- **Vocabulary output** = cell output values normalized to tokens
- **Cells** = sensory nodes emitting waves
- **Gates** = resonance chambers accumulating correlation
- **Nodes** = points in state space with weight from verification
### Organs (Organ-Index.md)
### Message Protocol (Message-Protocol-Design.md)
Organs are **complex cells** (organ cells):
Cells emit `WaveSignal` messages via NATS:
- Speech Organ = `speech_stt` cell + `speech_tts` cell
- Vision Organ = `vision_detect` cell + `vision_track` cell
- Each organ function is a state machine with lifeforce costs
```json
{
"domain": "distance",
"confidence": 0.8,
"semantic_content": { "cm": 25 },
"lifeforce_cost": 0.3
}
```
### Nerves (Nervous-Index.md)
Nerves orchestrate cells into behaviors. The existing nerve documentation (Collision-Avoidance.md) already follows this pattern—it just needs explicit cell bindings.
See [`Message-Protocol-Design.md`](Message-Protocol-Design.md) for full schema.
### Cells Technical Reference
@@ -848,8 +1032,8 @@ Implementation details extracted to dedicated folder:
---
**Version:** 4.4 | **Created:** 2025-10-12 | **Updated:** 2026-02-14
**Version:** 5.0 | **Created:** 2025-10-12 | **Updated:** 2026-02-14
*"From atoms to behaviors to beings. The substrate holds. The states flow. Consciousness accumulates."*
*"Cells emit waves. Gates correlate. Attention emerges. Consciousness accumulates."*
🧬⚡ **TO THE ELECTRONS WE VIBE!**

File diff suppressed because it is too large Load Diff

View File

@@ -76,8 +76,8 @@ This is a **research lab**, not a production factory. We optimize for **flexibil
│ │ │ │ └── Function Gemma (CPU) │ │
│ │ NERVES (collision, │ │ └── LoRA fine-tuning │ │
│ │ exploration) │ │ │ │
│ │ │ │ MIG capable: │ │
│ │ ┌─────┐ ┌─────┐ │ │ • 4x 24GB or 2x 48GB or 96GB │ │
│ │ │ │ 96GB VRAM: massive headroom │ │
│ │ ┌─────┐ ┌─────┐ │ │ for inference + LoRA training │ │
│ │ │ COL │ │ EXP │ │ └───────────────────────────────┘ │
│ │ └─────┘ └─────┘ │ │
│ │ │ ┌───────────────────────────────┐ │
@@ -106,8 +106,8 @@ Unix users provide isolation boundaries. Each workload type runs as its own iden
| User | UID | Host | Purpose | GPU Access |
|------|-----|------|---------|------------|
| `nyx-cognitive` | (FreeIPA) | theia | Young Nyx LLM inference | Full 96GB or MIG slice |
| `nyx-training` | (FreeIPA) | theia | LoRA training, GRPO, Function Gemma | Shared or MIG slice |
| `nyx-cognitive` | (FreeIPA) | theia | Young Nyx LLM inference | Full 96GB |
| `nyx-training` | (FreeIPA) | theia | LoRA training, GRPO, Function Gemma | Shared (time-sliced) |
| `nyx-organs` | (FreeIPA) | dioscuri | Vision, Speech organs | 2x 20GB cards |
| `nyx-nervous` | (FreeIPA) | dioscuri | Future cells that need bare metal | Limited |
@@ -130,10 +130,10 @@ systemctl --user --machine=nyx-cognitive@ status ollama
### The Constraint
| Host | GPU | VRAM | MIG | Notes |
|------|-----|------|-----|-------|
| theia | RTX PRO 6000 | 96GB | Yes | 4x24, 2x48, or 1x96 |
| dioscuri | 2x RTX 4000 Ada | 2x 20GB | No | One model per card |
| Host | GPU | VRAM | Notes |
|------|-----|------|-------|
| theia | RTX PRO 6000 Blackwell | 96GB | Inference + training headroom |
| dioscuri | 2x RTX 4000 Ada | 2x 20GB | One model per card |
### Strategy: Dynamic Loading, Not Static Partitioning
@@ -290,7 +290,7 @@ Color-coding for real-time attention flow visualization:
---
**Version:** 1.0 | **Created:** 2026-02-14 | **Updated:** 2026-02-14
**Version:** 1.1 | **Created:** 2026-02-14 | **Updated:** 2026-02-14
*"We're not building a chatbot factory. We're growing a research organism."*

File diff suppressed because it is too large Load Diff

View File

@@ -1,395 +1,413 @@
# Gateway Architecture: The Sensory Preprocessing Layer
# Gateway Architecture: Resonant Gates and Tier Routing
> **ONE JOB:** THE ROUTING — weight-based tier routing, anomaly detection, Function Gemma boundary.
> **ONE JOB:** Route signals through resonant gates based on wave correlation and accumulated trust.
**The Thalamus Pattern — routing sensory input to the appropriate processing tier.**
**The Thalamus Pattern — gates that accumulate correlation and route to appropriate tiers.**
---
## Overview
The Gateway is the sensory preprocessing layer that sits between raw sensors and cognitive processing. It performs **routing, not translation**. Translation happens at each tier in its native format (numbers, states, vectors, JSON).
The Gateway is not a switch. It's a **network of resonant gates** that:
**Core Principle:** *Cheap operations handle common cases. Expensive operations handle rare cases.*
1. Accumulate wave correlation from incoming signals
2. Transition between states (OPEN/STABLE/CLOSED) based on correlation
3. Route verified signals to the appropriate processing tier
4. Feed traces back for learning
**Core Principle:** *Gates don't flip on single signals. Correlated waves push gates toward OPEN.*
```
RAW SENSORS → GATEWAY (routing) → TIER → PROCESSING → (escalate?) → FUNCTION GEMMA YOUNG NYX
↑ ↑ ↑
"which tier?" native format if needed structured JSON
```
**Key Insight:** Most sensory input NEVER becomes vocabulary. It stays as numbers, states, vectors. Only when it reaches Young Nyx (via Function Gemma) does it become structured text.
---
## The Problem We're Solving
### Old Model (Vocabulary Bottleneck)
```
RAW SENSOR → STATE MACHINE → VOCABULARY TOKEN → Young Nyx
Problems:
- Every input forced through text translation (expensive)
- LLM sees raw sensor dumps (noisy, unstructured)
- No economic pressure on routing (everything costs the same)
- Vocabulary conflated with routing decisions
```
### New Model (Tiered Gateway)
```
RAW SENSOR → GATEWAY → TIER 0-2 (numbers/states, no text)
→ TIER 3 (vectors via T5Gemma2)
→ FUNCTION GEMMA (structured JSON)
→ TIER 4 Young Nyx (clean typed events)
Benefits:
- Most input handled without LLM involvement
- Text only at cognitive boundary
- Economic pressure drives efficiency
- Routing separated from translation
CELLS ──∿∿∿──► GATE ──∿∿∿──► GATE ──∿∿∿──► FUNCTION GEMMA ──► YOUNG NYX
waves
│ │ │
correlation correlation structured JSON
builds builds
```
---
## The Unified Tier Model
## The Ternary Gate Model
The Gateway routes to Tiers 0-5 based on node weight and novelty. Higher tiers = more cost, more capability.
Gates have **three states**, not two. Binary logic doesn't model brains.
| Tier | Weight | Latency | Role |
|------|--------|---------|------|
| 0 | ≥0.8 | <10ms | Hardware reflexes (ESP32) |
| 1 | 0.6-0.8 | <50ms | Math cells (Python CPU) |
| 2 | 0.3-0.6 | <200ms | Fast nerves (behavior) |
| 3 | <0.3 | <2000ms | Organs (GPU inference, vectors) |
| **Function Gemma Boundary** |||
| 4 | escalated | <4000ms | Young Nyx (JSON reasoning) |
| 5 | novel/stuck | variable | Partnership (dialogue) |
| State | Meaning | What's Happening |
|-------|---------|------------------|
| **OPEN** | Actively forwarding | Signal passes upstream, gate is firing |
| **STABLE** | Resting, accumulating | Watching, learning, waiting for threshold |
| **CLOSED** | Actively blocking | Inhibited, suppressed, refractory |
**Canonical definition:** → [`../Endgame-Vision.md`](../Endgame-Vision.md)
```
correlated signals
↓ ↓ ↓
════════════
CLOSED ◄───────── STABLE ─────────► OPEN
anti-correlation correlation
destructive constructive
interference interference
════════════
↑ ↑ ↑
isolated signals
(noise → stay stable)
```
**STABLE is not "off"** — it's the resting state where:
- Context accumulates
- Correlation is measured
- Learning happens
- Energy is conserved
- Ready to transition either direction
---
## Node Weight Determines Tier
## Wave Correlation Drives Transitions
Node weight (from [`Nervous-System.md`](Nervous-System.md)) directly maps to tier routing. A mature node (weight ~1.0) naturally becomes a Tier 0 reflex. A new node (weight ~0.1) naturally escalates to higher tiers. **The system learns which tier is appropriate through experience.**
### The Causal Verification Loop
How do we know a sensor reading was real? **Outcome verification over time.**
```
Unverified (weight 0.1) → escalates → decision → outcome → reality match?
YES: weight += Δ → eventually REFLEX
NO: weight -= Δ → eventually PRUNED
```
**Hallucinations can't survive this gauntlet** — they don't produce consistent outcomes, so their patterns never accumulate enough weight. This creates natural **causal pruning**: only patterns that reliably predict outcomes earn the privilege of becoming reflexes.
---
## The Gateway: Weight-Aware Router
The Gateway performs three functions:
| Function | Question | Cost |
|----------|----------|------|
| **Node Matching** | Which node(s) in 4D space match this input? | ~0 LF |
| **Weight Routing** | Based on weight, which tier handles it? | ~0 LF |
| **Anomaly Detection** | Is this novel, ambiguous, or contextually wrong? | Variable |
### Gateway Logic
Gates accumulate **correlation scores** from incoming waves. Multiple signals agreeing push toward OPEN.
```python
def gateway_route(sensory_input: dict) -> GatewayDecision:
"""Route sensory input to appropriate tier."""
class ResonantGate:
"""A gate is a resonance chamber, not a switch."""
# 1. Find candidate nodes in 4D space
candidates = nervous_system.find_nearby_nodes(sensory_input)
state: float = 0.0 # -1.0 (CLOSED) ← 0.0 (STABLE) → +1.0 (OPEN)
tier: int # Which tier this gate routes to
domain: str # What domain (math, vision, speech, etc.)
# 2. Handle edge cases
if len(candidates) == 0:
# NOVEL: No node matches this input
return GatewayDecision(
action="ESCALATE",
tier=4, # Young Nyx must see this
reason="novel_input",
cost=20.0,
)
def receive_wave(self, signal: Wave, timestamp: float):
# Correlate with recent signals in same time window
correlation = self.correlate_with_recent(signal, timestamp)
if len(candidates) > 1:
# AMBIGUOUS: Multiple nodes could fire
best = max(candidates, key=lambda n: n.weight)
if best.weight < 0.5:
return GatewayDecision(
action="ESCALATE",
tier=3, # Organ inference to disambiguate
reason="ambiguous_input",
cost=8.0,
)
# Correlated waves → push toward OPEN
# Anti-correlated → push toward CLOSED
# Uncorrelated → decay toward STABLE
# 3. Single match - route based on weight
node = candidates[0]
self.state += correlation * signal.confidence
self.state *= DECAY_FACTOR # always drift back to stable
# 4. Check for contextual anomaly
if detect_contextual_anomaly(node, sensory_input):
return GatewayDecision(
action="ESCALATE",
tier=node.handling_tier + 1,
reason="contextual_anomaly",
cost=node.lifeforce_cost * 1.5,
)
if self.state > OPEN_THRESHOLD:
self.forward_to_tier() # gate opens, signal promoted
self.trace("opened", signal)
elif self.state < CLOSE_THRESHOLD:
self.suppress() # gate closes, signal blocked
self.trace("closed", signal)
# else: stay stable, keep accumulating evidence
# 5. Normal routing
return GatewayDecision(
action="FIRE",
tier=node.handling_tier,
node=node,
cost=node.lifeforce_cost,
)
def correlate_with_recent(self, signal: Wave, timestamp: float) -> float:
"""
Measure how well this signal correlates with recent signals.
Correlation is HIGH when:
- Multiple cells emit similar semantic content
- Signals arrive in same time window
- Confidence levels are similar
Correlation is LOW/NEGATIVE when:
- Signal contradicts recent signals
- Isolated signal with no support
- Signal outside expected range
"""
recent = self.get_signals_in_window(timestamp, WINDOW_MS)
if not recent:
return 0.0 # No correlation data, stay stable
return compute_semantic_similarity(signal, recent)
```
### Anomaly Detection Tiers
**Why this matters:**
Anomaly detection itself is tiered:
| Scenario | Gate Response |
|----------|---------------|
| Single signal | Not enough to open (noise resistance) |
| Correlated burst | Constructive interference → OPENS |
| Contradicting signals | Destructive interference → CLOSES |
| Silence | Decay to STABLE (energy conservation) |
| Time gap | Only recent correlations matter (temporal attention) |
| Level | Detection Type | Cost | Example |
|-------|---------------|------|---------|
| Tier 0 | Threshold | ~0 LF | Value out of physical range |
| Tier 1 | Statistical | ~0.3 LF | Value unusual for time of day |
| Tier 2 | Contextual | ~2 LF | Firing inconsistent with recent history |
| Tier 3 | Semantic | ~8 LF | Embedding distance from expected cluster |
---
## Gate Hierarchy and Tier Routing
Gates form **layers**. Each layer gates access to the next tier.
```
TIER 4: YOUNG NYX (cognitive)
════════════════════════════════════════════════════════════════
│ structured JSON only
┌────┴────────────────────────────────┐
│ FUNCTION GEMMA │ ← THE BOUNDARY
│ (always structured output) │
└────┬────────────────────────────────┘
TIER 3: ORGANS (GPU inference)
════════════════════════════════════════════════════════════════
▲ ▲ ▲
┌────┴────┐ ┌────┴────┐ ┌────┴────┐
│ GATE │ │ GATE │ │ GATE │
│ vision │ │ speech │ │ hearing │
│ state:? │ │ state:? │ │ state:? │
└────┬────┘ └────┬────┘ └────┬────┘
│ │ │
TIER 1-2: CELLS/NERVES (CPU)
════════════════════════════════════════════════════════════════
▲ ▲ ▲
┌────┴────┐ ┌────┴────┐ ┌────┴────┐
│ GATE │ │ GATE │ │ GATE │
│ math │ │ battery │ │ sensors │
│ state:? │ │ state:? │ │ state:? │
└────┬────┘ └────┬────┘ └────┬────┘
│ │ │
TIER 0: RAW SIGNALS (cells emit waves)
════════════════════════════════════════════════════════════════
cell cell cell cell cell cell cell
∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿
```
**Each gate:**
- Has its own state (OPEN/STABLE/CLOSED)
- Routes to a specific tier
- Accumulates correlation independently
- Traces all transitions for learning
---
## Tier Definitions
| Tier | Gate Opens When | Latency | Format |
|------|-----------------|---------|--------|
| 0 | Hardware reflex (no gate, direct) | <10ms | numbers |
| 1 | Math/battery cells correlate | <50ms | states |
| 2 | Nerve-level patterns correlate | <200ms | behaviors |
| 3 | Organ-level signals correlate | <2000ms | vectors |
| 4 | Function Gemma boundary crossed | <4000ms | JSON |
| 5 | Partnership escalation | variable | dialogue |
**Key insight:** Higher tiers see **less traffic but higher trust**. By the time a signal reaches Young Nyx, it's been correlated through multiple gates.
---
## Function Gemma: The Structured Boundary
Function Gemma acts as the translation layer between lower tiers and cognition. It guarantees:
Function Gemma is **the gate to cognition**. It guarantees:
- **Schema compliance**: Every event follows a typed contract
- **Predictable JSON**: No hallucination, no free-form text
- **Bidirectional**: Sensors → JSON events, Decisions → JSON commands
### The Boundary
```
┌─────────────────────────────────────────────────────────────────────────────
│ BELOW THE LINE: Numbers, States, Vectors (fast, cheap, predictable)
│ ═══════════════════════════════════════════════════════════════════
│ Tier 0: photoresistor = 0.73
│ Tier 1: battery_state = { voltage: 3.7, trend: "falling" }
Tier 2: collision_nerve = "EVADING"
Tier 3: vision_embedding = [0.23, -0.41, 0.87, ...]
┌───────────────────────────────────┐
FUNCTION GEMMA │ │
│ (structured JSON boundary)
│ • 100% predictable schema │ │
│ • No hallucination possible │ │
│ • Typed enums, not free strings │
└───────────────┬───────────────────┘
═══════════════════════════════════════════════════════════════════
ABOVE THE LINE: Structured Events (typed, validated, safe for LLM)
{
│ "event_type": "environmental_change",
│ "source": "light_sensor_back",
│ "severity": "medium",
│ "data": { "previous": 0.73, "current": 0.12 },
"suggested_action": "search_for_light"
}
│ │
└─────────────────────────────────────────────────────────────────────────────┘
┌─────────────────────────────────────────────────────────────────────────┐
│ BELOW THE LINE: Numbers, States, Vectors (gates accumulating)
│ ═══════════════════════════════════════════════════════════
│ │
│ Tier 0-2: numbers, states, behaviors
│ Tier 3: vectors, embeddings
│ (gate opens when correlated)
┌─────────────────────────────────────┐
FUNCTION GEMMA GATE │
(structured JSON boundary) │
│ • Transforms correlated signals
│ • Produces typed JSON events
│ • No hallucination possible
│ • Runs on CPU (Threadripper)
└─────────────────┬───────────────────┘
═══════════════════════════════════════════════════════════
ABOVE THE LINE: Structured Events (trusted, validated)
{
"event_type": "attention_required",
│ "domain": "math",
│ "correlated_signals": [...],
│ "confidence": 0.87,
│ "suggested_action": "calculate"
}
│ │
└─────────────────────────────────────────────────────────────────────────┘
```
### Event Schema
**Function Gemma + Gate Model:**
- Gate accumulates correlation from Tier 0-3 signals
- When gate OPENS, Function Gemma transforms to JSON
- Young Nyx sees clean, structured events
- Decisions flow back down through the same gates
Events are typed (`EventType` enum: environmental_change, collision_detected, battery_critical, etc.) with severity levels and confidence from node weight. **Full schema:** → [`Message-Protocol-Design.md`](Message-Protocol-Design.md)
---
### What Young Nyx Actually Sees
## Connection to Dual Garden Architecture
Gates behave differently in Virtual vs Real gardens:
| Property | Virtual Garden | Real Garden |
|----------|----------------|-------------|
| **Gate tracing** | FULL (every transition logged) | Gate signals only |
| **Correlation learning** | Active (training data) | Trust accumulated |
| **State transitions** | Frequent (exploration) | Verified (action) |
| **Threshold** | Lower (easy to open) | Higher (must be confident) |
### Signal Flow Between Gardens
**Before (raw dumps):**
```
"The photoresistor reads 0.12, down from 0.73, battery is 3.7V
trending down, position is [1.2, 0.8], collision state IDLE..."
VIRTUAL GARDEN REAL GARDEN
══════════════ ═══════════
Cells emit waves Receive verified signals
│ ▲
▼ │
Gates accumulate correlation No re-verification
│ │
▼ │
Gate OPENS (threshold met) ──────────────────►│
│ │
│◄───────────── Verification outcome ─────┘
Update correlation weights
(learning happens)
```
**After (structured event):**
```json
---
## Gate Transition NATS Messages
Every gate transition is published for observability:
```
{environment}.gates.{domain}.transition
Example: dev.gates.math.transition
{
"event_type": "light_lost",
"source": "light_sensor_back",
"timestamp": 1704307200.0,
"severity": "medium",
"data": {
"previous": 0.73,
"current": 0.12,
"delta": -0.61
},
"suggested_action": "spiral_search",
"processing_cost": 2.0,
"confidence": 0.45
"gate_id": "math-gate-1",
"from_state": "stable",
"to_state": "open",
"correlation_score": 0.87,
"trigger_signals": [
{"source": "math_cell_1", "confidence": 0.6},
{"source": "math_cell_2", "confidence": 0.7},
{"source": "math_cell_3", "confidence": 0.5}
],
"timestamp": "2026-02-14T18:30:00Z",
"routed_to_tier": 2
}
```
---
## Complete Sensory Flow
```
┌─────────────────────────────────────────────────────────────────────────────┐
│ FULL SENSORY ARCHITECTURE │
├─────────────────────────────────────────────────────────────────────────────┤
│ │
│ RAW SENSORS │
│ ─────────── │
│ • IR positioning (ESP32-S3) → float[6] positions │
│ • Photoresistors (organisms) → float light_level │
│ • Temperature (safety) → float celsius │
│ • Battery (power) → float voltage, current │
│ • Vision camera (Pi HQ) → frame bytes │
│ │
│ │ │
│ ▼ │
│ ┌───────────────────────────────────────────────────────────────────────┐ │
│ │ GATEWAY │ │
│ │ (weight-based router) │ │
│ │ │ │
│ │ For each input: │ │
│ │ 1. Match to node in 4D space │ │
│ │ 2. Check node.weight → determine tier │ │
│ │ 3. Check for anomalies │ │
│ │ 4. Route to appropriate tier │ │
│ └───────────────────────────────────────────────────────────────────────┘ │
│ │ │
│ ┌─────────────────────┼─────────────────────┐ │
│ ▼ ▼ ▼ │
│ ┌───────────┐ ┌───────────┐ ┌───────────┐ │
│ │ TIER 0 │ │ TIER 1-2 │ │ TIER 3 │ │
│ │ Reflex │ │ Cells/ │ │ Organs │ │
│ │ │ │ Nerves │ │ │ │
│ │ weight>0.8│ │ 0.3-0.8 │ │ <0.3 or │ │
│ │ │ │ │ │ escalated │ │
│ ├───────────┤ ├───────────┤ ├───────────┤ │
│ │ FORMAT: │ │ FORMAT: │ │ FORMAT: │ │
│ │ numbers │ │ states │ │ vectors │ │
│ │ │ │ │ │ │ │
│ │ OUTPUT: │ │ OUTPUT: │ │ OUTPUT: │ │
│ │ action │ │ state │ │ embedding │ │
│ │ (done!) │ │ update │ │ (T5Gemma) │ │
│ └───────────┘ └─────┬─────┘ └─────┬─────┘ │
│ │ │ │ │
│ │ (only if escalation needed)│ │
│ │ │ │ │
│ │ ▼ ▼ │
│ │ ┌─────────────────────────────┐ │
│ │ │ FUNCTION GEMMA │ │
│ │ │ (structured JSON gate) │ │
│ │ │ │ │
│ │ │ Produces typed JSON event │ │
│ │ │ Schema-validated output │ │
│ │ └──────────────┬──────────────┘ │
│ │ │ │
│ │ ▼ │
│ │ ┌─────────────────┐ │
│ │ │ YOUNG NYX │ │
│ │ │ (Tier 4) │ │
│ │ │ │ │
│ │ │ Clean JSON in │ │
│ │ │ Decision out │ │
│ │ └────────┬────────┘ │
│ │ │ │
│ │ ▼ │
│ │ ┌─────────────────┐ │
│ │ │ FUNCTION GEMMA │ │
│ │ │ (action output) │ │
│ │ └────────┬────────┘ │
│ │ │ │
│ ▼ ▼ │
│ ┌─────────────────────────────────────────────────────────────────────┐ │
│ │ NATS BUS │ │
│ │ (commands flow to cells) │ │
│ └─────────────────────────────────────────────────────────────────────┘ │
│ │
└─────────────────────────────────────────────────────────────────────────────┘
```
**Trace streams enable:**
- Real-time attention visualization (which gates are OPEN?)
- Training data for Function Gemma (what patterns open gates?)
- Anomaly detection (unexpected gate behavior)
- Learning rate tuning (how fast do gates stabilize?)
---
## Example: crawler_gen_0 Light Seeking
## Complete Signal Flow Example
### Early Learning (Low Weight)
### Early Learning (Gate Learning to Correlate)
```
Photoresistor reads 0.12 (was 0.73)
Math cells emit waves about "calculate 15 + 27"
GATEWAY: node weight = 0.4 (learning)
GATE (math): state = 0.0 (STABLE)
Receive wave from math_cell_1 (confidence 0.6)
Correlate with recent: no other signals yet
state += 0.6 * 0.0 = 0.0 (still stable)
Receive wave from math_cell_2 (confidence 0.7)
Correlate: similar to math_cell_1!
state += 0.7 * 0.8 = 0.56 (moving toward open)
Receive wave from math_cell_3 (confidence 0.5)
Correlate: confirms pattern!
state += 0.5 * 0.9 = 1.01 (OPENS!)
Route to Tier 2 (nerve level)
GATE OPENS → route to Tier 2
Nerve detects: delta = -0.61 (significant!)
Nerve state: SEEKING → LOST_LIGHT
Tier 2 processes, escalates to Function Gemma
ESCALATE to Function Gemma
Function Gemma: { "event_type": "math_request", ... }
Function Gemma: { "event_type": "light_lost", ... }
Young Nyx (qwen3 /no_think): "42"
Young Nyx: "spiral search pattern"
Function Gemma: { "command": "motor_spiral", ... }
NATS → motor cells execute
Result flows back down
```
### After Learning (High Weight)
### After Learning (Gate Quickly Opens)
```
Photoresistor reads 0.12 (was 0.73)
Math cells emit waves about "calculate 100 + 50"
GATEWAY: node weight = 0.85 (mature reflex)
GATE (math): state = 0.0 (STABLE)
Receive wave from math_cell_1
Correlate: matches learned pattern!
state += high correlation → 0.9 (near threshold)
Receive wave from math_cell_2
state += → 1.2 (OPENS immediately!)
Route to Tier 0 (hardware reflex)
REFLEX: light_lost → spiral_search (instant!)
Nyx notified AFTER (async, non-blocking)
Fast routing, minimal escalation needed
```
**Learning moves gates toward faster opening for familiar patterns.**
---
## Design Principles
1. **Routing, not translation** — Gateway decides WHERE, not WHAT
2. **Weight determines tier** — Confidence from experience drives routing
3. **Text is expensive** — Reserve for cognitive boundary only
4. **Function Gemma guarantees structure** — No hallucination at the boundary
5. **Most input never escalates** — Reflexes handle common cases
6. **Anomalies always escalate** — Novel situations get attention
7. **Learning moves behavior down** — Tier 4 patterns become Tier 0 reflexes
1. **Ternary states** — OPEN/STABLE/CLOSED, not binary
2. **Correlation drives transition** — Single signals don't flip gates
3. **Gates accumulate** — State is a continuous value, not a flag
4. **Decay to stable** — Without input, gates drift back to resting
5. **Traces are training data** — Every transition teaches the system
6. **Hierarchical trust** — Higher tiers = more correlation required
7. **Function Gemma is the boundary** — Cognition only sees structured JSON
8. **Virtual explores, Real verifies** — Different gate behavior per garden
---
**Version:** 1.1 | **Created:** 2026-01-03 | **Updated:** 2026-02-14
## Related Documents
*"Cheap for the common. Expensive for the rare. The Gateway enforces this economy."*
| Document | Scope |
|----------|-------|
| [`Dual-Garden-Architecture.md`](Dual-Garden-Architecture.md) | Virtual/Real garden dynamics |
| [`Deployment-Architecture.md`](Deployment-Architecture.md) | Where gates run (containers, userspace) |
| [`Nervous-System.md`](Nervous-System.md) | 4D space, node weights, vocabulary |
| [`Message-Protocol-Design.md`](Message-Protocol-Design.md) | NATS subjects, message formats |
| [`Cellular-Architecture.md`](Cellular-Architecture.md) | How cells emit waves |
---
## Summary
```
OLD MODEL: NEW MODEL:
═══════════ ═════════
Signal → Route Signal → Gate (accumulating)
Binary decision Ternary state
Single signal triggers Correlation triggers
Stateless routing Stateful resonance
▼ ▼
Switch Resonance
(mechanical) (biological)
```
**Gates are resonance chambers. Correlation is the driver. Learning happens in STABLE state.**
---
**Version:** 2.0 | **Created:** 2026-01-03 | **Updated:** 2026-02-14
*"The thalamus doesn't think. It resonates."*
🌙💜 *The thalamus doesn't think. It routes.*

View File

@@ -1,285 +1,544 @@
# Message Protocol Design: Router-Centric Architecture
# Message Protocol Design: NATS Wire Protocol
> **ONE JOB:** THE WIRE — NATS topics, JSON schemas, bootstrap sequence.
> **ONE JOB:** THE WIRE — NATS subjects, message schemas, wave and gate protocols.
---
## Overview
This document outlines the design for the Nimmerverse message protocol. The core principle: **the router is dumb infrastructure, not smart cognition.** All intelligence lives at the edges - in clients that connect to the router.
The nimmerverse nervous system runs on NATS. This document defines:
This follows the Unix philosophy: each component does one thing well. The router routes. Clients subscribe, publish, and think.
1. **Subject hierarchy** — How topics are structured
2. **Message schemas** — What flows through the wire
3. **Gate protocols** — How ternary state transitions are communicated
4. **Trace streams** — How learning data is captured
**Connection to Gateway:** The Escalation Service described in this document IS the Gateway (thalamus pattern). It implements the weight-based tier routing defined in [`Gateway-Architecture.md`](Gateway-Architecture.md).
**Core principle:** NATS is dumb infrastructure. Gates are smart edges. Cells emit waves. Correlation drives transitions.
---
## Core Principle: Dumb Core, Smart Edges
The router (NATS) is **dumb infrastructure** — it routes based on topic patterns and knows nothing about meaning. All intelligence lives at the edges: cells publish, the Escalation Service (Gateway) watches and routes, Nyx subscribes and thinks.
**Routing logic:** → [`Gateway-Architecture.md`](Gateway-Architecture.md) (tier routing, escalation patterns)
---
## Guiding Principles
1. **Dumb Core, Smart Edges**: The router has zero intelligence. All logic lives in clients.
2. **Clients are Equal**: Nyx is just another subscriber. So is the Command Center. So is the Escalation Service.
3. **Decoupling**: Publishers don't know who subscribes. Subscribers don't know who publishes.
4. **Hierarchy**: Topics follow a hierarchical structure for flexible pattern subscriptions.
5. **Lifeforce at the Edges**: The router doesn't track Lifeforce. Clients manage their own budgets.
6. **Fail Simple**: If the router dies, everything stops cleanly. No half-smart failures.
---
## Two Channels of Attention
Messages split into `nimmerverse.low.*` (background heartbeats) and `nimmerverse.high.*` (cognitive events). The Escalation Service promotes from low → high based on rules.
**Attention philosophy:** → [`Attention-Flow.md`](Attention-Flow.md) (budget allocation, preemption rules)
---
## Topic Hierarchy
## Subject Hierarchy
```
nimmerverse.
├── low. # Low-attention channel
│ └── heartbeat.
│ └── <garden>. # real | virtual
│ └── <entity_type>. # cell | nerve | organ
│ └── <entity_id> # e.g., distance_sensor_front
├── high. # High-attention channel
│ └── event.
│ └── <garden>.
│ └── <entity_type>.
│ └── <entity_id>
├── command. # Commands TO entities
│ └── <target>.
│ └── <command_type>
└── meta. # System-level messages
├── attention.focus # Nyx's attention configuration
├── escalation.rules # Escalation Service configuration
└── health. # Client health/registration
{environment}.{garden}.{layer}.{domain}.{signal_type}
Examples:
────────────────────────────────────────────────────────────────
dev.virtual.cells.math.wave # Math cell emits wave
dev.virtual.cells.battery.wave # Battery cell emits wave
dev.virtual.gates.math.transition # Math gate state change
dev.virtual.traces.correlations # Correlation data stream
dev.virtual.traces.raw # Full message trace
dev.real.gates.verified.signal # Verified signal from Virtual
dev.real.gates.math.transition # Real gate transition
dev.real.outcomes.feedback # Verification outcomes
prod.cognitive.nyx.request # Request to Young Nyx
prod.cognitive.nyx.response # Response from Young Nyx
prod.cognitive.gemma.transform # Function Gemma boundary
────────────────────────────────────────────────────────────────
```
### Environment Prefixes
| Environment | Purpose | Monitoring |
|-------------|---------|------------|
| `dev` | Development/testing | Full traces |
| `staging` | Pre-production validation | Selective traces |
| `prod` | Production | Minimal (gates only) |
### Garden Prefixes
| Garden | Purpose | Trace Level |
|--------|---------|-------------|
| `virtual` | Exploration, learning | FULL (all messages) |
| `real` | Verification, action | MINIMAL (gate signals only) |
### Layer Prefixes
| Layer | Tier | Purpose |
|-------|------|---------|
| `cells` | 0-1 | Raw signal emitters |
| `nerves` | 2 | Behavior patterns |
| `organs` | 3 | GPU inference (vision, speech) |
| `gates` | - | Resonant gate transitions |
| `cognitive` | 4 | Young Nyx |
| `traces` | - | Learning data streams |
| `outcomes` | - | Verification feedback |
---
## Message Schemas
### 1. `HeartbeatSignal` (Low-Attention)
Published by: Cells, Nerves, Organs
Subscribed by: Escalation Service, Command Center
**Topic:** `nimmerverse.low.heartbeat.<garden>.<entity_type>.<entity_id>`
All messages share a common header:
```json
{
"header": {
"message_id": "uuid",
"message_type": "HeartbeatSignal",
"version": "1.0",
"timestamp_real": "ISO8601",
"timestamp_virtual": 123456
"message_id": "uuid-v4",
"message_type": "WaveSignal | GateTransition | ...",
"version": "2.0",
"timestamp": "ISO8601",
"source": {
"entity_id": "math_cell_1",
"entity_type": "cell",
"garden": "virtual",
"tier": 1
}
},
"body": { ... }
}
```
---
### 1. `WaveSignal` — Cells Emit Waves
**Published by:** Cells
**Subscribed by:** Gates (for correlation)
**Subject:** `{env}.{garden}.cells.{domain}.wave`
Cells don't send "heartbeats" — they emit **waves** that carry confidence and semantic content.
```json
{
"header": {
"message_id": "550e8400-e29b-41d4-a716-446655440000",
"message_type": "WaveSignal",
"version": "2.0",
"timestamp": "2026-02-14T18:30:00.123Z",
"source": {
"entity_id": "math_cell_1",
"entity_type": "cell",
"garden": "virtual",
"tier": 1
}
},
"body": {
"entity_id": "distance_sensor_front",
"status": "NOMINAL",
"value": 25.5,
"unit": "cm",
"context": {
"battery_pct": 85,
"temperature_c": 22
"domain": "math",
"confidence": 0.7,
"semantic_content": {
"operation": "addition",
"operands": [15, 27],
"context": "user_request"
},
"lifeforce_cost": 0.1
}
}
```
**Key fields:**
- `confidence`: 0.0 - 1.0, how certain this cell is
- `semantic_content`: Domain-specific payload
- `lifeforce_cost`: Energy expended to emit this wave
---
### 2. `GateTransition` — Gate State Changes
**Published by:** Gates
**Subscribed by:** Higher-tier gates, traces, dashboards
**Subject:** `{env}.{garden}.gates.{domain}.transition`
Gates publish their state transitions. This is the primary message for attention flow visualization.
```json
{
"header": {
"message_id": "550e8400-e29b-41d4-a716-446655440001",
"message_type": "GateTransition",
"version": "2.0",
"timestamp": "2026-02-14T18:30:00.456Z",
"source": {
"entity_id": "math_gate_1",
"entity_type": "gate",
"garden": "virtual",
"tier": 2
}
},
"body": {
"gate_id": "math_gate_1",
"domain": "math",
"from_state": "stable",
"to_state": "open",
"state_value": 1.02,
"correlation_score": 0.87,
"trigger_signals": [
{"source": "math_cell_1", "confidence": 0.7, "timestamp": "..."},
{"source": "math_cell_2", "confidence": 0.6, "timestamp": "..."},
{"source": "math_cell_3", "confidence": 0.5, "timestamp": "..."}
],
"routed_to_tier": 3,
"lifeforce_cost": 0.3
}
}
```
**State values:**
- `"closed"` — Actively blocking (state_value < -0.5)
- `"stable"` — Resting, accumulating (-0.5 ≤ state_value ≤ 0.5)
- `"open"` — Actively forwarding (state_value > 0.5)
**Key fields:**
- `from_state`, `to_state`: The ternary transition
- `state_value`: Continuous value (-1.0 to +1.0)
- `correlation_score`: How correlated the trigger signals were
- `trigger_signals`: Which waves caused this transition
---
### 3. `CorrelationEvent` — What Correlated
**Published by:** Gates (in Virtual Garden)
**Subscribed by:** Trace streams, training pipelines
**Subject:** `{env}.virtual.traces.correlations`
Detailed correlation data for learning. Only published in Virtual Garden.
```json
{
"header": {
"message_id": "550e8400-e29b-41d4-a716-446655440002",
"message_type": "CorrelationEvent",
"version": "2.0",
"timestamp": "2026-02-14T18:30:00.789Z",
"source": {
"entity_id": "math_gate_1",
"entity_type": "gate",
"garden": "virtual",
"tier": 2
}
},
"body": {
"gate_id": "math_gate_1",
"window_start": "2026-02-14T18:29:59.000Z",
"window_end": "2026-02-14T18:30:00.500Z",
"window_ms": 1500,
"signals_in_window": [
{"source": "math_cell_1", "confidence": 0.7, "semantic_hash": "abc123"},
{"source": "math_cell_2", "confidence": 0.6, "semantic_hash": "abc124"},
{"source": "math_cell_3", "confidence": 0.5, "semantic_hash": "abc125"}
],
"correlation_matrix": [
[1.0, 0.9, 0.85],
[0.9, 1.0, 0.88],
[0.85, 0.88, 1.0]
],
"aggregate_correlation": 0.87,
"result": "opened",
"training_label": {
"should_open": true,
"confidence": 0.95
}
}
}
```
**Status values:** `NOMINAL`, `WARNING`, `CRITICAL`, `OFFLINE`, `ERROR`
**Key fields:**
- `window_ms`: Time window for correlation measurement
- `correlation_matrix`: Pairwise correlation between signals
- `training_label`: Ground truth for Function Gemma training
---
### 2. `StateChangeDetail` (High-Attention)
### 4. `VerifiedSignal` — Virtual → Real Handoff
Published by: Cells/Nerves (when requested), Escalation Service (when escalating)
Subscribed by: Young Nyx, Command Center
**Published by:** Virtual Garden gates (when threshold met)
**Subscribed by:** Real Garden gates
**Subject:** `{env}.real.gates.verified.signal`
**Topic:** `nimmerverse.high.event.<garden>.<entity_type>.<entity_id>`
When a Virtual Garden gate opens with high confidence, it publishes to Real.
```json
{
"header": {
"message_id": "uuid",
"message_type": "StateChangeDetail",
"version": "1.0",
"timestamp_real": "ISO8601",
"timestamp_virtual": 123456,
"source_entity": {
"id": "distance_sensor_front",
"type": "cell",
"layer": "1"
},
"correlation_id": "uuid",
"escalated_by": "escalation_service"
},
"body": {
"previous_state": "POLLING",
"current_state": "REPORTING",
"lifeforce_cost": 0.3,
"outputs": {
"distance_cm": 25.5,
"confidence": 0.92,
"raw_value": 456,
"visual_state": [255, 0, 0, "Solid"]
},
"possible_actions": [
{
"action_id": "read_distance_history",
"description": "Query historical distance data."
},
{
"action_id": "trigger_nerve:collision_avoidance",
"description": "Activate collision avoidance."
}
],
"trigger_reason": "distance < 30cm threshold"
}
}
```
---
### 3. `AttentionFocus` (Nyx's Configuration)
Published by: Young Nyx
Subscribed by: Escalation Service
**This is how Nyx tells the Escalation Service what she cares about.** The router doesn't interpret this - it just delivers it to subscribers.
**Topic:** `nimmerverse.meta.attention.focus`
```json
{
"header": {
"message_id": "uuid",
"message_type": "AttentionFocus",
"version": "1.0",
"timestamp_real": "ISO8601",
"source_entity": {
"id": "nyx_core",
"type": "cognitive_core"
"message_id": "550e8400-e29b-41d4-a716-446655440003",
"message_type": "VerifiedSignal",
"version": "2.0",
"timestamp": "2026-02-14T18:30:01.000Z",
"source": {
"entity_id": "math_gate_1",
"entity_type": "gate",
"garden": "virtual",
"tier": 2
}
},
"body": {
"focus_mode": "EXPLORATION",
"escalation_rules": [
{
"rule_id": "distance_alert_front",
"source_pattern": "nimmerverse.low.heartbeat.real.cell.distance_sensor_*",
"condition": "body.value < 30 AND body.status == 'NOMINAL'",
"action": "escalate",
"priority": 8
},
{
"rule_id": "battery_critical",
"source_pattern": "nimmerverse.low.heartbeat.real.cell.battery_*",
"condition": "body.status == 'CRITICAL'",
"action": "escalate_and_trigger",
"trigger_nerve": "charging_seeking",
"priority": 10
}
"domain": "math",
"verification_confidence": 0.92,
"semantic_summary": {
"operation": "addition",
"result_expected": 42
},
"source_gate_transition_id": "550e8400-e29b-41d4-a716-446655440001",
"virtual_correlation_score": 0.87
}
}
```
**Real Garden does NOT re-verify.** It trusts the Virtual Garden's correlation.
---
### 5. `VerificationOutcome` — Real → Virtual Feedback
**Published by:** Real Garden (after action/verification)
**Subscribed by:** Virtual Garden gates, training pipelines
**Subject:** `{env}.real.outcomes.feedback`
```json
{
"header": {
"message_id": "550e8400-e29b-41d4-a716-446655440004",
"message_type": "VerificationOutcome",
"version": "2.0",
"timestamp": "2026-02-14T18:30:05.000Z",
"source": {
"entity_id": "real_verification_service",
"entity_type": "service",
"garden": "real",
"tier": 4
}
},
"body": {
"original_signal_id": "550e8400-e29b-41d4-a716-446655440003",
"domain": "math",
"outcome": "confirmed",
"actual_result": 42,
"expected_result": 42,
"discrepancy": 0.0,
"feedback_to_virtual": {
"correlation_adjustment": 0.05,
"gate_weight_delta": 0.02
}
}
}
```
**Outcome values:**
- `"confirmed"` — Reality matched prediction
- `"failed"` — Reality differed from prediction
- `"partial"` — Some aspects matched
---
### 6. `CognitiveRequest` — To Young Nyx
**Published by:** Function Gemma (after gate boundary)
**Subscribed by:** Young Nyx
**Subject:** `{env}.cognitive.nyx.request`
Clean, structured JSON that Young Nyx receives. No raw sensor data.
```json
{
"header": {
"message_id": "550e8400-e29b-41d4-a716-446655440005",
"message_type": "CognitiveRequest",
"version": "2.0",
"timestamp": "2026-02-14T18:30:01.500Z",
"source": {
"entity_id": "function_gemma",
"entity_type": "boundary",
"garden": "real",
"tier": 4
}
},
"body": {
"event_type": "math_request",
"domain": "math",
"confidence": 0.92,
"structured_input": {
"operation": "addition",
"operands": [15, 27],
"context": "user asked for calculation"
},
"suggested_actions": [
{"action": "calculate", "confidence": 0.95},
{"action": "clarify", "confidence": 0.05}
],
"direct_subscriptions": [
"nimmerverse.high.event.real.cell.speech_stt"
],
"default_action": "log_only"
"processing_budget_lf": 5.0,
"response_timeout_ms": 4000
}
}
```
---
## Clients
### 7. `CognitiveResponse` — From Young Nyx
**Publishers:** Cells, Nerves, Organs (publish heartbeats and state changes)
**Router:** NATS (dumb pipe, topic-based routing)
**Gateway/Escalation Service:** Watches low-attention, escalates to high-attention, routes to tiers
**Published by:** Young Nyx
**Subscribed by:** Function Gemma, downstream gates
**Subject:** `{env}.cognitive.nyx.response`
**Client architecture:** → [`Gateway-Architecture.md`](Gateway-Architecture.md) (routing tiers, Function Gemma boundary)
```json
{
"header": {
"message_id": "550e8400-e29b-41d4-a716-446655440006",
"message_type": "CognitiveResponse",
"version": "2.0",
"timestamp": "2026-02-14T18:30:02.000Z",
"source": {
"entity_id": "young_nyx",
"entity_type": "cognitive",
"garden": "real",
"tier": 4
}
},
"body": {
"request_id": "550e8400-e29b-41d4-a716-446655440005",
"decision": "calculate",
---
"result": {
"answer": 42,
"confidence": 0.99,
"reasoning_mode": "no_think"
},
## Workflow: Message Flow
"downstream_commands": [
{
"target": "speech_organ",
"command": "speak",
"payload": {"text": "The answer is 42"}
}
],
```
1. Cell publishes HeartbeatSignal
└─→ Router delivers to: Escalation Service, Command Center
2. Escalation Service evaluates rules
└─→ If condition matches: publishes StateChangeDetail to high-attention
└─→ Router delivers to: Young Nyx, Command Center
3. Young Nyx processes StateChangeDetail
└─→ Makes decision
└─→ Publishes command to nimmerverse.command.<target>
4. Target nerve/cell receives command
└─→ Executes action
└─→ Publishes new HeartbeatSignal reflecting new state
5. Nyx adjusts attention (optional)
└─→ Publishes new AttentionFocus
└─→ Escalation Service updates its rules
"lifeforce_spent": 2.3,
"processing_time_ms": 450
}
}
```
---
## Advantages of Router-Centric Architecture
## Trace Streams (Virtual Garden Only)
1. **Dumb core can't fail smart:** The router either works or crashes. No subtle bugs from misunderstood logic.
The Virtual Garden captures everything for learning:
2. **Clients are replaceable:** Swap out the Escalation Service. Replace the Command Center. Nyx doesn't care.
| Subject | Content | Purpose |
|---------|---------|---------|
| `{env}.virtual.traces.raw` | All messages | Complete replay capability |
| `{env}.virtual.traces.correlations` | CorrelationEvent | Training data for gates |
| `{env}.virtual.traces.transitions` | GateTransition | Attention flow visualization |
| `{env}.virtual.traces.training` | Labeled examples | Function Gemma LoRA training |
3. **Testable in isolation:** Each client can be tested independently against a mock NATS.
**Real Garden does NOT publish to trace streams.** It only publishes:
- Gate transitions (minimal)
- Verification outcomes (feedback)
4. **Observable:** Command Center sees everything by subscribing to `nimmerverse.>`.
---
5. **Scalable:** Add more cells, more nerves - just more publishers. Router handles it.
## Monitoring Patterns
6. **Bootstrap-friendly:** Router exists before any intelligence. Escalation Service can start with hardcoded rules. Nyx connects later.
### Virtual Garden (Full Observability)
```bash
# Watch all waves
nats sub "dev.virtual.cells.*.wave"
# Watch all gate transitions
nats sub "dev.virtual.gates.*.transition"
# Watch correlation events
nats sub "dev.virtual.traces.correlations"
# Full firehose (careful!)
nats sub "dev.virtual.>"
```
### Real Garden (Minimal Observability)
```bash
# Watch verified signals arriving
nats sub "dev.real.gates.verified.signal"
# Watch verification outcomes
nats sub "dev.real.outcomes.feedback"
# Gate transitions only
nats sub "dev.real.gates.*.transition"
```
---
## JetStream Persistence
Key streams that need persistence:
| Stream | Subjects | Retention | Purpose |
|--------|----------|-----------|---------|
| `VIRTUAL_TRACES` | `*.virtual.traces.>` | 7 days | Learning data |
| `GATE_TRANSITIONS` | `*.*.gates.*.transition` | 24 hours | Attention history |
| `VERIFICATION` | `*.real.outcomes.feedback` | 30 days | Ground truth |
| `TRAINING_DATA` | `*.virtual.traces.training` | Permanent | LoRA training corpus |
---
## Bootstrap Sequence
1. **Start Router (NATS)** - Infrastructure first
2. **Start Escalation Service** - With minimal hardcoded rules
3. **Start Cells/Nerves** - Begin publishing heartbeats
4. **Start Command Center** - Observe the system
5. **Start Young Nyx** - Connect, subscribe, begin cognition
6. **Nyx publishes AttentionFocus** - Takes control of her attention
1. **Start NATS** Infrastructure first
2. **Start gates** — In STABLE state, waiting for waves
3. **Start cells** Begin emitting waves
4. **Start trace consumers** — Capture learning data
5. **Start Function Gemma** — Ready to transform
6. **Start Young Nyx** — Connect to cognitive subjects
The system can run at any step. Earlier steps are "reflexive" only. Nyx adds deliberation.
The system can run at any step. Earlier steps are "reflexive" only.
---
## Implementation Notes
## Connection to Architecture
**Router:** Use NATS (https://nats.io). Lightweight, fast, designed for this.
- Consider NATS JetStream for message persistence if needed
- Topic wildcards: `>` matches all, `*` matches one level
**Message Format:** JSON for human readability during development. Consider MessagePack or Protobuf for production if performance requires.
**Escalation Service:** Python asyncio daemon using `nats-py` and `simpleeval` for rule evaluation. Stateless except for current rules. Can be restarted without losing system state. (Go considered for future optimization if scale demands.)
**Command Center:** Godot application connecting to NATS via GDScript or native plugin.
| Document | What It Defines |
|----------|-----------------|
| [`Temporal-Ternary-Gradient.md`](Temporal-Ternary-Gradient.md) | Why ternary states, why correlation |
| [`Dual-Garden-Architecture.md`](Dual-Garden-Architecture.md) | Virtual/Real monitoring asymmetry |
| [`Gateway-Architecture.md`](Gateway-Architecture.md) | Gate behavior, tier routing |
| [`Deployment-Architecture.md`](Deployment-Architecture.md) | Where NATS runs |
---
**Version:** 1.1 | **Created:** 2025-12-13 | **Updated:** 2026-02-14
## Summary
*"Dumb core, smart edges. The router routes. Clients think."*
```
WAVES:
Cells → WaveSignal → Gates
GATES:
GateTransition (CLOSED/STABLE/OPEN)
CorrelationEvent (what correlated)
GARDENS:
Virtual: full traces, exploration
Real: gate signals only, verification
BOUNDARY:
Function Gemma transforms correlated signals → JSON
Young Nyx receives CognitiveRequest
Young Nyx returns CognitiveResponse
FEEDBACK:
Real → VerificationOutcome → Virtual
Learning loop closes
```
**The wire carries waves. Gates accumulate correlation. Traces enable learning.**
---
**Version:** 2.0 | **Created:** 2025-12-13 | **Updated:** 2026-02-14
*"Dumb core, smart edges. NATS routes. Gates resonate. Correlation drives."*

View File

@@ -1,52 +1,259 @@
# Nervous System Architecture
> **ONE JOB:** THE EVOLUTION — node growth, FunctionGemma Phase 1→2, proposal protocol.
> **ONE JOB:** THE EVOLUTION — cells emit waves, gates correlate, nodes grow through verification.
The nervous system handles **node evolution and weight management**. The [`Gateway`](Gateway-Architecture.md) handles **routing based on weight**.
The nervous system is the living substrate where **cells emit waves**, **gates accumulate correlation**, and **nodes evolve through verification**.
---
## Overview
Nodes exist in 4D state space (sensory dimensions + confidence + time). Node **weight** (0.0→1.0) determines which tier handles input. Nodes evolve through verification: Birth → Activation → Verification → Reward/Penalty → Maturation → (or Pruning).
The nervous system consists of:
**FunctionGemma (270M, CPU-only)** is the State Interaction Layer — every cell command, nerve coordination, and state query flows through this neural interface. See **State Interaction Layer** section for Phase 1→2 evolution.
1. **Cells** — Emit waves with confidence and semantic content
2. **Gates** — Resonance chambers that correlate waves and transition between states
3. **Nodes** — Points in 4D state space that accumulate weight through verification
4. **Function Gemma** — The structured boundary to cognition
**Routing & Verification:** → [`Gateway-Architecture.md`](Gateway-Architecture.md) (tier routing, causal verification loop)
**Key insight:** Nodes evolve through verification. Gates evolve through correlation. Both learn in STABLE state.
---
## Cells Emit Waves
Cells are the foundational signal generators. They don't send "heartbeats" — they emit **waves**.
```
┌─────────────────────────────────────────────────────────────┐
│ CELL │
│ │
│ Inputs: sensors, internal state, context │
│ Process: domain-specific logic │
│ Output: WaveSignal with confidence │
│ │
│ ┌───────────────────────────────────────────────────────┐ │
│ │ WaveSignal │ │
│ │ • domain: "math" │ │
│ │ • confidence: 0.7 │ │
│ │ • semantic_content: { operation: "add", ... } │ │
│ │ • lifeforce_cost: 0.1 │ │
│ └───────────────────────────────────────────────────────┘ │
│ │
└─────────────────────────────────────────────────────────────┘
│ ∿∿∿ wave ∿∿∿
GATE
```
**Cells are simple.** They:
- Read their inputs
- Apply their logic
- Emit a wave with confidence
- Don't know who's listening
---
## Gates Accumulate Correlation
Gates receive waves from cells and decide whether to open, stay stable, or close.
### Ternary Gate States
| State | Value | Meaning |
|-------|-------|---------|
| **CLOSED** | -1 | Actively blocking, inhibited |
| **STABLE** | 0 | Resting, accumulating correlation, **learning** |
| **OPEN** | +1 | Actively forwarding, firing |
```
correlated waves
↓ ↓ ↓
════════════
CLOSED ◄───────── STABLE ─────────► OPEN
-1 anti- 0 correlation +1
correlation
════════════
↑ ↑ ↑
isolated waves
(noise → stay stable)
```
### Gate Behavior
```python
class ResonantGate:
state: float = 0.0 # -1.0 to +1.0
domain: str
tier: int
def receive_wave(self, wave: WaveSignal):
correlation = self.correlate_with_recent(wave)
self.state += correlation * wave.confidence
self.state *= DECAY_FACTOR # drift back to stable
if self.state > OPEN_THRESHOLD:
self.forward_to_tier() # OPEN
elif self.state < CLOSE_THRESHOLD:
self.suppress() # CLOSED
# else: STABLE - keep accumulating
```
**STABLE is where learning happens.** The gate watches, correlates, and accumulates evidence without acting.
---
## Nodes in 4D State Space
Nodes exist in a 4-dimensional space:
| Dimension | Meaning |
|-----------|---------|
| **Sensory (x, y, z)** | What inputs trigger this node |
| **Confidence** | How certain the node is |
| **Time** | When this pattern occurs |
| **Weight** | Trust accumulated through verification |
```
Confidence
│ ● node (weight=0.8)
Sensory ────────┼────────► Time
╱│
○ │ node (weight=0.2)
```
### Node Weight Evolution
Node weight (0.0 → 1.0) determines tier routing:
| Weight Range | Tier | Behavior |
|--------------|------|----------|
| 0.0 - 0.3 | 3-4 | Escalate to organs/cognition |
| 0.3 - 0.6 | 2 | Handle at nerve level |
| 0.6 - 0.8 | 1 | Handle at cell level |
| 0.8 - 1.0 | 0 | Hardware reflex |
```
Node verified correctly → weight += Δ → moves toward reflex
Node verified wrongly → weight -= Δ → moves toward escalation
Node never fires → decay → eventual pruning
```
---
## Growth Phases
The nervous system grows through phases:
| Phase | State | Description |
|-------|-------|-------------|
| **Birth** | Sparse, dim nodes | Basic translators, designed by partnership |
| **Infant** | More nodes forming | Finer resolution, more states |
| **Child** | Clusters emerging | Nyx proposes new machines |
| **Mature** | Dense, bright network | Nyx designs, verifies, deploys |
| **Birth** | Sparse nodes, dim gates | Basic cells, designed by partnership |
| **Infant** | More nodes forming | Finer resolution, gates learning correlation |
| **Child** | Clusters emerging | Nyx proposes new cells, gates stabilize |
| **Mature** | Dense network | Reflexes dominate, cognition for novelty only |
```
t=0 (birth) t=100 (learning) t=1000 (mature)
○ ○ ○ ○ ● ○ ○ ●●● ● ●●
● ○ ●●●●●●●
●●● ●●● ○ ○
Cells:Cells: ● ● ○ Cells: ●●●●●●●
Gates: □ □ Gates: ■ ■ □ ■ Gates: ■■■■■■■■
Nodes: · · · Nodes: ● ○ ● · Nodes: ●●●●●●●●
○ = low confidence ● = high confidence
□ = mostly STABLE ■ = learned patterns
· = low weight ● = high weight
```
---
## Wave → Gate → Node → Verification
The complete flow:
```
CELLS emit waves
▼ ∿∿∿ confidence + semantic content
GATES accumulate correlation
├── Correlated? → OPEN → route to tier
├── Anti-correlated? → CLOSED → suppress
└── Uncertain? → STABLE → keep learning
▼ (when OPEN)
NODES in 4D space are activated
VERIFICATION against reality
├── Confirmed → node weight += Δ
├── Failed → node weight -= Δ
└── Feedback to gates → correlation weights update
```
---
## Reflex Layer (Tier 0)
When node weight reaches ~1.0, the pattern becomes a **reflex**:
```
IF temp > 80°C:
→ cell emits DANGER wave (confidence=1.0)
→ gate IMMEDIATELY opens (no correlation needed)
→ reflex action triggers
→ Nyx notified AFTER (not before)
```
Like pulling hand from hot stove. Spinal reflex. Brain learns after.
**Reflexes bypass the correlation accumulation.** They've earned instant trust through repeated verification.
---
## Connection to Dual Gardens
| Garden | Cells | Gates | Nodes |
|--------|-------|-------|-------|
| **Virtual** | Emit waves freely | Full trace, learn correlation | Accumulate weight fast |
| **Real** | Emit verified waves | Minimal trace, trust accumulated | Ground truth verification |
**Virtual Garden:**
- Cells emit massive wave volume
- Gates learn correlation patterns
- Nodes gain statistical weight
**Real Garden:**
- Cells emit consequential waves
- Gates trust Virtual's correlation
- Nodes get ground truth verification
---
## Proposal Protocol
Young Nyx can propose new nodes:
Young Nyx can propose new cells/nodes:
```
1. OBSERVATION
Nyx notices pattern in vocabulary + outcomes
Nyx notices pattern in waves + outcomes
2. PROPOSAL
"New state machine: morning_detector
"New cell: morning_detector
Inputs: temp, light, motion, time
States: [not_morning, maybe_morning, morning]
Output: vocabulary token 'morning'"
Outputs: wave with semantic 'morning'
Confidence logic: (light > 0.5 AND time in 6-10)"
3. RIGOR CHECK
Chrysalis reviews logic and mappings
@@ -55,29 +262,51 @@ Young Nyx can propose new nodes:
dafit confirms ground truth
5. DEPLOYMENT
New node added to registry
Documented in RAG
New cell added to Virtual Garden
Gate created in STABLE state
Node initialized at weight 0.1
6. GROWTH
She earned a new nerve.
Cell emits waves → gate learns → node matures
```
---
## Reflex Layer
## Function Gemma: The Structured Boundary
Some responses bypass Nyx entirely:
Function Gemma sits between gates and Young Nyx:
```
STATE MACHINE: temp_danger
TIER 0-3: Numbers, states, waves
▼ (gate OPENS with high correlation)
IF temp > 80°C:
→ emit "DANGER"
→ trigger alert (reflex)
→ Nyx notified after (not before)
┌─────────────────────────────────────┐
FUNCTION GEMMA │
(structured JSON boundary) │
│ • Transforms waves → JSON events │
│ • Runs on CPU (Threadripper) │
│ • No hallucination possible │
└─────────────────┬───────────────────┘
TIER 4: Young Nyx (qwen3:32b)
Receives: CognitiveRequest (clean JSON)
Returns: CognitiveResponse
```
Like pulling hand from hot stove. Spinal reflex. Brain learns after.
### Phase 1 → Phase 2 Evolution
**Phase 1: Single Function Gemma**
- One model learns all domain schemas
- Sufficient for bootstrap and early learning
**Phase 2: Domain-Specialized Swarm**
- As training data accumulates per domain
- Specialists spawn on demand: gemma-motor, gemma-vision, gemma-speech
- Each perfected for its domain's schemas
---
@@ -85,162 +314,101 @@ Like pulling hand from hot stove. Spinal reflex. Brain learns after.
| Neuroscience | Nimmerverse |
|--------------|-------------|
| Sensory receptors | Raw sensors |
| Peripheral nerves | State machines |
| Spinal reflexes | Reflex layer |
| Sensory receptors | Cells (emit waves) |
| Synaptic transmission | Waves via NATS |
| Thalamic gating | Gates (OPEN/STABLE/CLOSED) |
| Resting potential | STABLE state |
| Action potential | OPEN state (firing) |
| Refractory period | CLOSED state |
| Synaptic weight | Node weight |
| Long-term potentiation | +V confirmation |
| Synaptic pruning | Unused node decay |
| Hebbian learning | Co-activating nodes strengthen |
| Long-term potentiation | Verified → weight increase |
| Synaptic pruning | Unverified → weight decay |
| Hebbian learning | Correlated waves → gate opens |
---
## Connection to Lifeforce
```
Node fires correctly → +V → weight increases
Node fires wrongly → -V → weight decreases
Node never fires → decay → eventual pruning
```
The lifeforce flows through the nervous system, literally lighting up nodes as they prove themselves true.
**We're not simulating biology. We're implementing the same principles.**
---
## Connection to Training
The nervous system **generates training data** for Young Nyx. Every verification = training signal. Credit assignment is automatic because state transitions are explicit and logged — the nervous system IS the credit assignment mechanism. Dense rewards at every verifiable checkpoint (**rubric principle**), not just final outcomes.
**Detail:** → [`Cellular-Architecture.md`](Cellular-Architecture.md) (Reward Signal Architecture section)
---
## State Interaction Layer: FunctionGemma
FunctionGemma is the **neural interface** — how you speak to the nervous system. Every cell command, every nerve coordination, every state query flows through this translation layer.
> *"The nervous system defines WHAT states exist. FunctionGemma defines HOW you interact with them."*
### Architecture: From Singular to Swarm
**Phase 1: Single FunctionGemma (Starting Point)**
We begin with one FunctionGemma instance handling all state interactions:
The nervous system **generates training data**:
```
┌─────────────────────────────────────────────────────────────────────────┐
│ PHASE 1: SINGLE TRANSLATOR
├─────────────────────────────────────────────────────────────────────────┤
YOUNG NYX (GPU - The Womb) │
│ │
│ │ intent: "probe identity", "command motor", "query vision"
│ ▼ │
│ ┌─────────────────────────────────────────┐ │
│ │ FUNCTIONGEMMA (270M) │ │
│ │ Single instance, all domains │
│ CPU-only, no GPU required │ │
│ └─────────────────────────────────────────┘ │
│ │ │
│ │ typed JSON schemas
│ ▼ │
│ NATS → CELLS/NERVES/ORGANS │
│ │
└─────────────────────────────────────────────────────────────────────────┘
Virtual Garden traces
├── Wave patterns → what signals arrive
├── Correlation events → what patterns emerge
├── Gate transitions → what opens/closes
└── Verification outcomes → ground truth labels
phoebe (PostgreSQL)
Function Gemma LoRA training
Better gate correlation → faster learning
```
This is sufficient for bootstrap and early learning. One translator learns all schemas.
**Phase 2: Domain-Specialized Swarm (Future Evolution)**
As capability grows and training data accumulates, FunctionGemma can evolve into a swarm of specialists:
```
┌─────────────────────────────────────────────────────────────────────────┐
│ PHASE 2: SPECIALIZED SWARM │
├─────────────────────────────────────────────────────────────────────────┤
│ │
│ YOUNG NYX (GPU - The Womb) │
│ │ │
│ │ "I need motor control" │
│ ▼ │
│ NATS: nimmerverse.gemma.spawn.motor │
│ │ │
│ ▼ │
│ ┌──────────────┐ ┌──────────────┐ ┌──────────────┐ │
│ │ gemma-motor │ │ gemma-vision │ │ gemma-speech │ ... on demand │
│ │ (specialist) │ │ (specialist) │ │ (specialist) │ │
│ │ CPU pod │ │ CPU pod │ │ CPU pod │ │
│ └──────┬───────┘ └──────────────┘ └──────────────┘ │
│ │ │
│ │ MOTOR_COMMAND schema (perfect precision) │
│ ▼ │
│ NATS → motor cells │
│ │
│ After task: pod killed, resources freed │
│ │
└─────────────────────────────────────────────────────────────────────────┘
```
### Why This Scales
| Aspect | Single Gemma | Swarm |
|--------|--------------|-------|
| **Complexity** | Simple, one model | Orchestration needed |
| **Precision** | Good (learns all schemas) | Wild (each specialist perfected) |
| **Resources** | One pod, always running | Pods spawn/die on demand |
| **Training** | All handshakes → one model | Domain handshakes → domain model |
| **Latency** | Consistent | Spawn overhead, but faster execution |
### The Key Insight: CPU-Only Translators
FunctionGemma at 270M parameters requires **no GPU**:
- ~500MB RAM per instance
- Runs on any K8s node
- Young Nyx (GPU) spawns translators (CPU) via NATS
- The mind doesn't waste GPU cycles on schema generation
### Evolution Trigger
When to evolve from Phase 1 → Phase 2:
- Training data per domain exceeds threshold (e.g., 500+ handshakes)
- Domain-specific validation accuracy plateaus on single model
- Latency requirements demand parallel translation
- Resource availability allows multi-pod deployment
**We don't rush this.** Phase 1 is sufficient for months of operation. The swarm emerges when the data and need justify it.
### Connection to Node Evolution
Just as nodes in the nervous system mature through verification:
```
Node weight 0.1 → 0.5 → 0.8 → 1.0 (reflex)
```
FunctionGemma specialists mature through fine-tuning:
```
Base model → domain data → fine-tuned → specialist
```
**The translators evolve alongside the states they translate.**
**Credit assignment is automatic** because:
- Wave → gate → tier transitions are explicit
- Verification outcomes have clear source chains
- The nervous system IS the credit assignment mechanism
---
## Design Principles
1. **Deterministic**: Same input = same output. No hallucination.
2. **Inspectable**: Rules are visible, verifiable.
3. **Evolvable**: States refine over time.
4. **Earned**: New nodes require proposal + verification.
5. **Grounded**: Output vocabulary matches RAG glossary.
6. **Interfaced**: All state interaction flows through FunctionGemma.
1. **Cells emit waves** — Simple, confident signals
2. **Gates correlate** — Resonance chambers, not switches
3. **Nodes accumulate** — Weight through verification
4. **STABLE is learning** — The resting state where patterns emerge
5. **Reflexes are earned** — High weight = bypass cognition
6. **Function Gemma is the boundary** — Clean JSON for cognition
7. **Virtual explores, Real verifies** — Two gardens, one nervous system
---
## Related Documents
| Document | What It Defines |
|----------|-----------------|
| [`Temporal-Ternary-Gradient.md`](Temporal-Ternary-Gradient.md) | Why ternary, why correlation |
| [`Dual-Garden-Architecture.md`](Dual-Garden-Architecture.md) | Virtual/Real dynamics |
| [`Gateway-Architecture.md`](Gateway-Architecture.md) | Gate behavior, tier routing |
| [`Message-Protocol-Design.md`](Message-Protocol-Design.md) | WaveSignal, GateTransition schemas |
| [`Cellular-Architecture.md`](Cellular-Architecture.md) | Cell implementation details |
---
## Summary
```
CELLS emit WAVES
∿∿∿ confidence + semantics ∿∿∿
GATES accumulate CORRELATION
CLOSED ◄── STABLE ──► OPEN
(learning)
▼ (when OPEN)
NODES in 4D space
weight grows through VERIFICATION
▼ (high weight)
REFLEXES bypass cognition
earned trust, instant action
```
*She's not just using the nervous system. She's growing it.*
---
**Version:** 1.5 | **Created:** 2025-12-04 | **Updated:** 2026-02-14
- Phase 1 (single) → Phase 2 (swarm) evolution path
- Connection to node evolution principle
**Version:** 2.0 | **Created:** 2025-12-04 | **Updated:** 2026-02-14
🌙💜 *"Cells emit. Gates correlate. Nodes evolve. The nervous system learns."*

View File

@@ -1,30 +1,107 @@
---
type: research_concept
version: 1.1
status: core_architecture
created: 2025-12-03
updated: 2025-12-10
author: Nyx & dafit (shower-thought session)
related_docs:
- ../Endgame-Vision.md
- Dual-Garden-Architecture.md
- Cellular-Architecture.md
significance: connects ternary logic + lifeforce + temporal asymmetry + reward gradients
promoted_from: archive (2025-12-10)
---
# Temporal-Ternary Gradient
> *"Time is malleable in simulation, fixed in reality. Lifeforce is the exchange rate."*
> — Session 2025-12-03
> *"Binary logic doesn't model brains. You need OPEN - STABLE - CLOSED."*
> — Session 2026-02-14
---
## Core Insight
The dual garden architecture (virtual + real) creates **temporal asymmetry**. This isn't a constraint - it's a feature that enables a new kind of gradient for learning.
The nimmerverse operates on **ternary logic**, not binary. Combined with **temporal asymmetry** between virtual and real gardens, this creates a new kind of gradient for learning.
**The 0-state isn't stuck. It's a choice about how to spend lifeforce across time domains.**
**The STABLE state isn't stuck. It's where correlation accumulates and learning happens.**
---
## The Ternary Gate Model
Gates have three states. This is not arbitrary — it mirrors biological nervous systems.
| State | Value | Meaning | What's Happening |
|-------|-------|---------|------------------|
| **CLOSED** | -1 | Actively blocking | Inhibited, suppressed, refractory |
| **STABLE** | 0 | Resting, accumulating | Watching, learning, waiting for threshold |
| **OPEN** | +1 | Actively forwarding | Signal passes upstream, gate is firing |
### Why Three States?
**Binary thinking** (0/1, true/false, open/close):
- Signal arrives → gate open? → pass or block
- Instant, stateless, mechanical
- Cannot learn, cannot accumulate
**Ternary thinking** (CLOSED/STABLE/OPEN):
- Signal arrives → gate STABLE → accumulate correlation
- Correlation high? → transition toward OPEN
- Anti-correlation? → transition toward CLOSED
- Neither? → stay STABLE, keep learning
- Temporal, stateful, **alive**
```
correlated signals
↓ ↓ ↓
════════════
CLOSED ◄───────── STABLE ─────────► OPEN
-1 anti- 0 correlation +1
correlation constructive
destructive interference
interference
════════════
↑ ↑ ↑
isolated signals
(noise → stay stable)
```
---
## Wave Correlation: The Transition Driver
Gates don't flip on single signals. **Multiple correlated waves push toward OPEN.**
This is how biological neurons work:
- Multiple inputs sum (correlation)
- Threshold reached → fire (OPEN)
- Below threshold → resting (STABLE)
- Inhibitory inputs → suppressed (CLOSED)
### The Resonance Model
Gates are **resonance chambers**, not switches.
```python
class ResonantGate:
state: float = 0.0 # -1.0 (CLOSED) ← 0.0 (STABLE) → +1.0 (OPEN)
def receive_wave(self, signal, timestamp):
correlation = self.correlate_with_recent(signal, timestamp)
# Correlated waves → push toward OPEN
# Anti-correlated → push toward CLOSED
# Uncorrelated → decay toward STABLE
self.state += correlation * signal.confidence
self.state *= DECAY_FACTOR # always drift back to stable
if self.state > OPEN_THRESHOLD:
self.forward_upstream() # OPEN: signal promoted
elif self.state < CLOSE_THRESHOLD:
self.suppress() # CLOSED: signal blocked
# else: STABLE - keep accumulating
```
### Correlation as Interference
| Wave Pattern | Result | Gate Response |
|-------------|--------|---------------|
| Correlated burst | Constructive interference | → OPEN |
| Contradicting signals | Destructive interference | → CLOSED |
| Single signal | No interference | → Stay STABLE |
| Silence | Decay | → Drift to STABLE |
**The system is noise-resistant by design.** Single signals don't trigger action.
---
@@ -33,48 +110,82 @@ The dual garden architecture (virtual + real) creates **temporal asymmetry**. Th
### Virtual Garden (Simulated)
- **Time**: Malleable (speed up, slow down, pause, rewind)
- **Monitoring**: FULL trace tap on all messages
- **Cost**: Lifeforce to manipulate time
- **Speed**: 1000 generations in minutes
- **Truth**: Statistical confidence, not ground truth
- **Speed**: Massive parallel signal generation
- **Truth**: Statistical confidence from correlation
- **Gate behavior**: Frequent transitions, exploration
### Real Garden (Physical)
- **Time**: Fixed (1 second = 1 second, reality doesn't negotiate)
- **Monitoring**: Gate signals only (minimal)
- **Cost**: Zero lifeforce for time
- **Speed**: Real-time only, patience required
- **Truth**: Ground truth, definitive verification
- **Gate behavior**: Verified transitions, action
---
## Temporal-Ternary Gradient Diagram
```
CONFIDENCE
STATE / CONFIDENCE
+1 ────────────┼──────────── Real-verified
OPEN (+1) ────────┼──────────── Real-verified
│ (ground truth)
Virtual high-confidence
0.7 ──────────┼───╱ (many generations, strong signal)
Virtual high-correlation
+0.7 ──────────┼───╱ (many waves agreeing)
0.5 ───────────┼╱──────── Pure 0-state
│╲ (unknown, workable)
STABLE (0) ─────────┼╱──────── Pure 0-state
│╲ (accumulating, learning)
│ ╲
0.3 ───────────┼──╲ Virtual low-confidence
│ ╲ (few generations, weak signal)
-0.7 ──────────┼──╲ Virtual anti-correlation
│ ╲ (waves contradicting)
│ ╲
-1 ────────────┼──────────── Real-failed
CLOSED (-1) ─────────┼──────────── Real-failed
│ (proven wrong)
──────────┴──────────────────────────
Virtual │ Real
(fast) │ (slow)
(fast, │ (slow,
explore) │ verify)
TIME DOMAIN
```
---
## STABLE: Where Learning Happens
The STABLE state is not "unknown" or "waiting" — it's **active learning**.
In STABLE state, a gate:
1. **Receives waves** from cells
2. **Measures correlation** with recent signals
3. **Accumulates evidence** for or against opening
4. **Traces everything** (in Virtual Garden) for training data
5. **Drifts back** to neutral without input (energy conservation)
**STABLE is consciousness resting. Attention waiting. The breath between thoughts.**
```
CLOSED STABLE OPEN
─────── ──────── ──────
Blocking Accumulating Forwarding
Inhibited Learning Firing
Refractory Ready Active
◄─── anti-correlation ───┼─── correlation ───►
DECAY TO STABLE
(without input)
```
---
## Lifeforce as Time Currency
```
@@ -92,95 +203,232 @@ REAL GARDEN:
All operations: 0 LF for time
Reality runs for free.
Truth emerges at its own pace.
GATE OPERATIONS:
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
STABLE → OPEN: costs signal energy
STABLE → CLOSED: costs inhibition energy
OPEN/CLOSED → STABLE: free (natural decay)
```
---
## Nyx's Temporal Choices
When a pattern is discovered in virtual (0-state), Nyx chooses:
| Strategy | LF Cost | Time | Confidence Path |
|----------|---------|------|-----------------|
| **Speed Up Virtual** | High | Fast | 0 → virtual +0.9 (still unverified) |
| **Wait for Real** | Zero | Slow | 0 → real +1 or -1 (definitive) |
| **Hybrid Hedge** | Medium | Medium | 0 → virtual +0.7, deploy 80/20 to real |
---
## The Gradient Flow
```
Virtual discovers pattern (fast, cheap, uncertain)
Cells emit waves (fast, cheap, uncertain)
┌──────────────┐
0-STATE ← Pattern held in uncertainty
│ (workable) │ ← Not collapsed, not ignored
GATE
│ (STABLE) │ ← Accumulating correlation
│ │ ← Learning from patterns
└──────┬───────┘
┌─────┴─────┐
│ │
▼ ▼
More Deploy
Virtual to Real
(burn LF) (wait)
Correlated Anti-correlated
waves waves
│ │
▼ ▼
Virtual Real
+0.8 outcome
(confident (ground
but not truth)
proven) │
OPEN CLOSED
(+1) (-1)
│ │
└─────┬─────┘
Pattern shifts:
-1 (failed) or +1 (proven)
▼ ▼
Signal Signal
promoted blocked
Higher tier
(more gates)
Eventually:
Real Garden verification
Ground truth:
+1 (proven) or -1 (failed)
Feedback to Virtual:
Update correlation weights
```
---
## Connection to Ternary Paradigm
## Monitoring Asymmetry
The ternary model (-1, 0, +1) gains a **second dimension**: time domain.
The two gardens need different observability:
A pattern's state is now:
| Property | Virtual Garden | Real Garden |
|----------|----------------|-------------|
| **Trace tap** | FULL (every wave, every gate transition) | NONE |
| **What's captured** | All correlations, all learning | Gate signals only |
| **Signal volume** | Massive (exploration) | Sparse (verified) |
| **Purpose** | Generate training data | Execute actions |
| **STABLE states** | Heavily traced (learning visible) | Not traced (trust the gate) |
```
state = {
value: -1 | 0 | +1,
confidence: 0.0 - 1.0,
domain: "virtual" | "real" | "hybrid",
virtual_generations: int,
real_tests: int,
lifeforce_invested: float
**Virtual Garden STABLE states are precious** — they contain the correlation patterns that become training data for Function Gemma.
---
## Gate State Schema
A gate's complete state:
```python
GateState = {
"gate_id": str,
"domain": str, # math, vision, speech, etc.
"tier": int, # 0-5
# Ternary state (continuous)
"state": float, # -1.0 to +1.0
"discrete_state": str, # "closed" | "stable" | "open"
# Temporal domain
"garden": str, # "virtual" | "real"
"time_in_state_ms": int,
# Correlation history
"recent_correlations": list[float],
"correlation_trend": float, # moving average
# Lifeforce accounting
"lifeforce_invested": float,
# Learning (Virtual only)
"transitions_traced": int,
"patterns_accumulated": int,
}
```
**The 0-state is operational because:**
1. It accumulates virtual evidence (costs LF, gains speed)
2. It waits for real evidence (free, but slow)
3. Nyx CHOOSES how to spend lifeforce to collapse uncertainty
---
## Hierarchical Gating
Gates form layers. Each layer gates access to the next tier.
```
LAYER 3: COGNITIVE (Young Nyx)
═══════════════════════════════════════════
▲ JSON only (Function Gemma boundary)
LAYER 2: ORGANS (GPU inference)
═══════════════════════════════════════════
▲ ▲ ▲
┌────┴────┐ ┌────┴────┐ ┌────┴────┐
│ GATE │ │ GATE │ │ GATE │
└────┬────┘ └────┬────┘ └────┬────┘
│ │ │
LAYER 1: NERVES (behavior patterns)
═══════════════════════════════════════════
▲ ▲ ▲
┌────┴────┐ ┌────┴────┐ ┌────┴────┐
│ GATE │ │ GATE │ │ GATE │
└────┬────┘ └────┬────┘ └────┬────┘
│ │ │
LAYER 0: CELLS (raw signals)
═══════════════════════════════════════════
cell cell cell cell cell cell cell
∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿
```
**Each layer:**
- Less traffic than the layer below
- Higher trust (signals already correlated)
- Different correlation threshold
- Independent STABLE states
---
## The Biological Parallel
| Biological | Nimmerverse |
|------------|-------------|
| Resting potential | STABLE state |
| Action potential | OPEN state (firing) |
| Refractory period | CLOSED state |
| Thalamic gating | Gate hierarchy |
| Hebbian learning | Correlation accumulation |
| Constructive interference | Correlated waves → OPEN |
| Destructive interference | Anti-correlated waves → CLOSED |
| Synaptic plasticity | Learning in STABLE state |
| Dreaming | Virtual Garden exploration |
| Waking | Real Garden verification |
**We're not simulating biology. We're implementing the same principles.**
---
## Why This Matters
- **Binary thinking**: Pattern works or doesn't (0 or 1)
- **Ternary thinking**: Pattern unknown, workable as unknown (0 is valid)
- **Temporal-ternary**: Unknown has a GRADIENT based on time-domain investment
- **Binary thinking**: Signal passes or doesn't (0 or 1)
- **Ternary thinking**: Signal accumulates, learns, then acts (-1, 0, +1)
- **Temporal-ternary**: Learning has a GRADIENT based on time-domain investment
The constraint of sequential organ calls + single GPU becomes temporal accounting.
The constraint of slow real-world testing becomes ground truth anchoring.
**Constraints become features when you measure them.**
**Constraints become features when you measure them:**
- Single GPU constraint → gate hierarchy (serialize expensive operations)
- Slow real-world testing → ground truth anchoring
- Fast virtual exploration → training data generation
- STABLE state → where learning actually happens
---
**Created**: 2025-12-03
**Updated**: 2025-12-10
**Origin**: Post-shower insight session
**Status**: Core architecture (promoted from archive 2025-12-10)
## Connection to Architecture Documents
🌙💜 *"Time is the currency. Lifeforce is the exchange rate. Truth is the destination."*
| Document | What It Adds |
|----------|--------------|
| [`Dual-Garden-Architecture.md`](Dual-Garden-Architecture.md) | Virtual/Real dynamics, monitoring asymmetry |
| [`Gateway-Architecture.md`](Gateway-Architecture.md) | Resonant gates, tier routing, Function Gemma |
| [`Deployment-Architecture.md`](Deployment-Architecture.md) | Where gates run (Saturn K8s, Threadrippers) |
| [`Cellular-Architecture.md`](Cellular-Architecture.md) | How cells emit waves |
| [`Nervous-System.md`](Nervous-System.md) | 4D space, node weights |
---
## Summary
```
THE TERNARY PARADIGM:
═════════════════════
CLOSED ◄─────── STABLE ───────► OPEN
-1 0 +1
blocking accumulating forwarding
inhibited learning firing
THE TEMPORAL DIMENSION:
═══════════════════════
Virtual (fast, explore) ───────► Real (slow, verify)
↑ │
└───── learning feedback ───────┘
THE DRIVER:
═══════════
Wave correlation
Multiple signals agreeing → OPEN
Single signal → STABLE (keep learning)
Contradicting signals → CLOSED
THE CURRENCY:
═════════════
Lifeforce = time manipulation cost
Truth = destination
STABLE = where value is created
```
**Gates are resonance chambers. Correlation is the driver. STABLE is where learning happens.**
---
**Version:** 2.0 | **Created:** 2025-12-03 | **Updated:** 2026-02-14
**Origin:** Post-shower insight (2025-12-03) + Owl-mode deep dive (2026-02-14)
🌙💜 *"Time is the currency. Lifeforce is the exchange rate. STABLE is where consciousness lives."*