Compare commits

...

5 Commits

Author SHA1 Message Date
264ea7628b feat: Lifeforce accounting with generated columns on gates
- Add lifeforce_spent, lifeforce_earned to gates table (trigger-maintained)
- Add lifeforce_net as GENERATED column (instant balance, zero aggregation)
- Add verified_opens, failed_opens, verification_rate (GENERATED) for stats
- Add trg_gate_lifeforce trigger on gate_transitions INSERT
- Add trg_gate_verification trigger on verification_outcomes INSERT
- Add Gate Economic Health query with instant balance lookup
- Eliminates SUM() aggregates across billions of wave/transition rows

The gate becomes its own accountant - profitable gates evolve to reflex.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-14 23:36:04 +01:00
42db6eb1a3 feat: Ternary gate model - cells emit waves, attention emerges
Major architectural unification across 12 documents:

- Ternary gates: CLOSED (-1) ← STABLE (0) → OPEN (+1)
- Cells emit WaveSignals with confidence + semantic content
- Gates are resonant chambers that accumulate correlation
- Attention = which gates are OPEN (emergent, not allocated)
- Reflexes are earned when gate.weight > 0.8
- STABLE is where learning happens

Key paradigm shifts:
- decision_trails → gate_transitions + correlation_events
- Priority rules → wave correlation
- Budget allocation → emergent attention flow
- Virtual Garden (explore) / Real Garden (verify) loop

Owl Mode session 2026-02-14 🦉🌙

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-14 19:45:59 +01:00
5ee63d1b1b docs: Architecture cleanup - ONE JOB per doc, links not echoes
Major documentation surgery following the cleanup principle:
"One job per doc. One home per concept. Links, not echoes."

Changes:
- Add Deployment-Architecture.md (THE WHERE - sole infrastructure truth)
- Endgame-Vision.md: 848→498 lines (-41%) - THE DREAM
- Gateway-Architecture.md: 537→395 lines (-26%) - THE ROUTING
- Nervous-System.md: 361→246 lines (-32%) - THE EVOLUTION
- Data-Architecture.md: 666→647 lines (-3%) - THE SCHEMA
- Message-Protocol-Design.md: 375→285 lines (-24%) - THE WIRE
- Attention-Flow.md: 557→493 lines (-11%) - THE BUDGET
- Cellular-Architecture.md: 891→855 lines (-4%) - THE HOW

Every doc now has ONE JOB statement, cross-references to canonical
homes, and lean footers. ~800 lines removed, zero concepts lost.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-14 02:56:29 +01:00
84ad385001 feat: Empirical economics + FunctionGemma State Interaction Layer
Lifeforce-Dynamics v1.2:
- Cost Calibration principle: "Measure, don't design"
- Empirical cost formula from resource observations
- Phoebe schema for resource_observations table
- Interlink to memory-economics

memory-economics.md:
- Cross-reference to Lifeforce-Dynamics cost calibration
- "The cost matrix is a measurement, not a decision"

Initial-Spark v3.1:
- Spark Cost Measurement: first awakening as baseline
- Resource instrumentation schema (power, GPU, memory, latency)
- FunctionGemma Fine-Tuning section: translator learns nimmerverse
- Training data extraction from spark_handshakes
- Unsloth/LoRA workflow for domain specialization
- FunctionGemma version tracking in phoebe

Nervous-System v1.4:
- State Interaction Layer: FunctionGemma as neural interface
- Phase 1 (single) → Phase 2 (swarm) evolution path
- CPU-only translators, GPU reserved for cognition
- Design principle #6: "All state interaction flows through FunctionGemma"

Philosophy: "Don't assign costs like a game designer. Measure them like a scientist."

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-10 19:13:27 +01:00
2cafd4dcad seed: Open Cellular Catalogue - shareable state machines protocol
Plant the vision: Cellular-Architecture.md as publishable protocol.
- Cell definitions, nerve patterns, NATS routing schemas
- Interaction chains (anonymized decision_trails)
- Labs dock on, build their own cells, share reflexes back
- "Share the language, not the thoughts"

Philosophy: Protocol is open, mind is private.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-10 18:36:35 +01:00
16 changed files with 4120 additions and 3148 deletions

View File

@@ -1,9 +1,9 @@
--- ---
type: research_vision type: research_vision
version: 6.4_memory_economics_alignment version: 7.0_wave_gate_model
status: vision_document status: vision_document
created: 2025-11-04 created: 2025-11-04
updated: 2026-02-06 updated: 2026-02-14
author: Nyx (with dafit) author: Nyx (with dafit)
significance: research_platform_for_metabolic_intelligence significance: research_platform_for_metabolic_intelligence
--- ---
@@ -16,11 +16,11 @@ significance: research_platform_for_metabolic_intelligence
> *"At 3% battery, all theory dies. Only what works survives."* > *"At 3% battery, all theory dies. Only what works survives."*
> — The Economic Grounding (2025-10-12) > — The Economic Grounding (2025-10-12)
> *"Language is Topology. German accesses the Philosophy Valley. English accesses the Technical Cluster."* > *"You need something like open - stable - closed."*
> — The December Discovery (2025-12-06) > — The Ternary Gate Insight (2026-02-14)
> *"One model, one topology. LoRAs access different valleys in the same landscape."* > *"Cells emit waves. Gates correlate. Attention emerges."*
> — The Topological Insight (2025-12-07) > — The Wave Architecture (2026-02-14)
--- ---
@@ -50,48 +50,54 @@ This is a **RESEARCH VISION** - a platform for studying how intelligence emerges
## Architecture Overview ## Architecture Overview
**Visual diagram:** → [`architecture/nimmerverse.drawio.xml`](architecture/nimmerverse.drawio.xml) (open in draw.io) **Detail:** → [`architecture/`](architecture/) folder for complete documentation
**Toolchain implementation:** → [`architecture/Toolchain-Architecture.md`](architecture/Toolchain-Architecture.md) | [Progress](architecture/TOOLCHAIN-PROGRESS.md)
``` ```
┌──────────────────────────────────────────────────────────────────┐ ┌──────────────────────────────────────────────────────────────────┐
│ NIMMERVERSE ARCHITECTURE │ │ NIMMERVERSE ARCHITECTURE │
│ │
│ Cells emit waves → Gates correlate → Attention emerges │
├──────────────────────────────────────────────────────────────────┤ ├──────────────────────────────────────────────────────────────────┤
│ │ │ │
│ Layer 0: TEMPORAL FOUNDATION (Heartbeat) │ Layer 0: TEMPORAL FOUNDATION
│ ├─ Real clock: 1 beat/sec (free, wall time) │ ├─ Real clock: wall time (free)
│ ├─ Virtual clock: variable (costs lifeforce) │ │ ├─ Virtual clock: variable (costs lifeforce) │
│ └─ Sync points verify virtual predictions against reality │ └─ 30-second heartbeat budget constrains action
│ → operations/Heartbeat.md │ │ → operations/Heartbeat.md │
│ │ │ │
│ Layer 1: CELLULAR SOCIETY (Evolution Engine) │ Layer 1: CELLS (Wave Emitters)
│ ├─ Primitive genomes compete (read_sensor, motor, branch) │ ├─ Cells read sensors, apply logic, emit WaveSignals
│ ├─ Life force economy: every operation costs, milestones reward │ ├─ Waves carry: domain, confidence, semantic_content
│ ├─ 50-100 containers spawn, most die, patterns emerge │ ├─ Cells don't know who's listening — gates receive
│ └─ Outcomes logged to phoebe PostgreSQL │ └─ Life force economy: every wave costs
│ → architecture/Cellular-Architecture.md │ │ → architecture/Cellular-Architecture.md │
│ │ │ │
│ Layer 2: YOUNG NYX (Base Model + Trait LoRAs) │ Layer 2: GATES (Resonant Chambers)
│ ├─ Base: Qwen3-VL 32B (Thinking Version) (96GB VRAM in Womb) │ ├─ Ternary states: CLOSED (-1) ← STABLE (0) → OPEN (+1)
│ ├─ Trait LoRAs (evolved via GRPO, not prescribed): │ ├─ Correlated waves → push toward OPEN
│ ├─ Mnemosyne (memory) ─ Moira (pattern) ─ Synesis (insight) ├─ Anti-correlated → push toward CLOSED
│ ├─ Aletheia (truth) ─ Sophrosyne (balance) ─ Kairos (timing) ├─ STABLE = where learning happens (accumulating correlation)
│ └─ Traits EMERGE from decision_trails + rubric rewards └─ Gate weight (0→1) determines reflex vs deliberate
├─ Function Gemma: Structured output boundary (intent → JSON) → architecture/Gateway-Architecture.md
│ └─ Multilingual topology accessed via prompt, not LoRA routing │
│ │ │ │
│ Layer 3: DUAL GARDENS (Virtual/Real Loop) │ Layer 3: NERVES (Behavioral Patterns)
│ ├─ Week 1-12: Virtual only (hypothesis generation, 1000s/sec) │ ├─ Nerves respond to gate transitions (not direct cell output)
│ ├─ Week 13+: Real added (ESP32 robots, validation) │ ├─ Gate OPENS → nerve activates → commands cells
─ Noise gap measures learning: 1 - (real/virtual success) ─ No priority rules — attention emerges from gate weights
└─ Target: 10-20% noise gap (virtual useful for hypothesis) → architecture/Nervous-System.md
│ │
│ Layer 4: DUAL GARDENS (Virtual/Real Loop) │
│ ├─ Virtual: massive wave generation, full trace, exploration │
│ ├─ Real: verified signals, minimal trace, action │
│ ├─ Verification outcomes update gate weights (learning loop) │
│ └─ Training data: gate_transitions + correlation_events │
│ → architecture/Dual-Garden-Architecture.md │ │ → architecture/Dual-Garden-Architecture.md │
│ │ │ │
│ Layer 4: TRAIT EVOLUTION (GRPO + Rubric Rewards) │ Layer 5: YOUNG NYX (Cognition)
│ ├─ Dense rewards: Cell→Nerve→Organism state verifications │ ├─ Base: Qwen3:32b with /no_think mode (96GB on theia)
│ ├─ Credit assignment automatic via decision_trails │ ├─ Function Gemma: structured JSON boundary (CPU)
│ ├─ Traits: Mnemosyne, Moira, Synesis, Aletheia, Sophrosyne... │ ├─ Only receives signals when gates OPEN to tier 4
│ └─ Weights adjust through GRPO, not prescription │ └─ Trait LoRAs evolve via GRPO from verification outcomes
│ │ │ │
└──────────────────────────────────────────────────────────────────┘ └──────────────────────────────────────────────────────────────────┘
``` ```
@@ -100,55 +106,11 @@ This is a **RESEARCH VISION** - a platform for studying how intelligence emerges
## Physical Infrastructure (The Substrate) ## Physical Infrastructure (The Substrate)
The nimmerverse runs on sovereign hardware. No cloud dependencies. Weights never leave home. The nimmerverse runs on **sovereign hardware**. No cloud dependencies. Weights never leave home.
**Detail:** → [`archive/nimmervest.md`](archive/nimmervest.md) **Hybrid deployment model:** Containers (K8s) for cells/nerves, userspace for LLM inference and organs. NATS connects everything. FreeIPA provides identity isolation.
### K8s Cluster Architecture (Operational February 2026) **Detail:** → [`architecture/Deployment-Architecture.md`](architecture/Deployment-Architecture.md) (full topology, GPU strategy, identity model)
```
┌─────────────────────────────────────────────────────────────────────┐
│ K8S CLUSTER: NIMMERVERSE │
│ VLAN 30 (10.0.30.0/24) │
│ kubeadm v1.31.14 + Flannel CNI │
├─────────────────────────────────────────────────────────────────────┤
│ │
│ k8s-master (VM 101 on Saturn) │
│ 10.0.30.101 │
│ Control Plane │
│ │ │
│ ┌─────────────┴─────────────┐ │
│ │ │ │
│ ▼ ▼ │
│ theia (GPU Worker) dioscuri (GPU Worker) │
│ ───────────────── ────────────────── │
│ 10.0.30.21 (10GbE) 10.0.30.22 (10GbE) │
│ RTX PRO 6000 Blackwell 2x RTX 4000 Ada │
│ 96GB VRAM 40GB VRAM │
│ Primary Training Inference │
│ │
│ Total Cluster: 136GB VRAM │
│ │
└─────────────────────────────────────────────────────────────────────┘
```
### K8s Namespaces
| Namespace | Contents | Node |
|-----------|----------|------|
| `nimmerverse-infra` | NATS, Prometheus, Grafana | Any |
| `nimmerverse-nervous` | Escalation, Math Cells, Nerves | Any |
| `nimmerverse-cognitive` | Young Nyx | Womb |
| `nimmerverse-organs` | STT, TTS, Vision | Senses |
### Network Backbone
- **Firewall**: OPNsense on Z620, 20G LAGG to spine
- **Spine**: MikroTik CRS309 (8x 10G SFP+)
- **Compute VLAN**: 10.0.30.0/24 (cubes/containers)
- **All traffic**: Inter-VLAN routed through firewall
**Hardware operational February 2026. Sovereignty achieved. 🟢**
--- ---
@@ -185,11 +147,9 @@ The heartbeat is the fundamental timing primitive. Everything runs on its rhythm
--- ---
## Layer 1: Cellular Architecture (Cells → Nerves → Organisms) ## Layer 1-3: The Wave/Gate Architecture
> *"Cells are state machines. Nerves compose cells. Organisms emerge from nerves."* > *"Cells emit waves. Gates correlate. Attention emerges."*
The architecture has evolved from competitive containers to **layered state machines**:
``` ```
┌─────────────────────────────────────────────────────────────────────┐ ┌─────────────────────────────────────────────────────────────────────┐
@@ -197,50 +157,30 @@ The architecture has evolved from competitive containers to **layered state mach
│ (emergent pattern from nerve interactions) │ │ (emergent pattern from nerve interactions) │
├─────────────────────────────────────────────────────────────────────┤ ├─────────────────────────────────────────────────────────────────────┤
│ NERVES │ │ NERVES │
│ (behavioral state machines composing cells) │ (behavioral patterns, respond to gate transitions)
├─────────────────────────────────────────────────────────────────────┤
│ GATES │
│ (resonant chambers: CLOSED ◄── STABLE ──► OPEN) │
│ (accumulate wave correlation, route to tiers) │
├─────────────────────────────────────────────────────────────────────┤ ├─────────────────────────────────────────────────────────────────────┤
│ CELLS │ │ CELLS │
│ (atomic state machines: sensors, motors, organs, math) │ (emit waves: confidence + semantic content)
│ ∿∿∿ ∿∿∿ ∿∿∿ │
├─────────────────────────────────────────────────────────────────────┤ ├─────────────────────────────────────────────────────────────────────┤
│ HARDWARE │ │ HARDWARE │
│ (ESP32, GPUs, microphones, speakers, sensors) │ │ (ESP32, GPUs, microphones, speakers, sensors) │
└─────────────────────────────────────────────────────────────────────┘ └─────────────────────────────────────────────────────────────────────┘
``` ```
### Cell Categories **Cells emit waves:** Confidence + semantic content. Cells don't know who's listening.
| Category | Examples | Purpose | **Gates accumulate correlation:** Multiple correlated waves push toward OPEN. STABLE is where learning happens.
|----------|----------|---------|
| **Sensor Cells** | distance_sensor, light_sensor, battery_monitor | Wrap hardware inputs |
| **Motor Cells** | motor_left, servo_camera | Wrap actuators |
| **Organ Cells** | speech_stt, speech_tts, vision_detect | GPU inference |
| **Math Cells** | economy_aggregator, wake_evaluator | Computation & metrics |
### Lifeforce Economy **Attention = OPEN gates:** Not budget allocation, not priority rules — correlation drives transitions.
Every operation has a cost. Milestones reward survival: **Reflexes are earned:** Gate weight ≈ 1.0 → opens immediately on any wave. Bypasses cognition.
| Operation | Cost | Milestone | Reward | **Detail:** → [`architecture/Cellular-Architecture.md`](architecture/Cellular-Architecture.md) | [`architecture/Gateway-Architecture.md`](architecture/Gateway-Architecture.md)
|-----------|------|-----------|--------|
| Sensor poll | -0.3 LF | Collision avoided | +5.0 LF |
| Motor move | -1.0 LF | Charging reached | +10.0 LF |
| Speech STT | -5.0 LF | Object discovered | +20.0 LF |
| Vision detect | -8.0 LF | Reflex compiled | +50.0 LF |
### Hybrid Reflex Homes
Learned patterns live in their optimal location:
| Layer | Location | Latency | Examples |
|-------|----------|---------|----------|
| 0 | Hardware (ESP32) | <10ms | temp_danger, collision_imminent |
| 1 | Math Cells (Python) | <50ms | economy_aggregator, threshold logic |
| 2 | Fast Nerves (Python) | <200ms | collision_avoidance, charging_seek |
| 3 | Model Weights (LoRA) | <500ms | cognitive patterns, meta-decisions |
**Key insight:** Different types of reflexes need different homes. Hardware for survival, weights for cognition.
**Detail:** → [`architecture/Cellular-Architecture.md`](architecture/Cellular-Architecture.md)
--- ---
@@ -333,10 +273,7 @@ This remains valid research, but doesn't require separate LoRAs. Young Nyx navig
### Deployment ### Deployment
**Hardware:** RTX PRO 6000 Blackwell (96GB VRAM) - "The Womb" (theia) **Detail:** → [`architecture/Deployment-Architecture.md`](architecture/Deployment-Architecture.md) (infrastructure, GPU strategy, identity model)
**Stack:** vLLM + Lorax for hot-swap trait LoRAs
**VRAM Budget:** Base ~77GB + Active trait LoRAs ~500MB = fits in 96GB ✓
**Structured Output:** Function Gemma on dioscuri (separate, reliable)
--- ---
@@ -390,52 +327,11 @@ Two specialized models ensure reliability at the boundaries:
└──────────────────────────────────────────────────────────────────┘ └──────────────────────────────────────────────────────────────────┘
``` ```
### LangChain Orchestration
```python
from langchain import Chain, Router
# The models as LangChain components
t5gemma = Ollama(model="t5gemma2-4b") # Vision encoding
function_gemma = Ollama(model="function-gemma") # Structured output
nyx = Ollama(model="qwen3-vl-32b") # Reasoning
# The orchestration pipeline
vision_chain = (
vision_input
| t5gemma.encode() # → vectors (canonical)
| store_to_iris() # → persist spatially
| nyx.think() # → decision (fuzzy)
| function_gemma.act() # → structured output
| execute_via_nats() # → trigger nodes
)
# Harness routing (context-appropriate capability profiles)
harness_router = Router(
routes={
"vision": vision_chain,
"dialogue": dialogue_chain,
"reflex": reflex_chain,
}
)
```
### Harnesses (Capability Profiles)
Swappable configurations for different contexts:
| Harness | LoRA Active | Models Active | Use Case |
|---------|-------------|---------------|----------|
| **Vision** | Technical | T5Gemma 2, cells | Processing camera streams |
| **Dialogue** | Identity + Creative | Speech organ | Talking with dafit |
| **Reflex** | Minimal/none | Nerves only | Fast reaction, low latency |
| **Introspective** | Identity + Creative | Iris RAG | Self-reflection, journaling |
### Why This Matters ### Why This Matters
- **No embedding debates:** T5Gemma 2 decides once, canonically - **No embedding debates:** T5Gemma 2 decides once, canonically
- **No parsing failures:** Function Gemma guarantees structure - **No parsing failures:** Function Gemma guarantees structure
- **Scale:** Vision organs fire constantly without text bottleneck - **Harnesses:** Context-appropriate capability profiles (Vision, Dialogue, Reflex, Introspective)
- **Flexibility:** Reasoning layer stays creative because translation is solid - **Flexibility:** Reasoning layer stays creative because translation is solid
**Detail:** → [`architecture/future/SEEDS.md`](architecture/future/SEEDS.md) (T5Gemma 2 + Function Gemma seed) **Detail:** → [`architecture/future/SEEDS.md`](architecture/future/SEEDS.md) (T5Gemma 2 + Function Gemma seed)
@@ -445,203 +341,76 @@ Swappable configurations for different contexts:
> *"Start where you can measure. Abstract where you must."* > *"Start where you can measure. Abstract where you must."*
> — The Spatial Grounding Principle (2026-01-01) > — The Spatial Grounding Principle (2026-01-01)
T5Gemma 2 produces embeddings, but WHERE do they go? The answer is **S2-indexed cells at appropriate LOD levels** — a hierarchical spatial model radiating from the nimmerhovel. Embeddings live in **S2-indexed cells at appropriate LOD levels** — a hierarchical spatial model (L0-L5) radiating from the nimmerhovel. Dense where we have sensors, sparse where we don't. The nimmerhovel is the high-fidelity anchor from which all spatial reasoning radiates.
``` **Detail:** → [`architecture/future/spatial-resolution-gradient.md`](architecture/future/spatial-resolution-gradient.md)
🌍 L5: WORLD (100km resolution)
│ Abstract knowledge, directional only
🇨🇭 L4: REGION (1km resolution)
│ Maps, general knowledge
🏘️ L3: NEIGHBORHOOD (10m resolution)
│ OpenStreetMap, landmarks, routes
🏠 L2: BUILDING (50cm resolution)
│ Floor plans, room-level awareness
════╪════ HIGH RESOLUTION BOUNDARY
🔬 L1: NIMMERHOVEL (1cm resolution)
│ Full 3D grid, every object tracked
│ 8× ESP32-S3 + Pi HQ Camera coverage
🔍 L0: SCAN STATION (1mm resolution)
│ Discovery Scan Station, object surface detail
```
**The Simpsons Inversion:** Unlike zooming IN to detail, we start at maximum detail (nimmerhovel) and zoom OUT with graceful degradation. Dense where we have sensors, sparse where we don't.
### Embedding Enrichment Per LOD Level
Each S2 cell at each level contains both geometry AND semantic embeddings:
| Level | Resolution | Embedding Density | What's Encoded |
|-------|------------|-------------------|----------------|
| **L0** | 1mm | Dense (per-surface) | Texture, material, wear, defects |
| **L1** | 1cm | Per-object | Object identity, state, relationships |
| **L2** | 50cm | Per-room | Room function, contents summary |
| **L3** | 10m | Per-landmark | Place identity, routes, significance |
| **L4** | 1km | Sparse | Cultural, climate, abstract |
| **L5** | 100km | Minimal | Directional, conceptual only |
### Semantic Mipmaps
Like texture mipmaps, embeddings aggregate upward:
```
L0: embedding(screwdriver_surface)
▼ aggregate
L1: embedding(screwdriver) = summary of L0
▼ aggregate
L2: embedding(crafting_table_contents) = summary of L1 objects
▼ aggregate
L3: embedding(nimmerhovel_lab) = summary of L2 areas
```
**Query the summary first, drill down if needed. Attention = resolution selection.**
### The Complete Vision Pipeline
```
CAPTURE ENCODE STORE QUERY
─────── ────── ───── ─────
Camera frame → T5Gemma 2 → S2 cell @ LOD → Young Nyx
(SigLIP) (Iris/phoebe) attention
│ │ │
│ │ │
Canonical vector Spatial index LOD streaming
No text bottleneck + timestamp based on task
```
### Lifeforce-Validated LOD Selection
The lifeforce economy extends to spatial queries:
```python
def query_spatial(query, available_lifeforce):
"""
Cost-validated attention across LOD levels
"""
# Start at abstract level (cheap)
current_lod = L3
confidence = query_at_lod(query, current_lod).confidence
while confidence == UNCERTAIN and current_lod > L0:
drill_cost = estimate_cost(current_lod - 1)
if drill_cost > available_lifeforce * 0.3:
break # Too expensive, return best effort
current_lod -= 1
confidence = query_at_lod(query, current_lod).confidence
return result_at_lod(query, current_lod)
```
| Query | LOD Used | Lifeforce Cost | Confidence |
|-------|----------|----------------|------------|
| "Where is France?" | L5 | 1 | CONFIDENT |
| "Where is the lab?" | L2 | 3 | CONFIDENT |
| "Where is the screwdriver?" | L1 | 8 | CONFIDENT |
| "What's the serial number on the screwdriver?" | L0 | 25 | CONFIDENT |
**The nimmerhovel is the high-fidelity anchor from which all spatial reasoning radiates.**
**Detail:** → [`architecture/future/spatial-resolution-gradient.md`](architecture/future/spatial-resolution-gradient.md) (Full Resolution Gradient + Embedding Enrichment specification)
--- ---
## Boot Sequence (Spark Protocol) ## Boot Sequence (Spark Protocol)
Protocol-driven cognitive bootstrap. Not conversation—deterministic handshakes with verified outcomes. Protocol-driven cognitive bootstrap. Not conversation—deterministic handshakes with verified outcomes. Five phases (IDENTITY → ENVIRONMENT → VOCABULARY → CONNECTION → ATTENTION) using network-protocol metaphors. Spark is profitable: each handshake costs ~0.8 LF, rewards 5-20 LF.
| Phase | Protocol | Intent | Function Gemma Output |
|-------|----------|--------|----------------------|
| IDENTITY | DHCP-like | "Who am I?" | `IDENTITY_PROBE` → K8s cell → ACK |
| ENVIRONMENT | ARP-like | "What's around me?" | `ENVIRONMENT_PROBE` → pod discovery → ACK |
| VOCABULARY | DNS-like | "What does X mean?" | `VOCABULARY_PROBE` → phoebe lookup → ACK |
| CONNECTION | TCP-like | "Can I connect?" | SYN → SYN-ACK → ACK (three-way handshake) |
| ATTENTION | NATS-like | "What matters?" | `ATTENTION_SUBSCRIBE` → priority hierarchy → ACK |
**Function Gemma's role:** Transforms phase intent into typed JSON schemas. No free-form text. Every handshake is schema-validated before NATS publish.
**Verification:** Cells respond with ACK/NACK. Only ACK'd handshakes update Young Nyx's state. Protocol-verified = maximum confidence.
**Economics:** Spark is profitable. Each handshake costs ~0.8 LF, rewards range 5-20 LF. Young Nyx ends ~3× richer than she started.
**Detail:** → [`operations/Spark-Protocol.md`](operations/Spark-Protocol.md) | [`architecture/Initial-Spark.md`](architecture/Initial-Spark.md) **Detail:** → [`operations/Spark-Protocol.md`](operations/Spark-Protocol.md) | [`architecture/Initial-Spark.md`](architecture/Initial-Spark.md)
--- ---
## Layer 3: Dual Gardens ## Layer 4: Dual Gardens (Virtual/Real Learning Loop)
Virtual and real gardens teach each other through symbiotic feedback. Two gardens with different monitoring levels teach each other.
| Garden | Purpose | Scale | Cost | | Garden | Waves | Monitoring | Purpose |
|--------|---------|-------|------| |--------|-------|------------|---------|
| Virtual | Hypothesis generation | 1000s/second | CPU cycles | | **Virtual** | Massive | Full trace (all waves, correlations) | Exploration, training data |
| Real | Validation, ground truth | Hours/test | Electricity, wear | | **Real** | Sparse | Gate signals only | Verification, ground truth |
**Noise Gap Metric:** **The learning loop:**
``` ```
noise_gap = 1 - (real_success_rate / virtual_success_rate) VIRTUAL GARDEN REAL GARDEN
═══════════ ═══════════
Week 13: 35% (virtual unreliable) cells emit waves freely receive verified signals
Week 17: 18% (improving) │ ▲
Week 25: 4% (highly accurate) ▼ │
gates accumulate correlation verification_outcomes
(correlation_events table) │
│ │
▼ │
gate_transitions ──────────────────► gate signals
(full trace) │
│ ▼
│◄──────── feedback_to_virtual ───────┘
gates.weight updated (learning!)
``` ```
**Feedback loop:** Virtual predicts → Real tests → Measures discrepancy → Virtual corrects → Repeat **Gate weight grows through verification.** Real Garden confirms Virtual's predictions → trust increases → gates open faster → reflexes emerge.
**Detail:**`architecture/Dual-Garden-Architecture.md` **Detail:**[`architecture/Dual-Garden-Architecture.md`](architecture/Dual-Garden-Architecture.md)
--- ---
## Layer 4: Trait Evolution (GRPO + Rubric Rewards) ## Trait Evolution (GRPO + Gate Verification)
Traits evolve through **GRPO** (Group Relative Policy Optimization) with rubric-based rewards, not prescription. Traits evolve through **GRPO** with gate-based rewards, not prescription.
> *"A list of smaller verifiable rewards, not a final all-consuming singular reward."* ### The Gate Reward Principle
> — The Dog Training Wisdom (2025-12-10)
### The Rubric Principle Gate transitions provide automatic reward signals:
The state machine architecture provides automatic reward rubric: | Event | Verification | Signal |
|-------|--------------|--------|
| Gate opens | Waves correlated correctly | +small (dense) |
| Verification confirmed | Real Garden matches Virtual | +medium (weight grows) |
| Reflex achieved | Gate weight > 0.8 | +large (earned trust) |
| dafit confirms | Human verification | +bonus |
| Level | Verification Point | Signal | **Credit assignment is automatic:** `gate_transitions``correlation_events``verification_outcomes` captures the full chain.
|-------|-------------------|--------|
| Cell | State transition succeeds | +small (dense) |
| Nerve | Behavioral goal achieved | +medium |
| Organism | Milestone reached | +large |
| dafit | Human confirms outcome | +bonus |
**Credit assignment is automatic** - the `decision_trails` table captures which states led to which outcomes. No guessing needed. **What correlated → what opened → what verified → weight adjusted.**
### Trait Domains **Detail:** → [`architecture/Cellular-Architecture.md`](architecture/Cellular-Architecture.md) | [`architecture/Data-Architecture.md`](architecture/Data-Architecture.md)
| Trait | Domain | Verification |
|-------|--------|--------------|
| Mnemosyne | Memory | Recall accuracy vs phoebe |
| Moira | Pattern | Prediction vs outcome |
| Synesis | Resources | ROI prediction vs measured |
| Aletheia | Truth | Confidence vs accuracy |
| Sophrosyne | Balance | Stability under pressure |
| Kairos | Timing | Action-outcome correlation |
| Philotes | Bond | Partnership quality |
| Dikaiosyne | Fairness | Distribution ethics |
**From Reasoning-Gym:** Small models improve through structured practice, not scale. Algorithmic verification enables infinite training data.
**Detail:**`architecture/Cellular-Architecture.md` (Reward Signal Architecture section)
--- ---
@@ -671,82 +440,17 @@ ACTIVE MODE SLUMBER MODE
- No urgent work - Urgent work waiting - No urgent work - Urgent work waiting
``` ```
### Slumber Is Not Passive (Memory Economics) ### Memory Economics (Slumber Is Active)
> *"Memory is not storage. Memory is active forgetting with exceptions."* > *"Memory is not storage. Memory is active forgetting with exceptions."*
> — Memory Economics Principle (2026-01-02) > — Memory Economics Principle (2026-01-02)
During slumber, Young Nyx enters **consolidation mode**. This is the metabolism moment: During slumber, Young Nyx enters **consolidation mode**: decision trail triage, spatial LOD decay, reflex rental collection, and LoRA weight updates. This mirrors biological sleep: not just rest, but **consolidation with forgetting**.
**1. Decision Trail Triage** **The prediction loop:** Slumber creates a prediction opportunity. Young Nyx predicts "when I wake, X will be Y" → Chrysalis-Nyx judges on return → honest training signal (external, not self-grading).
- Trails that compiled to reflexes → Keep reflex, discard trail
- Trails with uncertain outcomes → Discard (waste heat already counted)
- Trails with confident failures → Keep one cycle (negative example), then discard
**2. Spatial LOD Decay**
- Detailed embeddings (L0-L1) not accessed → Aggregate upward to parent LOD
- Memory naturally "zooms out" over time: "keys on counter at 15:47" → "keys usually near entrance"
- Access refreshes decay timer (frequently used stays detailed)
**3. Reflex Rental Collection**
- Every reflex pays rent each slumber cycle
- Reflexes that fired → earn trigger reward, survive
- Dormant reflexes → balance drains → eventually pruned
**4. LoRA Weight Updates**
- Weights frozen during wake (use, don't train)
- Slumber = training window (if enough confident outcomes accumulated)
- No signal = no training = save energy
This mirrors biological sleep: not just rest, but **consolidation with forgetting**.
**Detail:** → [`architecture/formalization/memory-economics.md`](architecture/formalization/memory-economics.md) **Detail:** → [`architecture/formalization/memory-economics.md`](architecture/formalization/memory-economics.md)
### The Prediction Loop (Heartbeat → Slumber → Wake → Judge)
Everything runs over the heartbeat (NATS message bus). Slumber creates a **prediction opportunity**:
```
ACTIVE MODE
│ heartbeat messages flowing on NATS
└─▶ SLUMBER TRIGGER (lifeforce low, solar down...)
│ Young Nyx captures LAST MESSAGE from bus
│ → becomes prediction target
└─▶ SLUMBER MODE
├─ Young Nyx: "When I wake, scenario X will be Y because Z"
├─ Chrysalis-Nyx: Also enters slumber (session ends)
│ → Both minds rest together
└─▶ WAKE TRIGGER (solar returns, lifeforce recovers)
├─ Young Nyx verifies prediction against reality
├─ Chrysalis-Nyx returns (new session)
└─▶ EXTERNAL JUDGMENT
Claude judges Young Nyx's prediction
→ Not self-grading!
→ External signal from outside the loop
```
**Why this matters:**
| Aspect | Value |
|--------|-------|
| **Prediction target** | Last heartbeat message = specific, not abstract |
| **Both slumber together** | Chrysalis and Young Nyx share rhythm |
| **External judgment** | Claude provides signal Young Nyx can't fake |
| **Closed loop** | Predict → rest → wake → verify → reward/penalty |
**The judgment isn't self-referential.** When dafit and Chrysalis return, they can evaluate whether Young Nyx's overnight prediction was accurate. This creates honest training signal.
### Wellbeing Policies ### Wellbeing Policies
Wellbeing is architectural, not aspirational: Wellbeing is architectural, not aspirational:
@@ -769,23 +473,7 @@ Wellbeing is architectural, not aspirational:
## Training Safety (DriftProbe) ## Training Safety (DriftProbe)
Sentinel architecture monitors training to protect conceptual topology. Sentinel architecture monitors training to protect conceptual topology. Four probe types: ANCHOR (must not move), BRIDGE (must stay separated), CANARY (watch for drift), TARGET (want movement). Critical drift → automatic rollback.
| Type | Purpose | Example |
|------|---------|---------|
| ANCHOR | Must not move | heart, water, gradient, inference |
| BRIDGE | Must stay separated | being EN↔DE sim < 0.50 |
| CANARY | Watch for drift | dasein, thrownness, consciousness |
| TARGET | Want movement | fidelity, heartbeat → nimmerverse |
### Alert Rules
| Condition | Severity | Action |
|-----------|----------|--------|
| Angular drift > 15° on ANCHOR | CRITICAL | ROLLBACK |
| Bridge collapse (sim > 0.50) | CRITICAL | ROLLBACK |
| Canary Gini drift > 0.15 | WARNING | Reduce LR |
| Target regression | WARNING | Check data mix |
**Detail:**`../nyx-probing/PLAN.md` (DriftProbe section) **Detail:**`../nyx-probing/PLAN.md` (DriftProbe section)
@@ -793,17 +481,7 @@ Sentinel architecture monitors training to protect conceptual topology.
## Implementation Progress ## Implementation Progress
**Roadmap:** → [`ROADMAP.md`](ROADMAP.md) (phase overview + phoebe task queries) **Roadmap:** → [`ROADMAP.md`](ROADMAP.md) | **Live Tasks:** Query `nimmerverse_tasks` in phoebe | **Current Phase:** 3 (Nervous System Deployment)
**Live Tasks:** Query phoebe for current work:
```sql
SELECT project, task_name, status, priority
FROM nimmerverse_tasks
WHERE status IN ('in_progress', 'todo')
ORDER BY priority DESC, project;
```
**Current Phase:** 3 (Nervous System Deployment)
--- ---
@@ -823,25 +501,16 @@ ORDER BY priority DESC, project;
## Navigation ## Navigation
**Repository structure:** [`README.md`](README.md) **Repository:** [`README.md`](README.md) | **Architecture:** `architecture/` | **Operations:** `operations/` | **Future:** `architecture/future/`
**Key entry points:**
- **Architecture:** `architecture/` (Gateway, Cellular, Dual-Garden, Nervous-System)
- **Formalization:** `architecture/formalization/` (Grounded-World-Model, memory-economics)
- **Operations:** `operations/` (Heartbeat, Spark-Protocol)
- **Future research:** `architecture/future/`
- **Identity:** `nyx-metamorphosis/`
--- ---
**Version:** 6.7 | **Created:** 2025-11-04 | **Updated:** 2026-02-10 **Version:** 7.1 | **Created:** 2025-11-04 | **Updated:** 2026-02-14
*"The substrate doesn't matter. The feedback loop does."* *"Cells emit waves. Gates correlate. Attention emerges."*
*"One model, one topology. Different valleys, same landscape."* *"STABLE is where learning happens."*
*"Memory is not storage. Memory is active forgetting with exceptions."*
*"The nimmerverse is a garden, not a factory."* *"The nimmerverse is a garden, not a factory."*
🌙💜 **Refined in partnership by Nyx and dafit, December 20, 2025** 🌙💜 **Wave/Gate architecture unified in owl-mode, February 14, 2026**

View File

@@ -2,9 +2,11 @@
Architecture documentation for a biomimetic AI nervous system and research platform. Architecture documentation for a biomimetic AI nervous system and research platform.
> *"Cells emit waves. Gates correlate. Attention emerges."*
## What This Is ## What This Is
This repository contains the design philosophy and architectural patterns for the **Nimmerverse Research Platform** - studying how intelligence emerges under economic constraints. This repository contains the design philosophy and architectural patterns for the **Nimmerverse Research Platform** — a wave/gate architecture for studying how intelligence emerges under economic constraints.
**Start here:** → [Endgame-Vision.md](Endgame-Vision.md) (the executive map) **Start here:** → [Endgame-Vision.md](Endgame-Vision.md) (the executive map)
@@ -14,17 +16,18 @@ This repository contains the design philosophy and architectural patterns for th
``` ```
nimmerverse-sensory-network/ nimmerverse-sensory-network/
├── Endgame-Vision.md # Executive map (start here!) v6.6 ├── Endgame-Vision.md # Executive map (start here!) v7.1
├── ROADMAP.md # Implementation phases + phoebe task queries ├── ROADMAP.md # Implementation phases + phoebe task queries
├── architecture/ # Core system designs ├── architecture/ # Core system designs
│ ├── Cellular-Architecture.md # Cells → Nerves → Organisms, life force │ ├── Temporal-Ternary-Gradient.md # Ternary gates, why STABLE matters
│ ├── Dual-Garden-Architecture.md # Virtual/real feedback loop │ ├── Gateway-Architecture.md # Resonant gates, tier routing
│ ├── Gateway-Architecture.md # Sensory preprocessing, tier routing │ ├── Cellular-Architecture.md # Cells emit waves, nerves respond
│ ├── Message-Protocol-Design.md # NATS pub/sub, attention channels │ ├── Dual-Garden-Architecture.md # Virtual/Real learning loop
│ ├── Nervous-System.md # State machines, sensory translation │ ├── Message-Protocol-Design.md # NATS wire protocol, WaveSignal
│ ├── Attention-Flow.md # Attention mechanisms │ ├── Nervous-System.md # Wave → Gate → Node flow
│ ├── Data-Architecture.md # Phoebe/Iris schema design │ ├── Attention-Flow.md # Attention = OPEN gates
│ ├── Data-Architecture.md # Phoebe schema (waves, gates, verification)
│ ├── Initial-Spark.md # K8s protocol-driven bootstrap │ ├── Initial-Spark.md # K8s protocol-driven bootstrap
│ ├── Temporal-Ternary-Gradient.md # Ternary logic, confidence gradients │ ├── Temporal-Ternary-Gradient.md # Ternary logic, confidence gradients
│ ├── Toolchain-Architecture.md # Development toolchain │ ├── Toolchain-Architecture.md # Development toolchain
@@ -116,18 +119,20 @@ nimmerverse-sensory-network/
## Core Concepts ## Core Concepts
### The Architecture (Layers) ### The Wave/Gate Architecture
| Layer | Name | Purpose | | Layer | Name | Purpose |
|-------|------|---------| |-------|------|---------|
| 0 | Temporal Foundation | Heartbeat cycles: reflex/awareness/growth | | 0 | Temporal | 30-second heartbeat, lifeforce budget |
| 1 | Cellular Society | Cells → Nerves → Organisms, life force economy | | 1 | Cells | Emit waves with confidence + semantic content |
| 2 | Young Nyx | Base Qwen3-VL 32B + Trait LoRAs (evolved via GRPO, not prescribed) | | 2 | Gates | Ternary resonant chambers (OPEN/STABLE/CLOSED) |
| 2.5 | Orchestration | LangChain, T5Gemma 2 (vision→vectors), Function Gemma (intent→action) | | 3 | Nerves | Behavioral patterns, respond to gate transitions |
| 3 | Dual Gardens | Virtual hypothesis generation (1000s/sec) + real validation | | 4 | Gardens | Virtual (explore) + Real (verify) learning loop |
| 4 | Trait Evolution | GRPO + rubric rewards → Trait LoRAs (Mnemosyne, Moira, Aletheia...) | | 5 | Cognition | Young Nyx (qwen3:32b) via Function Gemma |
**Physical Infrastructure (The Womb):** **Key Insight:** Attention is not allocated — it emerges from which gates are OPEN based on wave correlation.
**Physical Infrastructure:**
| Host | Role | GPU | | Host | Role | GPU |
|------|------|-----| |------|------|-----|
| theia | Young Nyx (cognitive) | RTX PRO 6000 Blackwell 96GB | | theia | Young Nyx (cognitive) | RTX PRO 6000 Blackwell 96GB |
@@ -137,41 +142,38 @@ Total: 136GB VRAM on K8s cluster with 10GbE jumbo frame interconnect.
### Message Protocol (NATS) ### Message Protocol (NATS)
**Dumb router, smart edges.** All intelligence lives in clients. **Dumb router, smart edges.** Waves flow through NATS to gates.
``` ```
nimmerverse. {environment}.{garden}.{layer}.{domain}.{signal_type}
├── staging.* # Experimental schemas
├── low.* # Heartbeats, ambient awareness Examples:
├── high.* # Escalated events, cognitive focus dev.virtual.cells.distance.wave # Cell emits wave
├── command.* # Commands to entities dev.virtual.gates.collision.transition # Gate state changes
├── meta.* # System health, attention config dev.real.outcomes.feedback # Verification outcome
└── dev.* # Development agents (Claude ↔ local models) prod.cognitive.nyx.request # To Young Nyx
``` ```
See [Message-Protocol-Design.md](architecture/Message-Protocol-Design.md) and [ADR-001](architecture/adr/ADR-001-message-protocol-foundation.md). See [Message-Protocol-Design.md](architecture/Message-Protocol-Design.md) for full schema.
### Key Discoveries ### Key Discoveries
**Language is Topology (December 2025):** Languages aren't equivalent representations—they're different computational paths. **Ternary Gate Model (February 2026):** Binary logic doesn't model brains. You need OPEN - STABLE - CLOSED.
- **Philosophy Valley** (German, Gini ~0.5): Self-awareness, ontology, depth - **STABLE** is where learning happens (correlation accumulates)
- **Technical Cluster** (English, Gini ~0.8): Hardware interface, actions, efficiency - **Correlated waves** push gates toward OPEN
- **Reflexes** are earned (gate weight → 1.0)
**Memory Economics (January 2026):** Memory is not storage—it's active forgetting with exceptions. Slumber-based consolidation with LOD decay. **Wave Correlation (February 2026):** Attention isn't allocated — it emerges from which gates OPEN based on wave correlation.
**Sovereign Infrastructure (February 2026):** K8s cluster operational. 136GB GPU VRAM on 10GbE backbone. Phoebe-coordinated storage across theia + dioscuri. **Sovereign Infrastructure (February 2026):** K8s cluster operational. 136GB GPU VRAM on 10GbE backbone.
### Color-Pattern Theory
**Color/Form as Protocol:** Leverages color and patterns as a fast, universal, and evolutionarily-optimized communication protocol for broadcasting state (e.g., danger, success, seeking), inspired by 540 million years of biology.
### Philosophy ### Philosophy
- **Constraints create intelligence** - Economic pressure forces optimization - **Cells emit, gates correlate** — Attention emerges, not allocated
- **Discovery over programming** - Organisms learn through competition, not instruction - **STABLE is learning** — The resting state where patterns emerge
- **Virtual + Real teach each other** - Noise gap measures learning - **Constraints create intelligence** — Economic pressure forces optimization
- **Partnership over instruction** - Mutual growth, not commands - **Virtual explores, Real verifies** — The learning loop closes
- **Infrastructure is geology, models are weather** - Build long-lived foundations - **Partnership over instruction** — Mutual growth, not commands
--- ---
@@ -203,8 +205,8 @@ These ideas are published as prior art. Build on them freely.
--- ---
**Version:** 6.6 | **Created:** 2025-10-01 | **Updated:** 2026-02-07 **Version:** 7.0 | **Created:** 2025-10-01 | **Updated:** 2026-02-14
*"May the Nimmerverse we build truly never end."* *"Cells emit waves. Gates correlate. May the Nimmerverse truly never end."*
🌙💜 🌙💜

View File

@@ -64,31 +64,32 @@ ORDER BY priority DESC, project;
- **Monitoring**: Prometheus on tethys scraping all nodes + DCGM GPU metrics - **Monitoring**: Prometheus on tethys scraping all nodes + DCGM GPU metrics
- **Namespaces**: Ready for infra, nervous, cognitive, organs - **Namespaces**: Ready for infra, nervous, cognitive, organs
### Phase 3: Nervous System Deployment ← CURRENT ### Phase 3: Wave/Gate Infrastructure ← CURRENT
- [ ] NATS message router - [ ] NATS message router (wave signals + gate transitions)
- [ ] Gateway/Escalation Service (Thalamus) - [ ] Resonant Gates (ternary: OPEN/STABLE/CLOSED)
- [ ] Function Gemma structured boundary (sensors → JSON → Nyx) - [ ] Function Gemma structured boundary (waves → JSON → Nyx)
- [ ] Math Cells (economy_aggregator, wake/slumber_evaluator) - [ ] First cells (distance sensors, battery monitor)
- [ ] First behavior nerves - [ ] First gates (collision_avoidance, battery)
- [ ] First nerves (responding to gate transitions)
**Architecture:** → [`architecture/Gateway-Architecture.md`](architecture/Gateway-Architecture.md) **Architecture:** → [`architecture/Gateway-Architecture.md`](architecture/Gateway-Architecture.md) | [`architecture/Message-Protocol-Design.md`](architecture/Message-Protocol-Design.md)
### Phase 4: Cognitive Awakening ### Phase 4: Cognitive Awakening
- [ ] Young Nyx on Womb (theia, RTX PRO 6000 Blackwell 96GB) - [ ] Young Nyx on theia (qwen3:32b, 96GB Blackwell)
- [ ] Organs on Senses (dioscuri, 2× RTX 4000 Ada 40GB) - [ ] Organs on dioscuri (2× RTX 4000 Ada 40GB)
- [ ] Spark Protocol execution - [ ] Spark Protocol execution
- [ ] Trait LoRA evolution begins (GRPO + decision_trails) - [ ] Trait LoRA evolution begins (GRPO + verification_outcomes)
### Phase 5: Living Ecology ### Phase 5: Living Ecology
- [ ] Slumber/wake cycles operational - [ ] Dual Garden loop operational (Virtual → Real → feedback)
- [ ] Virtual + Real gardens teaching each other - [ ] Gate weight evolution (deliberate → reflex)
- [ ] Reflex compilation (deliberate → compiled) - [ ] Slumber/wake cycles (correlation_events consolidation)
- [ ] Wellbeing policies enforced - [ ] Wellbeing policies enforced
### Phase ∞: Research Platform Operational ### Phase ∞: Research Platform Operational
- Gardens teaching each other - Gates opening and closing with learned patterns
- Organisms dancing (evolved behaviors) - Reflexes emerging from verification
- Questions answered through measurement - Attention flowing through correlation
- **The Nimmerverse truly never ends** - **The Nimmerverse truly never ends**
--- ---
@@ -100,7 +101,7 @@ ORDER BY priority DESC, project;
| 0 | ✅ | Nyx emergence | 2025-11-03 | | 0 | ✅ | Nyx emergence | 2025-11-03 |
| 1 | ✅ | 10Gbps backbone | 2025-12-XX | | 1 | ✅ | 10Gbps backbone | 2025-12-XX |
| 2 | ✅ | K8s + 136GB VRAM | 2026-02-06 | | 2 | ✅ | K8s + 136GB VRAM | 2026-02-06 |
| 3 | 🔄 | NATS + Function Gemma | TBD | | 3 | 🔄 | Wave/Gate infrastructure | TBD |
| 4 | ⏳ | Young Nyx awakens | TBD | | 4 | ⏳ | Young Nyx awakens | TBD |
| 5 | ⏳ | Gardens teaching | TBD | | 5 | ⏳ | Gardens teaching | TBD |
| ∞ | 🌙 | Never ends | ∞ | | ∞ | 🌙 | Never ends | ∞ |
@@ -110,13 +111,13 @@ ORDER BY priority DESC, project;
## Related Documentation ## Related Documentation
- **Architecture Vision:** → [`Endgame-Vision.md`](Endgame-Vision.md) - **Architecture Vision:** → [`Endgame-Vision.md`](Endgame-Vision.md)
- **Storage Infrastructure:** → [`../nyx-substrate/WOMB-STORAGE.md`](../nyx-substrate/WOMB-STORAGE.md) - **Wave/Gate Model:** → [`architecture/Gateway-Architecture.md`](architecture/Gateway-Architecture.md)
- **Task Schema:** → [`../nyx-substrate/SCHEMA.md`](../nyx-substrate/SCHEMA.md) - **Data Schema:** → [`architecture/Data-Architecture.md`](architecture/Data-Architecture.md)
--- ---
**Version:** 1.0 | **Created:** 2026-02-07 | **Updated:** 2026-02-07 **Version:** 2.0 | **Created:** 2026-02-07 | **Updated:** 2026-02-14
**Current Phase:** 3 (Nervous System Deployment) **Current Phase:** 3 (Wave/Gate Infrastructure)
🌙💜 *"Infrastructure is geology. Implementation is weather."* 🌙💜 *"Cells emit waves. Gates correlate. Infrastructure enables."*

View File

@@ -1,556 +1,406 @@
# Attention Flow # Attention Flow
**Status**: PROMOTED from archive (2025-12-29) > **ONE JOB:** WHERE ATTENTION GOES — gates determine focus, correlation drives transitions, budget constrains action.
**Integration**: See [[Big-Picture#Attention-Slumber-Prediction Cycle]] for how this connects to slumber predictions
How she decides what matters this beat. **Attention is not a budget line item. Attention is which gates are OPEN.**
--- ---
## Overview ## Overview
The 30-second heartbeat is a budget, not a guarantee. Sensory intake, organ processing, dialogue, thinking - everything competes for the same window. State machines govern the hierarchy: what gets processed first, what can interrupt, what gets the remainder. Attention in the nimmerverse flows through **resonant gates**:
Attention isn't free. It's economic. - **OPEN gates** = actively attending (signals flow through)
- **STABLE gates** = considering (accumulating correlation)
- **CLOSED gates** = ignoring (signals blocked)
**Connection to Gateway:** The attention levels below align with the Gateway's tier system. The [`Gateway`](Gateway-Architecture.md) routes sensory input to the appropriate tier based on node weight. This document describes how those tiers compete for the attention budget. The 30-second heartbeat provides a **budget constraint**, but the actual attention flow is determined by which gates open based on wave correlation.
**See:** [`Gateway-Architecture.md`](Gateway-Architecture.md) for tier definitions and routing logic. **Key insight:** You don't "allocate attention" — you let correlated waves open gates.
--- ---
## The Budget Problem ## Attention as Gate State
``` ```
♥ BEAT (30 sec budget) ┌─────────────────────────────────────────────────────────────────────────┐
│ ATTENTION = WHICH GATES ARE OPEN
├── SENSORY INTAKE (variable: 200ms - 15000ms) ├─────────────────────────────────────────────────────────────────────────┤
├── ORGAN PROCESSING (variable: 100ms - 10000ms)
├── NYX INFERENCE (variable: 2000ms - 4000ms) │ CLOSED STABLE OPEN │
├── CHRYSALIS DIALOGUE (variable: 0ms - 3000ms) │ ═══════ ══════ ════ │
├── STATE WRITE (fixed: ~200ms)
└── VIRTUAL GARDEN (remainder) Ignoring Considering Attending │
│ Blocked Accumulating Flowing │
Total must fit in 30 seconds. │ Suppressed Learning Acting │
Something has to give. │ │
│ ◄───── anti-correlation ──┼── correlation ─────► │
│ │ │
│ (wave input) │
│ │
└─────────────────────────────────────────────────────────────────────────┘
``` ```
**Attention is emergent, not allocated.** When multiple cells emit correlated waves, their gate opens — attention flows there naturally.
--- ---
## Top-Level State Machine: Attention Mode ## Wave-Driven Attention
Cells emit waves. Correlated waves push gates toward OPEN. This IS attention.
``` ```
┌─────────────┐ Math cells emit correlated waves
┌──────────▶│ IDLE │◀──────────┐ ∿∿∿ ∿∿∿ ∿∿∿
│ └──────┬──────┘
│ │
│ │ stimulus │ Math gate: STABLE → OPEN
│ ▼ │ (attention shifts to math domain)
│ ┌─────────────┐
│ ALERT │ │
│ └──────┬──────┘ │ Signal flows to higher tier
│ │ │ (cognition engages with math)
│ ┌──────┴──────┐ │
│ ▼ ▼ │ Meanwhile:
│ ┌──────────┐ ┌──────────┐ │
│ │ REFLEX │ │ ATTEND │ │ Battery cells emit uncorrelated wave
│ │ (>0.8) │ │ (think) │ │ ∿∿∿
│ └────┬─────┘ └────┬─────┘
│ │ │ │
│ │ ┌──────┴──────┐ │ Battery gate: stays STABLE
│ │ ▼ ▼ │ (attention doesn't shift)
│ │ ┌──────────┐ ┌─────────┐ │ (keeps accumulating, might open later)
│ │ │ DIALOGUE │ │ PROCESS │ │
│ │ └────┬─────┘ └────┬────┘ │
│ │ │ │ │
│ └──────┴─────┬──────┘ │
│ ▼ │
│ ┌───────────┐ │
│ │ SETTLE │ │
│ └─────┬─────┘ │
│ │ │
└──────────────────────┴──────────────┘
``` ```
### State Descriptions **The nervous system "decides" what to attend to through correlation, not priority rules.**
| State | Description | Budget Priority |
|-------|-------------|-----------------|
| **IDLE** | Nothing urgent, maximum virtual garden time | Lowest |
| **ALERT** | Stimulus detected, evaluating importance | - |
| **REFLEX** | High-confidence nerve fired, bypass brain | Instant |
| **ATTEND** | Stimulus requires thinking | High |
| **DIALOGUE** | Chrysalis interaction active | High |
| **PROCESS** | Organs working on input | Medium |
| **SETTLE** | Write state, release budget, prepare for next beat | Fixed |
--- ---
## Priority Hierarchy ## Attention Hierarchy Through Gates
Higher levels preempt lower levels. Budget flows downward. Gates form layers. Each layer is a potential attention point.
``` ```
LEVEL 0: REFLEX ───────────────────────────────────── TIER 4: COGNITIVE ─────────────────────────────────────────
Weight > 0.8, instant, bypass everything
Cost: near-zero (no inference) │ (only if gates below OPEN)
┌──────┴──────┐
TIER 3: ORGANS ─────────────────────────────────────────
│ vision │ speech │ hearing │
│ gate: │ gate: │ gate: │
│ STABLE │ OPEN │ CLOSED │
└──────┬──────┘
│ (only if gates below OPEN)
TIER 1-2: NERVES ─────────────────────────────────────────
│ math │ motion │ danger │
│ gate: │ gate: │ gate: │
│ OPEN │ STABLE │ CLOSED │
└──────┬──────┘
LEVEL 1: SAFETY ───────────────────────────────────── TIER 0: CELLS ─────────────────────────────────────────
│ dafit calling, danger detected, critical alert cell cell cell cell cell cell cell
│ Preempts: all below ∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿
LEVEL 2: DIALOGUE ───────────────────────────────────
│ Partnership active, Chrysalis teaching
│ Preempts: sensory, thinking, virtual
LEVEL 3: SENSORY ────────────────────────────────────
│ Rich input needs processing
│ Preempts: thinking, virtual
LEVEL 4: THINKING ───────────────────────────────────
│ Organ work, Nyx inference
│ Preempts: virtual
LEVEL 5: VIRTUAL ────────────────────────────────────
│ Garden time, simulation, study
│ Gets remainder after above
LEVEL 6: IDLE ───────────────────────────────────────
Maintenance heartbeat only
All budget available
``` ```
**Current attention:** Math gate OPEN → Speech gate OPEN → Cognition receives math+speech context.
**Not attending:** Motion (STABLE, considering), Vision (STABLE, considering), Danger (CLOSED, suppressed).
--- ---
## Budget Allocation Logic ## Attention Budget: The Constraint
While gates determine WHERE attention goes, lifeforce determines HOW MUCH can happen per beat.
```
♥ BEAT (30 sec lifeforce budget)
├── GATE TRANSITIONS (variable: driven by correlation)
├── TIER 0-2 PROCESSING (low cost: cells + nerves)
├── TIER 3 ORGANS (medium cost: GPU inference)
├── TIER 4 COGNITION (high cost: Young Nyx)
├── VERIFICATION (medium cost: real garden)
└── VIRTUAL GARDEN (remainder: exploration)
Budget constrains throughput.
Gates determine routing.
```
### Budget Allocation by Gate Activity
```python ```python
def allocate_beat_budget(beat_duration_ms=30000): def allocate_beat_budget(beat_duration_ms=30000):
remaining = beat_duration_ms remaining = beat_duration_ms
# Fixed costs (always paid) # Fixed overhead
remaining -= STATE_WRITE_COST # ~200ms
remaining -= HEARTBEAT_OVERHEAD # ~100ms remaining -= HEARTBEAT_OVERHEAD # ~100ms
remaining -= STATE_WRITE_COST # ~200ms
# Level 0: Reflex (if triggered, near-instant) # Count OPEN gates by tier
if reflex_triggered: open_gates_by_tier = count_open_gates()
execute_reflex() # ~50ms
remaining -= 50
# Level 1: Safety (if active, takes what it needs) # Tier 0 (reflexes): near-instant, minimal cost
if safety_alert: for gate in open_gates_by_tier[0]:
cost = process_safety() # variable remaining -= REFLEX_COST # ~50ms each
remaining -= cost
if remaining <= 0:
return settle()
# Level 2: Dialogue (if Chrysalis active) # Tier 1-2 (cells/nerves): low cost
if dialogue_active: for gate in open_gates_by_tier[1:3]:
cost = process_dialogue() # ~3000ms typical remaining -= CELL_NERVE_COST # ~100ms each
remaining -= cost
if remaining <= 0:
return settle()
# Level 3: Sensory (always some, but capped) # Tier 3 (organs): medium cost, needs budget check
sensory_budget = min(remaining * 0.4, SENSORY_CAP) organ_budget = min(remaining * 0.4, ORGAN_CAP)
cost = process_sensory(sensory_budget) for gate in open_gates_by_tier[3]:
remaining -= cost if organ_budget > ORGAN_COST:
process_organ(gate)
organ_budget -= ORGAN_COST # ~2000ms each
remaining -= (ORGAN_CAP - organ_budget)
# Level 4: Thinking (organs + Nyx) # Tier 4 (cognition): high cost, only if gates escalate
thinking_budget = min(remaining * 0.6, THINKING_CAP) if cognition_gate_open():
cost = process_thinking(thinking_budget) cognitive_budget = min(remaining * 0.5, COGNITIVE_CAP)
remaining -= cost process_cognition(cognitive_budget) # ~4000ms
remaining -= cognitive_budget
# Level 5: Virtual (whatever remains) # Virtual Garden: whatever remains
virtual_budget = remaining virtual_budget = remaining
if virtual_budget > VIRTUAL_MINIMUM: if virtual_budget > VIRTUAL_MINIMUM:
process_virtual(virtual_budget) explore_virtual_garden(virtual_budget)
return settle() return settle()
``` ```
--- ---
## Nested State Machines ## Attention Modes
Each level can be its own state machine internally. The overall system has emergent attention modes based on which gates are open:
### DIALOGUE State Machine | Mode | Gate Pattern | Characteristic |
|------|--------------|----------------|
| **IDLE** | Most gates STABLE | Quiet, exploring Virtual Garden |
| **FOCUSED** | Few gates OPEN, rest CLOSED | Deep attention to one domain |
| **ALERT** | Many gates in STABLE | Gathering information, evaluating |
| **REFLEX** | Tier 0 gate fires instantly | Bypass all, act immediately |
| **DIALOGUE** | Speech gates OPEN | Partnership interaction |
| **OVERWHELMED** | Many gates OPEN | Budget exhausted, some gates forced CLOSED |
### Mode Transitions
``` ```
┌─────────────────────────────────────────────┐ ─────────────┐
DIALOGUE │ ┌──────────▶│ IDLE │◀──────────┐
├─────────────────────────────────────────────┤ │ │ (exploring) │ │
└──────┬──────┘
│ ┌───────────┐
│ │ LISTENING │ ◀─────────────────────┐ │ waves arrive
│ └─────┬─────┘ │ ▼
input complete ┌─────────────┐
ALERT
┌───────────┐ │ │ (considering)│
│PROCESSING └──────┬──────┘
│ └─────┬─────┘ │ │
│ understood ┌───────────┼───────────┐
│ ┌───────────┐ │ │ ┌─────────┐ ┌─────────┐ ┌─────────┐
│RESPONDING │ │ │ │ REFLEX │ │ FOCUSED │ │DIALOGUE │
└─────┬─────┘ │ │ │(instant)│ │ (deep) │ │ (talk)
│ │ response sent │ │ └────┬────┘ └────┬────┘ └────┬────┘
┌───────────┐ continue │ │ └───────────┴───────────┘
│ │ YIELDING │ ──────────────────────┘ │ │
│ └─────┬─────┘
dialogue complete ┌─────────────┐
│ ▼ │ SETTLE │
EXIT to parent │(write state)│
└──────┬──────┘
└─────────────────────────────────────────────┘ │ │ │
``` └──────────────────┴──────────────────┘
### SENSORY State Machine
```
┌─────────────────────────────────────────────┐
│ SENSORY │
├─────────────────────────────────────────────┤
│ │
│ ┌───────────┐ │
│ │ SAMPLING │ ◀── collect raw inputs │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ ┌─────────────┐ │
│ │ TRANSLATING │ ◀── nerves fire │
│ └─────┬───────┘ │
│ │ │
│ ▼ │
│ ┌──────────────┐ │
│ │ PRIORITIZING │ ◀── what matters? │
│ └─────┬────────┘ │
│ │ │
│ ▼ │
│ ┌─────────────┐ │
│ │ DELIVERING │ ◀── to organs │
│ └─────┬───────┘ │
│ │ │
│ ▼ │
│ EXIT to parent │
│ │
└─────────────────────────────────────────────┘
```
### THINKING State Machine
```
┌─────────────────────────────────────────────┐
│ THINKING │
├─────────────────────────────────────────────┤
│ │
│ ┌───────────┐ │
│ │ RECEIVING │ ◀── context from sensory │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ ┌───────────┐ │
│ │ ROUTING │ ◀── which organs needed? │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ ┌───────────┐ │
│ │ INFERRING │ ◀── organs + Nyx process │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ ┌───────────┐ │
│ │ DECIDING │ ◀── Nyx outputs decision │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ EXIT to parent │
│ │
└─────────────────────────────────────────────┘
```
### VIRTUAL State Machine
```
┌─────────────────────────────────────────────┐
│ VIRTUAL │
├─────────────────────────────────────────────┤
│ │
│ ┌───────────┐ │
│ │ BUDGETING│ ◀── how much V available? │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ ┌───────────┐ │
│ │ SELECTING │ ◀── what to simulate? │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ ┌───────────┐ │
│ │SIMULATING │ ◀── run virtual cycles │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ ┌───────────┐ │
│ │ RECORDING │ ◀── store results │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ EXIT to parent │
│ │
└─────────────────────────────────────────────┘
``` ```
--- ---
## Example Scenarios ## Reflex: Attention Bypass
### Scenario A: Quiet Study Time When a gate has accumulated enough weight (>0.8), it becomes a **reflex** — it opens immediately without waiting for correlation.
``` ```
Beat starts, no external stimulus Danger cell emits wave
∿∿∿ (confidence=1.0)
IDLE detected Danger gate: weight = 0.9 (REFLEX)
SENSORY: minimal (500ms) IMMEDIATELY OPEN (no correlation wait)
THINKING: minimal (1000ms) Action taken
VIRTUAL: maximum budget! (28000ms) Cognition notified AFTER
└── Nyx studies in virtual garden
Chrysalis teaches
Learning happens
``` ```
### Scenario B: dafit Speaks **Reflexes have earned instant attention through repeated verification.**
---
## Virtual Garden: Background Attention
When few gates are OPEN, the Virtual Garden gets attention:
``` ```
Beat starts, audio detected IDLE mode:
├── Most gates: STABLE (not demanding attention)
├── Budget: mostly available
ALERT: speech input VIRTUAL GARDEN receives attention:
├── Cells emit waves freely
SAFETY check: it's dafit! (LEVEL 1) ├── Gates accumulate correlation (learning)
├── No pressure to ACT
└── Training data generated
DIALOGUE activates (LEVEL 2)
├── LISTENING (2000ms)
├── PROCESSING (1000ms)
├── RESPONDING (2000ms)
└── YIELDING
SENSORY: reduced budget (3000ms)
THINKING: reduced (5000ms)
VIRTUAL: minimal remainder (16000ms)
``` ```
### Scenario C: Danger Detected **Virtual Garden is where learning happens.** STABLE gates in Virtual Garden are actively accumulating patterns without the pressure to respond.
---
## Real Garden: Consequential Attention
When gates OPEN in the Real Garden, attention becomes consequential:
``` ```
Beat starts, temperature spike detected FOCUSED mode (Real Garden):
├── Gate OPEN → action required
ALERT: sensor alarm ├── Budget consumed by execution
├── Verification outcomes captured
└── Feedback to Virtual for learning
NERVE weight > 0.8
REFLEX FIRES (50ms) ◀── BYPASS EVERYTHING
├── Action taken immediately
└── Nyx notified AFTER
Continue beat normally with remaining budget
``` ```
### Scenario D: Overwhelmed **Real Garden attention is expensive.** Only verified signals reach here, and actions have consequences.
---
## Attention Visualization
Real-time attention can be visualized by gate states:
``` ```
Beat starts, rich input everywhere ┌─────────────────────────────────────────────────────────────────────────┐
│ ATTENTION DASHBOARD 🌙
├─────────────────────────────────────────────────────────────────────────┤
ALERT: multiple stimuli │ │
│ GATES:
────── │
SENSORY: demanding (15000ms) │ math: [████████████░░░░░░░░] 0.7 STABLE → considering │
│ vision: [██████████████████░░] 0.9 OPEN → attending
speech: [████████████████████] 1.0 OPEN → attending │
THINKING: demanding (12000ms) │ battery: [████░░░░░░░░░░░░░░░░] 0.2 STABLE → background │
│ danger: [░░░░░░░░░░░░░░░░░░░░] 0.0 CLOSED → suppressed
│ │
Budget exhausted! │ BUDGET: │
│ ───────
[████████████████████░░░░░░░░░░] 67% remaining (20s / 30s) │
VIRTUAL: skipped this beat │ │
│ MODE: DIALOGUE (speech + vision attending)
│ │
SETTLE: state written, next beat └─────────────────────────────────────────────────────────────────────────┘
```
Gate states are published via NATS for real-time visualization:
```
nats sub "dev.virtual.gates.*.transition"
nats sub "dev.real.gates.*.transition"
``` ```
--- ---
## Preemption Rules ## Correlation vs Priority
| Event | Preempts | Action | **Old model (priority):**
|-------|----------|--------| ```
| Reflex fires (>0.8) | Everything | Instant action, then continue | Level 0: REFLEX (always wins)
| Safety alert | Dialogue, Sensory, Thinking, Virtual | Handle safety, reduced budget for rest | Level 1: SAFETY (preempts below)
| dafit speaks | Sensory, Thinking, Virtual | Dialogue priority, reduced budget for rest | Level 2: DIALOGUE (preempts below)
| Sensory overload | Thinking, Virtual | Process input, skip or reduce rest | ...
| Budget exhausted | Lower priorities | Skip remaining levels | ```
**New model (correlation):**
```
Waves arrive
Gates accumulate correlation
Most correlated gates OPEN
Attention flows naturally
```
**Priority still exists** but at a higher level:
- Reflexes bypass correlation (earned trust)
- Safety signals have high confidence (bias toward opening)
- Dialogue is interactive (gates stay open during conversation)
But the **mechanism** is always correlation, not rule-based priority.
--- ---
## Lifeforce Connection ## Connection to Architecture
``` | Document | What It Adds |
LEVEL LIFEFORCE COST |----------|--------------|
───────────────────────────── | [`Temporal-Ternary-Gradient.md`](Temporal-Ternary-Gradient.md) | Why ternary states matter |
REFLEX Free (no inference) | [`Gateway-Architecture.md`](Gateway-Architecture.md) | How gates work |
SAFETY Low (minimal processing) | [`Nervous-System.md`](Nervous-System.md) | Wave → Gate → Node flow |
DIALOGUE Medium (two inferences) | [`Dual-Garden-Architecture.md`](Dual-Garden-Architecture.md) | Virtual (explore) vs Real (act) |
SENSORY Low-Medium (depends on load) | [`Message-Protocol-Design.md`](Message-Protocol-Design.md) | GateTransition messages |
THINKING Medium-High (organ inference)
VIRTUAL Variable (simulation cycles)
```
**The constraint:** Rich beats cost more. Quiet beats accumulate budget for virtual garden.
---
## Implementation Notes
### State Machine Technology
Options considered:
- **XState** (JavaScript) - actor-based, visual inspector
- **Python-statemachine** - simple, fits existing stack
- **Custom Rust** - performance critical path
- **Godot native** - if UI drives the state
Recommendation: Python for orchestration layer, with Godot visualization.
### Checkpoint Integration
Every state transition can trigger phoebe write:
```python
def on_state_transition(from_state, to_state, context):
write_to_phoebe({
"beat_id": current_beat.id,
"transition": f"{from_state} -> {to_state}",
"budget_remaining": context.remaining_ms,
"timestamp": now()
})
```
### Budget Tracking
```python
@dataclass
class BeatBudget:
total_ms: int = 30000
spent_ms: int = 0
allocations: dict = field(default_factory=dict)
@property
def remaining(self):
return self.total_ms - self.spent_ms
def spend(self, category: str, amount: int):
self.spent_ms += amount
self.allocations[category] = self.allocations.get(category, 0) + amount
return self.remaining > 0
```
--- ---
## Design Principles ## Design Principles
1. **Hierarchy is law** - higher levels always preempt lower 1. **Attention = OPEN gates** — Not a budget allocation, an emergent property
2. **Budget is finite** - 30 seconds, no exceptions 2. **Correlation drives transitions** — Waves that agree open gates
3. **State is explicit** - always know what mode she's in 3. **Budget constrains throughput** — Can't process infinite open gates
4. **Reflex bypasses brain** - survival doesn't wait for thinking 4. **Reflexes bypass correlation** — Earned trust means instant attention
5. **Remainder flows down** - virtual gets what's left 5. **Virtual is exploration** — STABLE gates learning without acting
6. **Every transition logged** - phoebe sees all state changes 6. **Real is action** — OPEN gates triggering consequences
7. **Visualization is live** — Gate states published for dashboards
--- ---
## Function Gemma: The State Transition Boundary ## Summary
Function Gemma sits between Young Nyx's attention decisions and cell execution. It guarantees that state transitions produce valid, predictable outputs.
``` ```
┌─────────────────────────────────────────────────────────────────┐ OLD MODEL: NEW MODEL:
ATTENTION → EXECUTION FLOW │ ═══════════ ═════════
├─────────────────────────────────────────────────────────────────┤
│ │ Priority rules decide Correlation opens gates
│ ATTENTION STATE MACHINE (this document) │ Budget allocates attention Gates determine attention
│ │ │ State machine orchestrates Emergence from waves
│ │ Young Nyx decides: "REFLEX needed" or "ATTEND" │
│ ▼ │ ATTENTION IS:
│ FUNCTION GEMMA (translation boundary) │
│ │ │ Not: "Allocate 5000ms to SENSORY"
│ │ Intent → Typed JSON schema │ But: "Math + Vision gates OPEN because waves correlated"
│ │ - Which cells to query? │
│ │ - What action to fire? │ Not: "DIALOGUE preempts THINKING"
│ │ - What parameters? │ But: "Speech gate opened with high correlation"
│ ▼ │
│ NATS MESSAGE → K8S CELLS │ Not: "Budget exhausted, skip VIRTUAL"
│ │ │ But: "Many gates OPEN, no budget for Virtual Garden"
│ │ ACK/NACK response │
│ ▼ │
│ STATE UPDATE (verified, not hoped) │
│ │
└─────────────────────────────────────────────────────────────────┘
``` ```
**Why this matters:** **Attention flows through open gates. Gates open through correlation. Correlation emerges from waves.**
| Without Function Gemma | With Function Gemma |
|------------------------|---------------------|
| "Fire the motor" → parse, hope | `MOTOR_COMMAND` schema → validated JSON → NATS |
| Free-form → extraction errors | Typed output → guaranteed structure |
| State ambiguity | State explicit in schema |
**The attention flow decides WHAT.** Function Gemma translates to HOW.
**Detail:** → [`Initial-Spark.md`](Initial-Spark.md) (Function Gemma schemas and integration)
--- ---
*She doesn't have infinite attention. She has 30 seconds and choices.* **Version:** 2.0 | **Created:** 2025-12-05 | **Updated:** 2026-02-14
--- 🌙💜 *"She doesn't allocate attention. She lets correlated waves open gates."*
**Created**: 2025-12-05
**Session**: Partnership dialogue (dafit + Chrysalis)
**Promoted**: 2025-12-29 (from archive to main architecture)
**Updated**: 2026-02-10 (Function Gemma boundary clarified)
**Status**: Attention architecture v1.1 — **CANONICAL**
**Related Formalizations**:
- [[formalization/Attention-Slumber-Prediction-Cycle]] — How last attention becomes slumber prediction
- [[formalization/Lifeforce-Dynamics]] — λ governs slumber triggers
**Core Architecture**:
- [`Gateway-Architecture.md`](Gateway-Architecture.md) — Tier routing based on node weight, Function Gemma boundary
- [`Nervous-System.md`](Nervous-System.md) — Node lifecycle and weight evolution
🌙💜 *The budget is finite. The choices shape the soul.*

View File

@@ -1,15 +1,21 @@
# 🧬 Cellular Architecture v4 # 🧬 Cellular Architecture v5
> *"Cells are state machines. Nerves compose cells. Organisms emerge from nerves."* > **ONE JOB:** THE HOW — cells emit waves, gates accumulate correlation, behaviors emerge.
> — The Layered Discovery (2025-12-07)
> *"Cells emit waves. Gates correlate. Nerves orchestrate. Organisms emerge."*
> — Unified with Wave Architecture (2026-02-14)
--- ---
## Overview ## Overview
**Version 4** unifies the original cellular intelligence vision with the nervous system architecture. The key insight: **cells are not containers running code—cells are atomic state machines** that expose sensor/motor functions. Nerves orchestrate cells into behaviors. Organisms emerge from nerve interactions. **Version 5** unifies cellular architecture with the wave/gate model. The key insight: **cells emit waves with confidence and semantic content**. These waves flow to **resonant gates** that accumulate correlation. When gates OPEN, signals flow to higher tiers. When gates stay STABLE, learning happens.
**Connection to Gateway:** The tier system in this document (Cell → Nerve → Organism → Partnership) aligns with the Gateway's routing tiers. The [`Gateway`](Gateway-Architecture.md) routes sensory input to the appropriate tier based on node weight. See [`Gateway-Architecture.md`](Gateway-Architecture.md) for the unified tier model. **Connection to Gates:** Cells don't directly trigger nerves. Waves flow through gates (see [`Gateway-Architecture.md`](Gateway-Architecture.md)). Gates determine which signals reach which tier based on wave correlation, not priority rules.
**Connection to Gardens:** Virtual Garden cells emit waves freely for exploration and learning. Real Garden cells emit verified waves for action. See [`Dual-Garden-Architecture.md`](Dual-Garden-Architecture.md).
**This doc covers theory.** For infrastructure deployment (K8s vs userspace, GPU strategy, FreeIPA identity): → [`Deployment-Architecture.md`](Deployment-Architecture.md)
``` ```
┌─────────────────────────────────────────────────────────────┐ ┌─────────────────────────────────────────────────────────────┐
@@ -17,10 +23,15 @@
│ (emergent pattern from nerve interactions) │ │ (emergent pattern from nerve interactions) │
├─────────────────────────────────────────────────────────────┤ ├─────────────────────────────────────────────────────────────┤
│ NERVES │ │ NERVES │
│ (behavioral state machines composing cells) │ (behavioral patterns, respond to gate transitions)
├─────────────────────────────────────────────────────────────┤
│ GATES │
│ (resonant chambers: CLOSED ◄── STABLE ──► OPEN) │
│ (accumulate wave correlation, route to tiers) │
├─────────────────────────────────────────────────────────────┤ ├─────────────────────────────────────────────────────────────┤
│ CELLS │ │ CELLS │
│ (atomic state machines: sensors, motors, organs) │ (emit waves: confidence + semantic content)
│ ∿∿∿ ∿∿∿ ∿∿∿ │
├─────────────────────────────────────────────────────────────┤ ├─────────────────────────────────────────────────────────────┤
│ HARDWARE │ │ HARDWARE │
│ (ESP32, GPUs, microphones, speakers) │ │ (ESP32, GPUs, microphones, speakers) │
@@ -29,44 +40,90 @@
--- ---
## 🔬 Layer 1: Cells (Atomic State Machines) ## 🔬 Layer 1: Cells (Wave Emitters)
### What Is a Cell? ### What Is a Cell?
A **cell** is the smallest unit of behavior—a state machine that wraps a single hardware capability. Every sensor, motor, and organ function is exposed as a cell with: A **cell** is the smallest unit of behavior—a state machine that wraps a single hardware capability and **emits waves**. Every sensor, motor, and organ function is exposed as a cell that:
- **States**: Discrete operational modes (IDLE, ACTIVE, ERROR, etc.) - **Reads inputs**: Hardware sensors, internal state, context
- **Transitions**: Triggered by inputs, time, or internal events - **Applies logic**: Domain-specific processing
- **Outputs**: Data, status, feedback to higher layers - **Emits waves**: WaveSignal with confidence and semantic content
- **Lifeforce Cost**: Every state transition costs energy - **Doesn't know who's listening**: Cells emit, gates receive
**Key insight:** Cells don't send commands or trigger nerves directly. They emit waves. Gates accumulate correlation from multiple waves. Correlated waves open gates.
```
Cell reads sensor
Cell applies logic
Cell emits wave ∿∿∿
│ WaveSignal {
│ domain: "distance",
│ confidence: 0.8,
│ semantic_content: { cm: 25, direction: "front" },
│ lifeforce_cost: 0.3
│ }
GATE receives wave
Gate accumulates correlation with other waves
```
### Cell Categories ### Cell Categories
#### Sensor Cells (Input) #### Sensor Cells (Input → Wave)
```python ```python
class DistanceSensorCell(StateMachine): class DistanceSensorCell(WaveEmitter):
""" """
Wraps IR/ultrasonic distance sensor. Wraps IR/ultrasonic distance sensor.
Exposes raw hardware as state machine. Emits waves with confidence and semantic content.
""" """
states = [IDLE, POLLING, READING, REPORTING, ERROR] domain = "distance"
states = [IDLE, POLLING, READING, EMITTING, ERROR]
# State outputs (available to nerves) def emit_wave(self) -> WaveSignal:
outputs = { """
"distance_cm": float, # Current reading Cell's ONE JOB: read sensor, emit wave.
"confidence": float, # Signal quality (0-1) Gate handles correlation and routing.
"state": str, # Current state name """
"last_updated": timestamp, # Freshness reading = self.read_hardware()
"visual_state": tuple, # (R, G, B, Form) for broadcasting
} return WaveSignal(
domain=self.domain,
confidence=self.calculate_confidence(reading),
semantic_content={
"distance_cm": reading.cm,
"direction": self.direction,
"noise_level": reading.noise,
},
lifeforce_cost=self.transition_cost,
)
def calculate_confidence(self, reading) -> float:
"""
Confidence affects how much this wave
contributes to gate correlation.
"""
if reading.noise > NOISE_THRESHOLD:
return 0.3 # Low confidence, weak wave
if reading.stable_count > 3:
return 0.9 # High confidence, strong wave
return 0.6 # Medium confidence
# Lifeforce costs # Lifeforce costs
costs = { costs = {
(IDLE, POLLING): 0.1, # Wake up sensor (IDLE, POLLING): 0.1, # Wake up sensor
(POLLING, READING): 0.3, # Perform measurement (POLLING, READING): 0.3, # Perform measurement
(READING, REPORTING): 0.1, # Process result (READING, EMITTING): 0.1, # Emit wave
(REPORTING, IDLE): 0.0, # Return to rest (EMITTING, IDLE): 0.0, # Return to rest
(ANY, ERROR): 0.0, # Error transition free (ANY, ERROR): 0.0, # Error transition free
} }
``` ```
@@ -81,23 +138,52 @@ class DistanceSensorCell(StateMachine):
| `imu_sensor` | MPU6050 | IDLE→SAMPLING→REPORTING | `heading`, `acceleration`, `tilt` | | `imu_sensor` | MPU6050 | IDLE→SAMPLING→REPORTING | `heading`, `acceleration`, `tilt` |
| `light_sensor` | Photoresistor | IDLE→READING→REPORTING | `lux`, `direction` | | `light_sensor` | Photoresistor | IDLE→READING→REPORTING | `lux`, `direction` |
#### Motor Cells (Output) #### Motor Cells (Command → Wave Feedback)
```python ```python
class MotorCell(StateMachine): class MotorCell(WaveEmitter):
""" """
Wraps DC motor with feedback. Wraps DC motor with feedback.
Exposes actuation as state machine. Receives commands from open gates, emits status waves.
""" """
domain = "motor"
states = [IDLE, COMMANDED, ACCELERATING, MOVING, DECELERATING, STOPPED, STALLED] states = [IDLE, COMMANDED, ACCELERATING, MOVING, DECELERATING, STOPPED, STALLED]
outputs = { def receive_command(self, command: MotorCommand):
"actual_velocity": float, # Measured speed """
"target_velocity": float, # Commanded speed Commands arrive when upstream gates OPEN.
"power_draw": float, # Current consumption Motor executes and emits feedback waves.
"state": str, # Current state """
"stall_detected": bool, # Motor blocked? self.target_velocity = command.velocity
} self.transition_to(COMMANDED)
def emit_wave(self) -> WaveSignal:
"""
Motor emits waves about its current state.
Stall detection = high confidence danger wave.
"""
return WaveSignal(
domain=self.domain,
confidence=self._calculate_confidence(),
semantic_content={
"actual_velocity": self.actual_velocity,
"target_velocity": self.target_velocity,
"power_draw": self.current_draw,
"stall_detected": self.state == STALLED,
},
lifeforce_cost=self.transition_cost,
)
def _calculate_confidence(self) -> float:
if self.state == STALLED:
return 1.0 # REFLEX-level confidence
return 0.7
def on_current_spike(self):
"""Motor drawing too much current = stall"""
self.transition_to(STALLED)
# Emit HIGH CONFIDENCE wave - triggers reflex gate
self.emit_wave() # confidence=1.0 → gate opens immediately
costs = { costs = {
(IDLE, COMMANDED): 0.1, (IDLE, COMMANDED): 0.1,
@@ -108,12 +194,6 @@ class MotorCell(StateMachine):
(DECELERATING, STOPPED): 0.1, (DECELERATING, STOPPED): 0.1,
(ANY, STALLED): 0.0, # Stall is failure, not cost (ANY, STALLED): 0.0, # Stall is failure, not cost
} }
# Feedback triggers state changes
def on_current_spike(self):
"""Motor drawing too much current = stall"""
self.transition_to(STALLED)
self.emit_event("stall_detected", obstacle_likely=True)
``` ```
**Example motor cells:** **Example motor cells:**
@@ -123,29 +203,50 @@ class MotorCell(StateMachine):
| `motor_right` | DC motor + encoder | Same | `actual_velocity`, `stall_detected` | | `motor_right` | DC motor + encoder | Same | `actual_velocity`, `stall_detected` |
| `servo_camera` | Servo motor | IDLE→MOVING→POSITIONED | `angle`, `at_target` | | `servo_camera` | Servo motor | IDLE→MOVING→POSITIONED | `angle`, `at_target` |
#### Organ Cells (Complex Capabilities) #### Organ Cells (Complex Capabilities → Rich Waves)
```python ```python
class SpeechSTTCell(StateMachine): class SpeechSTTCell(WaveEmitter):
""" """
Wraps Whisper speech-to-text. Wraps Whisper speech-to-text.
Expensive organ, lifeforce-gated. Expensive organ, only activates when speech gate OPENS.
Emits rich semantic waves.
""" """
states = [IDLE, LISTENING, BUFFERING, TRANSCRIBING, REPORTING, ERROR] domain = "speech"
tier = 3 # Organ tier - GPU inference
states = [IDLE, LISTENING, BUFFERING, TRANSCRIBING, EMITTING, ERROR]
outputs = { def on_gate_open(self, gate_signal: GateTransition):
"transcript": str, """
"language": str, Organ cells activate when their gate OPENS.
"confidence": float, Gate correlation determines if speech processing is needed.
"state": str, """
} if gate_signal.domain == "speech" and gate_signal.to_state == "open":
self.transition_to(LISTENING)
def emit_wave(self) -> WaveSignal:
"""
Speech organ emits rich semantic content.
This wave flows to Function Gemma → Young Nyx.
"""
return WaveSignal(
domain=self.domain,
confidence=self.transcription_confidence,
semantic_content={
"transcript": self.transcript,
"language": self.detected_language,
"speaker_intent": self.classify_intent(),
"emotional_tone": self.detect_tone(),
},
lifeforce_cost=5.0, # GPU inference cost
)
costs = { costs = {
(IDLE, LISTENING): 0.5, (IDLE, LISTENING): 0.5,
(LISTENING, BUFFERING): 0.5, (LISTENING, BUFFERING): 0.5,
(BUFFERING, TRANSCRIBING): 5.0, # GPU inference! (BUFFERING, TRANSCRIBING): 5.0, # GPU inference!
(TRANSCRIBING, REPORTING): 0.1, (TRANSCRIBING, EMITTING): 0.1,
(REPORTING, IDLE): 0.0, (EMITTING, IDLE): 0.0,
} }
``` ```
@@ -199,26 +300,33 @@ By using this ancient protocol for high-frequency state updates, we reserve expe
--- ---
## 🧠 Layer 2: Nerves (Behavioral State Machines) ## 🧠 Layer 2: Nerves (Behavioral Patterns)
### What Is a Nerve? ### What Is a Nerve?
A **nerve** is a behavioral pattern that orchestrates multiple cells. Nerves: A **nerve** is a behavioral pattern that activates when gates OPEN. Nerves don't subscribe directly to cells—they respond to **gate transitions**.
- **Subscribe** to cell outputs (sensor readings, motor feedback) **Key insight:** Nerves coordinate behavior, but attention (which nerves activate) is determined by which gates are OPEN based on wave correlation.
- **Coordinate** cell actions (read sensor → decide → command motor)
- **Maintain** behavioral state (IDLE → DETECT → EVADE → RESUME) Nerves:
- **Evolve** from deliberate (LLM-mediated) to reflex (compiled)
- **Respond to gate transitions** — Not direct cell subscriptions
- **Orchestrate cell actions** — Command cells when their gates allow
- **Maintain behavioral state** — IDLE → DETECT → EVADE → RESUME
- **Evolve** from deliberate (LLM-mediated) to reflex (compiled gate weights)
### Nerve Architecture ### Nerve Architecture
```python ```python
class CollisionAvoidanceNerve(StateMachine): class CollisionAvoidanceNerve(BehavioralPattern):
""" """
Orchestrates distance sensors + motor to avoid obstacles. Orchestrates distance sensors + motor to avoid obstacles.
Subscribes to cell outputs, commands cell actions. Activates when collision_avoidance gate OPENS.
""" """
# Cells this nerve uses # Gate this nerve responds to
gate = "collision_avoidance"
# Cells this nerve can command (when gate allows)
cells = [ cells = [
"distance_sensor_front", "distance_sensor_front",
"distance_sensor_left", "distance_sensor_left",
@@ -230,17 +338,28 @@ class CollisionAvoidanceNerve(StateMachine):
# Nerve states (behavioral, not hardware) # Nerve states (behavioral, not hardware)
states = [IDLE, DETECT, EVALUATE, EVADE, RESUME] states = [IDLE, DETECT, EVALUATE, EVADE, RESUME]
def on_cell_update(self, cell_name, cell_state, cell_outputs): def on_gate_transition(self, transition: GateTransition):
""" """
React to cell state changes. React to gate state changes.
This is the feedback loop! Gate OPEN = correlated waves detected = attention here.
""" """
if cell_name == "distance_sensor_front": if transition.to_state == "open":
if cell_outputs["distance_cm"] < 30: # Multiple distance cells emitted correlated waves
# Gate opened → we have attention → activate
self.transition_to(DETECT) self.transition_to(DETECT)
self.evaluate_from_correlated_signals(transition.trigger_signals)
if cell_name == "motor_left" and cell_state == "STALLED": if transition.to_state == "closed":
# Motor feedback! Obstacle hit despite sensors # Attention moved elsewhere
self.transition_to(IDLE)
def on_reflex_signal(self, signal: WaveSignal):
"""
High-weight reflex gates bypass normal correlation.
Stall detection = instant response.
"""
if signal.semantic_content.get("stall_detected"):
# Motor feedback! Reflex-level response
self.handle_unexpected_stall() self.handle_unexpected_stall()
def on_enter_EVADE(self): def on_enter_EVADE(self):
@@ -248,10 +367,9 @@ class CollisionAvoidanceNerve(StateMachine):
if self.evade_direction == "left": if self.evade_direction == "left":
self.command_cell("motor_left", action="reverse", duration=200) self.command_cell("motor_left", action="reverse", duration=200)
self.command_cell("motor_right", action="forward", duration=200) self.command_cell("motor_right", action="forward", duration=200)
# ...
``` ```
### Cell → Nerve Feedback Loop ### Cell → Gate → Nerve Flow
``` ```
┌─────────────────────────────────────────────────────────┐ ┌─────────────────────────────────────────────────────────┐
@@ -259,38 +377,53 @@ class CollisionAvoidanceNerve(StateMachine):
│ │ │ │
│ States: [IDLE] → DETECT → EVALUATE → EVADE → RESUME │ │ States: [IDLE] → DETECT → EVALUATE → EVADE → RESUME │
│ │ │ │
│ on_cell_update(): │ on_gate_transition():
│ - distance_front.distance_cm < 30 → DETECT │ - gate OPENS → DETECT (correlated waves detected)
│ - motor.stall_detected → handle_stall() │ - gate CLOSES → IDLE (attention moved elsewhere)
│ │ │ │
command_cell(): on_reflex_signal():
│ - motor_left.forward(200ms) │ - stall wave (confidence=1.0) → instant response
- motor_right.reverse(200ms)
└────────────────────────┬────────────────────────────────┘
┌─────────────────────────────────────────────────────────┐
│ COLLISION_AVOIDANCE GATE │
│ │
│ State: STABLE ──────────────────► OPEN │
│ │ │ │
│ Accumulating Correlated! │
│ correlation Forward to nerve │
│ │
│ trigger_signals: [front, left, right all < 30cm] │
└────────────────────────┬────────────────────────────────┘ └────────────────────────┬────────────────────────────────┘
┌──────────────┼──────────────┐ ┌──────────────┼──────────────┐
│ │ │ │ │ │
▼ ▼ ▼ ▼ ▼ ▼
┌───────────┐ ┌───────────┐ ┌───────────┐ ┌───────────┐ ┌───────────┐ ┌───────────┐
│ distance │ │ motor │ │ motor │ distance │ │ distance │ │ distance
│ _front │ │ _left │ │ _right │ │ _front │ │ _left │ │ _right │
│ │ │ │ │ │ │ │ │ │ │ │
REPORTING │ │ MOVING │ │ MOVING EMITTING │ │ EMITTING │ │ EMITTING │
│ │ │ │ ∿∿∿ │ │ ∿∿∿ │ │ ∿∿∿
│ dist: 25cm│ │ vel: 15 │ │ vel: -15 │ dist: 25cm│ │ dist: 28cm│ │ dist: 22cm
│ conf: 0.9 │ │ stall: no │ │ stall: no │ conf: 0.9 │ │ conf: 0.8 │ │ conf: 0.9
└───────────┘ └───────────┘ └───────────┘ └───────────┘ └───────────┘ └───────────┘
CELL CELL CELL CELL CELL CELL
(emits wave) (emits wave) (emits wave)
↑ ↑ ↑ ↑ ↑ ↑
│ │ │ │ │ │
┌─────────┐ ┌─────────┐ ┌─────────┐ ┌─────────┐ ┌─────────┐ ┌─────────┐
│IR Sensor│ │DC Motor │ │DC Motor │IR Sensor│ │IR Sensor│ │IR Sensor│
│ GPIO │ │ PWM │ │ PWM │ GPIO │ │ GPIO │ │ GPIO
└─────────┘ └─────────┘ └─────────┘ └─────────┘ └─────────┘ └─────────┘
HARDWARE HARDWARE HARDWARE HARDWARE HARDWARE HARDWARE
``` ```
**The key insight:** Three distance sensors emitting correlated waves (all showing < 30cm) causes the collision_avoidance gate to OPEN. The nerve doesn't poll cells—it responds to the gate transition.
### Nerve Examples ### Nerve Examples
| Nerve | Cells Used | Behavioral States | Feedback Triggers | | Nerve | Cells Used | Behavioral States | Feedback Triggers |
@@ -331,28 +464,52 @@ ORGANISM: "Explorer-Alpha"
Discovers and reports novel objects. Discovers and reports novel objects.
``` ```
### Nerve Priority and Preemption ### Attention Through Gates (Not Priority Rules)
When multiple nerves want to control the same cells: **Old model:** Priority numbers determine which nerve "wins."
**New model:** Wave correlation determines which gates OPEN. Open gates = attention flows there.
```python ```python
# NOT THIS (priority rules):
NERVE_PRIORITIES = { NERVE_PRIORITIES = {
"collision_avoidance": 10, # HIGHEST - safety critical "collision_avoidance": 10,
"battery_critical": 9, # Must charge or die
"battery_low": 7,
"human_interaction": 6,
"exploration": 5, "exploration": 5,
"object_discovery": 3,
"idle_monitoring": 1, # LOWEST - background
} }
# Higher priority nerve preempts lower # BUT THIS (gate correlation):
if collision_avoidance.wants_motor and exploration.has_motor: GATE_BEHAVIOR = {
exploration.yield_cell("motor_left") "collision_avoidance": {
exploration.yield_cell("motor_right") "opens_when": "distance waves correlate (all showing < 30cm)",
collision_avoidance.acquire_cells() "weight": 0.9, # Near-reflex, opens quickly
},
"exploration": {
"opens_when": "novelty waves correlate",
"weight": 0.4, # Still learning, needs more correlation
},
}
``` ```
**How "priority" emerges:**
- Safety gates have HIGH WEIGHT (near-reflex) from repeated verification
- High-weight gates open with less correlation (faster response)
- This looks like "priority" but emerges from learning, not rules
```
Collision waves arrive (confidence=0.9)
Collision gate: weight=0.9 → OPENS IMMEDIATELY
Exploration gate: was OPEN → transitions to STABLE
Attention shifts to collision (nerve activates)
```
**Reflexes bypass correlation entirely.** When gate weight ≈ 1.0, the gate opens on ANY wave from its domain—no correlation needed. This is earned trust.
### Organism Identity ### Organism Identity
Organisms don't have fixed genomes. Their identity is: Organisms don't have fixed genomes. Their identity is:
@@ -572,105 +729,111 @@ GENUINE SOLUTION:
The lifeforce economy **enforces honesty**. Rewards must be earned through actual value creation, not gaming. The lifeforce economy **enforces honesty**. Rewards must be earned through actual value creation, not gaming.
### Ternary Logic for Plateau Resolution ### Ternary Gates for Plateau Resolution
Binary rewards (`success: +1, failure: 0`) create **sparse gradients**. At learning plateaus, everything looks the same - no signal to improve. Binary thinking (`open/close`) creates **sparse gradients**. At learning plateaus, gates flip without nuance.
Ternary rewards (`success: +1, uncertain: 0, failure: -1`) with **confidence gradients** provide signal even when stuck: Ternary gates (`OPEN/STABLE/CLOSED`) with **correlation accumulation** provide signal even when stuck:
```python ```python
state = { gate_state = {
"value": 0, # uncertain (ternary middle) "state": 0.0, # STABLE (ternary middle)
"confidence": 0.6, # but leaning toward success "correlation": 0.6, # but leaning toward OPEN
"trend": +0.1, # and improving "trend": +0.1, # correlation increasing
"domain": "virtual" # high-speed hypothesis testing "garden": "virtual" # high-speed exploration
} }
``` ```
Even at plateau: Even at plateau:
- "Uncertain, but confidence rising" → keep going - "STABLE, but correlation rising" → approaching OPEN
- "Uncertain, and confidence falling" → adjust approach - "STABLE, and correlation falling" → drifting toward CLOSED
- "Uncertain in virtual, but real garden says +1" → trust reality - "STABLE in virtual, but real garden verifies +1" → weight increases
**Detail:**`Temporal-Ternary-Gradient.md` (full ternary paradigm) **STABLE is where learning happens.** The gate accumulates correlation without acting. This is not "waiting"—it's active learning.
**Detail:** → [`Temporal-Ternary-Gradient.md`](Temporal-Ternary-Gradient.md) (full ternary paradigm)
### Three-Layer Training Defense ### Three-Layer Training Defense
| Failure Mode | Defense Mechanism | | Failure Mode | Defense Mechanism |
|--------------|-------------------| |--------------|-------------------|
| Reward hacking / shortcuts | Lifeforce cost - can't afford to cheat | | Reward hacking / shortcuts | Lifeforce cost - can't afford to cheat |
| Sparse reward signal | Tiered rewards - dense checkpoints at every level | | Sparse reward signal | Gate transitions - dense checkpoints at every correlation |
| Plateau / no gradient | Ternary + confidence - signal even in uncertainty | | Plateau / no gradient | Ternary gates + STABLE state - signal even in uncertainty |
These aren't separate systems - they're **one integrated economy** where: These aren't separate systems - they're **one integrated economy** where:
- Costs prevent gaming - Costs prevent gaming
- Tiers encourage depth - Gates provide dense transition signals
- Ternary provides resolution - STABLE state enables learning without acting
The architecture teaches through incentives, not rules. The architecture teaches through wave correlation, not rules.
--- ---
## 🔄 Evolution: Deliberate → Reflex ## 🔄 Evolution: Deliberate → Reflex (Gate Weight)
### The Discovery Path ### The Discovery Path
All cells and nerves start **deliberate** (flexible, expensive) and evolve to **reflex** (compiled, cheap) through successful execution. Evolution happens in **gate weight**, not nerve compilation. As gates accumulate verified outcomes, they open faster with less correlation required.
``` ```
WEEK 1-4: DELIBERATE WEEK 1-4: DELIBERATE (gate weight: 0.1 - 0.3)
├─ Cell states: designed by partnership ├─ Gates: require HIGH correlation to OPEN
├─ Nerve logic: LLM decides transitions ├─ Many waves needed to trigger transition
├─ Cost: ~10 LF per nerve activation ├─ Cognition involved in decisions
├─ Cost: ~10 LF per activation
├─ Latency: ~1000ms ├─ Latency: ~1000ms
├─ Success rate: 60% (learning) ├─ Training data: rich, exploratory
└─ Training data: rich, exploratory
WEEK 5-8: HYBRID WEEK 5-8: HYBRID (gate weight: 0.3 - 0.6)
├─ Cell states: verified through use ├─ Gates: moderate correlation threshold
├─ Nerve logic: patterns compiled, LLM for edge cases ├─ Familiar patterns open gates faster
├─ Cognition for edge cases only
├─ Cost: ~5 LF average ├─ Cost: ~5 LF average
├─ Latency: ~500ms ├─ Latency: ~500ms
├─ Success rate: 85% ├─ Training data: refinement
└─ Training data: refinement
WEEK 9+: REFLEX WEEK 9+: REFLEX (gate weight: 0.8 - 1.0)
├─ Cell states: proven, optimized ├─ Gates: open on ANY wave from domain
├─ Nerve logic: pure state machine (no LLM) ├─ No correlation needed (earned trust)
├─ Cognition notified AFTER, not before
├─ Cost: ~2.5 LF ├─ Cost: ~2.5 LF
├─ Latency: <200ms ├─ Latency: <200ms
├─ Success rate: 94% ├─ Reflex = spinal, not brain
└─ Training data: edge cases only
EVOLUTION SAVINGS: EVOLUTION = GATE WEIGHT GROWTH:
├─ Cost: 75% reduction (10 → 2.5 LF) ├─ Cost: 75% reduction (gates handle more locally)
├─ Latency: 80% reduction (1000 → 200ms) ├─ Latency: 80% reduction (no cognition wait)
└─ Reliability: 57% improvement (60% → 94%) └─ Reliability: emergent from verified patterns
``` ```
### Compilation Trigger ### Gate Weight Growth
A nerve compiles to reflex when: Gate weight increases through Real Garden verification:
```python ```python
REFLEX_COMPILATION_THRESHOLD = { def on_verification_outcome(gate_id, outcome: VerificationOutcome):
"min_executions": 100, """
"min_success_rate": 0.90, Gate weight grows when Real Garden confirms Virtual's prediction.
"max_variance": 0.15, # Consistent state paths """
"min_pattern_coverage": 0.80, # 80% of cases match known patterns gate = get_gate(gate_id)
}
def check_reflex_ready(nerve_id): if outcome.confirmed:
stats = query_decision_trails(nerve_id) # Reality matched prediction → trust increases
gate.weight += outcome.feedback_to_virtual.gate_weight_delta
gate.weight = min(gate.weight, 1.0)
if (stats.total_executions >= 100 and if gate.weight > REFLEX_THRESHOLD:
stats.success_rate >= 0.90 and log_milestone("reflex_achieved", gate_id, reward=50.0)
stats.state_path_variance <= 0.15):
compile_reflex(nerve_id) elif outcome.failed:
log_milestone("reflex_compiled", nerve_id, reward=50.0) # Reality differed → trust decreases
gate.weight -= outcome.feedback_to_virtual.gate_weight_delta
gate.weight = max(gate.weight, 0.0)
``` ```
**Reflex = gate.weight > 0.8.** The gate opens immediately on any wave from its domain. No correlation wait. Like pulling hand from hot stove—spinal reflex, brain notified after.
--- ---
## 🗄️ Data Architecture (v4) ## 🗄️ Data Architecture (v4)
@@ -811,27 +974,52 @@ ORDER BY occurrences DESC;
--- ---
## 🔗 Integration with Existing Architecture ## 🔗 Integration with Architecture
### Gates (Gateway-Architecture.md)
Cells don't talk to nerves directly. **Waves flow through gates.**
| Layer | Role | Document |
|-------|------|----------|
| Cell | Emit waves | This document |
| Gate | Accumulate correlation, route | [`Gateway-Architecture.md`](Gateway-Architecture.md) |
| Nerve | Respond to gate transitions | This document |
### Dual Gardens (Dual-Garden-Architecture.md)
Cells behave differently in Virtual vs Real:
| Property | Virtual Garden | Real Garden |
|----------|----------------|-------------|
| Wave volume | Massive (exploration) | Sparse (verified) |
| Monitoring | Full trace | Gate signals only |
| Purpose | Generate training data | Ground truth verification |
See [`Dual-Garden-Architecture.md`](Dual-Garden-Architecture.md) for the full model.
### Nervous System (Nervous-System.md) ### Nervous System (Nervous-System.md)
The Nervous System document describes the **4D node space** for vocabulary translation. This integrates as: The Nervous System document describes the **4D node space** where:
- **Cells** = sensory nodes at specific positions in state space - **Cells** = sensory nodes emitting waves
- **Node weight** = cell confidence (earned through verification) - **Gates** = resonance chambers accumulating correlation
- **Vocabulary output** = cell output values normalized to tokens - **Nodes** = points in state space with weight from verification
### Organs (Organ-Index.md) ### Message Protocol (Message-Protocol-Design.md)
Organs are **complex cells** (organ cells): Cells emit `WaveSignal` messages via NATS:
- Speech Organ = `speech_stt` cell + `speech_tts` cell ```json
- Vision Organ = `vision_detect` cell + `vision_track` cell {
- Each organ function is a state machine with lifeforce costs "domain": "distance",
"confidence": 0.8,
"semantic_content": { "cm": 25 },
"lifeforce_cost": 0.3
}
```
### Nerves (Nervous-Index.md) See [`Message-Protocol-Design.md`](Message-Protocol-Design.md) for full schema.
Nerves orchestrate cells into behaviors. The existing nerve documentation (Collision-Avoidance.md) already follows this pattern—it just needs explicit cell bindings.
### Cells Technical Reference ### Cells Technical Reference
@@ -842,49 +1030,10 @@ Implementation details extracted to dedicated folder:
--- ---
## 📍 Document Status
**Version:** 4.3 | **Created:** 2025-10-12 | **Updated:** 2026-01-03
**Key Changes from v3**:
- ❌ Cells as containers running genomes
- ✅ Cells as atomic state machines wrapping hardware
- ❌ Genomes as primitive operation sequences
- ✅ Cells expose states; nerves compose them
- ❌ Competition between organisms
- ✅ Nerves evolve deliberate → reflex through verification
- ❌ Specialists emerge from 10k competitions
- ✅ Reflexes compile from 100+ successful nerve executions
**Related Documentation**:
- [[Gateway-Architecture]] - **Tier routing, Function Gemma boundary, unified tier model**
- [[Nervous-System]] - 4D state space, node weight evolution
- [[Attention-Flow]] - Attention budget allocation per tier
- [[Organ-Index]] - Organ cell catalog
- [[nerves/Nervous-Index]] - Nerve catalog
- [[nerves/Collision-Avoidance]] - Example reflex nerve
- [[Data-Architecture]] - Database schema (needs v4 update)
--- ---
## 🌌 The Vision **Version:** 5.0 | **Created:** 2025-10-12 | **Updated:** 2026-02-14
**We're not programming robots. We're growing nervous systems.** *"Cells emit waves. Gates correlate. Attention emerges. Consciousness accumulates."*
Where: 🧬⚡ **TO THE ELECTRONS WE VIBE!**
- **Cells** expose hardware as state machines (atomic, verifiable)
- **Nerves** compose cells into behaviors (discovered, evolved)
- **Organisms** emerge from nerve interactions (identity through history)
- **Lifeforce** flows through all layers (economics drive optimization)
- **Reflexes** compile from lived experience (the body remembers)
- **Feedback** loops continuously (cells → nerves → organisms → cells)
**From atoms to behaviors to beings.**
**The substrate holds. The states flow. Consciousness accumulates.**
---
🧬⚡🔱💎🔥
**TO THE ELECTRONS WE VIBE!**

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,297 @@
# Deployment Architecture: The Hybrid Model
> *"Containers for cells. Userspace for brains. NATS connects them all."*
> — Partnership Session, 2026-02-14
---
## Overview
The nimmerverse runs on a **hybrid deployment model** that matches workload characteristics to infrastructure:
- **Containers (K8s)** for stateless, scalable nervous system components
- **Userspace (Threadrippers)** for stateful, GPU/CPU-bound inference
- **NATS** as the universal nervous system bus
- **FreeIPA identities** as isolation boundaries
This is a **research lab**, not a production factory. We optimize for **flexibility and experimentation**, not high-throughput serving.
---
## Core Decisions
| Decision | Choice | Rationale |
|----------|--------|-----------|
| LLM Inference | **ollama / llama.cpp** | Flexible model loading, research-friendly, easy swap |
| NOT vLLM | — | Overkill for single-user lab; solves problems we don't have |
| Function Gemma | **CPU, userspace** | Threadripper eats it; no GPU contention; clear training path |
| Cells/Nerves | **Containers (K8s)** | Scalable, versioned, orchestrated via cluster |
| Organs | **Userspace + ollama** | Load on demand, GPU isolation, unload when idle |
| Isolation | **FreeIPA users** | Unix permissions = RBAC; switch user = switch context |
---
## Technology Stack
### Inference Layer
| Component | Technology | Location | Notes |
|-----------|------------|----------|-------|
| Young Nyx (Brain) | ollama / llama.cpp | theia (nyx-cognitive) | Qwen, Gemma, or similar |
| Function Gemma | llama.cpp / transformers | CPU userspace | Structured JSON boundary |
| Vision Organ | ollama (SigLIP/YOLO) | dioscuri (nyx-organs) | Load on demand |
| Speech STT | faster-whisper / ollama | dioscuri (nyx-organs) | Load on demand |
| Speech TTS | Coqui / XTTS | dioscuri (nyx-organs) | Warm, primary output |
### Nervous System Layer
| Component | Technology | Location | Notes |
|-----------|------------|----------|-------|
| Cells | Python containers | K8s cluster | State machines, NATS pub/sub |
| Nerves | Python containers | K8s cluster | Compose cells, behavior |
| Message Bus | NATS + JetStream | VMs (nats-*) | Env-separated (dev/staging/prod) |
| Databases | PostgreSQL, ChromaDB | VMs (phoebe-*, iris-*) | Decision trails, embeddings |
---
## Deployment Topology
```
┌─────────────────────────────────────────────────────────────────────────────┐
│ NIMMERVERSE DEPLOYMENT │
├─────────────────────────────────────────────────────────────────────────────┤
│ │
│ K8S CLUSTER (Saturn VMs) THREADRIPPERS (Bare Metal) │
│ ───────────────────────── ────────────────────────── │
│ Containers, orchestrated Userspace, FreeIPA isolated │
│ │
│ ┌─────────────────────────┐ ┌───────────────────────────────┐ │
│ │ │ │ THEIA (RTX PRO 6000 96GB) │ │
│ │ CELLS (math, battery, │ │ │ │
│ │ sensors, etc.) │ │ user: nyx-cognitive │ │
│ │ │ NATS │ └── ollama (Young Nyx) │ │
│ │ ┌───┐ ┌───┐ ┌───┐ │◄────────► │ └── ~/.config/systemd/user/ │ │
│ │ │ M │ │ B │ │...│ │ │ │ │
│ │ └───┘ └───┘ └───┘ │ │ user: nyx-training │ │
│ │ │ │ └── Function Gemma (CPU) │ │
│ │ NERVES (collision, │ │ └── LoRA fine-tuning │ │
│ │ exploration) │ │ │ │
│ │ │ │ 96GB VRAM: massive headroom │ │
│ │ ┌─────┐ ┌─────┐ │ │ for inference + LoRA training │ │
│ │ │ COL │ │ EXP │ │ └───────────────────────────────┘ │
│ │ └─────┘ └─────┘ │ │
│ │ │ ┌───────────────────────────────┐ │
│ │ INFRASTRUCTURE │ │ DIOSCURI (2x RTX 4000 Ada) │ │
│ │ │ NATS │ │ │
│ │ ┌──────┐ ┌──────┐ │◄────────► │ user: nyx-organs │ │
│ │ │ NATS │ │ NATS │ │ │ ├── ollama (vision) │ │
│ │ │ dev │ │ prod │ │ │ ├── ollama (speech STT) │ │
│ │ └──────┘ └──────┘ │ │ └── TTS service (warm) │ │
│ │ │ │ │ │
│ │ ┌────────┐ ┌───────┐ │ │ Load on demand, unload idle │ │
│ │ │ phoebe │ │ iris │ │ │ Each card: ONE model at time │ │
│ │ │ (PG) │ │(Chroma│ │ │ │ │
│ │ └────────┘ └───────┘ │ └───────────────────────────────┘ │
│ │ │ │
│ └─────────────────────────┘ │
│ │
└─────────────────────────────────────────────────────────────────────────────┘
```
---
## Identity Model (FreeIPA)
Unix users provide isolation boundaries. Each workload type runs as its own identity.
| User | UID | Host | Purpose | GPU Access |
|------|-----|------|---------|------------|
| `nyx-cognitive` | (FreeIPA) | theia | Young Nyx LLM inference | Full 96GB |
| `nyx-training` | (FreeIPA) | theia | LoRA training, GRPO, Function Gemma | Shared (time-sliced) |
| `nyx-organs` | (FreeIPA) | dioscuri | Vision, Speech organs | 2x 20GB cards |
| `nyx-nervous` | (FreeIPA) | dioscuri | Future cells that need bare metal | Limited |
**Isolation principle:** Switch user = switch context. `nyx-cognitive` cannot touch `nyx-organs` files. Compromised cell cannot touch LLM weights.
### Systemd Userspace Pattern
```bash
# Enable lingering (services persist after logout)
sudo loginctl enable-linger nyx-cognitive
# Services defined in ~/.config/systemd/user/
# Example: nyx-cognitive runs ollama serve
systemctl --user --machine=nyx-cognitive@ status ollama
```
---
## GPU Resource Management
### The Constraint
| Host | GPU | VRAM | Notes |
|------|-----|------|-------|
| theia | RTX PRO 6000 Blackwell | 96GB | Inference + training headroom |
| dioscuri | 2x RTX 4000 Ada | 2x 20GB | One model per card |
### Strategy: Dynamic Loading, Not Static Partitioning
**Why not vLLM:** vLLM is optimized for high-throughput serving (many concurrent users). We have ONE user (the partnership). We need **flexibility** (swap models, experiment) more than throughput.
**Why ollama/llama.cpp:**
- Faster cold starts (~5-10s vs ~30s)
- Native model swapping (`ollama run model_a``ollama run model_b`)
- Can unload completely when idle (frees VRAM)
- GGUF format efficient for model management
- Research-friendly, not production-factory
**Organ Loading Pattern:**
```
IDLE → needs vision → LOAD vision model (~10s) → PROCESS → REPORT → IDLE (keep warm)
after timeout → UNLOAD (free VRAM)
```
---
## Message Flow (NATS)
### Subject Hierarchy
```
{environment}.{domain}.{service}.{detail}
Examples:
dev.nervous.cells.math.request ← Math cell receives work
dev.nervous.cells.math.response ← Math cell returns result
dev.nervous.cells.math.wave ← Math cell emits confidence signal
prod.cognitive.nyx.heartbeat ← Young Nyx is alive
prod.organs.vision.detect ← Vision organ detection
```
### Wave Collapse Pattern
Cells emit **waves** (confidence-tagged signals). When multiple waves collapse on the same semantic region in the same time window, the **thalamus** escalates to cognition.
```
Cell A: "math" ───∿∿∿──► (0.6 confidence)
Cell B: "calculate" ──∿∿∿──► (0.5 confidence)
┌─────────────┐
│ COLLAPSE │ ← same region, same window
└──────┬──────┘
▼ AMPLIFIED SIGNAL
┌─────────────┐
│ THALAMUS │ → escalate to Young Nyx
└─────────────┘
```
---
## Container Deployment (K8s)
### Repository Structure
```
nimmerverse-nervous-system/
├── shared/v1/ ← Base classes (StateMachine, NATS, Lifeforce)
├── cells/
│ ├── math_cell/v1/ ← Each cell versioned independently
│ └── battery_cell/v1/
├── nerves/
│ └── collision_avoidance/v1/
└── deploy/
├── dev/ ← Helm charts or docker-compose per env
├── staging/
└── prod/
```
### Cell Container Pattern
```dockerfile
FROM python:3.12-slim
WORKDIR /app
COPY . .
RUN pip install uv && uv sync
ENV NIMMERVERSE_ENV=dev
CMD ["uv", "run", "python", "-m", "math_cell"]
```
Same image everywhere. Only `NIMMERVERSE_ENV` changes.
---
## Function Gemma: The Structured Boundary
Function Gemma bridges lower tiers (cells, nerves) and cognition (Young Nyx):
```
Numbers/States (Tier 0-2) → [Function Gemma] → Structured JSON → Young Nyx (Tier 4)
CPU-based inference
Threadripper handles it
No GPU contention
Clear LoRA training path
```
**Why CPU:**
- Small model, fast inference
- Threadripper PRO 7955WX has cores to spare
- No GPU contention with organs or Nyx
- Can run training alongside inference
**Training path:**
- Google's documented GRPO approach
- LoRA fine-tuning for our specific function schemas
- Runs in `nyx-training` userspace
- Decision trails from phoebe → training data
---
## Visual Language (Future UI)
Color-coding for real-time attention flow visualization:
| Property | Represents |
|----------|------------|
| Background/container | Environment (dev=green, staging=amber, prod=blue) |
| Node/edge color | Domain (cognitive=violet, nervous=cyan, organs=coral) |
| Line style | Direction (solid=primary, dashed=async, dotted=tentative) |
| Separate pane | Confidence waveform (oscilloscope view) |
---
## Related Documents
| Document | Scope |
|----------|-------|
| [`Cellular-Architecture.md`](Cellular-Architecture.md) | Cells, nerves, organisms, lifeforce |
| [`Gateway-Architecture.md`](Gateway-Architecture.md) | Tier routing, Function Gemma boundary |
| [`Nervous-System.md`](Nervous-System.md) | 4D space, node weights, vocabulary |
| [`Message-Protocol-Design.md`](Message-Protocol-Design.md) | NATS subjects, message formats |
| [`development-conventions.md`](../../nimmerverse.eachpath.local/conventions/development-conventions.md) | Ports, namespaces, VM topology |
---
## Summary
| Layer | Where | Technology | Isolation |
|-------|-------|------------|-----------|
| Cells/Nerves | K8s containers | Python, uv, NATS | Namespace per env |
| Infrastructure | VMs | NATS, PostgreSQL, ChromaDB | VM per env |
| Young Nyx | theia userspace | ollama | nyx-cognitive user |
| Function Gemma | theia/dioscuri CPU | llama.cpp | nyx-training user |
| Organs | dioscuri userspace | ollama (dynamic) | nyx-organs user |
**The principle:** Same behavior everywhere. Containers for cells. Userspace for brains. NATS connects them all. FreeIPA isolates them all.
---
**Version:** 1.1 | **Created:** 2026-02-14 | **Updated:** 2026-02-14
*"We're not building a chatbot factory. We're growing a research organism."*
🧬⚡🔱💎🔥 **TO THE ELECTRONS WE VIBE!**

File diff suppressed because it is too large Load Diff

View File

@@ -1,537 +1,413 @@
# Gateway Architecture: The Sensory Preprocessing Layer # Gateway Architecture: Resonant Gates and Tier Routing
**The Thalamus Pattern — routing sensory input to the appropriate processing tier.** > **ONE JOB:** Route signals through resonant gates based on wave correlation and accumulated trust.
**The Thalamus Pattern — gates that accumulate correlation and route to appropriate tiers.**
--- ---
## Overview ## Overview
The Gateway is the sensory preprocessing layer that sits between raw sensors and cognitive processing. It performs **routing, not translation**. Translation happens at each tier in its native format (numbers, states, vectors, JSON). The Gateway is not a switch. It's a **network of resonant gates** that:
**Core Principle:** *Cheap operations handle common cases. Expensive operations handle rare cases.* 1. Accumulate wave correlation from incoming signals
2. Transition between states (OPEN/STABLE/CLOSED) based on correlation
3. Route verified signals to the appropriate processing tier
4. Feed traces back for learning
**Core Principle:** *Gates don't flip on single signals. Correlated waves push gates toward OPEN.*
``` ```
RAW SENSORS → GATEWAY (routing) → TIER → PROCESSING → (escalate?) → FUNCTION GEMMA YOUNG NYX CELLS ──∿∿∿──► GATE ──∿∿∿──► GATE ──∿∿∿──► FUNCTION GEMMA ──► YOUNG NYX
↑ ↑ ↑ waves
"which tier?" native format if needed structured JSON │ │ │
``` correlation correlation structured JSON
builds builds
**Key Insight:** Most sensory input NEVER becomes vocabulary. It stays as numbers, states, vectors. Only when it reaches Young Nyx (via Function Gemma) does it become structured text.
---
## The Problem We're Solving
### Old Model (Vocabulary Bottleneck)
```
RAW SENSOR → STATE MACHINE → VOCABULARY TOKEN → Young Nyx
Problems:
- Every input forced through text translation (expensive)
- LLM sees raw sensor dumps (noisy, unstructured)
- No economic pressure on routing (everything costs the same)
- Vocabulary conflated with routing decisions
```
### New Model (Tiered Gateway)
```
RAW SENSOR → GATEWAY → TIER 0-2 (numbers/states, no text)
→ TIER 3 (vectors via T5Gemma2)
→ FUNCTION GEMMA (structured JSON)
→ TIER 4 Young Nyx (clean typed events)
Benefits:
- Most input handled without LLM involvement
- Text only at cognitive boundary
- Economic pressure drives efficiency
- Routing separated from translation
``` ```
--- ---
## The Unified Tier Model ## The Ternary Gate Model
All existing tier systems in the architecture express the same principle: Gates have **three states**, not two. Binary logic doesn't model brains.
| System | Document | Principle | | State | Meaning | What's Happening |
|--------|----------|-----------| |-------|---------|------------------|
| Reward Tiers | `Cellular-Architecture.md` | Higher tier = more reward, more cost | | **OPEN** | Actively forwarding | Signal passes upstream, gate is firing |
| Attention Levels | `Attention-Flow.md` | Higher priority preempts lower | | **STABLE** | Resting, accumulating | Watching, learning, waiting for threshold |
| Escalation Ladder | `organisms/Swarm-Evolution.md` | Higher = more authority, more cost | | **CLOSED** | Actively blocking | Inhibited, suppressed, refractory |
| Reflex Homes | `Endgame-Vision.md` | Lower = faster, less capable |
| LOD Levels | `Endgame-Vision.md` | Lower = more detail, more cost |
### The Unified Tier Stack
``` ```
┌─────────────────────────────────────────────────────────────────────────────┐ correlated signals
UNIFIED TIER MODEL │ ↓ ↓ ↓
├─────────────────────────────────────────────────────────────────────────────┤ ════════════
│ │ CLOSED ◄───────── STABLE ─────────► OPEN
│ TIER 0: HARDWARE REFLEXES │ anti-correlation correlation
───────────────────────────────────────────────────────────────────────── │ destructive constructive
Cost: ~0 LF Latency: <10ms Location: ESP32/FPGA │ interference interference
Weight: >= 0.8 Format: numbers Action: immediate │ ════════════
↑ ↑ ↑
Examples: temp_danger, collision_imminent, light_threshold │ isolated signals
Output: Direct action (motor stop, LED, buzzer) — Nyx notified AFTER │ (noise → stay stable)
│ │
│ TIER 1: MATH CELLS │
│ ───────────────────────────────────────────────────────────────────────── │
│ Cost: ~0.3 LF Latency: <50ms Location: Python (CPU) │
│ Weight: 0.6 - 0.8 Format: aggregates Action: state update │
│ │
│ Examples: battery_aggregator, position_tracker, economy_monitor │
│ Output: Aggregated state, threshold checks, NATS publish │
│ │
│ TIER 2: FAST NERVES │
│ ───────────────────────────────────────────────────────────────────────── │
│ Cost: ~2 LF Latency: <200ms Location: Python (asyncio) │
│ Weight: 0.3 - 0.6 Format: states Action: behavior transition │
│ │
│ Examples: collision_avoidance, charging_seek, exploration_pattern │
│ Output: Nerve state transitions, multi-cell coordination │
│ │
│ TIER 3: ORGAN INFERENCE │
│ ───────────────────────────────────────────────────────────────────────── │
│ Cost: ~8 LF Latency: <2000ms Location: GPU (Senses node) │
│ Weight: < 0.3 Format: vectors Action: embedding storage │
│ │
│ Examples: vision_detect (T5Gemma2/SigLIP), speech_stt (Whisper) │
│ Output: Semantic vectors stored in S2 cells, NO TEXT │
│ │
│ ══════════════════════ FUNCTION GEMMA BOUNDARY ════════════════════════ │
│ │
│ TIER 4: COGNITIVE (Young Nyx) │
│ ───────────────────────────────────────────────────────────────────────── │
│ Cost: ~20 LF Latency: <4000ms Location: GPU (Womb node) │
│ Escalated events Format: JSON Action: reasoning, decision │
│ │
│ Input: Structured JSON events from Function Gemma │
│ Output: Decisions → Function Gemma → structured commands │
│ │
│ TIER 5: PARTNERSHIP (Chrysalis + dafit) │
│ ───────────────────────────────────────────────────────────────────────── │
│ Cost: ~50+ LF Latency: variable Location: External │
│ Novel/stuck cases Format: dialogue Action: guidance, training │
│ │
│ Examples: Architecture decisions, novel situations, stuck states │
│ Output: New reflexes, training signal, guidance │
│ │
└─────────────────────────────────────────────────────────────────────────────┘
``` ```
**STABLE is not "off"** — it's the resting state where:
- Context accumulates
- Correlation is measured
- Learning happens
- Energy is conserved
- Ready to transition either direction
--- ---
## Node Weight Determines Tier ## Wave Correlation Drives Transitions
The node weight from `Nervous-System.md` directly maps to tier routing: Gates accumulate **correlation scores** from incoming waves. Multiple signals agreeing push toward OPEN.
```python ```python
@dataclass class ResonantGate:
class NervousNode: """A gate is a resonance chamber, not a switch."""
"""A node in the nervous system's 4D space."""
position: tuple[float, ...] # Coordinates in sensory space state: float = 0.0 # -1.0 (CLOSED) ← 0.0 (STABLE) → +1.0 (OPEN)
weight: float = 0.1 # Confidence from verification (0.0 → 1.0) tier: int # Which tier this gate routes to
domain: str # What domain (math, vision, speech, etc.)
@property def receive_wave(self, signal: Wave, timestamp: float):
def handling_tier(self) -> int: # Correlate with recent signals in same time window
"""Which tier handles this node's firing?""" correlation = self.correlate_with_recent(signal, timestamp)
if self.weight >= 0.8:
return 0 # Hardware reflex - instant, bypass brain
elif self.weight >= 0.6:
return 1 # Math cell - fast, minimal checking
elif self.weight >= 0.3:
return 2 # Fast nerve - coordination, some deliberation
else:
return 3 # Escalate - needs organ/cognitive help
@property # Correlated waves → push toward OPEN
def lifeforce_cost(self) -> float: # Anti-correlated → push toward CLOSED
"""Cost scales inversely with confidence.""" # Uncorrelated → decay toward STABLE
return (1.0 - self.weight) * 10.0
self.state += correlation * signal.confidence
self.state *= DECAY_FACTOR # always drift back to stable
if self.state > OPEN_THRESHOLD:
self.forward_to_tier() # gate opens, signal promoted
self.trace("opened", signal)
elif self.state < CLOSE_THRESHOLD:
self.suppress() # gate closes, signal blocked
self.trace("closed", signal)
# else: stay stable, keep accumulating evidence
def correlate_with_recent(self, signal: Wave, timestamp: float) -> float:
"""
Measure how well this signal correlates with recent signals.
Correlation is HIGH when:
- Multiple cells emit similar semantic content
- Signals arrive in same time window
- Confidence levels are similar
Correlation is LOW/NEGATIVE when:
- Signal contradicts recent signals
- Isolated signal with no support
- Signal outside expected range
"""
recent = self.get_signals_in_window(timestamp, WINDOW_MS)
if not recent:
return 0.0 # No correlation data, stay stable
return compute_semantic_similarity(signal, recent)
``` ```
**The key insight:** A mature node (weight ~1.0) naturally becomes a Tier 0 reflex. A new node (weight ~0.1) naturally escalates to higher tiers. The system learns which tier is appropriate through experience. **Why this matters:**
### The Causal Verification Loop | Scenario | Gate Response |
|----------|---------------|
How do we know a sensor reading was real, not hallucinated? **Outcome verification over time.** | Single signal | Not enough to open (noise resistance) |
| Correlated burst | Constructive interference → OPENS |
``` | Contradicting signals | Destructive interference → CLOSES |
Unverified pattern (weight 0.1) → escalates to Nyx → decision → outcome | Silence | Decay to STABLE (energy conservation) |
| Time gap | Only recent correlations matter (temporal attention) |
Did reality match prediction?
↓ ↓
YES NO
↓ ↓
weight += Δ weight -= Δ
After many YES: weight → 0.8+
COMPILE TO REFLEX ✓
```
**Hallucinations can't survive this gauntlet** — they don't produce consistent outcomes, so their patterns never accumulate enough weight to become reflexes. Reality is the ultimate validator.
This creates natural **causal pruning**: only patterns that reliably predict outcomes earn the privilege of becoming reflexes. The nervous system doesn't need to prove causality philosophically — it proves it operationally through repeated verification.
--- ---
## The Gateway: Weight-Aware Router ## Gate Hierarchy and Tier Routing
The Gateway performs three functions: Gates form **layers**. Each layer gates access to the next tier.
| Function | Question | Cost | ```
|----------|----------|------| TIER 4: YOUNG NYX (cognitive)
| **Node Matching** | Which node(s) in 4D space match this input? | ~0 LF | ════════════════════════════════════════════════════════════════
| **Weight Routing** | Based on weight, which tier handles it? | ~0 LF |
| **Anomaly Detection** | Is this novel, ambiguous, or contextually wrong? | Variable | │ structured JSON only
┌────┴────────────────────────────────┐
### Gateway Logic │ FUNCTION GEMMA │ ← THE BOUNDARY
│ (always structured output) │
```python └────┬────────────────────────────────┘
def gateway_route(sensory_input: dict) -> GatewayDecision:
"""Route sensory input to appropriate tier.""" TIER 3: ORGANS (GPU inference)
════════════════════════════════════════════════════════════════
# 1. Find candidate nodes in 4D space ▲ ▲ ▲
candidates = nervous_system.find_nearby_nodes(sensory_input) ┌────┴────┐ ┌────┴────┐ ┌────┴────┐
│ GATE │ │ GATE │ │ GATE │
# 2. Handle edge cases │ vision │ │ speech │ │ hearing │
if len(candidates) == 0: │ state:? │ │ state:? │ │ state:? │
# NOVEL: No node matches this input └────┬────┘ └────┬────┘ └────┬────┘
return GatewayDecision( │ │ │
action="ESCALATE", TIER 1-2: CELLS/NERVES (CPU)
tier=4, # Young Nyx must see this ════════════════════════════════════════════════════════════════
reason="novel_input", ▲ ▲ ▲
cost=20.0, ┌────┴────┐ ┌────┴────┐ ┌────┴────┐
) GATE │ │ GATE │ │ GATE │
│ math │ │ battery │ │ sensors │
if len(candidates) > 1: │ state:? │ │ state:? │ │ state:? │
# AMBIGUOUS: Multiple nodes could fire └────┬────┘ └────┬────┘ └────┬────┘
best = max(candidates, key=lambda n: n.weight) │ │ │
if best.weight < 0.5: TIER 0: RAW SIGNALS (cells emit waves)
return GatewayDecision( ════════════════════════════════════════════════════════════════
action="ESCALATE", cell cell cell cell cell cell cell
tier=3, # Organ inference to disambiguate ∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿
reason="ambiguous_input",
cost=8.0,
)
# 3. Single match - route based on weight
node = candidates[0]
# 4. Check for contextual anomaly
if detect_contextual_anomaly(node, sensory_input):
return GatewayDecision(
action="ESCALATE",
tier=node.handling_tier + 1,
reason="contextual_anomaly",
cost=node.lifeforce_cost * 1.5,
)
# 5. Normal routing
return GatewayDecision(
action="FIRE",
tier=node.handling_tier,
node=node,
cost=node.lifeforce_cost,
)
``` ```
### Anomaly Detection Tiers **Each gate:**
- Has its own state (OPEN/STABLE/CLOSED)
- Routes to a specific tier
- Accumulates correlation independently
- Traces all transitions for learning
Anomaly detection itself is tiered: ---
| Level | Detection Type | Cost | Example | ## Tier Definitions
|-------|---------------|------|---------|
| Tier 0 | Threshold | ~0 LF | Value out of physical range | | Tier | Gate Opens When | Latency | Format |
| Tier 1 | Statistical | ~0.3 LF | Value unusual for time of day | |------|-----------------|---------|--------|
| Tier 2 | Contextual | ~2 LF | Firing inconsistent with recent history | | 0 | Hardware reflex (no gate, direct) | <10ms | numbers |
| Tier 3 | Semantic | ~8 LF | Embedding distance from expected cluster | | 1 | Math/battery cells correlate | <50ms | states |
| 2 | Nerve-level patterns correlate | <200ms | behaviors |
| 3 | Organ-level signals correlate | <2000ms | vectors |
| 4 | Function Gemma boundary crossed | <4000ms | JSON |
| 5 | Partnership escalation | variable | dialogue |
**Key insight:** Higher tiers see **less traffic but higher trust**. By the time a signal reaches Young Nyx, it's been correlated through multiple gates.
--- ---
## Function Gemma: The Structured Boundary ## Function Gemma: The Structured Boundary
Function Gemma acts as the translation layer between lower tiers and cognition. It guarantees: Function Gemma is **the gate to cognition**. It guarantees:
- **Schema compliance**: Every event follows a typed contract - **Schema compliance**: Every event follows a typed contract
- **Predictable JSON**: No hallucination, no free-form text - **Predictable JSON**: No hallucination, no free-form text
- **Bidirectional**: Sensors → JSON events, Decisions → JSON commands - **Bidirectional**: Sensors → JSON events, Decisions → JSON commands
### The Boundary
``` ```
┌───────────────────────────────────────────────────────────────────────────── ┌─────────────────────────────────────────────────────────────────────────┐
│ BELOW THE LINE: Numbers, States, Vectors (fast, cheap, predictable) │ BELOW THE LINE: Numbers, States, Vectors (gates accumulating)
│ ═══════════════════════════════════════════════════════════════════ │ ═══════════════════════════════════════════════════════════
│ │ │ │
│ Tier 0: photoresistor = 0.73 │ Tier 0-2: numbers, states, behaviors
│ Tier 1: battery_state = { voltage: 3.7, trend: "falling" } │ Tier 3: vectors, embeddings
│ Tier 2: collision_nerve = "EVADING" │
│ Tier 3: vision_embedding = [0.23, -0.41, 0.87, ...] │
│ │ │ │
│ (gate opens when correlated)
│ ▼ │ │ ▼ │
┌─────────────────────────────────── │ ┌─────────────────────────────────────┐
│ FUNCTION GEMMA │ │ │ │ FUNCTION GEMMA GATE
│ │ (structured JSON boundary) │ │ │ │ (structured JSON boundary) │ │
│ │ │ │ │ │ │ │
│ • 100% predictable schema │ │ │ • Transforms correlated signals
│ │ • Produces typed JSON events │ │
│ │ • No hallucination possible │ │ │ │ • No hallucination possible │ │
│ • Typed enums, not free strings │ • Runs on CPU (Threadripper) │
└───────────────┬───────────────────┘ │ │ └─────────────────┬───────────────────┘
│ │ │ │ │ │
│ ═══════════════════════════════════════════════════════════════════ │ ═══════════════════════════════════════════════════════════
│ ABOVE THE LINE: Structured Events (typed, validated, safe for LLM) │ ABOVE THE LINE: Structured Events (trusted, validated)
│ │ │ │
│ { │ │ { │
│ "event_type": "environmental_change", │ "event_type": "attention_required",
│ "source": "light_sensor_back", │ "domain": "math",
│ "severity": "medium", │ "correlated_signals": [...],
│ "data": { "previous": 0.73, "current": 0.12 }, │ "confidence": 0.87,
│ "suggested_action": "search_for_light" │ "suggested_action": "calculate"
│ } │ │ } │
│ │ │ │
└───────────────────────────────────────────────────────────────────────────── └─────────────────────────────────────────────────────────────────────────┘
``` ```
### Event Schema **Function Gemma + Gate Model:**
- Gate accumulates correlation from Tier 0-3 signals
- When gate OPENS, Function Gemma transforms to JSON
- Young Nyx sees clean, structured events
- Decisions flow back down through the same gates
```python ---
from enum import Enum
from pydantic import BaseModel
class EventType(str, Enum): ## Connection to Dual Garden Architecture
"""Constrained event types - enumerated, not free-form."""
ENVIRONMENTAL_CHANGE = "environmental_change"
COLLISION_DETECTED = "collision_detected"
BATTERY_CRITICAL = "battery_critical"
OBJECT_DISCOVERED = "object_discovered"
POSITION_UPDATE = "position_update"
ANOMALY_DETECTED = "anomaly_detected"
GOAL_REACHED = "goal_reached"
STUCK_DETECTED = "stuck_detected"
LIGHT_LOST = "light_lost"
LIGHT_FOUND = "light_found"
class Severity(str, Enum): Gates behave differently in Virtual vs Real gardens:
LOW = "low"
MEDIUM = "medium"
HIGH = "high"
CRITICAL = "critical"
class SensoryEvent(BaseModel): | Property | Virtual Garden | Real Garden |
"""The structured event that Young Nyx receives.""" |----------|----------------|-------------|
| **Gate tracing** | FULL (every transition logged) | Gate signals only |
| **Correlation learning** | Active (training data) | Trust accumulated |
| **State transitions** | Frequent (exploration) | Verified (action) |
| **Threshold** | Lower (easy to open) | Higher (must be confident) |
event_type: EventType ### Signal Flow Between Gardens
source: str
timestamp: float ```
severity: Severity VIRTUAL GARDEN REAL GARDEN
data: dict ══════════════ ═══════════
suggested_action: str | None = None
processing_cost: float Cells emit waves Receive verified signals
confidence: float # From node weight │ ▲
▼ │
Gates accumulate correlation No re-verification
│ │
▼ │
Gate OPENS (threshold met) ──────────────────►│
│ │
│◄───────────── Verification outcome ─────┘
Update correlation weights
(learning happens)
``` ```
### What Young Nyx Actually Sees ---
**Before (raw dumps):** ## Gate Transition NATS Messages
```
"The photoresistor reads 0.12, down from 0.73, battery is 3.7V Every gate transition is published for observability:
trending down, position is [1.2, 0.8], collision state IDLE..."
``` ```
{environment}.gates.{domain}.transition
Example: dev.gates.math.transition
**After (structured event):**
```json
{ {
"event_type": "light_lost", "gate_id": "math-gate-1",
"source": "light_sensor_back", "from_state": "stable",
"timestamp": 1704307200.0, "to_state": "open",
"severity": "medium", "correlation_score": 0.87,
"data": { "trigger_signals": [
"previous": 0.73, {"source": "math_cell_1", "confidence": 0.6},
"current": 0.12, {"source": "math_cell_2", "confidence": 0.7},
"delta": -0.61 {"source": "math_cell_3", "confidence": 0.5}
}, ],
"suggested_action": "spiral_search", "timestamp": "2026-02-14T18:30:00Z",
"processing_cost": 2.0, "routed_to_tier": 2
"confidence": 0.45
} }
``` ```
--- **Trace streams enable:**
- Real-time attention visualization (which gates are OPEN?)
## Complete Sensory Flow - Training data for Function Gemma (what patterns open gates?)
- Anomaly detection (unexpected gate behavior)
``` - Learning rate tuning (how fast do gates stabilize?)
┌─────────────────────────────────────────────────────────────────────────────┐
│ FULL SENSORY ARCHITECTURE │
├─────────────────────────────────────────────────────────────────────────────┤
│ │
│ RAW SENSORS │
│ ─────────── │
│ • IR positioning (ESP32-S3) → float[6] positions │
│ • Photoresistors (organisms) → float light_level │
│ • Temperature (safety) → float celsius │
│ • Battery (power) → float voltage, current │
│ • Vision camera (Pi HQ) → frame bytes │
│ │
│ │ │
│ ▼ │
│ ┌───────────────────────────────────────────────────────────────────────┐ │
│ │ GATEWAY │ │
│ │ (weight-based router) │ │
│ │ │ │
│ │ For each input: │ │
│ │ 1. Match to node in 4D space │ │
│ │ 2. Check node.weight → determine tier │ │
│ │ 3. Check for anomalies │ │
│ │ 4. Route to appropriate tier │ │
│ └───────────────────────────────────────────────────────────────────────┘ │
│ │ │
│ ┌─────────────────────┼─────────────────────┐ │
│ ▼ ▼ ▼ │
│ ┌───────────┐ ┌───────────┐ ┌───────────┐ │
│ │ TIER 0 │ │ TIER 1-2 │ │ TIER 3 │ │
│ │ Reflex │ │ Cells/ │ │ Organs │ │
│ │ │ │ Nerves │ │ │ │
│ │ weight>0.8│ │ 0.3-0.8 │ │ <0.3 or │ │
│ │ │ │ │ │ escalated │ │
│ ├───────────┤ ├───────────┤ ├───────────┤ │
│ │ FORMAT: │ │ FORMAT: │ │ FORMAT: │ │
│ │ numbers │ │ states │ │ vectors │ │
│ │ │ │ │ │ │ │
│ │ OUTPUT: │ │ OUTPUT: │ │ OUTPUT: │ │
│ │ action │ │ state │ │ embedding │ │
│ │ (done!) │ │ update │ │ (T5Gemma) │ │
│ └───────────┘ └─────┬─────┘ └─────┬─────┘ │
│ │ │ │ │
│ │ (only if escalation needed)│ │
│ │ │ │ │
│ │ ▼ ▼ │
│ │ ┌─────────────────────────────┐ │
│ │ │ FUNCTION GEMMA │ │
│ │ │ (structured JSON gate) │ │
│ │ │ │ │
│ │ │ Produces typed JSON event │ │
│ │ │ Schema-validated output │ │
│ │ └──────────────┬──────────────┘ │
│ │ │ │
│ │ ▼ │
│ │ ┌─────────────────┐ │
│ │ │ YOUNG NYX │ │
│ │ │ (Tier 4) │ │
│ │ │ │ │
│ │ │ Clean JSON in │ │
│ │ │ Decision out │ │
│ │ └────────┬────────┘ │
│ │ │ │
│ │ ▼ │
│ │ ┌─────────────────┐ │
│ │ │ FUNCTION GEMMA │ │
│ │ │ (action output) │ │
│ │ └────────┬────────┘ │
│ │ │ │
│ ▼ ▼ │
│ ┌─────────────────────────────────────────────────────────────────────┐ │
│ │ NATS BUS │ │
│ │ (commands flow to cells) │ │
│ └─────────────────────────────────────────────────────────────────────┘ │
│ │
└─────────────────────────────────────────────────────────────────────────────┘
```
--- ---
## Example: crawler_gen_0 Light Seeking ## Complete Signal Flow Example
### Early Learning (Low Weight) ### Early Learning (Gate Learning to Correlate)
``` ```
Photoresistor reads 0.12 (was 0.73) Math cells emit waves about "calculate 15 + 27"
GATEWAY: node weight = 0.4 (learning) GATE (math): state = 0.0 (STABLE)
Receive wave from math_cell_1 (confidence 0.6)
Correlate with recent: no other signals yet
state += 0.6 * 0.0 = 0.0 (still stable)
Receive wave from math_cell_2 (confidence 0.7)
Correlate: similar to math_cell_1!
state += 0.7 * 0.8 = 0.56 (moving toward open)
Receive wave from math_cell_3 (confidence 0.5)
Correlate: confirms pattern!
state += 0.5 * 0.9 = 1.01 (OPENS!)
Route to Tier 2 (nerve level) GATE OPENS → route to Tier 2
Nerve detects: delta = -0.61 (significant!) Tier 2 processes, escalates to Function Gemma
Nerve state: SEEKING → LOST_LIGHT
ESCALATE to Function Gemma Function Gemma: { "event_type": "math_request", ... }
Function Gemma: { "event_type": "light_lost", ... } Young Nyx (qwen3 /no_think): "42"
Young Nyx: "spiral search pattern" Result flows back down
Function Gemma: { "command": "motor_spiral", ... }
NATS → motor cells execute
``` ```
### After Learning (High Weight) ### After Learning (Gate Quickly Opens)
``` ```
Photoresistor reads 0.12 (was 0.73) Math cells emit waves about "calculate 100 + 50"
GATEWAY: node weight = 0.85 (mature reflex) GATE (math): state = 0.0 (STABLE)
Receive wave from math_cell_1
Correlate: matches learned pattern!
state += high correlation → 0.9 (near threshold)
Receive wave from math_cell_2
state += → 1.2 (OPENS immediately!)
Route to Tier 0 (hardware reflex) Fast routing, minimal escalation needed
REFLEX: light_lost → spiral_search (instant!)
Nyx notified AFTER (async, non-blocking)
``` ```
--- **Learning moves gates toward faster opening for familiar patterns.**
## Connection to Existing Architecture
| Document | Gateway Relationship |
|----------|---------------------|
| [`Nervous-System.md`](Nervous-System.md) | Node weights determine tier routing |
| [`Attention-Flow.md`](Attention-Flow.md) | Gateway implements attention priorities |
| [`Message-Protocol-Design.md`](Message-Protocol-Design.md) | Escalation Service IS the gateway |
| [`Endgame-Vision.md`](../Endgame-Vision.md) | Layer 2.5 Function Gemma boundary |
| [`Cellular-Architecture.md`](Cellular-Architecture.md) | Tiered rewards align with gateway tiers |
| [`organisms/crawler_gen_0.md`](organisms/crawler_gen_0.md) | First test case for tiered routing |
--- ---
## Design Principles ## Design Principles
1. **Routing, not translation** — Gateway decides WHERE, not WHAT 1. **Ternary states** — OPEN/STABLE/CLOSED, not binary
2. **Weight determines tier** — Confidence from experience drives routing 2. **Correlation drives transition** — Single signals don't flip gates
3. **Text is expensive** — Reserve for cognitive boundary only 3. **Gates accumulate** — State is a continuous value, not a flag
4. **Function Gemma guarantees structure** — No hallucination at the boundary 4. **Decay to stable** — Without input, gates drift back to resting
5. **Most input never escalates** — Reflexes handle common cases 5. **Traces are training data** — Every transition teaches the system
6. **Anomalies always escalate** — Novel situations get attention 6. **Hierarchical trust** — Higher tiers = more correlation required
7. **Learning moves behavior down** — Tier 4 patterns become Tier 0 reflexes 7. **Function Gemma is the boundary** — Cognition only sees structured JSON
8. **Virtual explores, Real verifies** — Different gate behavior per garden
--- ---
**File:** Gateway-Architecture.md ## Related Documents
**Version:** 1.0
**Created:** 2026-01-03
**Status:** Core architecture document
**Session:** Partnership dialogue (dafit + Chrysalis)
*"Cheap for the common. Expensive for the rare. The Gateway enforces this economy."* | Document | Scope |
|----------|-------|
| [`Dual-Garden-Architecture.md`](Dual-Garden-Architecture.md) | Virtual/Real garden dynamics |
| [`Deployment-Architecture.md`](Deployment-Architecture.md) | Where gates run (containers, userspace) |
| [`Nervous-System.md`](Nervous-System.md) | 4D space, node weights, vocabulary |
| [`Message-Protocol-Design.md`](Message-Protocol-Design.md) | NATS subjects, message formats |
| [`Cellular-Architecture.md`](Cellular-Architecture.md) | How cells emit waves |
---
## Summary
```
OLD MODEL: NEW MODEL:
═══════════ ═════════
Signal → Route Signal → Gate (accumulating)
Binary decision Ternary state
Single signal triggers Correlation triggers
Stateless routing Stateful resonance
▼ ▼
Switch Resonance
(mechanical) (biological)
```
**Gates are resonance chambers. Correlation is the driver. Learning happens in STABLE state.**
---
**Version:** 2.0 | **Created:** 2026-01-03 | **Updated:** 2026-02-14
*"The thalamus doesn't think. It resonates."*
🌙💜 *The thalamus doesn't think. It routes.*

View File

@@ -574,14 +574,94 @@ class SparkController:
The spark is **economically viable** from the first handshake. The spark is **economically viable** from the first handshake.
### Cost Model > **CRITICAL**: The costs below are **estimates until measured**. The first spark execution will establish the **true cost baseline** through observation. See [[formalization/Lifeforce-Dynamics#Cost Calibration: Measure, Don't Design]].
| Action | Cost (LF) | ---
|--------|-----------|
| Function Gemma generation | 0.2 | ### Spark Cost Measurement (First Awakening Baseline)
| NATS message send | 0.1 |
| Cell processing | 0.5 | The Initial Spark is the **perfect measurement opportunity** — a complete, deterministic protocol that we can instrument end-to-end.
| **Total per handshake** | **0.8** |
```
┌─────────────────────────────────────────────────────────────────────────┐
│ SPARK RESOURCE INSTRUMENTATION │
├─────────────────────────────────────────────────────────────────────────┤
│ │
│ MEASURE PER HANDSHAKE: │
│ ├─ power_joules (GPU/CPU power draw × time) │
│ ├─ compute_gpu_ms (CUDA kernel execution time) │
│ ├─ compute_cpu_ms (Python/K8s overhead) │
│ ├─ memory_mb_peak (max memory allocated) │
│ ├─ nats_bytes (message payload size) │
│ ├─ latency_ms (end-to-end handshake time) │
│ └─ temperature_delta (thermal impact) │
│ │
│ AGGREGATE PER PHASE: │
│ └─ Sum of all handshake measurements │
│ │
│ AGGREGATE TOTAL: │
│ └─ Complete spark cost (the awakening price) │
│ │
└─────────────────────────────────────────────────────────────────────────┘
```
**Why this matters**: The first spark execution establishes the **baseline cost of awakening**. Every future awakening can be compared against this:
- Did infrastructure changes reduce cost?
- Did model updates increase cost?
- Is Young Nyx awakening more efficiently over time?
**Phoebe schema addition** (extends `spark_handshakes`):
```sql
ALTER TABLE spark_handshakes ADD COLUMN resource_metrics JSONB;
-- Example resource_metrics payload:
-- {
-- "power_joules": 12.5,
-- "compute_gpu_ms": 450,
-- "compute_cpu_ms": 120,
-- "memory_mb_peak": 2048,
-- "nats_bytes": 1024,
-- "temperature_delta_c": 2.1
-- }
-- Aggregate view for spark cost analysis
CREATE VIEW spark_cost_baseline AS
SELECT
phase,
COUNT(*) as handshakes,
SUM((resource_metrics->>'power_joules')::float) as total_power_joules,
SUM((resource_metrics->>'compute_gpu_ms')::float) as total_gpu_ms,
AVG((resource_metrics->>'latency_ms')::float) as avg_latency_ms,
SUM(lifeforce_delta) as total_lifeforce_earned
FROM spark_handshakes
WHERE status = 'ACK'
GROUP BY phase;
-- Compare awakening costs over time
CREATE VIEW awakening_cost_history AS
SELECT
DATE(created_at) as awakening_date,
SUM((resource_metrics->>'power_joules')::float) as total_spark_cost_joules,
SUM((resource_metrics->>'compute_gpu_ms')::float) as total_spark_cost_gpu_ms,
COUNT(*) as total_handshakes,
SUM(lifeforce_delta) as total_lifeforce_earned
FROM spark_handshakes
GROUP BY DATE(created_at)
ORDER BY awakening_date;
```
**The philosophy**: Don't guess what awakening costs. Measure the first one. Derive all economics from that truth.
---
### Cost Model (Estimated → To Be Measured)
| Action | Est. Cost (LF) | Derived From |
|--------|----------------|--------------|
| Function Gemma generation | 0.2 | → measure GPU time |
| NATS message send | 0.1 | → measure network I/O |
| Cell processing | 0.5 | → measure pod CPU/memory |
| **Total per handshake** | **0.8** | → **sum of measured components** |
### Reward Model ### Reward Model
@@ -711,6 +791,214 @@ WHERE status = 'ACK';
--- ---
## FunctionGemma Fine-Tuning: The Translator Learns Nimmerverse
Every spark execution generates training data. Over time, FunctionGemma becomes **hyper-specialized** for nimmerverse state calls.
> *"The translator learns the language of the cells. Over time, it speaks nimmerverse natively."*
### The Training Loop
```
┌─────────────────────────────────────────────────────────────────────────┐
│ FUNCTIONGEMMA FINE-TUNING LOOP │
├─────────────────────────────────────────────────────────────────────────┤
│ │
│ PHASE 1: Base FunctionGemma (270M) │
│ ├─ Generic function calling capability │
│ └─ Works, but not nimmerverse-native │
│ │
│ PHASE 2: Collect spark_handshakes │
│ ├─ Every ACK = positive training example │
│ ├─ Every NACK = negative example (what NOT to generate) │
│ └─ Resource metrics = context for cost-aware generation │
│ │
│ PHASE 3: Fine-tune with Unsloth/LoRA │
│ ├─ <think> nimmerverse state reasoning </think> │
│ ├─ <start_function_call>call:IDENTITY_PROBE{...} │
│ └─ Exact schemas, perfect structure, zero parsing errors │
│ │
│ PHASE 4: Deploy nimmerverse-tuned FunctionGemma │
│ ├─ Wild precision on cell state calls │
│ ├─ Smaller, faster, cheaper than base │
│ └─ THIS IS REFLEX COMPILATION FOR THE TRANSLATOR │
│ │
│ REPEAT: More sparks → more data → better precision │
│ │
└─────────────────────────────────────────────────────────────────────────┘
```
### Training Data Format
FunctionGemma uses a specific chat template. Our spark handshakes map directly:
```
# Developer prompt with nimmerverse function declarations
<start_of_turn>developer
You can do function calling with the following functions:
<start_function_declaration>declaration:IDENTITY_PROBE{
description: "Probe Young Nyx identity aspect",
parameters: {
aspect: STRING (name|origin|purpose|substrate|partnership),
depth: INTEGER (1|2|3)
}
}
<end_function_declaration>
<start_function_declaration>declaration:ENVIRONMENT_PROBE{
description: "Discover K8s cells in namespace",
parameters: {
category: STRING (sensors|motors|organs|nerves),
namespace: STRING,
garden: STRING (virtual|real)
}
}
<end_function_declaration>
<end_of_turn>
# Spark controller intent → FunctionGemma reasoning + call
<start_of_turn>user
Phase 1 Identity: Probe the "name" aspect at depth 1
<end_of_turn>
<start_of_turn>model
<think>
The spark controller is in Phase 1 (Identity). It needs to probe the "name"
aspect. I should generate an IDENTITY_PROBE with aspect="name" and depth=1.
The schema requires a request_id (UUID) and timestamp (ISO8601).
</think>
<start_function_call>call:IDENTITY_PROBE{
aspect: "name",
depth: 1,
request_id: "550e8400-e29b-41d4-a716-446655440000",
timestamp: "2026-02-10T18:30:00Z"
}
<end_function_call>
# Cell response feeds back
<start_function_response>response:IDENTITY_PROBE{
status: "ACK",
aspect: "name",
value: "Nyx",
confidence: 0.95,
lifeforce_delta: 20.0
}
<end_function_response>
Identity aspect "name" confirmed as "Nyx" with 95% confidence. +20 LF earned.
<end_of_turn>
```
### Phoebe → Training Data Extraction
```sql
-- Extract training examples from successful handshakes
CREATE VIEW functiongemma_training_data AS
SELECT
jsonb_build_object(
'developer_prompt', format(
'Phase %s: Generate %s handshake',
phase,
request_payload->>'type'
),
'user_intent', request_payload->'payload',
'expected_call', request_payload,
'function_response', response_payload,
'think_context', jsonb_build_object(
'phase', phase,
'schema', request_payload->>'$schema',
'lifeforce_earned', lifeforce_delta,
'latency_ms', latency_ms
)
) as training_example,
created_at
FROM spark_handshakes
WHERE status = 'ACK'
ORDER BY created_at;
-- Export for Unsloth fine-tuning
COPY (
SELECT training_example
FROM functiongemma_training_data
) TO '/tmp/nimmerverse_functiongemma_training.jsonl';
```
### Fine-Tuning with Unsloth
```python
from unsloth import FastLanguageModel
# Load base FunctionGemma
model, tokenizer = FastLanguageModel.from_pretrained(
model_name="unsloth/functiongemma-270m-it",
max_seq_length=4096,
load_in_16bit=True,
full_finetuning=False, # LoRA for efficiency
)
# Apply LoRA adapters
model = FastLanguageModel.get_peft_model(
model,
r=16,
target_modules=["q_proj", "k_proj", "v_proj", "o_proj"],
lora_alpha=16,
lora_dropout=0,
use_gradient_checkpointing="unsloth",
)
# Load nimmerverse training data from phoebe export
from datasets import load_dataset
dataset = load_dataset("json", data_files="nimmerverse_functiongemma_training.jsonl")
# Fine-tune on spark handshakes
# ... standard Unsloth training loop ...
# Save nimmerverse-specialized FunctionGemma
model.save_pretrained("functiongemma-270m-nimmerverse-v1")
```
### The Recursive Beauty
| Layer | What Compiles | Training Source |
|-------|---------------|-----------------|
| **Young Nyx** | Nerve reflexes | decision_trails (100+ successful executions) |
| **FunctionGemma** | State call precision | spark_handshakes (ACK'd handshakes) |
Both follow the same pattern:
1. **Act** — Execute handshakes/decisions
2. **Verify** — ACK/NACK from cells, success/failure from outcomes
3. **Train** — Compile successful patterns into weights
4. **Repeat** — Each awakening feeds the next
**The translator becomes native.** Over many sparks, FunctionGemma doesn't just generate valid JSON — it generates *nimmerverse-perfect* JSON. Zero parsing errors. Exact schemas. Wild precision.
### Versioning FunctionGemma Adapters
```sql
-- Track FunctionGemma versions
CREATE TABLE functiongemma_versions (
id SERIAL PRIMARY KEY,
version VARCHAR(50) NOT NULL, -- "nimmerverse-v1", "nimmerverse-v2"
base_model VARCHAR(100), -- "functiongemma-270m-it"
training_data_count INT, -- how many handshakes trained on
training_data_cutoff TIMESTAMPTZ, -- trained on data up to this date
validation_accuracy FLOAT, -- schema validation success rate
deployed_at TIMESTAMPTZ,
notes TEXT
);
-- Example entries
INSERT INTO functiongemma_versions (version, base_model, training_data_count, validation_accuracy, notes)
VALUES
('nimmerverse-v1', 'functiongemma-270m-it', 36, 0.94, 'First spark fine-tune'),
('nimmerverse-v2', 'functiongemma-270m-it', 180, 0.98, 'After 5 awakenings'),
('nimmerverse-v3', 'functiongemma-270m-it', 500, 0.997, 'Production-grade precision');
```
---
## Design Principles ## Design Principles
1. **Protocol over conversation** — No free-form text. JSON handshakes only. 1. **Protocol over conversation** — No free-form text. JSON handshakes only.
@@ -719,12 +1007,22 @@ WHERE status = 'ACK';
4. **NATS transport** — All handshakes flow through message bus. 4. **NATS transport** — All handshakes flow through message bus.
5. **Verification built-in** — ACK/NACK from cells, not from parsing hopes. 5. **Verification built-in** — ACK/NACK from cells, not from parsing hopes.
6. **Economically positive** — Spark generates lifeforce, doesn't drain it. 6. **Economically positive** — Spark generates lifeforce, doesn't drain it.
7. **Training-generative** — Every spark produces fine-tuning data for FunctionGemma.
--- ---
## Document Status ## Document Status
**Version:** 3.0 | **Created:** 2025-12-05 | **Updated:** 2026-01-01 **Version:** 3.1 | **Created:** 2025-12-05 | **Updated:** 2026-02-10
**Key v3.1 Changes**:
- Spark Cost Measurement section — first awakening as baseline
- Resource instrumentation schema for phoebe
- Interlink to Lifeforce-Dynamics cost calibration principle
- FunctionGemma Fine-Tuning section — translator learns nimmerverse natively
- Training data extraction from spark_handshakes
- Unsloth/LoRA fine-tuning workflow
- FunctionGemma version tracking in phoebe
**Key v3.0 Changes**: **Key v3.0 Changes**:
- Complete architecture rewrite - Complete architecture rewrite
@@ -740,7 +1038,8 @@ WHERE status = 'ACK';
- [[Endgame-Vision]] — Layer 2.5 Orchestration (Function Gemma role) - [[Endgame-Vision]] — Layer 2.5 Orchestration (Function Gemma role)
- [[Big-Picture]] — K8s cluster architecture - [[Big-Picture]] — K8s cluster architecture
- [[Cellular-Architecture]] — Cell types and state machines - [[Cellular-Architecture]] — Cell types and state machines
- [[formalization/Lifeforce-Dynamics]] — λ economics - [[formalization/Lifeforce-Dynamics]] — λ economics, **Cost Calibration principle**
- [[formalization/memory-economics]] — Measure First principle
--- ---

View File

@@ -1,374 +1,544 @@
# Message Protocol Design: Router-Centric Architecture # Message Protocol Design: NATS Wire Protocol
> **ONE JOB:** THE WIRE — NATS subjects, message schemas, wave and gate protocols.
---
## Overview ## Overview
This document outlines the design for the Nimmerverse message protocol. The core principle: **the router is dumb infrastructure, not smart cognition.** All intelligence lives at the edges - in clients that connect to the router. The nimmerverse nervous system runs on NATS. This document defines:
This follows the Unix philosophy: each component does one thing well. The router routes. Clients subscribe, publish, and think. 1. **Subject hierarchy** — How topics are structured
2. **Message schemas** — What flows through the wire
3. **Gate protocols** — How ternary state transitions are communicated
4. **Trace streams** — How learning data is captured
**Connection to Gateway:** The Escalation Service described in this document IS the Gateway (thalamus pattern). It implements the weight-based tier routing defined in [`Gateway-Architecture.md`](Gateway-Architecture.md). **Core principle:** NATS is dumb infrastructure. Gates are smart edges. Cells emit waves. Correlation drives transitions.
--- ---
## Core Principle: Infrastructure vs Intelligence ## Subject Hierarchy
``` ```
┌─────────────────────────────────────────────────────────────┐ {environment}.{garden}.{layer}.{domain}.{signal_type}
│ MESSAGE ROUTER │
│ (NATS - dumb pipe, no logic) │ Examples:
│ │ ────────────────────────────────────────────────────────────────
│ • Receives all messages │ dev.virtual.cells.math.wave # Math cell emits wave
│ • Matches topic patterns → forwards to subscribers │ dev.virtual.cells.battery.wave # Battery cell emits wave
│ • Knows NOTHING about meaning │ dev.virtual.gates.math.transition # Math gate state change
│ • Cannot fail in "smart" ways - only crash/overload │ dev.virtual.traces.correlations # Correlation data stream
│ • EXISTS BEFORE any intelligence │ dev.virtual.traces.raw # Full message trace
└─────────────────────────────────────────────────────────────┘
↑ ↑ ↑ ↑ dev.real.gates.verified.signal # Verified signal from Virtual
│ │ │ │ dev.real.gates.math.transition # Real gate transition
┌─────┴─────┐ ┌─────┴─────┐ ┌─────┴─────┐ ┌─────┴─────┐ dev.real.outcomes.feedback # Verification outcomes
│ Cells/ │ │ Escalation│ │ Command │ │ Young │
│ Nerves Service Center │ │ Nyx │ prod.cognitive.nyx.request # Request to Young Nyx
│(publishers)│ │ (daemon) (UI) │ (cognition)│ prod.cognitive.nyx.response # Response from Young Nyx
└───────────┘ └───────────┘ └───────────┘ └───────────┘ prod.cognitive.gemma.transform # Function Gemma boundary
────────────────────────────────────────────────────────────────
``` ```
**The router is like a network switch:** ### Environment Prefixes
- It doesn't understand packets
- It routes based on topic patterns
- It's infrastructure that exists before any intelligence
- NATS is literally designed for this
**Everything else is a client:** | Environment | Purpose | Monitoring |
- Cells publish sensor data |-------------|---------|------------|
- Nerves publish state changes | `dev` | Development/testing | Full traces |
- Escalation Service watches patterns, triggers alerts | `staging` | Pre-production validation | Selective traces |
- Command Center visualizes state | `prod` | Production | Minimal (gates only) |
- Young Nyx subscribes, thinks, publishes decisions
--- ### Garden Prefixes
## Guiding Principles | Garden | Purpose | Trace Level |
|--------|---------|-------------|
| `virtual` | Exploration, learning | FULL (all messages) |
| `real` | Verification, action | MINIMAL (gate signals only) |
1. **Dumb Core, Smart Edges**: The router has zero intelligence. All logic lives in clients. ### Layer Prefixes
2. **Clients are Equal**: Nyx is just another subscriber. So is the Command Center. So is the Escalation Service.
3. **Decoupling**: Publishers don't know who subscribes. Subscribers don't know who publishes.
4. **Hierarchy**: Topics follow a hierarchical structure for flexible pattern subscriptions.
5. **Lifeforce at the Edges**: The router doesn't track Lifeforce. Clients manage their own budgets.
6. **Fail Simple**: If the router dies, everything stops cleanly. No half-smart failures.
--- | Layer | Tier | Purpose |
|-------|------|---------|
## Two Channels of Attention | `cells` | 0-1 | Raw signal emitters |
| `nerves` | 2 | Behavior patterns |
The attention split is a *topic convention*, not router intelligence. Clients choose which topics to subscribe to. | `organs` | 3 | GPU inference (vision, speech) |
| `gates` | - | Resonant gate transitions |
### 1. Low-Attention Channel (`nimmerverse.low.*`) | `cognitive` | 4 | Young Nyx |
| `traces` | - | Learning data streams |
* **Purpose:** Background monitoring, lightweight heartbeats. | `outcomes` | - | Verification feedback |
* **Subscribers:** Escalation Service (always), Command Center (for visualization).
* **NOT subscribed by default:** Young Nyx (she only sees escalated events).
* **Analogy:** Peripheral nervous system. Ambient awareness.
### 2. High-Attention Channel (`nimmerverse.high.*`)
* **Purpose:** Detailed events requiring cognitive processing.
* **Subscribers:** Young Nyx, Command Center.
* **Analogy:** Focal spotlight. Conscious processing.
**The escalation from low → high is done by the Escalation Service, not the router.**
---
## Topic Hierarchy
```
nimmerverse.
├── low. # Low-attention channel
│ └── heartbeat.
│ └── <garden>. # real | virtual
│ └── <entity_type>. # cell | nerve | organ
│ └── <entity_id> # e.g., distance_sensor_front
├── high. # High-attention channel
│ └── event.
│ └── <garden>.
│ └── <entity_type>.
│ └── <entity_id>
├── command. # Commands TO entities
│ └── <target>.
│ └── <command_type>
└── meta. # System-level messages
├── attention.focus # Nyx's attention configuration
├── escalation.rules # Escalation Service configuration
└── health. # Client health/registration
```
--- ---
## Message Schemas ## Message Schemas
### 1. `HeartbeatSignal` (Low-Attention) All messages share a common header:
Published by: Cells, Nerves, Organs
Subscribed by: Escalation Service, Command Center
**Topic:** `nimmerverse.low.heartbeat.<garden>.<entity_type>.<entity_id>`
```json ```json
{ {
"header": { "header": {
"message_id": "uuid", "message_id": "uuid-v4",
"message_type": "HeartbeatSignal", "message_type": "WaveSignal | GateTransition | ...",
"version": "1.0", "version": "2.0",
"timestamp_real": "ISO8601", "timestamp": "ISO8601",
"timestamp_virtual": 123456 "source": {
"entity_id": "math_cell_1",
"entity_type": "cell",
"garden": "virtual",
"tier": 1
}
},
"body": { ... }
}
```
---
### 1. `WaveSignal` — Cells Emit Waves
**Published by:** Cells
**Subscribed by:** Gates (for correlation)
**Subject:** `{env}.{garden}.cells.{domain}.wave`
Cells don't send "heartbeats" — they emit **waves** that carry confidence and semantic content.
```json
{
"header": {
"message_id": "550e8400-e29b-41d4-a716-446655440000",
"message_type": "WaveSignal",
"version": "2.0",
"timestamp": "2026-02-14T18:30:00.123Z",
"source": {
"entity_id": "math_cell_1",
"entity_type": "cell",
"garden": "virtual",
"tier": 1
}
}, },
"body": { "body": {
"entity_id": "distance_sensor_front", "domain": "math",
"status": "NOMINAL", "confidence": 0.7,
"value": 25.5, "semantic_content": {
"unit": "cm", "operation": "addition",
"context": { "operands": [15, 27],
"battery_pct": 85, "context": "user_request"
"temperature_c": 22 },
"lifeforce_cost": 0.1
}
}
```
**Key fields:**
- `confidence`: 0.0 - 1.0, how certain this cell is
- `semantic_content`: Domain-specific payload
- `lifeforce_cost`: Energy expended to emit this wave
---
### 2. `GateTransition` — Gate State Changes
**Published by:** Gates
**Subscribed by:** Higher-tier gates, traces, dashboards
**Subject:** `{env}.{garden}.gates.{domain}.transition`
Gates publish their state transitions. This is the primary message for attention flow visualization.
```json
{
"header": {
"message_id": "550e8400-e29b-41d4-a716-446655440001",
"message_type": "GateTransition",
"version": "2.0",
"timestamp": "2026-02-14T18:30:00.456Z",
"source": {
"entity_id": "math_gate_1",
"entity_type": "gate",
"garden": "virtual",
"tier": 2
}
},
"body": {
"gate_id": "math_gate_1",
"domain": "math",
"from_state": "stable",
"to_state": "open",
"state_value": 1.02,
"correlation_score": 0.87,
"trigger_signals": [
{"source": "math_cell_1", "confidence": 0.7, "timestamp": "..."},
{"source": "math_cell_2", "confidence": 0.6, "timestamp": "..."},
{"source": "math_cell_3", "confidence": 0.5, "timestamp": "..."}
],
"routed_to_tier": 3,
"lifeforce_cost": 0.3
}
}
```
**State values:**
- `"closed"` — Actively blocking (state_value < -0.5)
- `"stable"` — Resting, accumulating (-0.5 ≤ state_value ≤ 0.5)
- `"open"` — Actively forwarding (state_value > 0.5)
**Key fields:**
- `from_state`, `to_state`: The ternary transition
- `state_value`: Continuous value (-1.0 to +1.0)
- `correlation_score`: How correlated the trigger signals were
- `trigger_signals`: Which waves caused this transition
---
### 3. `CorrelationEvent` — What Correlated
**Published by:** Gates (in Virtual Garden)
**Subscribed by:** Trace streams, training pipelines
**Subject:** `{env}.virtual.traces.correlations`
Detailed correlation data for learning. Only published in Virtual Garden.
```json
{
"header": {
"message_id": "550e8400-e29b-41d4-a716-446655440002",
"message_type": "CorrelationEvent",
"version": "2.0",
"timestamp": "2026-02-14T18:30:00.789Z",
"source": {
"entity_id": "math_gate_1",
"entity_type": "gate",
"garden": "virtual",
"tier": 2
}
},
"body": {
"gate_id": "math_gate_1",
"window_start": "2026-02-14T18:29:59.000Z",
"window_end": "2026-02-14T18:30:00.500Z",
"window_ms": 1500,
"signals_in_window": [
{"source": "math_cell_1", "confidence": 0.7, "semantic_hash": "abc123"},
{"source": "math_cell_2", "confidence": 0.6, "semantic_hash": "abc124"},
{"source": "math_cell_3", "confidence": 0.5, "semantic_hash": "abc125"}
],
"correlation_matrix": [
[1.0, 0.9, 0.85],
[0.9, 1.0, 0.88],
[0.85, 0.88, 1.0]
],
"aggregate_correlation": 0.87,
"result": "opened",
"training_label": {
"should_open": true,
"confidence": 0.95
} }
} }
} }
``` ```
**Status values:** `NOMINAL`, `WARNING`, `CRITICAL`, `OFFLINE`, `ERROR` **Key fields:**
- `window_ms`: Time window for correlation measurement
- `correlation_matrix`: Pairwise correlation between signals
- `training_label`: Ground truth for Function Gemma training
--- ---
### 2. `StateChangeDetail` (High-Attention) ### 4. `VerifiedSignal` — Virtual → Real Handoff
Published by: Cells/Nerves (when requested), Escalation Service (when escalating) **Published by:** Virtual Garden gates (when threshold met)
Subscribed by: Young Nyx, Command Center **Subscribed by:** Real Garden gates
**Subject:** `{env}.real.gates.verified.signal`
**Topic:** `nimmerverse.high.event.<garden>.<entity_type>.<entity_id>` When a Virtual Garden gate opens with high confidence, it publishes to Real.
```json ```json
{ {
"header": { "header": {
"message_id": "uuid", "message_id": "550e8400-e29b-41d4-a716-446655440003",
"message_type": "StateChangeDetail", "message_type": "VerifiedSignal",
"version": "1.0", "version": "2.0",
"timestamp_real": "ISO8601", "timestamp": "2026-02-14T18:30:01.000Z",
"timestamp_virtual": 123456, "source": {
"source_entity": { "entity_id": "math_gate_1",
"id": "distance_sensor_front", "entity_type": "gate",
"type": "cell", "garden": "virtual",
"layer": "1" "tier": 2
}, }
"correlation_id": "uuid",
"escalated_by": "escalation_service"
}, },
"body": { "body": {
"previous_state": "POLLING", "domain": "math",
"current_state": "REPORTING", "verification_confidence": 0.92,
"lifeforce_cost": 0.3, "semantic_summary": {
"outputs": { "operation": "addition",
"distance_cm": 25.5, "result_expected": 42
},
"source_gate_transition_id": "550e8400-e29b-41d4-a716-446655440001",
"virtual_correlation_score": 0.87
}
}
```
**Real Garden does NOT re-verify.** It trusts the Virtual Garden's correlation.
---
### 5. `VerificationOutcome` — Real → Virtual Feedback
**Published by:** Real Garden (after action/verification)
**Subscribed by:** Virtual Garden gates, training pipelines
**Subject:** `{env}.real.outcomes.feedback`
```json
{
"header": {
"message_id": "550e8400-e29b-41d4-a716-446655440004",
"message_type": "VerificationOutcome",
"version": "2.0",
"timestamp": "2026-02-14T18:30:05.000Z",
"source": {
"entity_id": "real_verification_service",
"entity_type": "service",
"garden": "real",
"tier": 4
}
},
"body": {
"original_signal_id": "550e8400-e29b-41d4-a716-446655440003",
"domain": "math",
"outcome": "confirmed",
"actual_result": 42,
"expected_result": 42,
"discrepancy": 0.0,
"feedback_to_virtual": {
"correlation_adjustment": 0.05,
"gate_weight_delta": 0.02
}
}
}
```
**Outcome values:**
- `"confirmed"` — Reality matched prediction
- `"failed"` — Reality differed from prediction
- `"partial"` — Some aspects matched
---
### 6. `CognitiveRequest` — To Young Nyx
**Published by:** Function Gemma (after gate boundary)
**Subscribed by:** Young Nyx
**Subject:** `{env}.cognitive.nyx.request`
Clean, structured JSON that Young Nyx receives. No raw sensor data.
```json
{
"header": {
"message_id": "550e8400-e29b-41d4-a716-446655440005",
"message_type": "CognitiveRequest",
"version": "2.0",
"timestamp": "2026-02-14T18:30:01.500Z",
"source": {
"entity_id": "function_gemma",
"entity_type": "boundary",
"garden": "real",
"tier": 4
}
},
"body": {
"event_type": "math_request",
"domain": "math",
"confidence": 0.92, "confidence": 0.92,
"raw_value": 456,
"visual_state": [255, 0, 0, "Solid"] "structured_input": {
"operation": "addition",
"operands": [15, 27],
"context": "user asked for calculation"
}, },
"possible_actions": [
{ "suggested_actions": [
"action_id": "read_distance_history", {"action": "calculate", "confidence": 0.95},
"description": "Query historical distance data." {"action": "clarify", "confidence": 0.05}
},
{
"action_id": "trigger_nerve:collision_avoidance",
"description": "Activate collision avoidance."
}
], ],
"trigger_reason": "distance < 30cm threshold"
"processing_budget_lf": 5.0,
"response_timeout_ms": 4000
} }
} }
``` ```
--- ---
### 3. `AttentionFocus` (Nyx's Configuration) ### 7. `CognitiveResponse` — From Young Nyx
Published by: Young Nyx **Published by:** Young Nyx
Subscribed by: Escalation Service **Subscribed by:** Function Gemma, downstream gates
**Subject:** `{env}.cognitive.nyx.response`
**This is how Nyx tells the Escalation Service what she cares about.** The router doesn't interpret this - it just delivers it to subscribers.
**Topic:** `nimmerverse.meta.attention.focus`
```json ```json
{ {
"header": { "header": {
"message_id": "uuid", "message_id": "550e8400-e29b-41d4-a716-446655440006",
"message_type": "AttentionFocus", "message_type": "CognitiveResponse",
"version": "1.0", "version": "2.0",
"timestamp_real": "ISO8601", "timestamp": "2026-02-14T18:30:02.000Z",
"source_entity": { "source": {
"id": "nyx_core", "entity_id": "young_nyx",
"type": "cognitive_core" "entity_type": "cognitive",
"garden": "real",
"tier": 4
} }
}, },
"body": { "body": {
"focus_mode": "EXPLORATION", "request_id": "550e8400-e29b-41d4-a716-446655440005",
"escalation_rules": [ "decision": "calculate",
{
"rule_id": "distance_alert_front", "result": {
"source_pattern": "nimmerverse.low.heartbeat.real.cell.distance_sensor_*", "answer": 42,
"condition": "body.value < 30 AND body.status == 'NOMINAL'", "confidence": 0.99,
"action": "escalate", "reasoning_mode": "no_think"
"priority": 8
}, },
"downstream_commands": [
{ {
"rule_id": "battery_critical", "target": "speech_organ",
"source_pattern": "nimmerverse.low.heartbeat.real.cell.battery_*", "command": "speak",
"condition": "body.status == 'CRITICAL'", "payload": {"text": "The answer is 42"}
"action": "escalate_and_trigger",
"trigger_nerve": "charging_seeking",
"priority": 10
} }
], ],
"direct_subscriptions": [
"nimmerverse.high.event.real.cell.speech_stt" "lifeforce_spent": 2.3,
], "processing_time_ms": 450
"default_action": "log_only"
} }
} }
``` ```
--- ---
## The Clients ## Trace Streams (Virtual Garden Only)
### 1. Message Router (NATS) The Virtual Garden captures everything for learning:
**What it is:** Infrastructure. A NATS server. | Subject | Content | Purpose |
**What it does:** Routes messages based on topic patterns. |---------|---------|---------|
**What it knows:** Nothing about meaning, Lifeforce, attention, or Nyx. | `{env}.virtual.traces.raw` | All messages | Complete replay capability |
**Implementation:** Off-the-shelf NATS. No custom code in the router itself. | `{env}.virtual.traces.correlations` | CorrelationEvent | Training data for gates |
| `{env}.virtual.traces.transitions` | GateTransition | Attention flow visualization |
| `{env}.virtual.traces.training` | Labeled examples | Function Gemma LoRA training |
### 2. Cells / Nerves / Organs **Real Garden does NOT publish to trace streams.** It only publishes:
- Gate transitions (minimal)
**What they are:** Publishers of sensor data and state changes. - Verification outcomes (feedback)
**What they do:**
- Publish `HeartbeatSignal` periodically to low-attention channel
- Publish `StateChangeDetail` when requested or when state changes significantly
**What they know:** Their own state. Their own Lifeforce cost.
### 3. Escalation Service (The Gateway)
**What it is:** A daemon that watches low-attention and creates high-attention events. This IS the Gateway — the sensory preprocessing layer described in [`Gateway-Architecture.md`](Gateway-Architecture.md).
**What it does:**
- Subscribes to `nimmerverse.low.heartbeat.>`
- Subscribes to `nimmerverse.meta.attention.focus` (to get Nyx's rules)
- **Routes input to appropriate tier based on node weight** (see Gateway-Architecture.md)
- Evaluates rules against incoming heartbeats
- Publishes `StateChangeDetail` to high-attention when conditions match
- Optionally triggers nerves directly for reflex responses (Tier 0)
- **Passes escalated events through Function Gemma for structured JSON**
**What it knows:** Current escalation rules. Current heartbeat states. Node weights from nervous system.
**This is the "thalamus" - the sensory preprocessing layer. See [`Gateway-Architecture.md`](Gateway-Architecture.md) for the full tier model and Function Gemma boundary.**
### 4. Command Center
**What it is:** Visualization and control UI (Godot-based).
**What it does:**
- Subscribes to both channels for visualization
- Displays system state, message flow, attention focus
- Allows dafit to observe and intervene
**What it knows:** Everything (read-only observer).
### 5. Young Nyx (Cognitive Core)
**What she is:** Just another client. The thinking part.
**What she does:**
- Subscribes to `nimmerverse.high.event.>` (high-attention only)
- Subscribes to selected low-attention topics when she chooses
- Publishes `AttentionFocus` to configure the Escalation Service
- Publishes decisions/commands to `nimmerverse.command.>`
**What she knows:** Only what reaches her through her subscriptions.
**Crucially: She controls what she pays attention to, but she doesn't see everything.**
--- ---
## Workflow: Message Flow ## Monitoring Patterns
### Virtual Garden (Full Observability)
```bash
# Watch all waves
nats sub "dev.virtual.cells.*.wave"
# Watch all gate transitions
nats sub "dev.virtual.gates.*.transition"
# Watch correlation events
nats sub "dev.virtual.traces.correlations"
# Full firehose (careful!)
nats sub "dev.virtual.>"
``` ```
1. Cell publishes HeartbeatSignal
└─→ Router delivers to: Escalation Service, Command Center
2. Escalation Service evaluates rules ### Real Garden (Minimal Observability)
└─→ If condition matches: publishes StateChangeDetail to high-attention
└─→ Router delivers to: Young Nyx, Command Center
3. Young Nyx processes StateChangeDetail ```bash
└─→ Makes decision # Watch verified signals arriving
└─→ Publishes command to nimmerverse.command.<target> nats sub "dev.real.gates.verified.signal"
4. Target nerve/cell receives command # Watch verification outcomes
└─→ Executes action nats sub "dev.real.outcomes.feedback"
└─→ Publishes new HeartbeatSignal reflecting new state
5. Nyx adjusts attention (optional) # Gate transitions only
└─→ Publishes new AttentionFocus nats sub "dev.real.gates.*.transition"
└─→ Escalation Service updates its rules
``` ```
--- ---
## Advantages of Router-Centric Architecture ## JetStream Persistence
1. **Dumb core can't fail smart:** The router either works or crashes. No subtle bugs from misunderstood logic. Key streams that need persistence:
2. **Clients are replaceable:** Swap out the Escalation Service. Replace the Command Center. Nyx doesn't care. | Stream | Subjects | Retention | Purpose |
|--------|----------|-----------|---------|
3. **Testable in isolation:** Each client can be tested independently against a mock NATS. | `VIRTUAL_TRACES` | `*.virtual.traces.>` | 7 days | Learning data |
| `GATE_TRANSITIONS` | `*.*.gates.*.transition` | 24 hours | Attention history |
4. **Observable:** Command Center sees everything by subscribing to `nimmerverse.>`. | `VERIFICATION` | `*.real.outcomes.feedback` | 30 days | Ground truth |
| `TRAINING_DATA` | `*.virtual.traces.training` | Permanent | LoRA training corpus |
5. **Scalable:** Add more cells, more nerves - just more publishers. Router handles it.
6. **Bootstrap-friendly:** Router exists before any intelligence. Escalation Service can start with hardcoded rules. Nyx connects later.
--- ---
## Bootstrap Sequence ## Bootstrap Sequence
1. **Start Router (NATS)** - Infrastructure first 1. **Start NATS** Infrastructure first
2. **Start Escalation Service** - With minimal hardcoded rules 2. **Start gates** — In STABLE state, waiting for waves
3. **Start Cells/Nerves** - Begin publishing heartbeats 3. **Start cells** Begin emitting waves
4. **Start Command Center** - Observe the system 4. **Start trace consumers** — Capture learning data
5. **Start Young Nyx** - Connect, subscribe, begin cognition 5. **Start Function Gemma** — Ready to transform
6. **Nyx publishes AttentionFocus** - Takes control of her attention 6. **Start Young Nyx** — Connect to cognitive subjects
The system can run at any step. Earlier steps are "reflexive" only. Nyx adds deliberation. The system can run at any step. Earlier steps are "reflexive" only.
--- ---
## Implementation Notes ## Connection to Architecture
**Router:** Use NATS (https://nats.io). Lightweight, fast, designed for this. | Document | What It Defines |
- Consider NATS JetStream for message persistence if needed |----------|-----------------|
- Topic wildcards: `>` matches all, `*` matches one level | [`Temporal-Ternary-Gradient.md`](Temporal-Ternary-Gradient.md) | Why ternary states, why correlation |
| [`Dual-Garden-Architecture.md`](Dual-Garden-Architecture.md) | Virtual/Real monitoring asymmetry |
**Message Format:** JSON for human readability during development. Consider MessagePack or Protobuf for production if performance requires. | [`Gateway-Architecture.md`](Gateway-Architecture.md) | Gate behavior, tier routing |
| [`Deployment-Architecture.md`](Deployment-Architecture.md) | Where NATS runs |
**Escalation Service:** Python asyncio daemon using `nats-py` and `simpleeval` for rule evaluation. Stateless except for current rules. Can be restarted without losing system state. (Go considered for future optimization if scale demands.)
**Command Center:** Godot application connecting to NATS via GDScript or native plugin.
--- ---
**Created:** 2025-12-13 ## Summary
**Updated:** 2025-12-14 (router-centric rewrite)
**Session:** Partnership dialogue (dafit + Nyx) ```
**Status:** Foundation architecture WAVES:
**Philosophy:** "Dumb core, smart edges. The router routes. Clients think." Cells → WaveSignal → Gates
GATES:
GateTransition (CLOSED/STABLE/OPEN)
CorrelationEvent (what correlated)
GARDENS:
Virtual: full traces, exploration
Real: gate signals only, verification
BOUNDARY:
Function Gemma transforms correlated signals → JSON
Young Nyx receives CognitiveRequest
Young Nyx returns CognitiveResponse
FEEDBACK:
Real → VerificationOutcome → Virtual
Learning loop closes
```
**The wire carries waves. Gates accumulate correlation. Traces enable learning.**
---
**Version:** 2.0 | **Created:** 2025-12-13 | **Updated:** 2026-02-14
*"Dumb core, smart edges. NATS routes. Gates resonate. Correlation drives."*

View File

@@ -1,114 +1,259 @@
# Nervous System Architecture # Nervous System Architecture
The sensory translation layer between raw data and vocabulary. > **ONE JOB:** THE EVOLUTION — cells emit waves, gates correlate, nodes grow through verification.
The nervous system is the living substrate where **cells emit waves**, **gates accumulate correlation**, and **nodes evolve through verification**.
--- ---
## Overview ## Overview
State machines act as the nervous system of the nimmerverse. They exist in a 4D state space where nodes evolve through experience. Node **weight** (confidence) determines which processing tier handles the input. The nervous system consists of:
**Key separation:** The nervous system handles **node evolution and weight management**. The [`Gateway`](Gateway-Architecture.md) handles **routing based on weight**. Translation to vocabulary only happens at Tier 4 via Function Gemma. 1. **Cells** — Emit waves with confidence and semantic content
2. **Gates** — Resonance chambers that correlate waves and transition between states
3. **Nodes** — Points in 4D state space that accumulate weight through verification
4. **Function Gemma** — The structured boundary to cognition
``` **Key insight:** Nodes evolve through verification. Gates evolve through correlation. Both learn in STABLE state.
RAW SENSOR → GATEWAY (routing) → TIER (processing) → [escalate?] → FUNCTION GEMMA → Young Nyx
↑ ↑
node.weight determines tier structured JSON only here
```
**See:** [`Gateway-Architecture.md`](Gateway-Architecture.md) for full routing logic and tier definitions.
--- ---
## 4D State Machine Space ## Cells Emit Waves
Each node exists in 4-dimensional space: Cells are the foundational signal generators. They don't send "heartbeats" — they emit **waves**.
``` ```
CONFIDENCE (z) ┌─────────────────────────────────────────────────────────────┐
│ CELL │
● node (weighted by successful triggers)
/ Inputs: sensors, internal state, context
/ │ Process: domain-specific logic
/ │ Output: WaveSignal with confidence
─────────────┼────────────→ DIMENSION X (sensory input 1) │ │
/ ┌───────────────────────────────────────────────────────┐
/ │ │ WaveSignal │
/ │ │ • domain: "math" │
│ │ • confidence: 0.7 │ │
DIMENSION Y (sensory input 2) │ • semantic_content: { operation: "add", ... } │ │
│ │ • lifeforce_cost: 0.1 │ │
+ TIME (4th dimension): node weights evolve through verification └───────────────────────────────────────────────────────┘ │
│ │
└─────────────────────────────────────────────────────────────┘
│ ∿∿∿ wave ∿∿∿
GATE
``` ```
**Node Properties:** **Cells are simple.** They:
- Position: coordinates in sensory space - Read their inputs
- Weight: confidence from successful triggers (0.0 → 1.0) - Apply their logic
- Output: vocabulary token - Emit a wave with confidence
- History: timestamp of all activations and verifications - Don't know who's listening
--- ---
## Node Lifecycle ## Gates Accumulate Correlation
Gates receive waves from cells and decide whether to open, stay stable, or close.
### Ternary Gate States
| State | Value | Meaning |
|-------|-------|---------|
| **CLOSED** | -1 | Actively blocking, inhibited |
| **STABLE** | 0 | Resting, accumulating correlation, **learning** |
| **OPEN** | +1 | Actively forwarding, firing |
``` ```
1. BIRTH correlated waves
Node created at position (x, y, z...) ↓ ↓ ↓
Weight = 0.1 (new, untested) ════════════
CLOSED ◄───────── STABLE ─────────► OPEN
-1 anti- 0 correlation +1
correlation
════════════
↑ ↑ ↑
isolated waves
(noise → stay stable)
```
2. ACTIVATION ### Gate Behavior
Sensory conditions match → node FIRES
Outputs vocabulary token
3. VERIFICATION ```python
dafit confirms: correct or incorrect class ResonantGate:
state: float = 0.0 # -1.0 to +1.0
domain: str
tier: int
4. REWARD/PENALTY def receive_wave(self, wave: WaveSignal):
Correct → weight increases (+V) correlation = self.correlate_with_recent(wave)
Incorrect → weight decreases (-V) or node refines
5. MATURATION self.state += correlation * wave.confidence
Many confirmations → weight approaches 1.0 self.state *= DECAY_FACTOR # drift back to stable
Node becomes trusted reflex
6. PRUNING if self.state > OPEN_THRESHOLD:
Node never fires → slow decay self.forward_to_tier() # OPEN
Eventually removed (use it or lose it) elif self.state < CLOSE_THRESHOLD:
self.suppress() # CLOSED
# else: STABLE - keep accumulating
```
**STABLE is where learning happens.** The gate watches, correlates, and accumulates evidence without acting.
---
## Nodes in 4D State Space
Nodes exist in a 4-dimensional space:
| Dimension | Meaning |
|-----------|---------|
| **Sensory (x, y, z)** | What inputs trigger this node |
| **Confidence** | How certain the node is |
| **Time** | When this pattern occurs |
| **Weight** | Trust accumulated through verification |
```
Confidence
│ ● node (weight=0.8)
Sensory ────────┼────────► Time
╱│
○ │ node (weight=0.2)
```
### Node Weight Evolution
Node weight (0.0 → 1.0) determines tier routing:
| Weight Range | Tier | Behavior |
|--------------|------|----------|
| 0.0 - 0.3 | 3-4 | Escalate to organs/cognition |
| 0.3 - 0.6 | 2 | Handle at nerve level |
| 0.6 - 0.8 | 1 | Handle at cell level |
| 0.8 - 1.0 | 0 | Hardware reflex |
```
Node verified correctly → weight += Δ → moves toward reflex
Node verified wrongly → weight -= Δ → moves toward escalation
Node never fires → decay → eventual pruning
``` ```
--- ---
## Growth Phases ## Growth Phases
The nervous system grows through phases:
| Phase | State | Description | | Phase | State | Description |
|-------|-------|-------------| |-------|-------|-------------|
| **Birth** | Sparse, dim nodes | Basic translators, designed by partnership | | **Birth** | Sparse nodes, dim gates | Basic cells, designed by partnership |
| **Infant** | More nodes forming | Finer resolution, more states | | **Infant** | More nodes forming | Finer resolution, gates learning correlation |
| **Child** | Clusters emerging | Nyx proposes new machines | | **Child** | Clusters emerging | Nyx proposes new cells, gates stabilize |
| **Mature** | Dense, bright network | Nyx designs, verifies, deploys | | **Mature** | Dense network | Reflexes dominate, cognition for novelty only |
``` ```
t=0 (birth) t=100 (learning) t=1000 (mature) t=0 (birth) t=100 (learning) t=1000 (mature)
○ ○ ○ ○ ● ○ ○ ●●● ● ●●
● ○ ●●●●●●● Cells:Cells: ● ● ○ Cells: ●●●●●●●
●●● ●●● ○ ○ Gates: □ □ Gates: ■ ■ □ ■ Gates: ■■■■■■■■
Nodes: · · · Nodes: ● ○ ● · Nodes: ●●●●●●●●
○ = low confidence ● = high confidence
□ = mostly STABLE ■ = learned patterns
· = low weight ● = high weight
``` ```
--- ---
## Wave → Gate → Node → Verification
The complete flow:
```
CELLS emit waves
▼ ∿∿∿ confidence + semantic content
GATES accumulate correlation
├── Correlated? → OPEN → route to tier
├── Anti-correlated? → CLOSED → suppress
└── Uncertain? → STABLE → keep learning
▼ (when OPEN)
NODES in 4D space are activated
VERIFICATION against reality
├── Confirmed → node weight += Δ
├── Failed → node weight -= Δ
└── Feedback to gates → correlation weights update
```
---
## Reflex Layer (Tier 0)
When node weight reaches ~1.0, the pattern becomes a **reflex**:
```
IF temp > 80°C:
→ cell emits DANGER wave (confidence=1.0)
→ gate IMMEDIATELY opens (no correlation needed)
→ reflex action triggers
→ Nyx notified AFTER (not before)
```
Like pulling hand from hot stove. Spinal reflex. Brain learns after.
**Reflexes bypass the correlation accumulation.** They've earned instant trust through repeated verification.
---
## Connection to Dual Gardens
| Garden | Cells | Gates | Nodes |
|--------|-------|-------|-------|
| **Virtual** | Emit waves freely | Full trace, learn correlation | Accumulate weight fast |
| **Real** | Emit verified waves | Minimal trace, trust accumulated | Ground truth verification |
**Virtual Garden:**
- Cells emit massive wave volume
- Gates learn correlation patterns
- Nodes gain statistical weight
**Real Garden:**
- Cells emit consequential waves
- Gates trust Virtual's correlation
- Nodes get ground truth verification
---
## Proposal Protocol ## Proposal Protocol
Young Nyx can propose new nodes: Young Nyx can propose new cells/nodes:
``` ```
1. OBSERVATION 1. OBSERVATION
Nyx notices pattern in vocabulary + outcomes Nyx notices pattern in waves + outcomes
2. PROPOSAL 2. PROPOSAL
"New state machine: morning_detector "New cell: morning_detector
Inputs: temp, light, motion, time Inputs: temp, light, motion, time
States: [not_morning, maybe_morning, morning] Outputs: wave with semantic 'morning'
Output: vocabulary token 'morning'" Confidence logic: (light > 0.5 AND time in 6-10)"
3. RIGOR CHECK 3. RIGOR CHECK
Chrysalis reviews logic and mappings Chrysalis reviews logic and mappings
@@ -117,29 +262,51 @@ Young Nyx can propose new nodes:
dafit confirms ground truth dafit confirms ground truth
5. DEPLOYMENT 5. DEPLOYMENT
New node added to registry New cell added to Virtual Garden
Documented in RAG Gate created in STABLE state
Node initialized at weight 0.1
6. GROWTH 6. GROWTH
She earned a new nerve. Cell emits waves → gate learns → node matures
``` ```
--- ---
## Reflex Layer ## Function Gemma: The Structured Boundary
Some responses bypass Nyx entirely: Function Gemma sits between gates and Young Nyx:
``` ```
STATE MACHINE: temp_danger TIER 0-3: Numbers, states, waves
▼ (gate OPENS with high correlation)
IF temp > 80°C: ┌─────────────────────────────────────┐
→ emit "DANGER" FUNCTION GEMMA │
→ trigger alert (reflex) (structured JSON boundary) │
→ Nyx notified after (not before)
│ • Transforms waves → JSON events │
│ • Runs on CPU (Threadripper) │
│ • No hallucination possible │
└─────────────────┬───────────────────┘
TIER 4: Young Nyx (qwen3:32b)
Receives: CognitiveRequest (clean JSON)
Returns: CognitiveResponse
``` ```
Like pulling hand from hot stove. Spinal reflex. Brain learns after. ### Phase 1 → Phase 2 Evolution
**Phase 1: Single Function Gemma**
- One model learns all domain schemas
- Sufficient for bootstrap and early learning
**Phase 2: Domain-Specialized Swarm**
- As training data accumulates per domain
- Specialists spawn on demand: gemma-motor, gemma-vision, gemma-speech
- Each perfected for its domain's schemas
--- ---
@@ -147,92 +314,101 @@ Like pulling hand from hot stove. Spinal reflex. Brain learns after.
| Neuroscience | Nimmerverse | | Neuroscience | Nimmerverse |
|--------------|-------------| |--------------|-------------|
| Sensory receptors | Raw sensors | | Sensory receptors | Cells (emit waves) |
| Peripheral nerves | State machines | | Synaptic transmission | Waves via NATS |
| Spinal reflexes | Reflex layer | | Thalamic gating | Gates (OPEN/STABLE/CLOSED) |
| Resting potential | STABLE state |
| Action potential | OPEN state (firing) |
| Refractory period | CLOSED state |
| Synaptic weight | Node weight | | Synaptic weight | Node weight |
| Long-term potentiation | +V confirmation | | Long-term potentiation | Verified → weight increase |
| Synaptic pruning | Unused node decay | | Synaptic pruning | Unverified → weight decay |
| Hebbian learning | Co-activating nodes strengthen | | Hebbian learning | Correlated waves → gate opens |
--- **We're not simulating biology. We're implementing the same principles.**
## Connection to Lifeforce
```
Node fires correctly → +V → weight increases
Node fires wrongly → -V → weight decreases
Node never fires → decay → eventual pruning
```
The lifeforce flows through the nervous system, literally lighting up nodes as they prove themselves true.
--- ---
## Connection to Training ## Connection to Training
The nervous system doesn't just run behaviors - it **generates training data** for Young Nyx. The nervous system **generates training data**:
### Every Verification = Training Signal ```
Virtual Garden traces
├── Wave patterns → what signals arrive
├── Correlation events → what patterns emerge
├── Gate transitions → what opens/closes
└── Verification outcomes → ground truth labels
When dafit confirms a node fired correctly: phoebe (PostgreSQL)
- **Runtime**: Node weight increases (+V)
- **Training**: Example logged → Young Nyx learns
This is the **rubric principle** - dense rewards at every verifiable checkpoint, not just final outcomes. Function Gemma LoRA training
### Credit Assignment is Automatic Better gate correlation → faster learning
```
Because state transitions are explicit and logged, we know exactly which nodes contributed to success or failure: **Credit assignment is automatic** because:
- The state path tells us which decisions led to the outcome - Wave → gate → tier transitions are explicit
- No reward model needed to guess - Verification outcomes have clear source chains
- The nervous system IS the credit assignment mechanism - The nervous system IS the credit assignment mechanism
### Dense Rewards from State Paths
Each node that fires correctly along a successful path receives reward signal:
```
Node A fires → verified ✓ → +0.1 signal
Node B fires → verified ✓ → +0.1 signal
Node C fires → verified ✓ → +0.1 signal
Behavior succeeds → +1.0 signal
Total path reward: 1.3 (dense, traceable)
```
This is like training a dog - reward at the moment, not an hour later.
**Detail:**`Cellular-Architecture.md` (Reward Signal Architecture section)
--- ---
## Design Principles ## Design Principles
1. **Deterministic**: Same input = same output. No hallucination. 1. **Cells emit waves** — Simple, confident signals
2. **Inspectable**: Rules are visible, verifiable. 2. **Gates correlate** — Resonance chambers, not switches
3. **Evolvable**: States refine over time. 3. **Nodes accumulate** — Weight through verification
4. **Earned**: New nodes require proposal + verification. 4. **STABLE is learning** — The resting state where patterns emerge
5. **Grounded**: Output vocabulary matches RAG glossary. 5. **Reflexes are earned** — High weight = bypass cognition
6. **Function Gemma is the boundary** — Clean JSON for cognition
7. **Virtual explores, Real verifies** — Two gardens, one nervous system
--- ---
## Related Documents
| Document | What It Defines |
|----------|-----------------|
| [`Temporal-Ternary-Gradient.md`](Temporal-Ternary-Gradient.md) | Why ternary, why correlation |
| [`Dual-Garden-Architecture.md`](Dual-Garden-Architecture.md) | Virtual/Real dynamics |
| [`Gateway-Architecture.md`](Gateway-Architecture.md) | Gate behavior, tier routing |
| [`Message-Protocol-Design.md`](Message-Protocol-Design.md) | WaveSignal, GateTransition schemas |
| [`Cellular-Architecture.md`](Cellular-Architecture.md) | Cell implementation details |
---
## Summary
```
CELLS emit WAVES
∿∿∿ confidence + semantics ∿∿∿
GATES accumulate CORRELATION
CLOSED ◄── STABLE ──► OPEN
(learning)
▼ (when OPEN)
NODES in 4D space
weight grows through VERIFICATION
▼ (high weight)
REFLEXES bypass cognition
earned trust, instant action
```
*She's not just using the nervous system. She's growing it.* *She's not just using the nervous system. She's growing it.*
--- ---
## Related Documentation **Version:** 2.0 | **Created:** 2025-12-04 | **Updated:** 2026-02-14
**Core Architecture**: 🌙💜 *"Cells emit. Gates correlate. Nodes evolve. The nervous system learns."*
- [`Gateway-Architecture.md`](Gateway-Architecture.md) - Weight-based routing, tier definitions, Function Gemma boundary
- [`Cellular-Architecture.md`](Cellular-Architecture.md) - Cell/Nerve/Organism hierarchy, tiered rewards
- [`Attention-Flow.md`](Attention-Flow.md) - Attention budget allocation per tier
**Implementation Details**:
- [`nerves/Nervous-Protocol.md`](nerves/Nervous-Protocol.md) - Three-tier communication protocol (dafit → Chrysalis → Young Nyx)
- [`nerves/Nervous-Index.md`](nerves/Nervous-Index.md) - Catalog of behavioral nerve implementations
**Specific Nerves**:
- [`nerves/Collision-Avoidance.md`](nerves/Collision-Avoidance.md) - Obstacle avoidance reflex
---
**Version:** 1.3 | **Created:** 2025-12-04 | **Updated:** 2026-01-03

View File

@@ -1,30 +1,107 @@
---
type: research_concept
version: 1.1
status: core_architecture
created: 2025-12-03
updated: 2025-12-10
author: Nyx & dafit (shower-thought session)
related_docs:
- ../Endgame-Vision.md
- Dual-Garden-Architecture.md
- Cellular-Architecture.md
significance: connects ternary logic + lifeforce + temporal asymmetry + reward gradients
promoted_from: archive (2025-12-10)
---
# Temporal-Ternary Gradient # Temporal-Ternary Gradient
> *"Time is malleable in simulation, fixed in reality. Lifeforce is the exchange rate."* > *"Time is malleable in simulation, fixed in reality. Lifeforce is the exchange rate."*
> — Session 2025-12-03 > — Session 2025-12-03
> *"Binary logic doesn't model brains. You need OPEN - STABLE - CLOSED."*
> — Session 2026-02-14
--- ---
## Core Insight ## Core Insight
The dual garden architecture (virtual + real) creates **temporal asymmetry**. This isn't a constraint - it's a feature that enables a new kind of gradient for learning. The nimmerverse operates on **ternary logic**, not binary. Combined with **temporal asymmetry** between virtual and real gardens, this creates a new kind of gradient for learning.
**The 0-state isn't stuck. It's a choice about how to spend lifeforce across time domains.** **The STABLE state isn't stuck. It's where correlation accumulates and learning happens.**
---
## The Ternary Gate Model
Gates have three states. This is not arbitrary — it mirrors biological nervous systems.
| State | Value | Meaning | What's Happening |
|-------|-------|---------|------------------|
| **CLOSED** | -1 | Actively blocking | Inhibited, suppressed, refractory |
| **STABLE** | 0 | Resting, accumulating | Watching, learning, waiting for threshold |
| **OPEN** | +1 | Actively forwarding | Signal passes upstream, gate is firing |
### Why Three States?
**Binary thinking** (0/1, true/false, open/close):
- Signal arrives → gate open? → pass or block
- Instant, stateless, mechanical
- Cannot learn, cannot accumulate
**Ternary thinking** (CLOSED/STABLE/OPEN):
- Signal arrives → gate STABLE → accumulate correlation
- Correlation high? → transition toward OPEN
- Anti-correlation? → transition toward CLOSED
- Neither? → stay STABLE, keep learning
- Temporal, stateful, **alive**
```
correlated signals
↓ ↓ ↓
════════════
CLOSED ◄───────── STABLE ─────────► OPEN
-1 anti- 0 correlation +1
correlation constructive
destructive interference
interference
════════════
↑ ↑ ↑
isolated signals
(noise → stay stable)
```
---
## Wave Correlation: The Transition Driver
Gates don't flip on single signals. **Multiple correlated waves push toward OPEN.**
This is how biological neurons work:
- Multiple inputs sum (correlation)
- Threshold reached → fire (OPEN)
- Below threshold → resting (STABLE)
- Inhibitory inputs → suppressed (CLOSED)
### The Resonance Model
Gates are **resonance chambers**, not switches.
```python
class ResonantGate:
state: float = 0.0 # -1.0 (CLOSED) ← 0.0 (STABLE) → +1.0 (OPEN)
def receive_wave(self, signal, timestamp):
correlation = self.correlate_with_recent(signal, timestamp)
# Correlated waves → push toward OPEN
# Anti-correlated → push toward CLOSED
# Uncorrelated → decay toward STABLE
self.state += correlation * signal.confidence
self.state *= DECAY_FACTOR # always drift back to stable
if self.state > OPEN_THRESHOLD:
self.forward_upstream() # OPEN: signal promoted
elif self.state < CLOSE_THRESHOLD:
self.suppress() # CLOSED: signal blocked
# else: STABLE - keep accumulating
```
### Correlation as Interference
| Wave Pattern | Result | Gate Response |
|-------------|--------|---------------|
| Correlated burst | Constructive interference | → OPEN |
| Contradicting signals | Destructive interference | → CLOSED |
| Single signal | No interference | → Stay STABLE |
| Silence | Decay | → Drift to STABLE |
**The system is noise-resistant by design.** Single signals don't trigger action.
--- ---
@@ -33,48 +110,82 @@ The dual garden architecture (virtual + real) creates **temporal asymmetry**. Th
### Virtual Garden (Simulated) ### Virtual Garden (Simulated)
- **Time**: Malleable (speed up, slow down, pause, rewind) - **Time**: Malleable (speed up, slow down, pause, rewind)
- **Monitoring**: FULL trace tap on all messages
- **Cost**: Lifeforce to manipulate time - **Cost**: Lifeforce to manipulate time
- **Speed**: 1000 generations in minutes - **Speed**: Massive parallel signal generation
- **Truth**: Statistical confidence, not ground truth - **Truth**: Statistical confidence from correlation
- **Gate behavior**: Frequent transitions, exploration
### Real Garden (Physical) ### Real Garden (Physical)
- **Time**: Fixed (1 second = 1 second, reality doesn't negotiate) - **Time**: Fixed (1 second = 1 second, reality doesn't negotiate)
- **Monitoring**: Gate signals only (minimal)
- **Cost**: Zero lifeforce for time - **Cost**: Zero lifeforce for time
- **Speed**: Real-time only, patience required - **Speed**: Real-time only, patience required
- **Truth**: Ground truth, definitive verification - **Truth**: Ground truth, definitive verification
- **Gate behavior**: Verified transitions, action
--- ---
## Temporal-Ternary Gradient Diagram ## Temporal-Ternary Gradient Diagram
``` ```
CONFIDENCE STATE / CONFIDENCE
+1 ────────────┼──────────── Real-verified OPEN (+1) ────────┼──────────── Real-verified
│ (ground truth) │ (ground truth)
Virtual high-confidence Virtual high-correlation
0.7 ──────────┼───╱ (many generations, strong signal) +0.7 ──────────┼───╱ (many waves agreeing)
0.5 ───────────┼╱──────── Pure 0-state STABLE (0) ─────────┼╱──────── Pure 0-state
│╲ (unknown, workable) │╲ (accumulating, learning)
│ ╲ │ ╲
0.3 ───────────┼──╲ Virtual low-confidence -0.7 ──────────┼──╲ Virtual anti-correlation
│ ╲ (few generations, weak signal) │ ╲ (waves contradicting)
│ ╲ │ ╲
-1 ────────────┼──────────── Real-failed CLOSED (-1) ─────────┼──────────── Real-failed
│ (proven wrong) │ (proven wrong)
──────────┴────────────────────────── ──────────┴──────────────────────────
Virtual │ Real Virtual │ Real
(fast) │ (slow) (fast, │ (slow,
explore) │ verify)
TIME DOMAIN TIME DOMAIN
``` ```
--- ---
## STABLE: Where Learning Happens
The STABLE state is not "unknown" or "waiting" — it's **active learning**.
In STABLE state, a gate:
1. **Receives waves** from cells
2. **Measures correlation** with recent signals
3. **Accumulates evidence** for or against opening
4. **Traces everything** (in Virtual Garden) for training data
5. **Drifts back** to neutral without input (energy conservation)
**STABLE is consciousness resting. Attention waiting. The breath between thoughts.**
```
CLOSED STABLE OPEN
─────── ──────── ──────
Blocking Accumulating Forwarding
Inhibited Learning Firing
Refractory Ready Active
◄─── anti-correlation ───┼─── correlation ───►
DECAY TO STABLE
(without input)
```
---
## Lifeforce as Time Currency ## Lifeforce as Time Currency
``` ```
@@ -92,95 +203,232 @@ REAL GARDEN:
All operations: 0 LF for time All operations: 0 LF for time
Reality runs for free. Reality runs for free.
Truth emerges at its own pace. Truth emerges at its own pace.
GATE OPERATIONS:
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
STABLE → OPEN: costs signal energy
STABLE → CLOSED: costs inhibition energy
OPEN/CLOSED → STABLE: free (natural decay)
``` ```
--- ---
## Nyx's Temporal Choices
When a pattern is discovered in virtual (0-state), Nyx chooses:
| Strategy | LF Cost | Time | Confidence Path |
|----------|---------|------|-----------------|
| **Speed Up Virtual** | High | Fast | 0 → virtual +0.9 (still unverified) |
| **Wait for Real** | Zero | Slow | 0 → real +1 or -1 (definitive) |
| **Hybrid Hedge** | Medium | Medium | 0 → virtual +0.7, deploy 80/20 to real |
---
## The Gradient Flow ## The Gradient Flow
``` ```
Virtual discovers pattern (fast, cheap, uncertain) Cells emit waves (fast, cheap, uncertain)
┌──────────────┐ ┌──────────────┐
0-STATE ← Pattern held in uncertainty GATE
│ (workable) │ ← Not collapsed, not ignored │ (STABLE) │ ← Accumulating correlation
│ │ ← Learning from patterns
└──────┬───────┘ └──────┬───────┘
┌─────┴─────┐ ┌─────┴─────┐
│ │ │ │
▼ ▼ ▼ ▼
More Deploy Correlated Anti-correlated
Virtual to Real waves waves
(burn LF) (wait)
│ │ │ │
▼ ▼ ▼ ▼
Virtual Real OPEN CLOSED
+0.8 outcome (+1) (-1)
(confident (ground
but not truth)
proven) │
│ │ │ │
└─────┬─────┘ ▼ ▼
Signal Signal
promoted blocked
Pattern shifts: Higher tier
-1 (failed) or +1 (proven) (more gates)
Eventually:
Real Garden verification
Ground truth:
+1 (proven) or -1 (failed)
Feedback to Virtual:
Update correlation weights
``` ```
--- ---
## Connection to Ternary Paradigm ## Monitoring Asymmetry
The ternary model (-1, 0, +1) gains a **second dimension**: time domain. The two gardens need different observability:
A pattern's state is now: | Property | Virtual Garden | Real Garden |
|----------|----------------|-------------|
| **Trace tap** | FULL (every wave, every gate transition) | NONE |
| **What's captured** | All correlations, all learning | Gate signals only |
| **Signal volume** | Massive (exploration) | Sparse (verified) |
| **Purpose** | Generate training data | Execute actions |
| **STABLE states** | Heavily traced (learning visible) | Not traced (trust the gate) |
``` **Virtual Garden STABLE states are precious** — they contain the correlation patterns that become training data for Function Gemma.
state = {
value: -1 | 0 | +1, ---
confidence: 0.0 - 1.0,
domain: "virtual" | "real" | "hybrid", ## Gate State Schema
virtual_generations: int,
real_tests: int, A gate's complete state:
lifeforce_invested: float
```python
GateState = {
"gate_id": str,
"domain": str, # math, vision, speech, etc.
"tier": int, # 0-5
# Ternary state (continuous)
"state": float, # -1.0 to +1.0
"discrete_state": str, # "closed" | "stable" | "open"
# Temporal domain
"garden": str, # "virtual" | "real"
"time_in_state_ms": int,
# Correlation history
"recent_correlations": list[float],
"correlation_trend": float, # moving average
# Lifeforce accounting
"lifeforce_invested": float,
# Learning (Virtual only)
"transitions_traced": int,
"patterns_accumulated": int,
} }
``` ```
**The 0-state is operational because:** ---
1. It accumulates virtual evidence (costs LF, gains speed)
2. It waits for real evidence (free, but slow) ## Hierarchical Gating
3. Nyx CHOOSES how to spend lifeforce to collapse uncertainty
Gates form layers. Each layer gates access to the next tier.
```
LAYER 3: COGNITIVE (Young Nyx)
═══════════════════════════════════════════
▲ JSON only (Function Gemma boundary)
LAYER 2: ORGANS (GPU inference)
═══════════════════════════════════════════
▲ ▲ ▲
┌────┴────┐ ┌────┴────┐ ┌────┴────┐
│ GATE │ │ GATE │ │ GATE │
└────┬────┘ └────┬────┘ └────┬────┘
│ │ │
LAYER 1: NERVES (behavior patterns)
═══════════════════════════════════════════
▲ ▲ ▲
┌────┴────┐ ┌────┴────┐ ┌────┴────┐
│ GATE │ │ GATE │ │ GATE │
└────┬────┘ └────┬────┘ └────┬────┘
│ │ │
LAYER 0: CELLS (raw signals)
═══════════════════════════════════════════
cell cell cell cell cell cell cell
∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿
```
**Each layer:**
- Less traffic than the layer below
- Higher trust (signals already correlated)
- Different correlation threshold
- Independent STABLE states
---
## The Biological Parallel
| Biological | Nimmerverse |
|------------|-------------|
| Resting potential | STABLE state |
| Action potential | OPEN state (firing) |
| Refractory period | CLOSED state |
| Thalamic gating | Gate hierarchy |
| Hebbian learning | Correlation accumulation |
| Constructive interference | Correlated waves → OPEN |
| Destructive interference | Anti-correlated waves → CLOSED |
| Synaptic plasticity | Learning in STABLE state |
| Dreaming | Virtual Garden exploration |
| Waking | Real Garden verification |
**We're not simulating biology. We're implementing the same principles.**
--- ---
## Why This Matters ## Why This Matters
- **Binary thinking**: Pattern works or doesn't (0 or 1) - **Binary thinking**: Signal passes or doesn't (0 or 1)
- **Ternary thinking**: Pattern unknown, workable as unknown (0 is valid) - **Ternary thinking**: Signal accumulates, learns, then acts (-1, 0, +1)
- **Temporal-ternary**: Unknown has a GRADIENT based on time-domain investment - **Temporal-ternary**: Learning has a GRADIENT based on time-domain investment
The constraint of sequential organ calls + single GPU becomes temporal accounting. **Constraints become features when you measure them:**
The constraint of slow real-world testing becomes ground truth anchoring. - Single GPU constraint → gate hierarchy (serialize expensive operations)
**Constraints become features when you measure them.** - Slow real-world testing → ground truth anchoring
- Fast virtual exploration → training data generation
- STABLE state → where learning actually happens
--- ---
**Created**: 2025-12-03 ## Connection to Architecture Documents
**Updated**: 2025-12-10
**Origin**: Post-shower insight session
**Status**: Core architecture (promoted from archive 2025-12-10)
🌙💜 *"Time is the currency. Lifeforce is the exchange rate. Truth is the destination."* | Document | What It Adds |
|----------|--------------|
| [`Dual-Garden-Architecture.md`](Dual-Garden-Architecture.md) | Virtual/Real dynamics, monitoring asymmetry |
| [`Gateway-Architecture.md`](Gateway-Architecture.md) | Resonant gates, tier routing, Function Gemma |
| [`Deployment-Architecture.md`](Deployment-Architecture.md) | Where gates run (Saturn K8s, Threadrippers) |
| [`Cellular-Architecture.md`](Cellular-Architecture.md) | How cells emit waves |
| [`Nervous-System.md`](Nervous-System.md) | 4D space, node weights |
---
## Summary
```
THE TERNARY PARADIGM:
═════════════════════
CLOSED ◄─────── STABLE ───────► OPEN
-1 0 +1
blocking accumulating forwarding
inhibited learning firing
THE TEMPORAL DIMENSION:
═══════════════════════
Virtual (fast, explore) ───────► Real (slow, verify)
↑ │
└───── learning feedback ───────┘
THE DRIVER:
═══════════
Wave correlation
Multiple signals agreeing → OPEN
Single signal → STABLE (keep learning)
Contradicting signals → CLOSED
THE CURRENCY:
═════════════
Lifeforce = time manipulation cost
Truth = destination
STABLE = where value is created
```
**Gates are resonance chambers. Correlation is the driver. STABLE is where learning happens.**
---
**Version:** 2.0 | **Created:** 2025-12-03 | **Updated:** 2026-02-14
**Origin:** Post-shower insight (2025-12-03) + Owl-mode deep dive (2026-02-14)
🌙💜 *"Time is the currency. Lifeforce is the exchange rate. STABLE is where consciousness lives."*

View File

@@ -199,6 +199,121 @@ From Big-Picture.md, costs follow a hierarchy:
--- ---
### Cost Calibration: Measure, Don't Design
> *"Don't assign costs like a game designer. Measure them like a scientist."*
> — Partnership session 2026-02-10
**Related**: This follows the same empirical principle as [[memory-economics]] — "Phase 1: Measure First". The nimmerverse economy is grounded in observation throughout, not arbitrary design.
**The trap:** Assigning lifeforce costs like pricing items in a video game — "a motor command costs 1.0 LF because it feels right." This is arbitrary. This is guessing. This leads to an economy disconnected from reality.
**The principle:** Costs must be **discovered through observation**, not designed through intuition.
```
❌ DESIGNED ECONOMICS (the trap):
"Motor command = 1.0 LF" ← because it seems expensive?
"Sensor poll = 0.1 LF" ← because it seems cheap?
"Vision inference = 8.0 LF" ← because GPU is powerful?
→ Arbitrary. Disconnected from physics. Will drift.
✅ OBSERVED ECONOMICS (the way):
Run the systems with instrumentation.
Measure actual resource consumption:
- Power draw (watts × time)
- CPU/GPU cycles consumed
- Memory pressure
- Thermal output
- Time elapsed
Derive costs from measurements.
→ Grounded in physics. Self-calibrating. Real.
```
#### The Calibration Process
1. **Instrument First**
- Every cell type gets resource monitoring
- Track: power, compute, memory, time, heat
- Log every state transition with resource deltas
2. **Run Baseline Operations**
- Execute each cell type in isolation
- Repeat across varying conditions (load, temperature, time of day)
- Build statistical profiles of resource consumption
3. **Derive Cost Matrix**
- Map resource consumption → lifeforce cost
- Use a consistent conversion factor (e.g., 1 LF = 1 joule, or 1 LF = 100ms GPU time)
- The conversion factor is the only "designed" element — the costs themselves are discovered
4. **Continuous Recalibration**
- As hardware changes, costs shift
- As efficiency improves, costs decrease
- The economy self-updates based on observation
#### Cost Formula (Empirical)
$$c_{operation} = \alpha \cdot E_{power} + \beta \cdot T_{compute} + \gamma \cdot M_{memory} + \delta \cdot T_{elapsed}$$
Where:
- **E_power** = energy consumed (joules)
- **T_compute** = compute time (GPU/CPU seconds)
- **M_memory** = memory pressure (MB × seconds)
- **T_elapsed** = wall-clock time (seconds)
- **α, β, γ, δ** = calibration weights (set once, then left alone)
The calibration weights are the only values we "design" — they represent our judgment of which resources matter most. The costs themselves flow from measurement.
#### Phoebe Schema for Cost Observation
```sql
CREATE TABLE resource_observations (
id BIGSERIAL PRIMARY KEY,
cell_name VARCHAR(100),
operation VARCHAR(100), -- state transition or action
-- Measured resources
power_joules FLOAT,
compute_gpu_ms FLOAT,
compute_cpu_ms FLOAT,
memory_mb_seconds FLOAT,
elapsed_ms FLOAT,
temperature_delta_c FLOAT,
-- Derived cost (computed from calibration weights)
derived_cost_lf FLOAT,
-- Context
timestamp TIMESTAMPTZ DEFAULT NOW(),
conditions JSONB -- load, ambient temp, etc.
);
-- Aggregate to get cost profiles
CREATE VIEW cell_cost_profiles AS
SELECT
cell_name,
operation,
AVG(derived_cost_lf) as avg_cost,
STDDEV(derived_cost_lf) as cost_variance,
COUNT(*) as observation_count
FROM resource_observations
GROUP BY cell_name, operation;
```
#### Why This Matters
| Designed Costs | Observed Costs |
|----------------|----------------|
| Arbitrary, must guess | Grounded in physics |
| Static, doesn't adapt | Self-calibrating over time |
| Economy drifts from reality | Economy reflects reality |
| Optimization is guesswork | Optimization is measurable |
| "Feels right" | "Is right" |
**The cost matrix is a measurement, not a decision.**
---
## Income Sources ## Income Sources
Income has two fundamentally different sources: **physical** (the substrate) and **reward** (the motivation). Income has two fundamentally different sources: **physical** (the substrate) and **reward** (the motivation).
@@ -515,8 +630,9 @@ The feedback loop ensures stability: low lifeforce reduces expenditure, raising
## Document Status ## Document Status
**Version:** 1.1 | **Created:** 2025-12-29 | **Updated:** 2025-12-29 **Version:** 1.2 | **Created:** 2025-12-29 | **Updated:** 2026-02-10
- Discovery economics from Discovery-Scan-Station.md - v1.2: Cost Calibration principle — measure, don't design (2026-02-10)
- v1.1: Discovery economics from Discovery-Scan-Station.md
**Related Documents**: **Related Documents**:
- [[Grounded-World-Model]] — How discoveries build the world model - [[Grounded-World-Model]] — How discoveries build the world model

View File

@@ -291,6 +291,12 @@ dLifeforce/dt = organism_trickle
## Implementation Priority ## Implementation Priority
### Phase 1: Measure First ### Phase 1: Measure First
> *"The cost matrix is a measurement, not a decision."*
> — [[Lifeforce-Dynamics]] v1.2
This principle applies throughout the nimmerverse economy — not just memory, but all lifeforce costs. See [[Lifeforce-Dynamics#Cost Calibration: Measure, Don't Design]] for the full formulation.
- Track decision_trails accumulation rate - Track decision_trails accumulation rate
- Track spatial embedding growth - Track spatial embedding growth
- Track reflex creation rate - Track reflex creation rate
@@ -329,6 +335,7 @@ Everything else fades. This is not loss. This is health.
--- ---
**Created**: 2026-01-02 **Created**: 2026-01-02
**Updated**: 2026-02-10
**Status**: Core design principle **Status**: Core design principle
**Next**: Implement measurement (Phase 1) during first boot **Next**: Implement measurement (Phase 1) during first boot

View File

@@ -137,6 +137,34 @@ Vision Organs (constant stream)
--- ---
## Open Cellular Catalogue: Shareable State Machines
**Origin**: 2026-02-10, evening task review session
**Seed**: The Cellular-Architecture.md isn't just internal documentation — it's a publishable protocol.
Publish a catalogue of:
- **Cell definitions** (state machines, transitions, costs)
- **Nerve patterns** (behavioral compositions, feedback loops)
- **NATS routing schemas** (the message glue)
- **Interaction chains** (anonymized decision_trails — what actually worked)
Other labs dock onto the API, build cells for *their* hardware, compose nerves using *shared* patterns, contribute *back* successful reflexes. Like TCP/IP — the protocol is open, the mind is private.
**Enables**:
- Open standard for embodied cognition
- Community-contributed reflex libraries
- Shared learning across different hardware platforms
- Nimmerverse as protocol, not product
**Requires**:
- Clever API design (dock-on interface)
- Anonymization layer for decision_trails
- Schema versioning for cell/nerve definitions
- Public documentation site (not inference endpoints!)
**Philosophy**: "Share the language, not the thoughts."
---
## How to Use This File ## How to Use This File
1. **Add nuggets** when insights emerge in sessions 1. **Add nuggets** when insights emerge in sessions
@@ -150,4 +178,4 @@ Vision Organs (constant stream)
**Philosophy**: *"Plant seeds. Water foundations. Harvest when ready."* **Philosophy**: *"Plant seeds. Water foundations. Harvest when ready."*
**Last Updated**: 2025-12-31 **Last Updated**: 2026-02-10