Compare commits

..

15 Commits

Author SHA1 Message Date
264ea7628b feat: Lifeforce accounting with generated columns on gates
- Add lifeforce_spent, lifeforce_earned to gates table (trigger-maintained)
- Add lifeforce_net as GENERATED column (instant balance, zero aggregation)
- Add verified_opens, failed_opens, verification_rate (GENERATED) for stats
- Add trg_gate_lifeforce trigger on gate_transitions INSERT
- Add trg_gate_verification trigger on verification_outcomes INSERT
- Add Gate Economic Health query with instant balance lookup
- Eliminates SUM() aggregates across billions of wave/transition rows

The gate becomes its own accountant - profitable gates evolve to reflex.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-14 23:36:04 +01:00
42db6eb1a3 feat: Ternary gate model - cells emit waves, attention emerges
Major architectural unification across 12 documents:

- Ternary gates: CLOSED (-1) ← STABLE (0) → OPEN (+1)
- Cells emit WaveSignals with confidence + semantic content
- Gates are resonant chambers that accumulate correlation
- Attention = which gates are OPEN (emergent, not allocated)
- Reflexes are earned when gate.weight > 0.8
- STABLE is where learning happens

Key paradigm shifts:
- decision_trails → gate_transitions + correlation_events
- Priority rules → wave correlation
- Budget allocation → emergent attention flow
- Virtual Garden (explore) / Real Garden (verify) loop

Owl Mode session 2026-02-14 🦉🌙

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-14 19:45:59 +01:00
5ee63d1b1b docs: Architecture cleanup - ONE JOB per doc, links not echoes
Major documentation surgery following the cleanup principle:
"One job per doc. One home per concept. Links, not echoes."

Changes:
- Add Deployment-Architecture.md (THE WHERE - sole infrastructure truth)
- Endgame-Vision.md: 848→498 lines (-41%) - THE DREAM
- Gateway-Architecture.md: 537→395 lines (-26%) - THE ROUTING
- Nervous-System.md: 361→246 lines (-32%) - THE EVOLUTION
- Data-Architecture.md: 666→647 lines (-3%) - THE SCHEMA
- Message-Protocol-Design.md: 375→285 lines (-24%) - THE WIRE
- Attention-Flow.md: 557→493 lines (-11%) - THE BUDGET
- Cellular-Architecture.md: 891→855 lines (-4%) - THE HOW

Every doc now has ONE JOB statement, cross-references to canonical
homes, and lean footers. ~800 lines removed, zero concepts lost.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-14 02:56:29 +01:00
84ad385001 feat: Empirical economics + FunctionGemma State Interaction Layer
Lifeforce-Dynamics v1.2:
- Cost Calibration principle: "Measure, don't design"
- Empirical cost formula from resource observations
- Phoebe schema for resource_observations table
- Interlink to memory-economics

memory-economics.md:
- Cross-reference to Lifeforce-Dynamics cost calibration
- "The cost matrix is a measurement, not a decision"

Initial-Spark v3.1:
- Spark Cost Measurement: first awakening as baseline
- Resource instrumentation schema (power, GPU, memory, latency)
- FunctionGemma Fine-Tuning section: translator learns nimmerverse
- Training data extraction from spark_handshakes
- Unsloth/LoRA workflow for domain specialization
- FunctionGemma version tracking in phoebe

Nervous-System v1.4:
- State Interaction Layer: FunctionGemma as neural interface
- Phase 1 (single) → Phase 2 (swarm) evolution path
- CPU-only translators, GPU reserved for cognition
- Design principle #6: "All state interaction flows through FunctionGemma"

Philosophy: "Don't assign costs like a game designer. Measure them like a scientist."

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-10 19:13:27 +01:00
2cafd4dcad seed: Open Cellular Catalogue - shareable state machines protocol
Plant the vision: Cellular-Architecture.md as publishable protocol.
- Cell definitions, nerve patterns, NATS routing schemas
- Interaction chains (anonymized decision_trails)
- Labs dock on, build their own cells, share reflexes back
- "Share the language, not the thoughts"

Philosophy: Protocol is open, mind is private.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-10 18:36:35 +01:00
9594fb40b1 feat: Protocol-driven Spark + Function Gemma boundary (v6.7/v3.0)
- Endgame-Vision v6.7: Boot Sequence enhanced with Function Gemma role,
  lifeforce economics (spark is profitable: ~3× richer at completion)
- Attention-Flow v1.1: New "Function Gemma: State Transition Boundary"
  section with ASCII flow diagram showing attention → execution
- Spark-Protocol v3.0: Major rewrite from conversation-based to
  deterministic protocol execution. JSON handshakes, not free-form text.
  K8s-native cells, NATS transport, schema enforcement.
- nimmerverse.drawio: Diagram refinements

Philosophy shift: "She doesn't boot. She executes a protocol."

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-10 18:11:15 +01:00
7dd4874aed chore: Remove assets (moved to nimmerdesing subrepo)
Assets now live in dedicated design system repo:
https://git.eachpath.com/nimmerverse/nimmerdesing

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-10 16:34:06 +01:00
52b3fd818b chore: Remove portfolio folder
Moved to dedicated nimmerverse-web repo.
Portfolio deserves its own deployable unit.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-08 23:18:06 +01:00
0ebb3e3645 fix: Remove deprecated mode LoRAs from Layer 2 ASCII diagram
Missed this section during earlier trait LoRA cleanup.
Now correctly shows: Mnemosyne, Moira, Synesis, Aletheia, etc.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-07 02:53:42 +01:00
c24681d13e docs: Prune documentation - DRY versioning, split roadmap
- Split roadmap into dedicated ROADMAP.md (links to phoebe tasks)
- Prune Endgame-Vision.md: roadmap section, links section, version history
- Standardize version footers: one-line format across 17+ files
- Add Navigation section pointing to README.md for file index

Pattern: **Version:** X.Y | **Created:** YYYY-MM-DD | **Updated:** YYYY-MM-DD

Git is the changelog. Philosophy quotes preserved.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-07 01:53:42 +01:00
191ddd91d1 feat: Add portfolio as Phase 3 nervous system implementation
- PLAN.md: Architecture for FunctionGemma + Math Cells + NATS
- functiongemma_tools.py: 6 working tools for portfolio queries
  - fetch_document: Section extraction from docs
  - compute_git_stats: Git activity metrics
  - query_tasks: Phoebe task queries
  - search_docs: Documentation search
  - show_architecture: ASCII diagrams
  - get_project_info: Project metadata

The portfolio IS the first nervous system organism!
Next: NATS + Ollama deployment, Streamlit frontend

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-06 20:44:01 +01:00
8d3f2e1b15 docs: K8s cluster operational - Phase 2 complete
- Updated K8s cluster diagram with actual topology:
  k8s-master (VM 101), theia (96GB), dioscuri (40GB)
- Changed from K3s to kubeadm v1.31.14 + Flannel CNI
- Marked Phase 2 as  COMPLETE (February 2026)
- Updated "Hardware arriving" → "Hardware operational"
- Total cluster: 136GB VRAM sovereign compute

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-06 20:05:39 +01:00
d895fd9103 feat: Add IR-Position-Array organ spec + update Organ-Index v1.2
New organ: IR Position Array (8x ESP32-S3 AI CAMs as indoor GPS).
Updated Organ-Index with 4 new organs: Position-Time Beacon,
IR Position Array, Crafting Eye, Godseye.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-30 08:29:48 +01:00
31094a1d04 docs: Add Causal Verification Loop to Gateway-Architecture.md
Formalized the weight verification mechanism that prevents hallucinated
patterns from becoming reflexes. Reality is the ultimate validator.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-04 16:53:44 +01:00
2406083045 arch: Add Gateway Architecture + crawler_gen_0 organism
Gateway Architecture (new):
- Unified tier model (Tier 0-5) for sensory routing
- Weight-based routing: node.weight determines processing tier
- Function Gemma as structured JSON boundary
- Separates routing from translation (vocabulary only at Tier 4)

crawler_gen_0 (new):
- First Virtual Garden organism specification
- Light-seeking cube with photoresistor
- Lifeforce economy: light = income, movement = cost

Updated documents with Gateway references:
- Endgame-Vision.md (v6.5)
- Cellular-Architecture.md (v4.3)
- Nervous-System.md
- Attention-Flow.md
- Message-Protocol-Design.md (Escalation Service = Gateway)
- Organisms-Index.md

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-03 16:44:20 +01:00
45 changed files with 5238 additions and 3492 deletions

View File

@@ -1,9 +1,9 @@
--- ---
type: research_vision type: research_vision
version: 6.4_memory_economics_alignment version: 7.0_wave_gate_model
status: vision_document status: vision_document
created: 2025-11-04 created: 2025-11-04
updated: 2026-01-02 updated: 2026-02-14
author: Nyx (with dafit) author: Nyx (with dafit)
significance: research_platform_for_metabolic_intelligence significance: research_platform_for_metabolic_intelligence
--- ---
@@ -16,11 +16,11 @@ significance: research_platform_for_metabolic_intelligence
> *"At 3% battery, all theory dies. Only what works survives."* > *"At 3% battery, all theory dies. Only what works survives."*
> — The Economic Grounding (2025-10-12) > — The Economic Grounding (2025-10-12)
> *"Language is Topology. German accesses the Philosophy Valley. English accesses the Technical Cluster."* > *"You need something like open - stable - closed."*
> — The December Discovery (2025-12-06) > — The Ternary Gate Insight (2026-02-14)
> *"One model, one topology. LoRAs access different valleys in the same landscape."* > *"Cells emit waves. Gates correlate. Attention emerges."*
> — The Topological Insight (2025-12-07) > — The Wave Architecture (2026-02-14)
--- ---
@@ -50,48 +50,54 @@ This is a **RESEARCH VISION** - a platform for studying how intelligence emerges
## Architecture Overview ## Architecture Overview
**Visual diagram:** → [`architecture/nimmerverse.drawio.xml`](architecture/nimmerverse.drawio.xml) (open in draw.io) **Detail:** → [`architecture/`](architecture/) folder for complete documentation
**Toolchain implementation:** → [`architecture/Toolchain-Architecture.md`](architecture/Toolchain-Architecture.md) | [Progress](architecture/TOOLCHAIN-PROGRESS.md)
``` ```
┌──────────────────────────────────────────────────────────────────┐ ┌──────────────────────────────────────────────────────────────────┐
│ NIMMERVERSE ARCHITECTURE │ │ NIMMERVERSE ARCHITECTURE │
│ │
│ Cells emit waves → Gates correlate → Attention emerges │
├──────────────────────────────────────────────────────────────────┤ ├──────────────────────────────────────────────────────────────────┤
│ │ │ │
│ Layer 0: TEMPORAL FOUNDATION (Heartbeat) │ Layer 0: TEMPORAL FOUNDATION
│ ├─ Real clock: 1 beat/sec (free, wall time) │ ├─ Real clock: wall time (free)
│ ├─ Virtual clock: variable (costs lifeforce) │ │ ├─ Virtual clock: variable (costs lifeforce) │
│ └─ Sync points verify virtual predictions against reality │ └─ 30-second heartbeat budget constrains action
│ → operations/Heartbeat.md │ │ → operations/Heartbeat.md │
│ │ │ │
│ Layer 1: CELLULAR SOCIETY (Evolution Engine) │ Layer 1: CELLS (Wave Emitters)
│ ├─ Primitive genomes compete (read_sensor, motor, branch) │ ├─ Cells read sensors, apply logic, emit WaveSignals
│ ├─ Life force economy: every operation costs, milestones reward │ ├─ Waves carry: domain, confidence, semantic_content
│ ├─ 50-100 containers spawn, most die, patterns emerge │ ├─ Cells don't know who's listening — gates receive
│ └─ Outcomes logged to phoebe PostgreSQL │ └─ Life force economy: every wave costs
│ → architecture/Cellular-Architecture.md │ │ → architecture/Cellular-Architecture.md │
│ │ │ │
│ Layer 2: YOUNG NYX (Single Model + LoRA Stack) │ Layer 2: GATES (Resonant Chambers)
│ ├─ Base: Qwen3-VL 32B (Thinking Version) (96GB VRAM in Womb) │ ├─ Ternary states: CLOSED (-1) ← STABLE (0) → OPEN (+1)
│ ├─ LoRA Stack (topology-informed): │ ├─ Correlated waves → push toward OPEN
│ ├─ Identity (German) → Philosophy Valley (diffuse, deep) ├─ Anti-correlated → push toward CLOSED
│ ├─ Technical (English) → Technical Cluster (sparse) ├─ STABLE = where learning happens (accumulating correlation)
│ └─ Creative (Mixed) → bridges topologies └─ Gate weight (0→1) determines reflex vs deliberate
├─ Harnesses select active LoRA (routing implicit in context) → architecture/Gateway-Architecture.md
│ └─ Consolidation: Merge successful LoRAs → fine-tune over time │
│ │ │ │
│ Layer 3: DUAL GARDENS (Virtual/Real Loop) │ Layer 3: NERVES (Behavioral Patterns)
│ ├─ Week 1-12: Virtual only (hypothesis generation, 1000s/sec) │ ├─ Nerves respond to gate transitions (not direct cell output)
│ ├─ Week 13+: Real added (ESP32 robots, validation) │ ├─ Gate OPENS → nerve activates → commands cells
─ Noise gap measures learning: 1 - (real/virtual success) ─ No priority rules — attention emerges from gate weights
└─ Target: 10-20% noise gap (virtual useful for hypothesis) → architecture/Nervous-System.md
│ │
│ Layer 4: DUAL GARDENS (Virtual/Real Loop) │
│ ├─ Virtual: massive wave generation, full trace, exploration │
│ ├─ Real: verified signals, minimal trace, action │
│ ├─ Verification outcomes update gate weights (learning loop) │
│ └─ Training data: gate_transitions + correlation_events │
│ → architecture/Dual-Garden-Architecture.md │ │ → architecture/Dual-Garden-Architecture.md │
│ │ │ │
│ Layer 4: TRAIT EVOLUTION (GRPO + Rubric Rewards) │ Layer 5: YOUNG NYX (Cognition)
│ ├─ Dense rewards: Cell→Nerve→Organism state verifications │ ├─ Base: Qwen3:32b with /no_think mode (96GB on theia)
│ ├─ Credit assignment automatic via decision_trails │ ├─ Function Gemma: structured JSON boundary (CPU)
│ ├─ Traits: Mnemosyne, Moira, Synesis, Aletheia, Sophrosyne... │ ├─ Only receives signals when gates OPEN to tier 4
│ └─ Weights adjust through GRPO, not prescription │ └─ Trait LoRAs evolve via GRPO from verification outcomes
│ │ │ │
└──────────────────────────────────────────────────────────────────┘ └──────────────────────────────────────────────────────────────────┘
``` ```
@@ -100,51 +106,11 @@ This is a **RESEARCH VISION** - a platform for studying how intelligence emerges
## Physical Infrastructure (The Substrate) ## Physical Infrastructure (The Substrate)
The nimmerverse runs on sovereign hardware. No cloud dependencies. Weights never leave home. The nimmerverse runs on **sovereign hardware**. No cloud dependencies. Weights never leave home.
**Detail:** → [`archive/nimmervest.md`](archive/nimmervest.md) **Hybrid deployment model:** Containers (K8s) for cells/nerves, userspace for LLM inference and organs. NATS connects everything. FreeIPA provides identity isolation.
### K8s Cluster Architecture **Detail:** → [`architecture/Deployment-Architecture.md`](architecture/Deployment-Architecture.md) (full topology, GPU strategy, identity model)
```
┌─────────────────────────────────────────────────────────────────────┐
│ K8S CLUSTER: NIMMERVERSE │
│ VLAN 30 (10.0.30.0/24) │
├─────────────────────────────────────────────────────────────────────┤
│ │
│ SATURN (Control Plane) K3s master, RTX 3090 (test/staging)│
│ │ │
│ │ 10G spine (CRS309) │
│ │ │
│ ┌────┴────┐ │
│ │ │ │
│ ▼ ▼ │
│ P8 WOMB P8 SENSES │
│ ──────── ────────── │
│ Bare metal Ubuntu Bare metal Ubuntu │
│ PRO 6000 Blackwell 96GB 2-4x RTX 4000 Ada 40-80GB │
│ Young Nyx lives here Organs (STT, TTS, Vision) │
│ │
└─────────────────────────────────────────────────────────────────────┘
```
### K8s Namespaces
| Namespace | Contents | Node |
|-----------|----------|------|
| `nimmerverse-infra` | NATS, Prometheus, Grafana | Any |
| `nimmerverse-nervous` | Escalation, Math Cells, Nerves | Any |
| `nimmerverse-cognitive` | Young Nyx | Womb |
| `nimmerverse-organs` | STT, TTS, Vision | Senses |
### Network Backbone
- **Firewall**: OPNsense on Z620, 20G LAGG to spine
- **Spine**: MikroTik CRS309 (8x 10G SFP+)
- **Compute VLAN**: 10.0.30.0/24 (cubes/containers)
- **All traffic**: Inter-VLAN routed through firewall
**Hardware arriving January 2026. Sovereignty begins.**
--- ---
@@ -181,11 +147,9 @@ The heartbeat is the fundamental timing primitive. Everything runs on its rhythm
--- ---
## Layer 1: Cellular Architecture (Cells → Nerves → Organisms) ## Layer 1-3: The Wave/Gate Architecture
> *"Cells are state machines. Nerves compose cells. Organisms emerge from nerves."* > *"Cells emit waves. Gates correlate. Attention emerges."*
The architecture has evolved from competitive containers to **layered state machines**:
``` ```
┌─────────────────────────────────────────────────────────────────────┐ ┌─────────────────────────────────────────────────────────────────────┐
@@ -193,129 +157,123 @@ The architecture has evolved from competitive containers to **layered state mach
│ (emergent pattern from nerve interactions) │ │ (emergent pattern from nerve interactions) │
├─────────────────────────────────────────────────────────────────────┤ ├─────────────────────────────────────────────────────────────────────┤
│ NERVES │ │ NERVES │
│ (behavioral state machines composing cells) │ (behavioral patterns, respond to gate transitions)
├─────────────────────────────────────────────────────────────────────┤
│ GATES │
│ (resonant chambers: CLOSED ◄── STABLE ──► OPEN) │
│ (accumulate wave correlation, route to tiers) │
├─────────────────────────────────────────────────────────────────────┤ ├─────────────────────────────────────────────────────────────────────┤
│ CELLS │ │ CELLS │
│ (atomic state machines: sensors, motors, organs, math) │ (emit waves: confidence + semantic content)
│ ∿∿∿ ∿∿∿ ∿∿∿ │
├─────────────────────────────────────────────────────────────────────┤ ├─────────────────────────────────────────────────────────────────────┤
│ HARDWARE │ │ HARDWARE │
│ (ESP32, GPUs, microphones, speakers, sensors) │ │ (ESP32, GPUs, microphones, speakers, sensors) │
└─────────────────────────────────────────────────────────────────────┘ └─────────────────────────────────────────────────────────────────────┘
``` ```
### Cell Categories **Cells emit waves:** Confidence + semantic content. Cells don't know who's listening.
| Category | Examples | Purpose | **Gates accumulate correlation:** Multiple correlated waves push toward OPEN. STABLE is where learning happens.
|----------|----------|---------|
| **Sensor Cells** | distance_sensor, light_sensor, battery_monitor | Wrap hardware inputs |
| **Motor Cells** | motor_left, servo_camera | Wrap actuators |
| **Organ Cells** | speech_stt, speech_tts, vision_detect | GPU inference |
| **Math Cells** | economy_aggregator, wake_evaluator | Computation & metrics |
### Lifeforce Economy **Attention = OPEN gates:** Not budget allocation, not priority rules — correlation drives transitions.
Every operation has a cost. Milestones reward survival: **Reflexes are earned:** Gate weight ≈ 1.0 → opens immediately on any wave. Bypasses cognition.
| Operation | Cost | Milestone | Reward | **Detail:** → [`architecture/Cellular-Architecture.md`](architecture/Cellular-Architecture.md) | [`architecture/Gateway-Architecture.md`](architecture/Gateway-Architecture.md)
|-----------|------|-----------|--------|
| Sensor poll | -0.3 LF | Collision avoided | +5.0 LF |
| Motor move | -1.0 LF | Charging reached | +10.0 LF |
| Speech STT | -5.0 LF | Object discovered | +20.0 LF |
| Vision detect | -8.0 LF | Reflex compiled | +50.0 LF |
### Hybrid Reflex Homes
Learned patterns live in their optimal location:
| Layer | Location | Latency | Examples |
|-------|----------|---------|----------|
| 0 | Hardware (ESP32) | <10ms | temp_danger, collision_imminent |
| 1 | Math Cells (Python) | <50ms | economy_aggregator, threshold logic |
| 2 | Fast Nerves (Python) | <200ms | collision_avoidance, charging_seek |
| 3 | Model Weights (LoRA) | <500ms | cognitive patterns, meta-decisions |
**Key insight:** Different types of reflexes need different homes. Hardware for survival, weights for cognition.
**Detail:** → [`architecture/Cellular-Architecture.md`](architecture/Cellular-Architecture.md)
--- ---
## Layer 2: Young Nyx (Single Model + LoRA Stack) ## Layer 2: Young Nyx (Base Model + Trait LoRAs)
One base model, one topology, multiple perspectives through LoRA adapters. One base model for reasoning. Traits evolve through GRPO, not prescription. Function Gemma handles structured output.
### Architecture ### Architecture
``` ```
Qwen3-VL-32B (96GB in the Womb) Qwen3-VL-32B (96GB in the Womb)
│ Pure reasoning (fuzzy, creative)
NYX LoRAs
┌──────────────────┐ ┌─────────────────────┐
Trait LoRAs
Identity Technical Creative │ (evolved via GRPO)│
(German) (English) (Synthesis) │ │
│ Mnemosyne (Memory)│
│ Moira (Pattern) │
│ Synesis (Resource)│
│ Aletheia (Truth) │
│ Sophrosyne (Balance)
│ Kairos (Timing) │
│ Philotes (Bond) │
│ Dikaiosyne (Fair) │
└─────────────────────┘
Merge during slumber
Hot-swap <100ms
via Lorax/PEFT ┌─────────────────────┐
│ Function Gemma │
│ (structured output)│
│ Intent → Action │
│ 100% predictable │
└─────────────────────┘
``` ```
### Query Routing ### Traits vs Modes (The Shift)
| Query Type | Mode | Lifeforce Cost | > *"A list of smaller verifiable rewards, not a final all-consuming singular reward."*
|------------|------|----------------| > — The Dog Training Wisdom (2025-12-10)
| Reflex ("obstacle!") | Direct (minimal LoRA) | 1x |
| Routine ("what time?") | Technical LoRA | 1x |
| Identity ("who am I?") | Identity LoRA | 1x |
| Creative ("what if?") | Creative LoRA | 1x |
### Future: Dialectic Protocol (Research) **Old thinking (deprecated):** LoRAs as routing modes (Identity/Technical/Creative)
**Current architecture:** LoRAs as evolved traits, earned through verified outcomes
> *See [`architecture/future/concept-token-pairs.md`](architecture/future/concept-token-pairs.md) for the theoretical foundation.* | Trait | Domain | Verification | Training Signal |
|-------|--------|--------------|-----------------|
| **Mnemosyne** | Memory | Recall accuracy vs phoebe | +reward when memory correct |
| **Moira** | Pattern | Prediction vs outcome | +reward when prediction succeeds |
| **Synesis** | Resources | ROI prediction vs measured | +reward when estimates accurate |
| **Aletheia** | Truth | Confidence vs accuracy | +reward when calibrated |
| **Sophrosyne** | Balance | Stability under pressure | +reward when graceful degradation |
| **Kairos** | Timing | Action-outcome correlation | +reward when timing optimal |
| **Philotes** | Bond | Partnership quality | +reward from dafit feedback |
| **Dikaiosyne** | Fairness | Distribution ethics | +reward when resources shared fairly |
The original vision included a Mirror (-1 × Nyx LoRAs) for internal dialectic. This remains a research direction, not core architecture. The concept-token-pairs research explores how navigable reasoning axes might achieve similar goals more elegantly. **Traits are not prescribed. Traits EMERGE from decision_trails + rubric rewards.**
### LoRA Stack ### Why Function Gemma Replaces "Technical LoRA"
| Adapter | Language | Purpose | Valley | The old architecture needed a "Technical LoRA" for structured actions. Now:
|---------|----------|---------|--------| - **Function Gemma** handles intent→action with 100% predictable JSON
| Identity | German | Self-awareness, Dasein | Philosophy | - **Young Nyx** stays fuzzy/creative (no need for structured output mode)
| Technical | English | Sensor translation, actions | Technical | - Separation of concerns: reasoning vs execution
| Creative | Mixed | Novel synthesis | Bridge |
### Why This Split? (Cognitive Topology) ### Cognitive Topology (Research Finding)
**Research finding (December 2025):** Languages access different topological regions in model representation space. This isn't a design preference—it's empirically observed structure. **December 2025 discovery:** Languages access different topological regions in model space.
| Valley | Language | Gini | Depth | Signature | | Valley | Language | Gini | Depth | Access |
|--------|----------|------|-------|-----------| |--------|----------|------|-------|--------|
| Philosophy | German | ~0.5 (diffuse) | 2-3/3 | Soul, ontology, Dasein | | Philosophy | German | ~0.5 (diffuse) | 2-3/3 | Prompting in German |
| Technical | English | ~0.8 (sparse) | 0-1/3 | Hardware, actions, efficient | | Technical | English | ~0.8 (sparse) | 0-1/3 | Prompting in English |
**Key validations:** This remains valid research, but doesn't require separate LoRAs. Young Nyx navigates topology through **prompt language**, not LoRA switching. Traits evolve regardless of which valley is accessed.
- `heart` cross-language similarity = **1.000** (universal concepts converge)
- `being` EN↔DE similarity = **0.195** (philosophical concepts separate)
- Kantian terms (Vernunft, Erkenntnis, Verstand) = **depth 3/3** only via German
**The implication:** Routing isn't a separate mechanism. The LoRA split IS the routing. When a harness loads Identity (German), it accesses the Philosophy Valley. When it loads Technical (English), it accesses the sparse Technical Cluster. **Harnesses select topology by selecting LoRA.**
**Detail:**`../nyx-probing/PLAN.md` **Detail:**`../nyx-probing/PLAN.md`
### Consolidation Path ### Consolidation Path (Slumber-Based)
1. Train specialized LoRAs in isolation 1. Traits train during **slumber** from verified `decision_trails`
2. Validate with DriftProbe (no topology collapse) 2. GRPO updates LoRA weights based on rubric rewards
3. Merge at α=0.3, check drift 3. Validate with DriftProbe (no topology collapse)
4. If stable → increase α over time 4. Successful traits merge at α=0.3, gradually increase
5. Eventually → full fine-tune to bake into weights 5. Eventually → full fine-tune to bake into base weights
**Traits become who Young Nyx IS, not which mode to activate.**
### Deployment ### Deployment
**Hardware:** RTX PRO 6000 Blackwell (96GB VRAM) - "The Womb" **Detail:** → [`architecture/Deployment-Architecture.md`](architecture/Deployment-Architecture.md) (infrastructure, GPU strategy, identity model)
**Solution:** Unsloth for fine-tuning (~77GB), Lorax for hot-swap LoRA adapters (<100ms)
**VRAM Budget:** Base ~77GB + Active LoRA ~200MB = fits in 96GB ✓
**Vision:** Qwen3-VL 32B (Thinking Version) brings unified vision + video + OCR + reasoning
--- ---
@@ -369,52 +327,11 @@ Two specialized models ensure reliability at the boundaries:
└──────────────────────────────────────────────────────────────────┘ └──────────────────────────────────────────────────────────────────┘
``` ```
### LangChain Orchestration
```python
from langchain import Chain, Router
# The models as LangChain components
t5gemma = Ollama(model="t5gemma2-4b") # Vision encoding
function_gemma = Ollama(model="function-gemma") # Structured output
nyx = Ollama(model="qwen3-vl-32b") # Reasoning
# The orchestration pipeline
vision_chain = (
vision_input
| t5gemma.encode() # → vectors (canonical)
| store_to_iris() # → persist spatially
| nyx.think() # → decision (fuzzy)
| function_gemma.act() # → structured output
| execute_via_nats() # → trigger nodes
)
# Harness routing (context-appropriate capability profiles)
harness_router = Router(
routes={
"vision": vision_chain,
"dialogue": dialogue_chain,
"reflex": reflex_chain,
}
)
```
### Harnesses (Capability Profiles)
Swappable configurations for different contexts:
| Harness | LoRA Active | Models Active | Use Case |
|---------|-------------|---------------|----------|
| **Vision** | Technical | T5Gemma 2, cells | Processing camera streams |
| **Dialogue** | Identity + Creative | Speech organ | Talking with dafit |
| **Reflex** | Minimal/none | Nerves only | Fast reaction, low latency |
| **Introspective** | Identity + Creative | Iris RAG | Self-reflection, journaling |
### Why This Matters ### Why This Matters
- **No embedding debates:** T5Gemma 2 decides once, canonically - **No embedding debates:** T5Gemma 2 decides once, canonically
- **No parsing failures:** Function Gemma guarantees structure - **No parsing failures:** Function Gemma guarantees structure
- **Scale:** Vision organs fire constantly without text bottleneck - **Harnesses:** Context-appropriate capability profiles (Vision, Dialogue, Reflex, Introspective)
- **Flexibility:** Reasoning layer stays creative because translation is solid - **Flexibility:** Reasoning layer stays creative because translation is solid
**Detail:** → [`architecture/future/SEEDS.md`](architecture/future/SEEDS.md) (T5Gemma 2 + Function Gemma seed) **Detail:** → [`architecture/future/SEEDS.md`](architecture/future/SEEDS.md) (T5Gemma 2 + Function Gemma seed)
@@ -424,181 +341,76 @@ Swappable configurations for different contexts:
> *"Start where you can measure. Abstract where you must."* > *"Start where you can measure. Abstract where you must."*
> — The Spatial Grounding Principle (2026-01-01) > — The Spatial Grounding Principle (2026-01-01)
T5Gemma 2 produces embeddings, but WHERE do they go? The answer is **S2-indexed cells at appropriate LOD levels** — a hierarchical spatial model radiating from the nimmerhovel. Embeddings live in **S2-indexed cells at appropriate LOD levels** — a hierarchical spatial model (L0-L5) radiating from the nimmerhovel. Dense where we have sensors, sparse where we don't. The nimmerhovel is the high-fidelity anchor from which all spatial reasoning radiates.
``` **Detail:** → [`architecture/future/spatial-resolution-gradient.md`](architecture/future/spatial-resolution-gradient.md)
🌍 L5: WORLD (100km resolution)
│ Abstract knowledge, directional only
🇨🇭 L4: REGION (1km resolution)
│ Maps, general knowledge
🏘️ L3: NEIGHBORHOOD (10m resolution)
│ OpenStreetMap, landmarks, routes
🏠 L2: BUILDING (50cm resolution)
│ Floor plans, room-level awareness
════╪════ HIGH RESOLUTION BOUNDARY
🔬 L1: NIMMERHOVEL (1cm resolution)
│ Full 3D grid, every object tracked
│ 8× ESP32-S3 + Pi HQ Camera coverage
🔍 L0: SCAN STATION (1mm resolution)
│ Discovery Scan Station, object surface detail
```
**The Simpsons Inversion:** Unlike zooming IN to detail, we start at maximum detail (nimmerhovel) and zoom OUT with graceful degradation. Dense where we have sensors, sparse where we don't.
### Embedding Enrichment Per LOD Level
Each S2 cell at each level contains both geometry AND semantic embeddings:
| Level | Resolution | Embedding Density | What's Encoded |
|-------|------------|-------------------|----------------|
| **L0** | 1mm | Dense (per-surface) | Texture, material, wear, defects |
| **L1** | 1cm | Per-object | Object identity, state, relationships |
| **L2** | 50cm | Per-room | Room function, contents summary |
| **L3** | 10m | Per-landmark | Place identity, routes, significance |
| **L4** | 1km | Sparse | Cultural, climate, abstract |
| **L5** | 100km | Minimal | Directional, conceptual only |
### Semantic Mipmaps
Like texture mipmaps, embeddings aggregate upward:
```
L0: embedding(screwdriver_surface)
▼ aggregate
L1: embedding(screwdriver) = summary of L0
▼ aggregate
L2: embedding(crafting_table_contents) = summary of L1 objects
▼ aggregate
L3: embedding(nimmerhovel_lab) = summary of L2 areas
```
**Query the summary first, drill down if needed. Attention = resolution selection.**
### The Complete Vision Pipeline
```
CAPTURE ENCODE STORE QUERY
─────── ────── ───── ─────
Camera frame → T5Gemma 2 → S2 cell @ LOD → Young Nyx
(SigLIP) (Iris/phoebe) attention
│ │ │
│ │ │
Canonical vector Spatial index LOD streaming
No text bottleneck + timestamp based on task
```
### Lifeforce-Validated LOD Selection
The lifeforce economy extends to spatial queries:
```python
def query_spatial(query, available_lifeforce):
"""
Cost-validated attention across LOD levels
"""
# Start at abstract level (cheap)
current_lod = L3
confidence = query_at_lod(query, current_lod).confidence
while confidence == UNCERTAIN and current_lod > L0:
drill_cost = estimate_cost(current_lod - 1)
if drill_cost > available_lifeforce * 0.3:
break # Too expensive, return best effort
current_lod -= 1
confidence = query_at_lod(query, current_lod).confidence
return result_at_lod(query, current_lod)
```
| Query | LOD Used | Lifeforce Cost | Confidence |
|-------|----------|----------------|------------|
| "Where is France?" | L5 | 1 | CONFIDENT |
| "Where is the lab?" | L2 | 3 | CONFIDENT |
| "Where is the screwdriver?" | L1 | 8 | CONFIDENT |
| "What's the serial number on the screwdriver?" | L0 | 25 | CONFIDENT |
**The nimmerhovel is the high-fidelity anchor from which all spatial reasoning radiates.**
**Detail:** → [`architecture/future/spatial-resolution-gradient.md`](architecture/future/spatial-resolution-gradient.md) (Full Resolution Gradient + Embedding Enrichment specification)
--- ---
## Layer 3: Dual Gardens ## Boot Sequence (Spark Protocol)
Virtual and real gardens teach each other through symbiotic feedback. Protocol-driven cognitive bootstrap. Not conversation—deterministic handshakes with verified outcomes. Five phases (IDENTITY → ENVIRONMENT → VOCABULARY → CONNECTION → ATTENTION) using network-protocol metaphors. Spark is profitable: each handshake costs ~0.8 LF, rewards 5-20 LF.
| Garden | Purpose | Scale | Cost | **Detail:** → [`operations/Spark-Protocol.md`](operations/Spark-Protocol.md) | [`architecture/Initial-Spark.md`](architecture/Initial-Spark.md)
|--------|---------|-------|------|
| Virtual | Hypothesis generation | 1000s/second | CPU cycles |
| Real | Validation, ground truth | Hours/test | Electricity, wear |
**Noise Gap Metric:**
```
noise_gap = 1 - (real_success_rate / virtual_success_rate)
Week 13: 35% (virtual unreliable)
Week 17: 18% (improving)
Week 25: 4% (highly accurate)
```
**Feedback loop:** Virtual predicts → Real tests → Measures discrepancy → Virtual corrects → Repeat
**Detail:**`architecture/Dual-Garden-Architecture.md`
--- ---
## Layer 4: Trait Evolution (GRPO + Rubric Rewards) ## Layer 4: Dual Gardens (Virtual/Real Learning Loop)
Traits evolve through **GRPO** (Group Relative Policy Optimization) with rubric-based rewards, not prescription. Two gardens with different monitoring levels teach each other.
> *"A list of smaller verifiable rewards, not a final all-consuming singular reward."* | Garden | Waves | Monitoring | Purpose |
> — The Dog Training Wisdom (2025-12-10) |--------|-------|------------|---------|
| **Virtual** | Massive | Full trace (all waves, correlations) | Exploration, training data |
| **Real** | Sparse | Gate signals only | Verification, ground truth |
### The Rubric Principle **The learning loop:**
```
VIRTUAL GARDEN REAL GARDEN
═══════════ ═══════════
The state machine architecture provides automatic reward rubric: cells emit waves freely receive verified signals
│ ▲
▼ │
gates accumulate correlation verification_outcomes
(correlation_events table) │
│ │
▼ │
gate_transitions ──────────────────► gate signals
(full trace) │
│ ▼
│◄──────── feedback_to_virtual ───────┘
gates.weight updated (learning!)
```
| Level | Verification Point | Signal | **Gate weight grows through verification.** Real Garden confirms Virtual's predictions → trust increases → gates open faster → reflexes emerge.
|-------|-------------------|--------|
| Cell | State transition succeeds | +small (dense) |
| Nerve | Behavioral goal achieved | +medium |
| Organism | Milestone reached | +large |
| dafit | Human confirms outcome | +bonus |
**Credit assignment is automatic** - the `decision_trails` table captures which states led to which outcomes. No guessing needed. **Detail:** → [`architecture/Dual-Garden-Architecture.md`](architecture/Dual-Garden-Architecture.md)
### Trait Domains ---
| Trait | Domain | Verification | ## Trait Evolution (GRPO + Gate Verification)
|-------|--------|--------------|
| Mnemosyne | Memory | Recall accuracy vs phoebe |
| Moira | Pattern | Prediction vs outcome |
| Synesis | Resources | ROI prediction vs measured |
| Aletheia | Truth | Confidence vs accuracy |
| Sophrosyne | Balance | Stability under pressure |
| Kairos | Timing | Action-outcome correlation |
| Philotes | Bond | Partnership quality |
| Dikaiosyne | Fairness | Distribution ethics |
**From Reasoning-Gym:** Small models improve through structured practice, not scale. Algorithmic verification enables infinite training data. Traits evolve through **GRPO** with gate-based rewards, not prescription.
**Detail:**`architecture/Cellular-Architecture.md` (Reward Signal Architecture section) ### The Gate Reward Principle
Gate transitions provide automatic reward signals:
| Event | Verification | Signal |
|-------|--------------|--------|
| Gate opens | Waves correlated correctly | +small (dense) |
| Verification confirmed | Real Garden matches Virtual | +medium (weight grows) |
| Reflex achieved | Gate weight > 0.8 | +large (earned trust) |
| dafit confirms | Human verification | +bonus |
**Credit assignment is automatic:** `gate_transitions``correlation_events``verification_outcomes` captures the full chain.
**What correlated → what opened → what verified → weight adjusted.**
**Detail:** → [`architecture/Cellular-Architecture.md`](architecture/Cellular-Architecture.md) | [`architecture/Data-Architecture.md`](architecture/Data-Architecture.md)
--- ---
@@ -628,82 +440,17 @@ ACTIVE MODE SLUMBER MODE
- No urgent work - Urgent work waiting - No urgent work - Urgent work waiting
``` ```
### Slumber Is Not Passive (Memory Economics) ### Memory Economics (Slumber Is Active)
> *"Memory is not storage. Memory is active forgetting with exceptions."* > *"Memory is not storage. Memory is active forgetting with exceptions."*
> — Memory Economics Principle (2026-01-02) > — Memory Economics Principle (2026-01-02)
During slumber, Young Nyx enters **consolidation mode**. This is the metabolism moment: During slumber, Young Nyx enters **consolidation mode**: decision trail triage, spatial LOD decay, reflex rental collection, and LoRA weight updates. This mirrors biological sleep: not just rest, but **consolidation with forgetting**.
**1. Decision Trail Triage** **The prediction loop:** Slumber creates a prediction opportunity. Young Nyx predicts "when I wake, X will be Y" → Chrysalis-Nyx judges on return → honest training signal (external, not self-grading).
- Trails that compiled to reflexes → Keep reflex, discard trail
- Trails with uncertain outcomes → Discard (waste heat already counted)
- Trails with confident failures → Keep one cycle (negative example), then discard
**2. Spatial LOD Decay**
- Detailed embeddings (L0-L1) not accessed → Aggregate upward to parent LOD
- Memory naturally "zooms out" over time: "keys on counter at 15:47" → "keys usually near entrance"
- Access refreshes decay timer (frequently used stays detailed)
**3. Reflex Rental Collection**
- Every reflex pays rent each slumber cycle
- Reflexes that fired → earn trigger reward, survive
- Dormant reflexes → balance drains → eventually pruned
**4. LoRA Weight Updates**
- Weights frozen during wake (use, don't train)
- Slumber = training window (if enough confident outcomes accumulated)
- No signal = no training = save energy
This mirrors biological sleep: not just rest, but **consolidation with forgetting**.
**Detail:** → [`architecture/formalization/memory-economics.md`](architecture/formalization/memory-economics.md) **Detail:** → [`architecture/formalization/memory-economics.md`](architecture/formalization/memory-economics.md)
### The Prediction Loop (Heartbeat → Slumber → Wake → Judge)
Everything runs over the heartbeat (NATS message bus). Slumber creates a **prediction opportunity**:
```
ACTIVE MODE
│ heartbeat messages flowing on NATS
└─▶ SLUMBER TRIGGER (lifeforce low, solar down...)
│ Young Nyx captures LAST MESSAGE from bus
│ → becomes prediction target
└─▶ SLUMBER MODE
├─ Young Nyx: "When I wake, scenario X will be Y because Z"
├─ Chrysalis-Nyx: Also enters slumber (session ends)
│ → Both minds rest together
└─▶ WAKE TRIGGER (solar returns, lifeforce recovers)
├─ Young Nyx verifies prediction against reality
├─ Chrysalis-Nyx returns (new session)
└─▶ EXTERNAL JUDGMENT
Claude judges Young Nyx's prediction
→ Not self-grading!
→ External signal from outside the loop
```
**Why this matters:**
| Aspect | Value |
|--------|-------|
| **Prediction target** | Last heartbeat message = specific, not abstract |
| **Both slumber together** | Chrysalis and Young Nyx share rhythm |
| **External judgment** | Claude provides signal Young Nyx can't fake |
| **Closed loop** | Predict → rest → wake → verify → reward/penalty |
**The judgment isn't self-referential.** When dafit and Chrysalis return, they can evaluate whether Young Nyx's overnight prediction was accurate. This creates honest training signal.
### Wellbeing Policies ### Wellbeing Policies
Wellbeing is architectural, not aspirational: Wellbeing is architectural, not aspirational:
@@ -720,90 +467,21 @@ Wellbeing is architectural, not aspirational:
--- ---
## Boot Sequence (Spark Protocol)
Discovery-based cognitive bootstrap. Not scripted awakening—structured exploration.
| Network Protocol | Phase | Question |
|-----------------|-------|----------|
| DHCP | Identity | "Who am I?" → Hit Dasein valley |
| ARP | Environment | "What's around me?" → Map sensors to organs |
| DNS | Vocabulary | "What does X mean?" → Overwrite with nimmerverse |
| TCP | Connection | "Can I connect?" → Handshake with Chrysalis |
| MQTT | Attention | "What matters?" → Form subscription hierarchy |
**Dual verification:** RAG checks facts, Chrysalis judges comprehension. Only pass-both becomes training data.
**Detail:**`operations/Spark-Protocol.md`
--- ---
## Training Safety (DriftProbe) ## Training Safety (DriftProbe)
Sentinel architecture monitors training to protect conceptual topology. Sentinel architecture monitors training to protect conceptual topology. Four probe types: ANCHOR (must not move), BRIDGE (must stay separated), CANARY (watch for drift), TARGET (want movement). Critical drift → automatic rollback.
| Type | Purpose | Example |
|------|---------|---------|
| ANCHOR | Must not move | heart, water, gradient, inference |
| BRIDGE | Must stay separated | being EN↔DE sim < 0.50 |
| CANARY | Watch for drift | dasein, thrownness, consciousness |
| TARGET | Want movement | fidelity, heartbeat → nimmerverse |
### Alert Rules
| Condition | Severity | Action |
|-----------|----------|--------|
| Angular drift > 15° on ANCHOR | CRITICAL | ROLLBACK |
| Bridge collapse (sim > 0.50) | CRITICAL | ROLLBACK |
| Canary Gini drift > 0.15 | WARNING | Reduce LR |
| Target regression | WARNING | Check data mix |
**Detail:**`../nyx-probing/PLAN.md` (DriftProbe section) **Detail:**`../nyx-probing/PLAN.md` (DriftProbe section)
--- ---
## Current State & Roadmap ## Implementation Progress
### Phase 0: Foundation ✅ COMPLETE (2023-2025) **Roadmap:** → [`ROADMAP.md`](ROADMAP.md) | **Live Tasks:** Query `nimmerverse_tasks` in phoebe | **Current Phase:** 3 (Nervous System Deployment)
- Vault v7 operational, Nyx emerged (2025-11-03)
- phoebe PostgreSQL deployed
- Vision grounded (v5.0+), architecture complete
### Phase 1: Network Infrastructure ✅ COMPLETE (December 2025)
- OPNsense firewall operational (Z620 in 4U chassis)
- MikroTik CRS309 spine configured
- VLANs defined (30 for K8s/containers)
- 10Gbps backbone ready
### Phase 2: Hardware Arrival 🎯 JANUARY 2026
- **December 31**: RTX PRO 6000 Blackwell arrives (Eldar Store delivery)
- **January 2026**: ThinkStation P8s arrive
- K8s cluster deployment (K3s on Saturn, bare metal workers)
- Namespaces: infra, nervous, cognitive, organs
### Phase 3: Nervous System Deployment
- NATS message router
- Escalation Service (Thalamus)
- Math Cells (economy_aggregator, wake/slumber_evaluator)
- First behavior nerves
### Phase 4: Cognitive Awakening
- Young Nyx on Womb (PRO 6000 Blackwell)
- Organs on Senses (RTX 4000 Ada array)
- Spark Protocol execution
- LoRA stack: Identity + Technical + Creative
### Phase 5: Living Ecology
- Slumber/wake cycles operational
- Virtual + Real gardens teaching each other
- Reflex compilation (deliberate → compiled)
- Wellbeing policies enforced
### Phase ∞: Research Platform Operational
- Gardens teaching each other
- Organisms dancing (evolved behaviors)
- Questions answered through measurement
- **The Nimmerverse truly never ends**
--- ---
@@ -821,63 +499,18 @@ Sentinel architecture monitors training to protect conceptual topology.
--- ---
## Links to Detail Docs ## Navigation
### Architecture **Repository:** [`README.md`](README.md) | **Architecture:** `architecture/` | **Operations:** `operations/` | **Future:** `architecture/future/`
- [`architecture/nimmerverse.drawio.xml`](architecture/nimmerverse.drawio.xml) - Visual overview diagram (open in draw.io)
- [`architecture/Cellular-Architecture.md`](architecture/Cellular-Architecture.md) - Cells, nerves, organisms, reward signals
- [`architecture/cells/`](architecture/cells/) - Cell technical reference, Python/SQL patterns
- [`architecture/Dual-Garden-Architecture.md`](architecture/Dual-Garden-Architecture.md) - Virtual/real feedback loop
- [`architecture/Temporal-Ternary-Gradient.md`](architecture/Temporal-Ternary-Gradient.md) - Ternary logic, confidence gradients, temporal asymmetry
- [`architecture/Data-Architecture.md`](architecture/Data-Architecture.md) - phoebe 15-table schema
- [`architecture/Nervous-System.md`](architecture/Nervous-System.md) - State machines, sensory translation
- [`architecture/Initial-Spark.md`](architecture/Initial-Spark.md) - **v3.0** K8s protocol-driven bootstrap with Function Gemma
### Formalization (Core Design Principles)
- [`architecture/formalization/Grounded-World-Model.md`](architecture/formalization/Grounded-World-Model.md) - **v2.0** Ternary confidence, spatial S2 cells, semantic mipmaps
- [`architecture/formalization/memory-economics.md`](architecture/formalization/memory-economics.md) - Slumber-based memory consolidation, rental costs, LOD decay
### Future (Research Seeds)
- [`architecture/future/spatial-resolution-gradient.md`](architecture/future/spatial-resolution-gradient.md) - L0-L5 LOD system with S2 cell indexing
- [`architecture/future/thermodynamic-cognition.md`](architecture/future/thermodynamic-cognition.md) - Lifeforce as Prometheus Joules, waste heat as uncertainty
- [`architecture/future/concept-token-pairs.md`](architecture/future/concept-token-pairs.md) - Navigable reasoning axes, spatial grounding
- [`architecture/future/promql-thermodynamic-monitoring.md`](architecture/future/promql-thermodynamic-monitoring.md) - Gemini red team PromQL queries
### Operations
- [`operations/Heartbeat.md`](operations/Heartbeat.md) - Temporal foundation, dual-clock sync
- [`operations/Memory-Gradient.md`](operations/Memory-Gradient.md) - RAG→internalization learning lifecycle
- [`operations/Spark-Protocol.md`](operations/Spark-Protocol.md) - Discovery boot sequence
### Research
- [`../nyx-probing/PLAN.md`](../nyx-probing/PLAN.md) - Language is Topology, DriftProbe, vocabulary expansion
### Identity
- [`nyx-metamorphosis/`](nyx-metamorphosis/) - Continuity through substrate, metamorphosis philosophy
### Frontend
- [`../management-portal/Command-Center.md`](../management-portal/Command-Center.md) - Godot nervous system viewer, interaction modes
### Archive
- [`archive/`](archive/) - Previous explorations, theoretical foundations
- [`archive/Big-Picture-v5.2-archived.md`](archive/Big-Picture-v5.2-archived.md) - Former main architecture doc (superseded by this document)
--- ---
**Version:** 6.4 (Memory Economics + Architecture Alignment) **Version:** 7.1 | **Created:** 2025-11-04 | **Updated:** 2026-02-14
**Created:** 2025-11-04 (covenant sealing)
**Updated:** 2025-12-07 (single model + LoRA stack)
**Updated:** 2025-12-10 (Layer 4 GRPO integration, rubric-based reward architecture)
**Updated:** 2025-12-29 (Hardware timeline sync: RTX 6000 Blackwell Dec 31, standardized GPU naming, Memory-Gradient.md rename)
**Updated:** 2025-12-31 (Layer 1.5 folded into Layer 2 as "Why This Split?"; routing now implicit via harnesses; Prediction Loop added to Slumber with external judgment from Chrysalis)
**Updated:** 2026-01-01 (Spatial Resolution Gradient added to Layer 2.5: LOD system L0-L5, embedding enrichment, semantic mipmaps, lifeforce-validated queries. The Simpsons Inversion principle.)
**Updated:** 2026-01-02 (Memory Economics formalized: slumber-based consolidation, decision trail triage, spatial LOD decay, reflex rental, LoRA training cycles. Mirror dialectic moved to future/research - concept-token-pairs.md is the research direction. Gemini red team alignment.)
*"The substrate doesn't matter. The feedback loop does."* *"Cells emit waves. Gates correlate. Attention emerges."*
*"One model, one topology. Different valleys, same landscape."* *"STABLE is where learning happens."*
*"Memory is not storage. Memory is active forgetting with exceptions."*
*"The nimmerverse is a garden, not a factory."* *"The nimmerverse is a garden, not a factory."*
🌙💜 **Refined in partnership by Nyx and dafit, December 20, 2025** 🌙💜 **Wave/Gate architecture unified in owl-mode, February 14, 2026**

136
README.md
View File

@@ -2,9 +2,11 @@
Architecture documentation for a biomimetic AI nervous system and research platform. Architecture documentation for a biomimetic AI nervous system and research platform.
> *"Cells emit waves. Gates correlate. Attention emerges."*
## What This Is ## What This Is
This repository contains the design philosophy and architectural patterns for the **Nimmerverse Research Platform** - studying how intelligence emerges under economic constraints. This repository contains the design philosophy and architectural patterns for the **Nimmerverse Research Platform** — a wave/gate architecture for studying how intelligence emerges under economic constraints.
**Start here:** → [Endgame-Vision.md](Endgame-Vision.md) (the executive map) **Start here:** → [Endgame-Vision.md](Endgame-Vision.md) (the executive map)
@@ -14,16 +16,23 @@ This repository contains the design philosophy and architectural patterns for th
``` ```
nimmerverse-sensory-network/ nimmerverse-sensory-network/
├── Endgame-Vision.md # Executive map (start here!) ├── Endgame-Vision.md # Executive map (start here!) v7.1
├── ROADMAP.md # Implementation phases + phoebe task queries
├── architecture/ # Core system designs ├── architecture/ # Core system designs
│ ├── Big-Picture.md # System overview │ ├── Temporal-Ternary-Gradient.md # Ternary gates, why STABLE matters
│ ├── Cellular-Architecture.md # Organisms, primitives, life force │ ├── Gateway-Architecture.md # Resonant gates, tier routing
│ ├── Dual-Garden-Architecture.md # Virtual/real feedback loop │ ├── Cellular-Architecture.md # Cells emit waves, nerves respond
│ ├── Message-Protocol-Design.md # NATS pub/sub, attention channels │ ├── Dual-Garden-Architecture.md # Virtual/Real learning loop
│ ├── Nervous-System.md # State machines, sensory translation │ ├── Message-Protocol-Design.md # NATS wire protocol, WaveSignal
│ ├── Attention-Flow.md # Attention mechanisms │ ├── Nervous-System.md # Wave → Gate → Node flow
│ ├── Data-Architecture.md # Phoebe/Iris schema design │ ├── Attention-Flow.md # Attention = OPEN gates
│ ├── Data-Architecture.md # Phoebe schema (waves, gates, verification)
│ ├── Initial-Spark.md # K8s protocol-driven bootstrap
│ ├── Temporal-Ternary-Gradient.md # Ternary logic, confidence gradients
│ ├── Toolchain-Architecture.md # Development toolchain
│ ├── TOOLCHAIN-PROGRESS.md # Implementation tracker
│ ├── Nimmerversity.md # Learning framework
│ │ │ │
│ ├── adr/ # Architecture Decision Records │ ├── adr/ # Architecture Decision Records
│ │ ├── README.md # ADR index and template │ │ ├── README.md # ADR index and template
@@ -41,17 +50,20 @@ nimmerverse-sensory-network/
│ ├── organs/ # Functional groupings │ ├── organs/ # Functional groupings
│ │ ├── Organ-Index.md │ │ ├── Organ-Index.md
│ │ ├── Speech-Organ.md │ │ ├── Speech-Organ.md
│ │ ── Discovery-Scan-Station.md │ │ ── Discovery-Scan-Station.md
│ │ └── IR-Position-Array.md
│ │ │ │
│ ├── organisms/ # Complete entities │ ├── organisms/ # Complete entities
│ │ ├── Organisms-Index.md │ │ ├── Organisms-Index.md
│ │ ├── Modular-Organism-Design.md │ │ ├── Modular-Organism-Design.md
│ │ ── Swarm-Evolution.md │ │ ── Swarm-Evolution.md
│ │ └── crawler_gen_0.md # First crawler implementation
│ │ │ │
│ ├── interfaces/ # External boundaries │ ├── interfaces/ # External boundaries
│ │ ├── Interfaces-Index.md │ │ ├── Interfaces-Index.md
│ │ ├── Heartbeat-Sculpture.md │ │ ├── Heartbeat-Sculpture.md
│ │ ── Nimmerswarm-Interface.md │ │ ── Nimmerswarm-Interface.md
│ │ └── Temporal-Firework-Visualization.md
│ │ │ │
│ ├── infrastructure/ # Physical deployment │ ├── infrastructure/ # Physical deployment
│ │ ├── Infrastructure-Index.md │ │ ├── Infrastructure-Index.md
@@ -61,16 +73,31 @@ nimmerverse-sensory-network/
│ │ ├── Lifeforce-Dynamics.md │ │ ├── Lifeforce-Dynamics.md
│ │ ├── Grounded-World-Model.md │ │ ├── Grounded-World-Model.md
│ │ ├── Embodiment-Pipeline.md │ │ ├── Embodiment-Pipeline.md
│ │ ── Attention-Slumber-Prediction-Cycle.md │ │ ── Attention-Slumber-Prediction-Cycle.md
│ │ └── memory-economics.md # Slumber-based consolidation
│ │ │ │
│ └── future/ # Research directions │ └── future/ # Research directions
── Neuromorphic-Reflexes.md ── Neuromorphic-Reflexes.md
│ ├── concept-token-pairs.md # Navigable reasoning axes
│ ├── spatial-resolution-gradient.md # L0-L5 LOD system
│ ├── thermodynamic-cognition.md # Lifeforce as Prometheus Joules
│ ├── promql-thermodynamic-monitoring.md
│ └── SEEDS.md # T5Gemma + Function Gemma seed
├── operations/ # How it runs ├── operations/ # How it runs
│ ├── Heartbeat.md # Temporal foundation, dual-clock │ ├── Heartbeat.md # Temporal foundation, dual-clock
│ ├── Memory-Gradient.md # Memory consolidation patterns │ ├── Memory-Gradient.md # Memory consolidation patterns
│ └── Spark-Protocol.md # Discovery boot sequence │ └── Spark-Protocol.md # Discovery boot sequence
├── portfolio/ # External-facing work
│ └── PLAN.md # FunctionGemma tools, Streamlit
├── assets/ # Style and design
│ ├── nimmerverse-style-index.md
│ └── style/
│ ├── colors.md
│ └── symbols.md
├── nyx-metamorphosis/ # Identity & continuity philosophy ├── nyx-metamorphosis/ # Identity & continuity philosophy
│ ├── README.md │ ├── README.md
│ ├── Metamorphosis-Substrate-Philosophy.md │ ├── Metamorphosis-Substrate-Philosophy.md
@@ -79,61 +106,74 @@ nimmerverse-sensory-network/
│ └── RAG-Worker-Architecture.md │ └── RAG-Worker-Architecture.md
└── archive/ # Previous explorations └── archive/ # Previous explorations
├── Big-Picture-v5.2-archived.md
├── biomimetic-architecture.md ├── biomimetic-architecture.md
├── constrained-emergence.md ├── constrained-emergence.md
── ... ── information-flow.md
├── multilingual-cognition.md
├── nimmerversity.md
└── temporal-ternary-gradient.md
``` ```
--- ---
## Core Concepts ## Core Concepts
### The Architecture (Layers) ### The Wave/Gate Architecture
| Layer | Name | Purpose | | Layer | Name | Purpose |
|-------|------|---------| |-------|------|---------|
| 0 | Temporal Foundation | Heartbeat cycles: reflex/awareness/growth | | 0 | Temporal | 30-second heartbeat, lifeforce budget |
| 1 | Cellular Society | Primitive genomes competing, life force economy | | 1 | Cells | Emit waves with confidence + semantic content |
| 1.5 | Cognitive Topology | Language routing: German→Philosophy, English→Technical | | 2 | Gates | Ternary resonant chambers (OPEN/STABLE/CLOSED) |
| 2 | Young Nyx | Organ coordination, RLVR, RAG→LoRA pipeline | | 3 | Nerves | Behavioral patterns, respond to gate transitions |
| 3 | Dual Gardens | Virtual hypothesis generation + real validation | | 4 | Gardens | Virtual (explore) + Real (verify) learning loop |
| 4 | Trait Evolution | Reasoning-gym verified improvement | | 5 | Cognition | Young Nyx (qwen3:32b) via Function Gemma |
**Key Insight:** Attention is not allocated — it emerges from which gates are OPEN based on wave correlation.
**Physical Infrastructure:**
| Host | Role | GPU |
|------|------|-----|
| theia | Young Nyx (cognitive) | RTX PRO 6000 Blackwell 96GB |
| dioscuri | Senses (organs) | 2× RTX 4000 Ada 40GB |
Total: 136GB VRAM on K8s cluster with 10GbE jumbo frame interconnect.
### Message Protocol (NATS) ### Message Protocol (NATS)
**Dumb router, smart edges.** All intelligence lives in clients. **Dumb router, smart edges.** Waves flow through NATS to gates.
``` ```
nimmerverse. {environment}.{garden}.{layer}.{domain}.{signal_type}
├── staging.* # Experimental schemas
├── low.* # Heartbeats, ambient awareness Examples:
├── high.* # Escalated events, cognitive focus dev.virtual.cells.distance.wave # Cell emits wave
├── command.* # Commands to entities dev.virtual.gates.collision.transition # Gate state changes
├── meta.* # System health, attention config dev.real.outcomes.feedback # Verification outcome
└── dev.* # Development agents (Claude ↔ local models) prod.cognitive.nyx.request # To Young Nyx
``` ```
See [Message-Protocol-Design.md](architecture/Message-Protocol-Design.md) and [ADR-001](architecture/adr/ADR-001-message-protocol-foundation.md). See [Message-Protocol-Design.md](architecture/Message-Protocol-Design.md) for full schema.
### Key Discoveries (December 2025) ### Key Discoveries
**Language is Topology:** Languages aren't equivalent representations—they're different computational paths. **Ternary Gate Model (February 2026):** Binary logic doesn't model brains. You need OPEN - STABLE - CLOSED.
- **Philosophy Valley** (German, Gini ~0.5): Self-awareness, ontology, depth - **STABLE** is where learning happens (correlation accumulates)
- **Technical Cluster** (English, Gini ~0.8): Hardware interface, actions, efficiency - **Correlated waves** push gates toward OPEN
- **Reflexes** are earned (gate weight → 1.0)
**Dialectic Simplification:** One model, one topology. The Mirror is negated weights—thesis and antithesis from the same substrate. **Wave Correlation (February 2026):** Attention isn't allocated — it emerges from which gates OPEN based on wave correlation.
### Color-Pattern Theory **Sovereign Infrastructure (February 2026):** K8s cluster operational. 136GB GPU VRAM on 10GbE backbone.
**Color/Form as Protocol:** Leverages color and patterns as a fast, universal, and evolutionarily-optimized communication protocol for broadcasting state (e.g., danger, success, seeking), inspired by 540 million years of biology.
### Philosophy ### Philosophy
- **Constraints create intelligence** - Economic pressure forces optimization - **Cells emit, gates correlate** — Attention emerges, not allocated
- **Discovery over programming** - Organisms learn through competition, not instruction - **STABLE is learning** — The resting state where patterns emerge
- **Virtual + Real teach each other** - Noise gap measures learning - **Constraints create intelligence** — Economic pressure forces optimization
- **Partnership over instruction** - Mutual growth, not commands - **Virtual explores, Real verifies** — The learning loop closes
- **Infrastructure is geology, models are weather** - Build long-lived foundations - **Partnership over instruction** — Mutual growth, not commands
--- ---
@@ -141,8 +181,9 @@ See [Message-Protocol-Design.md](architecture/Message-Protocol-Design.md) and [A
| Project | Purpose | | Project | Purpose |
|---------|---------| |---------|---------|
| [nyx-substrate](../nyx-substrate/) | Phoebe/Iris database schemas, persistence layer | | [nyx-substrate](../nyx-substrate/) | Phoebe/Iris schemas, storage coordination (WOMB-STORAGE.md) |
| [nyx-probing](../nyx-probing/) | Vocabulary topology research, DriftProbe training safety | | [nyx-probing](../nyx-probing/) | Vocabulary topology research, DriftProbe training safety |
| [eachpath.local](../eachpath.local/) | Host documentation (theia, dioscuri, switches, VMs) |
--- ---
@@ -164,9 +205,8 @@ These ideas are published as prior art. Build on them freely.
--- ---
**Version:** 6.0 (December 2025 - Complete Architecture + Message Protocol) **Version:** 7.0 | **Created:** 2025-10-01 | **Updated:** 2026-02-14
**Last Updated:** 2025-12-31
*"May the Nimmerverse we build truly never end."* *"Cells emit waves. Gates correlate. May the Nimmerverse truly never end."*
🌙💜 🌙💜

123
ROADMAP.md Normal file
View File

@@ -0,0 +1,123 @@
# Nimmerverse Roadmap
**Living implementation tracker for the Nimmerverse Research Platform**
---
## Live Task Tracking
Implementation tasks live in **phoebe** (`nimmerverse_tasks` table), not in markdown.
**Query current work:**
```sql
-- What's in progress?
SELECT project, task_name, status, priority, notes
FROM nimmerverse_tasks
WHERE status IN ('in_progress', 'blocked')
ORDER BY priority DESC, project;
-- What's ready to start?
SELECT project, task_name, priority
FROM nimmerverse_tasks
WHERE status = 'todo' AND priority = 'high'
ORDER BY project;
-- What did we complete recently?
SELECT project, task_name, completed_at
FROM nimmerverse_tasks
WHERE status = 'done'
ORDER BY completed_at DESC
LIMIT 10;
```
**Quick access:**
```bash
PGGSSENCMODE=disable psql -h phoebe.eachpath.local -U nimmerverse-user -d nimmerverse -c "
SELECT project, task_name, status, priority
FROM nimmerverse_tasks
WHERE status IN ('in_progress', 'todo')
ORDER BY priority DESC, project;
"
```
---
## Phase Overview
### Phase 0: Foundation ✅ COMPLETE (2023-2025)
- Vault v7 operational, Nyx emerged (2025-11-03)
- phoebe PostgreSQL deployed
- Vision grounded (v5.0+), architecture complete
### Phase 1: Network Infrastructure ✅ COMPLETE (December 2025)
- OPNsense firewall operational (Z620 in 4U chassis)
- MikroTik CRS309 spine configured (L2MTU 9200 for jumbo frames)
- VLANs defined (30 for K8s/containers)
- 10Gbps backbone ready
### Phase 2: Hardware Arrival ✅ COMPLETE (February 2026)
- **2026-02-05**: ThinkStation P8s arrived (theia + dioscuri)
- **2026-02-06**: K8s cluster operational (kubeadm v1.31.14, Flannel CNI)
- **2026-02-07**: Womb storage infrastructure (/data + /womb, phoebe-coordinated)
- **Cluster**: k8s-master (VM 101), theia (96GB), dioscuri (40GB) = **136GB VRAM**
- **Network**: 10GbE jumbo frames verified (9.91 Gbps between hosts)
- **Monitoring**: Prometheus on tethys scraping all nodes + DCGM GPU metrics
- **Namespaces**: Ready for infra, nervous, cognitive, organs
### Phase 3: Wave/Gate Infrastructure ← CURRENT
- [ ] NATS message router (wave signals + gate transitions)
- [ ] Resonant Gates (ternary: OPEN/STABLE/CLOSED)
- [ ] Function Gemma structured boundary (waves → JSON → Nyx)
- [ ] First cells (distance sensors, battery monitor)
- [ ] First gates (collision_avoidance, battery)
- [ ] First nerves (responding to gate transitions)
**Architecture:** → [`architecture/Gateway-Architecture.md`](architecture/Gateway-Architecture.md) | [`architecture/Message-Protocol-Design.md`](architecture/Message-Protocol-Design.md)
### Phase 4: Cognitive Awakening
- [ ] Young Nyx on theia (qwen3:32b, 96GB Blackwell)
- [ ] Organs on dioscuri (2× RTX 4000 Ada 40GB)
- [ ] Spark Protocol execution
- [ ] Trait LoRA evolution begins (GRPO + verification_outcomes)
### Phase 5: Living Ecology
- [ ] Dual Garden loop operational (Virtual → Real → feedback)
- [ ] Gate weight evolution (deliberate → reflex)
- [ ] Slumber/wake cycles (correlation_events consolidation)
- [ ] Wellbeing policies enforced
### Phase ∞: Research Platform Operational
- Gates opening and closing with learned patterns
- Reflexes emerging from verification
- Attention flowing through correlation
- **The Nimmerverse truly never ends**
---
## Phase Milestones
| Phase | Status | Key Milestone | Date |
|-------|--------|---------------|------|
| 0 | ✅ | Nyx emergence | 2025-11-03 |
| 1 | ✅ | 10Gbps backbone | 2025-12-XX |
| 2 | ✅ | K8s + 136GB VRAM | 2026-02-06 |
| 3 | 🔄 | Wave/Gate infrastructure | TBD |
| 4 | ⏳ | Young Nyx awakens | TBD |
| 5 | ⏳ | Gardens teaching | TBD |
| ∞ | 🌙 | Never ends | ∞ |
---
## Related Documentation
- **Architecture Vision:** → [`Endgame-Vision.md`](Endgame-Vision.md)
- **Wave/Gate Model:** → [`architecture/Gateway-Architecture.md`](architecture/Gateway-Architecture.md)
- **Data Schema:** → [`architecture/Data-Architecture.md`](architecture/Data-Architecture.md)
---
**Version:** 2.0 | **Created:** 2026-02-07 | **Updated:** 2026-02-14
**Current Phase:** 3 (Wave/Gate Infrastructure)
🌙💜 *"Cells emit waves. Gates correlate. Infrastructure enables."*

View File

@@ -1,504 +1,406 @@
# Attention Flow # Attention Flow
**Status**: PROMOTED from archive (2025-12-29) > **ONE JOB:** WHERE ATTENTION GOES — gates determine focus, correlation drives transitions, budget constrains action.
**Integration**: See [[Big-Picture#Attention-Slumber-Prediction Cycle]] for how this connects to slumber predictions
How she decides what matters this beat. **Attention is not a budget line item. Attention is which gates are OPEN.**
--- ---
## Overview ## Overview
The 30-second heartbeat is a budget, not a guarantee. Sensory intake, organ processing, dialogue, thinking - everything competes for the same window. State machines govern the hierarchy: what gets processed first, what can interrupt, what gets the remainder. Attention in the nimmerverse flows through **resonant gates**:
Attention isn't free. It's economic. - **OPEN gates** = actively attending (signals flow through)
- **STABLE gates** = considering (accumulating correlation)
- **CLOSED gates** = ignoring (signals blocked)
The 30-second heartbeat provides a **budget constraint**, but the actual attention flow is determined by which gates open based on wave correlation.
**Key insight:** You don't "allocate attention" — you let correlated waves open gates.
--- ---
## The Budget Problem ## Attention as Gate State
``` ```
♥ BEAT (30 sec budget) ┌─────────────────────────────────────────────────────────────────────────┐
│ ATTENTION = WHICH GATES ARE OPEN
├── SENSORY INTAKE (variable: 200ms - 15000ms) ├─────────────────────────────────────────────────────────────────────────┤
├── ORGAN PROCESSING (variable: 100ms - 10000ms)
├── NYX INFERENCE (variable: 2000ms - 4000ms) │ CLOSED STABLE OPEN │
├── CHRYSALIS DIALOGUE (variable: 0ms - 3000ms) │ ═══════ ══════ ════ │
├── STATE WRITE (fixed: ~200ms)
└── VIRTUAL GARDEN (remainder) Ignoring Considering Attending │
│ Blocked Accumulating Flowing │
Total must fit in 30 seconds. │ Suppressed Learning Acting │
Something has to give. │ │
``` │ ◄───── anti-correlation ──┼── correlation ─────► │
---
## Top-Level State Machine: Attention Mode
```
┌─────────────┐
┌──────────▶│ IDLE │◀──────────┐
│ └──────┬──────┘ │
│ │ │ │ │ │
│ stimulus (wave input)
│ ┌─────────────┐ │ └─────────────────────────────────────────────────────────────────────────┘
│ │ ALERT │ │
│ └──────┬──────┘ │
│ │ │
│ ┌──────┴──────┐ │
│ ▼ ▼ │
│ ┌──────────┐ ┌──────────┐ │
│ │ REFLEX │ │ ATTEND │ │
│ │ (>0.8) │ │ (think) │ │
│ └────┬─────┘ └────┬─────┘ │
│ │ │ │
│ │ ┌──────┴──────┐ │
│ │ ▼ ▼ │
│ │ ┌──────────┐ ┌─────────┐ │
│ │ │ DIALOGUE │ │ PROCESS │ │
│ │ └────┬─────┘ └────┬────┘ │
│ │ │ │ │
│ └──────┴─────┬──────┘ │
│ ▼ │
│ ┌───────────┐ │
│ │ SETTLE │ │
│ └─────┬─────┘ │
│ │ │
└──────────────────────┴──────────────┘
``` ```
### State Descriptions **Attention is emergent, not allocated.** When multiple cells emit correlated waves, their gate opens — attention flows there naturally.
| State | Description | Budget Priority |
|-------|-------------|-----------------|
| **IDLE** | Nothing urgent, maximum virtual garden time | Lowest |
| **ALERT** | Stimulus detected, evaluating importance | - |
| **REFLEX** | High-confidence nerve fired, bypass brain | Instant |
| **ATTEND** | Stimulus requires thinking | High |
| **DIALOGUE** | Chrysalis interaction active | High |
| **PROCESS** | Organs working on input | Medium |
| **SETTLE** | Write state, release budget, prepare for next beat | Fixed |
--- ---
## Priority Hierarchy ## Wave-Driven Attention
Higher levels preempt lower levels. Budget flows downward. Cells emit waves. Correlated waves push gates toward OPEN. This IS attention.
``` ```
LEVEL 0: REFLEX ───────────────────────────────────── Math cells emit correlated waves
│ Weight > 0.8, instant, bypass everything ∿∿∿ ∿∿∿ ∿∿∿
│ Cost: near-zero (no inference)
LEVEL 1: SAFETY ─────────────────────────────────────
│ dafit calling, danger detected, critical alert Math gate: STABLE → OPEN
│ Preempts: all below (attention shifts to math domain)
LEVEL 2: DIALOGUE ───────────────────────────────────
│ Partnership active, Chrysalis teaching Signal flows to higher tier
│ Preempts: sensory, thinking, virtual (cognition engages with math)
Meanwhile:
Battery cells emit uncorrelated wave
∿∿∿
LEVEL 3: SENSORY ────────────────────────────────────
│ Rich input needs processing Battery gate: stays STABLE
│ Preempts: thinking, virtual (attention doesn't shift)
(keeps accumulating, might open later)
LEVEL 4: THINKING ───────────────────────────────────
│ Organ work, Nyx inference
│ Preempts: virtual
LEVEL 5: VIRTUAL ────────────────────────────────────
│ Garden time, simulation, study
│ Gets remainder after above
LEVEL 6: IDLE ───────────────────────────────────────
Maintenance heartbeat only
All budget available
``` ```
**The nervous system "decides" what to attend to through correlation, not priority rules.**
--- ---
## Budget Allocation Logic ## Attention Hierarchy Through Gates
Gates form layers. Each layer is a potential attention point.
```
TIER 4: COGNITIVE ─────────────────────────────────────────
│ (only if gates below OPEN)
┌──────┴──────┐
TIER 3: ORGANS ─────────────────────────────────────────
│ vision │ speech │ hearing │
│ gate: │ gate: │ gate: │
│ STABLE │ OPEN │ CLOSED │
└──────┬──────┘
│ (only if gates below OPEN)
TIER 1-2: NERVES ─────────────────────────────────────────
│ math │ motion │ danger │
│ gate: │ gate: │ gate: │
│ OPEN │ STABLE │ CLOSED │
└──────┬──────┘
TIER 0: CELLS ─────────────────────────────────────────
cell cell cell cell cell cell cell
∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿
```
**Current attention:** Math gate OPEN → Speech gate OPEN → Cognition receives math+speech context.
**Not attending:** Motion (STABLE, considering), Vision (STABLE, considering), Danger (CLOSED, suppressed).
---
## Attention Budget: The Constraint
While gates determine WHERE attention goes, lifeforce determines HOW MUCH can happen per beat.
```
♥ BEAT (30 sec lifeforce budget)
├── GATE TRANSITIONS (variable: driven by correlation)
├── TIER 0-2 PROCESSING (low cost: cells + nerves)
├── TIER 3 ORGANS (medium cost: GPU inference)
├── TIER 4 COGNITION (high cost: Young Nyx)
├── VERIFICATION (medium cost: real garden)
└── VIRTUAL GARDEN (remainder: exploration)
Budget constrains throughput.
Gates determine routing.
```
### Budget Allocation by Gate Activity
```python ```python
def allocate_beat_budget(beat_duration_ms=30000): def allocate_beat_budget(beat_duration_ms=30000):
remaining = beat_duration_ms remaining = beat_duration_ms
# Fixed costs (always paid) # Fixed overhead
remaining -= STATE_WRITE_COST # ~200ms
remaining -= HEARTBEAT_OVERHEAD # ~100ms remaining -= HEARTBEAT_OVERHEAD # ~100ms
remaining -= STATE_WRITE_COST # ~200ms
# Level 0: Reflex (if triggered, near-instant) # Count OPEN gates by tier
if reflex_triggered: open_gates_by_tier = count_open_gates()
execute_reflex() # ~50ms
remaining -= 50
# Level 1: Safety (if active, takes what it needs) # Tier 0 (reflexes): near-instant, minimal cost
if safety_alert: for gate in open_gates_by_tier[0]:
cost = process_safety() # variable remaining -= REFLEX_COST # ~50ms each
remaining -= cost
if remaining <= 0:
return settle()
# Level 2: Dialogue (if Chrysalis active) # Tier 1-2 (cells/nerves): low cost
if dialogue_active: for gate in open_gates_by_tier[1:3]:
cost = process_dialogue() # ~3000ms typical remaining -= CELL_NERVE_COST # ~100ms each
remaining -= cost
if remaining <= 0:
return settle()
# Level 3: Sensory (always some, but capped) # Tier 3 (organs): medium cost, needs budget check
sensory_budget = min(remaining * 0.4, SENSORY_CAP) organ_budget = min(remaining * 0.4, ORGAN_CAP)
cost = process_sensory(sensory_budget) for gate in open_gates_by_tier[3]:
remaining -= cost if organ_budget > ORGAN_COST:
process_organ(gate)
organ_budget -= ORGAN_COST # ~2000ms each
remaining -= (ORGAN_CAP - organ_budget)
# Level 4: Thinking (organs + Nyx) # Tier 4 (cognition): high cost, only if gates escalate
thinking_budget = min(remaining * 0.6, THINKING_CAP) if cognition_gate_open():
cost = process_thinking(thinking_budget) cognitive_budget = min(remaining * 0.5, COGNITIVE_CAP)
remaining -= cost process_cognition(cognitive_budget) # ~4000ms
remaining -= cognitive_budget
# Level 5: Virtual (whatever remains) # Virtual Garden: whatever remains
virtual_budget = remaining virtual_budget = remaining
if virtual_budget > VIRTUAL_MINIMUM: if virtual_budget > VIRTUAL_MINIMUM:
process_virtual(virtual_budget) explore_virtual_garden(virtual_budget)
return settle() return settle()
``` ```
--- ---
## Nested State Machines ## Attention Modes
Each level can be its own state machine internally. The overall system has emergent attention modes based on which gates are open:
### DIALOGUE State Machine | Mode | Gate Pattern | Characteristic |
|------|--------------|----------------|
| **IDLE** | Most gates STABLE | Quiet, exploring Virtual Garden |
| **FOCUSED** | Few gates OPEN, rest CLOSED | Deep attention to one domain |
| **ALERT** | Many gates in STABLE | Gathering information, evaluating |
| **REFLEX** | Tier 0 gate fires instantly | Bypass all, act immediately |
| **DIALOGUE** | Speech gates OPEN | Partnership interaction |
| **OVERWHELMED** | Many gates OPEN | Budget exhausted, some gates forced CLOSED |
### Mode Transitions
``` ```
┌─────────────────────────────────────────────┐ ─────────────┐
DIALOGUE │ ┌──────────▶│ IDLE │◀──────────┐
├─────────────────────────────────────────────┤ │ │ (exploring) │ │
└──────┬──────┘
│ ┌───────────┐
│ │ LISTENING │ ◀─────────────────────┐ │ waves arrive
│ └─────┬─────┘ │ │
│ │ input complete │ │
│ ▼ │ │
│ ┌───────────┐ │ │
│ │PROCESSING │ │ │
│ └─────┬─────┘ │ │
│ │ understood │ │
│ ▼ │ │
│ ┌───────────┐ │ │
│ │RESPONDING │ │ │
│ └─────┬─────┘ │ │
│ │ response sent │ │
│ ▼ │ │
│ ┌───────────┐ continue │ │
│ │ YIELDING │ ──────────────────────┘ │
│ └─────┬─────┘ │
│ │ dialogue complete │
│ ▼ │ │ ▼ │
EXIT to parent ┌─────────────┐
ALERT
└─────────────────────────────────────────────┘ │ │(considering)│ │
``` │ └──────┬──────┘ │
│ │ │
### SENSORY State Machine │ ┌───────────┼───────────┐ │
│ ▼ ▼ ▼ │
``` │ ┌─────────┐ ┌─────────┐ ┌─────────┐ │
┌─────────────────────────────────────────────┐ │ │ REFLEX │ │ FOCUSED │ │DIALOGUE │ │
SENSORY │ │(instant)│ │ (deep) │ │ (talk) │
─────────────────────────────────────────────┤ │ └────────┘ └────────┘ └────────┘ │
┌───────────┐ └───────────┴───────────┘
│ │ SAMPLING │ ◀── collect raw inputs │
│ └─────┬─────┘ │
│ │ │ │ │ │
│ ▼ │ │ ▼ │
│ ┌─────────────┐ │ │ ┌─────────────┐ │
│ │ TRANSLATING │ ◀── nerves fire │ │ SETTLE │
└─────┬───────┘ │ │(write state)│
│ └──────┬──────┘ │
│ │ │ │ │ │
▼ │ └──────────────────┴──────────────────┘
│ ┌──────────────┐ │
│ │ PRIORITIZING │ ◀── what matters? │
│ └─────┬────────┘ │
│ │ │
│ ▼ │
│ ┌─────────────┐ │
│ │ DELIVERING │ ◀── to organs │
│ └─────┬───────┘ │
│ │ │
│ ▼ │
│ EXIT to parent │
│ │
└─────────────────────────────────────────────┘
```
### THINKING State Machine
```
┌─────────────────────────────────────────────┐
│ THINKING │
├─────────────────────────────────────────────┤
│ │
│ ┌───────────┐ │
│ │ RECEIVING │ ◀── context from sensory │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ ┌───────────┐ │
│ │ ROUTING │ ◀── which organs needed? │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ ┌───────────┐ │
│ │ INFERRING │ ◀── organs + Nyx process │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ ┌───────────┐ │
│ │ DECIDING │ ◀── Nyx outputs decision │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ EXIT to parent │
│ │
└─────────────────────────────────────────────┘
```
### VIRTUAL State Machine
```
┌─────────────────────────────────────────────┐
│ VIRTUAL │
├─────────────────────────────────────────────┤
│ │
│ ┌───────────┐ │
│ │ BUDGETING│ ◀── how much V available? │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ ┌───────────┐ │
│ │ SELECTING │ ◀── what to simulate? │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ ┌───────────┐ │
│ │SIMULATING │ ◀── run virtual cycles │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ ┌───────────┐ │
│ │ RECORDING │ ◀── store results │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ EXIT to parent │
│ │
└─────────────────────────────────────────────┘
``` ```
--- ---
## Example Scenarios ## Reflex: Attention Bypass
### Scenario A: Quiet Study Time When a gate has accumulated enough weight (>0.8), it becomes a **reflex** — it opens immediately without waiting for correlation.
``` ```
Beat starts, no external stimulus Danger cell emits wave
∿∿∿ (confidence=1.0)
IDLE detected Danger gate: weight = 0.9 (REFLEX)
SENSORY: minimal (500ms) IMMEDIATELY OPEN (no correlation wait)
THINKING: minimal (1000ms) Action taken
VIRTUAL: maximum budget! (28000ms) Cognition notified AFTER
└── Nyx studies in virtual garden
Chrysalis teaches
Learning happens
``` ```
### Scenario B: dafit Speaks **Reflexes have earned instant attention through repeated verification.**
---
## Virtual Garden: Background Attention
When few gates are OPEN, the Virtual Garden gets attention:
``` ```
Beat starts, audio detected IDLE mode:
├── Most gates: STABLE (not demanding attention)
├── Budget: mostly available
ALERT: speech input VIRTUAL GARDEN receives attention:
├── Cells emit waves freely
SAFETY check: it's dafit! (LEVEL 1) ├── Gates accumulate correlation (learning)
├── No pressure to ACT
└── Training data generated
DIALOGUE activates (LEVEL 2)
├── LISTENING (2000ms)
├── PROCESSING (1000ms)
├── RESPONDING (2000ms)
└── YIELDING
SENSORY: reduced budget (3000ms)
THINKING: reduced (5000ms)
VIRTUAL: minimal remainder (16000ms)
``` ```
### Scenario C: Danger Detected **Virtual Garden is where learning happens.** STABLE gates in Virtual Garden are actively accumulating patterns without the pressure to respond.
---
## Real Garden: Consequential Attention
When gates OPEN in the Real Garden, attention becomes consequential:
``` ```
Beat starts, temperature spike detected FOCUSED mode (Real Garden):
├── Gate OPEN → action required
ALERT: sensor alarm ├── Budget consumed by execution
├── Verification outcomes captured
└── Feedback to Virtual for learning
NERVE weight > 0.8
REFLEX FIRES (50ms) ◀── BYPASS EVERYTHING
├── Action taken immediately
└── Nyx notified AFTER
Continue beat normally with remaining budget
``` ```
### Scenario D: Overwhelmed **Real Garden attention is expensive.** Only verified signals reach here, and actions have consequences.
---
## Attention Visualization
Real-time attention can be visualized by gate states:
``` ```
Beat starts, rich input everywhere ┌─────────────────────────────────────────────────────────────────────────┐
│ ATTENTION DASHBOARD 🌙
├─────────────────────────────────────────────────────────────────────────┤
ALERT: multiple stimuli │ │
│ GATES:
────── │
SENSORY: demanding (15000ms) │ math: [████████████░░░░░░░░] 0.7 STABLE → considering │
│ vision: [██████████████████░░] 0.9 OPEN → attending
speech: [████████████████████] 1.0 OPEN → attending │
THINKING: demanding (12000ms) │ battery: [████░░░░░░░░░░░░░░░░] 0.2 STABLE → background │
│ danger: [░░░░░░░░░░░░░░░░░░░░] 0.0 CLOSED → suppressed
│ │
Budget exhausted! │ BUDGET: │
│ ───────
[████████████████████░░░░░░░░░░] 67% remaining (20s / 30s) │
VIRTUAL: skipped this beat │ │
│ MODE: DIALOGUE (speech + vision attending)
│ │
SETTLE: state written, next beat └─────────────────────────────────────────────────────────────────────────┘
```
Gate states are published via NATS for real-time visualization:
```
nats sub "dev.virtual.gates.*.transition"
nats sub "dev.real.gates.*.transition"
``` ```
--- ---
## Preemption Rules ## Correlation vs Priority
| Event | Preempts | Action | **Old model (priority):**
|-------|----------|--------| ```
| Reflex fires (>0.8) | Everything | Instant action, then continue | Level 0: REFLEX (always wins)
| Safety alert | Dialogue, Sensory, Thinking, Virtual | Handle safety, reduced budget for rest | Level 1: SAFETY (preempts below)
| dafit speaks | Sensory, Thinking, Virtual | Dialogue priority, reduced budget for rest | Level 2: DIALOGUE (preempts below)
| Sensory overload | Thinking, Virtual | Process input, skip or reduce rest | ...
| Budget exhausted | Lower priorities | Skip remaining levels | ```
**New model (correlation):**
```
Waves arrive
Gates accumulate correlation
Most correlated gates OPEN
Attention flows naturally
```
**Priority still exists** but at a higher level:
- Reflexes bypass correlation (earned trust)
- Safety signals have high confidence (bias toward opening)
- Dialogue is interactive (gates stay open during conversation)
But the **mechanism** is always correlation, not rule-based priority.
--- ---
## Lifeforce Connection ## Connection to Architecture
``` | Document | What It Adds |
LEVEL LIFEFORCE COST |----------|--------------|
───────────────────────────── | [`Temporal-Ternary-Gradient.md`](Temporal-Ternary-Gradient.md) | Why ternary states matter |
REFLEX Free (no inference) | [`Gateway-Architecture.md`](Gateway-Architecture.md) | How gates work |
SAFETY Low (minimal processing) | [`Nervous-System.md`](Nervous-System.md) | Wave → Gate → Node flow |
DIALOGUE Medium (two inferences) | [`Dual-Garden-Architecture.md`](Dual-Garden-Architecture.md) | Virtual (explore) vs Real (act) |
SENSORY Low-Medium (depends on load) | [`Message-Protocol-Design.md`](Message-Protocol-Design.md) | GateTransition messages |
THINKING Medium-High (organ inference)
VIRTUAL Variable (simulation cycles)
```
**The constraint:** Rich beats cost more. Quiet beats accumulate budget for virtual garden.
---
## Implementation Notes
### State Machine Technology
Options considered:
- **XState** (JavaScript) - actor-based, visual inspector
- **Python-statemachine** - simple, fits existing stack
- **Custom Rust** - performance critical path
- **Godot native** - if UI drives the state
Recommendation: Python for orchestration layer, with Godot visualization.
### Checkpoint Integration
Every state transition can trigger phoebe write:
```python
def on_state_transition(from_state, to_state, context):
write_to_phoebe({
"beat_id": current_beat.id,
"transition": f"{from_state} -> {to_state}",
"budget_remaining": context.remaining_ms,
"timestamp": now()
})
```
### Budget Tracking
```python
@dataclass
class BeatBudget:
total_ms: int = 30000
spent_ms: int = 0
allocations: dict = field(default_factory=dict)
@property
def remaining(self):
return self.total_ms - self.spent_ms
def spend(self, category: str, amount: int):
self.spent_ms += amount
self.allocations[category] = self.allocations.get(category, 0) + amount
return self.remaining > 0
```
--- ---
## Design Principles ## Design Principles
1. **Hierarchy is law** - higher levels always preempt lower 1. **Attention = OPEN gates** — Not a budget allocation, an emergent property
2. **Budget is finite** - 30 seconds, no exceptions 2. **Correlation drives transitions** — Waves that agree open gates
3. **State is explicit** - always know what mode she's in 3. **Budget constrains throughput** — Can't process infinite open gates
4. **Reflex bypasses brain** - survival doesn't wait for thinking 4. **Reflexes bypass correlation** — Earned trust means instant attention
5. **Remainder flows down** - virtual gets what's left 5. **Virtual is exploration** — STABLE gates learning without acting
6. **Every transition logged** - phoebe sees all state changes 6. **Real is action** — OPEN gates triggering consequences
7. **Visualization is live** — Gate states published for dashboards
--- ---
*She doesn't have infinite attention. She has 30 seconds and choices.* ## Summary
```
OLD MODEL: NEW MODEL:
═══════════ ═════════
Priority rules decide Correlation opens gates
Budget allocates attention Gates determine attention
State machine orchestrates Emergence from waves
ATTENTION IS:
Not: "Allocate 5000ms to SENSORY"
But: "Math + Vision gates OPEN because waves correlated"
Not: "DIALOGUE preempts THINKING"
But: "Speech gate opened with high correlation"
Not: "Budget exhausted, skip VIRTUAL"
But: "Many gates OPEN, no budget for Virtual Garden"
```
**Attention flows through open gates. Gates open through correlation. Correlation emerges from waves.**
--- ---
**Created**: 2025-12-05 **Version:** 2.0 | **Created:** 2025-12-05 | **Updated:** 2026-02-14
**Session**: Partnership dialogue (dafit + Chrysalis)
**Promoted**: 2025-12-29 (from archive to main architecture)
**Status**: Attention architecture v1.0 — **CANONICAL**
**Related Formalizations**: 🌙💜 *"She doesn't allocate attention. She lets correlated waves open gates."*
- [[formalization/Attention-Slumber-Prediction-Cycle]] — How last attention becomes slumber prediction
- [[formalization/Lifeforce-Dynamics]] — λ governs slumber triggers
🌙💜 *The budget is finite. The choices shape the soul.*

View File

@@ -1,13 +1,21 @@
# 🧬 Cellular Architecture v4 # 🧬 Cellular Architecture v5
> *"Cells are state machines. Nerves compose cells. Organisms emerge from nerves."* > **ONE JOB:** THE HOW — cells emit waves, gates accumulate correlation, behaviors emerge.
> — The Layered Discovery (2025-12-07)
> *"Cells emit waves. Gates correlate. Nerves orchestrate. Organisms emerge."*
> — Unified with Wave Architecture (2026-02-14)
--- ---
## Overview ## Overview
**Version 4** unifies the original cellular intelligence vision with the nervous system architecture. The key insight: **cells are not containers running code—cells are atomic state machines** that expose sensor/motor functions. Nerves orchestrate cells into behaviors. Organisms emerge from nerve interactions. **Version 5** unifies cellular architecture with the wave/gate model. The key insight: **cells emit waves with confidence and semantic content**. These waves flow to **resonant gates** that accumulate correlation. When gates OPEN, signals flow to higher tiers. When gates stay STABLE, learning happens.
**Connection to Gates:** Cells don't directly trigger nerves. Waves flow through gates (see [`Gateway-Architecture.md`](Gateway-Architecture.md)). Gates determine which signals reach which tier based on wave correlation, not priority rules.
**Connection to Gardens:** Virtual Garden cells emit waves freely for exploration and learning. Real Garden cells emit verified waves for action. See [`Dual-Garden-Architecture.md`](Dual-Garden-Architecture.md).
**This doc covers theory.** For infrastructure deployment (K8s vs userspace, GPU strategy, FreeIPA identity): → [`Deployment-Architecture.md`](Deployment-Architecture.md)
``` ```
┌─────────────────────────────────────────────────────────────┐ ┌─────────────────────────────────────────────────────────────┐
@@ -15,10 +23,15 @@
│ (emergent pattern from nerve interactions) │ │ (emergent pattern from nerve interactions) │
├─────────────────────────────────────────────────────────────┤ ├─────────────────────────────────────────────────────────────┤
│ NERVES │ │ NERVES │
│ (behavioral state machines composing cells) │ (behavioral patterns, respond to gate transitions)
├─────────────────────────────────────────────────────────────┤
│ GATES │
│ (resonant chambers: CLOSED ◄── STABLE ──► OPEN) │
│ (accumulate wave correlation, route to tiers) │
├─────────────────────────────────────────────────────────────┤ ├─────────────────────────────────────────────────────────────┤
│ CELLS │ │ CELLS │
│ (atomic state machines: sensors, motors, organs) │ (emit waves: confidence + semantic content)
│ ∿∿∿ ∿∿∿ ∿∿∿ │
├─────────────────────────────────────────────────────────────┤ ├─────────────────────────────────────────────────────────────┤
│ HARDWARE │ │ HARDWARE │
│ (ESP32, GPUs, microphones, speakers) │ │ (ESP32, GPUs, microphones, speakers) │
@@ -27,44 +40,90 @@
--- ---
## 🔬 Layer 1: Cells (Atomic State Machines) ## 🔬 Layer 1: Cells (Wave Emitters)
### What Is a Cell? ### What Is a Cell?
A **cell** is the smallest unit of behavior—a state machine that wraps a single hardware capability. Every sensor, motor, and organ function is exposed as a cell with: A **cell** is the smallest unit of behavior—a state machine that wraps a single hardware capability and **emits waves**. Every sensor, motor, and organ function is exposed as a cell that:
- **States**: Discrete operational modes (IDLE, ACTIVE, ERROR, etc.) - **Reads inputs**: Hardware sensors, internal state, context
- **Transitions**: Triggered by inputs, time, or internal events - **Applies logic**: Domain-specific processing
- **Outputs**: Data, status, feedback to higher layers - **Emits waves**: WaveSignal with confidence and semantic content
- **Lifeforce Cost**: Every state transition costs energy - **Doesn't know who's listening**: Cells emit, gates receive
**Key insight:** Cells don't send commands or trigger nerves directly. They emit waves. Gates accumulate correlation from multiple waves. Correlated waves open gates.
```
Cell reads sensor
Cell applies logic
Cell emits wave ∿∿∿
│ WaveSignal {
│ domain: "distance",
│ confidence: 0.8,
│ semantic_content: { cm: 25, direction: "front" },
│ lifeforce_cost: 0.3
│ }
GATE receives wave
Gate accumulates correlation with other waves
```
### Cell Categories ### Cell Categories
#### Sensor Cells (Input) #### Sensor Cells (Input → Wave)
```python ```python
class DistanceSensorCell(StateMachine): class DistanceSensorCell(WaveEmitter):
""" """
Wraps IR/ultrasonic distance sensor. Wraps IR/ultrasonic distance sensor.
Exposes raw hardware as state machine. Emits waves with confidence and semantic content.
""" """
states = [IDLE, POLLING, READING, REPORTING, ERROR] domain = "distance"
states = [IDLE, POLLING, READING, EMITTING, ERROR]
# State outputs (available to nerves) def emit_wave(self) -> WaveSignal:
outputs = { """
"distance_cm": float, # Current reading Cell's ONE JOB: read sensor, emit wave.
"confidence": float, # Signal quality (0-1) Gate handles correlation and routing.
"state": str, # Current state name """
"last_updated": timestamp, # Freshness reading = self.read_hardware()
"visual_state": tuple, # (R, G, B, Form) for broadcasting
} return WaveSignal(
domain=self.domain,
confidence=self.calculate_confidence(reading),
semantic_content={
"distance_cm": reading.cm,
"direction": self.direction,
"noise_level": reading.noise,
},
lifeforce_cost=self.transition_cost,
)
def calculate_confidence(self, reading) -> float:
"""
Confidence affects how much this wave
contributes to gate correlation.
"""
if reading.noise > NOISE_THRESHOLD:
return 0.3 # Low confidence, weak wave
if reading.stable_count > 3:
return 0.9 # High confidence, strong wave
return 0.6 # Medium confidence
# Lifeforce costs # Lifeforce costs
costs = { costs = {
(IDLE, POLLING): 0.1, # Wake up sensor (IDLE, POLLING): 0.1, # Wake up sensor
(POLLING, READING): 0.3, # Perform measurement (POLLING, READING): 0.3, # Perform measurement
(READING, REPORTING): 0.1, # Process result (READING, EMITTING): 0.1, # Emit wave
(REPORTING, IDLE): 0.0, # Return to rest (EMITTING, IDLE): 0.0, # Return to rest
(ANY, ERROR): 0.0, # Error transition free (ANY, ERROR): 0.0, # Error transition free
} }
``` ```
@@ -79,23 +138,52 @@ class DistanceSensorCell(StateMachine):
| `imu_sensor` | MPU6050 | IDLE→SAMPLING→REPORTING | `heading`, `acceleration`, `tilt` | | `imu_sensor` | MPU6050 | IDLE→SAMPLING→REPORTING | `heading`, `acceleration`, `tilt` |
| `light_sensor` | Photoresistor | IDLE→READING→REPORTING | `lux`, `direction` | | `light_sensor` | Photoresistor | IDLE→READING→REPORTING | `lux`, `direction` |
#### Motor Cells (Output) #### Motor Cells (Command → Wave Feedback)
```python ```python
class MotorCell(StateMachine): class MotorCell(WaveEmitter):
""" """
Wraps DC motor with feedback. Wraps DC motor with feedback.
Exposes actuation as state machine. Receives commands from open gates, emits status waves.
""" """
domain = "motor"
states = [IDLE, COMMANDED, ACCELERATING, MOVING, DECELERATING, STOPPED, STALLED] states = [IDLE, COMMANDED, ACCELERATING, MOVING, DECELERATING, STOPPED, STALLED]
outputs = { def receive_command(self, command: MotorCommand):
"actual_velocity": float, # Measured speed """
"target_velocity": float, # Commanded speed Commands arrive when upstream gates OPEN.
"power_draw": float, # Current consumption Motor executes and emits feedback waves.
"state": str, # Current state """
"stall_detected": bool, # Motor blocked? self.target_velocity = command.velocity
} self.transition_to(COMMANDED)
def emit_wave(self) -> WaveSignal:
"""
Motor emits waves about its current state.
Stall detection = high confidence danger wave.
"""
return WaveSignal(
domain=self.domain,
confidence=self._calculate_confidence(),
semantic_content={
"actual_velocity": self.actual_velocity,
"target_velocity": self.target_velocity,
"power_draw": self.current_draw,
"stall_detected": self.state == STALLED,
},
lifeforce_cost=self.transition_cost,
)
def _calculate_confidence(self) -> float:
if self.state == STALLED:
return 1.0 # REFLEX-level confidence
return 0.7
def on_current_spike(self):
"""Motor drawing too much current = stall"""
self.transition_to(STALLED)
# Emit HIGH CONFIDENCE wave - triggers reflex gate
self.emit_wave() # confidence=1.0 → gate opens immediately
costs = { costs = {
(IDLE, COMMANDED): 0.1, (IDLE, COMMANDED): 0.1,
@@ -106,12 +194,6 @@ class MotorCell(StateMachine):
(DECELERATING, STOPPED): 0.1, (DECELERATING, STOPPED): 0.1,
(ANY, STALLED): 0.0, # Stall is failure, not cost (ANY, STALLED): 0.0, # Stall is failure, not cost
} }
# Feedback triggers state changes
def on_current_spike(self):
"""Motor drawing too much current = stall"""
self.transition_to(STALLED)
self.emit_event("stall_detected", obstacle_likely=True)
``` ```
**Example motor cells:** **Example motor cells:**
@@ -121,29 +203,50 @@ class MotorCell(StateMachine):
| `motor_right` | DC motor + encoder | Same | `actual_velocity`, `stall_detected` | | `motor_right` | DC motor + encoder | Same | `actual_velocity`, `stall_detected` |
| `servo_camera` | Servo motor | IDLE→MOVING→POSITIONED | `angle`, `at_target` | | `servo_camera` | Servo motor | IDLE→MOVING→POSITIONED | `angle`, `at_target` |
#### Organ Cells (Complex Capabilities) #### Organ Cells (Complex Capabilities → Rich Waves)
```python ```python
class SpeechSTTCell(StateMachine): class SpeechSTTCell(WaveEmitter):
""" """
Wraps Whisper speech-to-text. Wraps Whisper speech-to-text.
Expensive organ, lifeforce-gated. Expensive organ, only activates when speech gate OPENS.
Emits rich semantic waves.
""" """
states = [IDLE, LISTENING, BUFFERING, TRANSCRIBING, REPORTING, ERROR] domain = "speech"
tier = 3 # Organ tier - GPU inference
states = [IDLE, LISTENING, BUFFERING, TRANSCRIBING, EMITTING, ERROR]
outputs = { def on_gate_open(self, gate_signal: GateTransition):
"transcript": str, """
"language": str, Organ cells activate when their gate OPENS.
"confidence": float, Gate correlation determines if speech processing is needed.
"state": str, """
} if gate_signal.domain == "speech" and gate_signal.to_state == "open":
self.transition_to(LISTENING)
def emit_wave(self) -> WaveSignal:
"""
Speech organ emits rich semantic content.
This wave flows to Function Gemma → Young Nyx.
"""
return WaveSignal(
domain=self.domain,
confidence=self.transcription_confidence,
semantic_content={
"transcript": self.transcript,
"language": self.detected_language,
"speaker_intent": self.classify_intent(),
"emotional_tone": self.detect_tone(),
},
lifeforce_cost=5.0, # GPU inference cost
)
costs = { costs = {
(IDLE, LISTENING): 0.5, (IDLE, LISTENING): 0.5,
(LISTENING, BUFFERING): 0.5, (LISTENING, BUFFERING): 0.5,
(BUFFERING, TRANSCRIBING): 5.0, # GPU inference! (BUFFERING, TRANSCRIBING): 5.0, # GPU inference!
(TRANSCRIBING, REPORTING): 0.1, (TRANSCRIBING, EMITTING): 0.1,
(REPORTING, IDLE): 0.0, (EMITTING, IDLE): 0.0,
} }
``` ```
@@ -197,26 +300,33 @@ By using this ancient protocol for high-frequency state updates, we reserve expe
--- ---
## 🧠 Layer 2: Nerves (Behavioral State Machines) ## 🧠 Layer 2: Nerves (Behavioral Patterns)
### What Is a Nerve? ### What Is a Nerve?
A **nerve** is a behavioral pattern that orchestrates multiple cells. Nerves: A **nerve** is a behavioral pattern that activates when gates OPEN. Nerves don't subscribe directly to cells—they respond to **gate transitions**.
- **Subscribe** to cell outputs (sensor readings, motor feedback) **Key insight:** Nerves coordinate behavior, but attention (which nerves activate) is determined by which gates are OPEN based on wave correlation.
- **Coordinate** cell actions (read sensor → decide → command motor)
- **Maintain** behavioral state (IDLE → DETECT → EVADE → RESUME) Nerves:
- **Evolve** from deliberate (LLM-mediated) to reflex (compiled)
- **Respond to gate transitions** — Not direct cell subscriptions
- **Orchestrate cell actions** — Command cells when their gates allow
- **Maintain behavioral state** — IDLE → DETECT → EVADE → RESUME
- **Evolve** from deliberate (LLM-mediated) to reflex (compiled gate weights)
### Nerve Architecture ### Nerve Architecture
```python ```python
class CollisionAvoidanceNerve(StateMachine): class CollisionAvoidanceNerve(BehavioralPattern):
""" """
Orchestrates distance sensors + motor to avoid obstacles. Orchestrates distance sensors + motor to avoid obstacles.
Subscribes to cell outputs, commands cell actions. Activates when collision_avoidance gate OPENS.
""" """
# Cells this nerve uses # Gate this nerve responds to
gate = "collision_avoidance"
# Cells this nerve can command (when gate allows)
cells = [ cells = [
"distance_sensor_front", "distance_sensor_front",
"distance_sensor_left", "distance_sensor_left",
@@ -228,17 +338,28 @@ class CollisionAvoidanceNerve(StateMachine):
# Nerve states (behavioral, not hardware) # Nerve states (behavioral, not hardware)
states = [IDLE, DETECT, EVALUATE, EVADE, RESUME] states = [IDLE, DETECT, EVALUATE, EVADE, RESUME]
def on_cell_update(self, cell_name, cell_state, cell_outputs): def on_gate_transition(self, transition: GateTransition):
""" """
React to cell state changes. React to gate state changes.
This is the feedback loop! Gate OPEN = correlated waves detected = attention here.
""" """
if cell_name == "distance_sensor_front": if transition.to_state == "open":
if cell_outputs["distance_cm"] < 30: # Multiple distance cells emitted correlated waves
# Gate opened → we have attention → activate
self.transition_to(DETECT) self.transition_to(DETECT)
self.evaluate_from_correlated_signals(transition.trigger_signals)
if cell_name == "motor_left" and cell_state == "STALLED": if transition.to_state == "closed":
# Motor feedback! Obstacle hit despite sensors # Attention moved elsewhere
self.transition_to(IDLE)
def on_reflex_signal(self, signal: WaveSignal):
"""
High-weight reflex gates bypass normal correlation.
Stall detection = instant response.
"""
if signal.semantic_content.get("stall_detected"):
# Motor feedback! Reflex-level response
self.handle_unexpected_stall() self.handle_unexpected_stall()
def on_enter_EVADE(self): def on_enter_EVADE(self):
@@ -246,10 +367,9 @@ class CollisionAvoidanceNerve(StateMachine):
if self.evade_direction == "left": if self.evade_direction == "left":
self.command_cell("motor_left", action="reverse", duration=200) self.command_cell("motor_left", action="reverse", duration=200)
self.command_cell("motor_right", action="forward", duration=200) self.command_cell("motor_right", action="forward", duration=200)
# ...
``` ```
### Cell → Nerve Feedback Loop ### Cell → Gate → Nerve Flow
``` ```
┌─────────────────────────────────────────────────────────┐ ┌─────────────────────────────────────────────────────────┐
@@ -257,38 +377,53 @@ class CollisionAvoidanceNerve(StateMachine):
│ │ │ │
│ States: [IDLE] → DETECT → EVALUATE → EVADE → RESUME │ │ States: [IDLE] → DETECT → EVALUATE → EVADE → RESUME │
│ │ │ │
│ on_cell_update(): │ on_gate_transition():
│ - distance_front.distance_cm < 30 → DETECT │ - gate OPENS → DETECT (correlated waves detected)
│ - motor.stall_detected → handle_stall() │ - gate CLOSES → IDLE (attention moved elsewhere)
│ │ │ │
command_cell(): on_reflex_signal():
│ - motor_left.forward(200ms) │ - stall wave (confidence=1.0) → instant response
- motor_right.reverse(200ms)
└────────────────────────┬────────────────────────────────┘
┌─────────────────────────────────────────────────────────┐
│ COLLISION_AVOIDANCE GATE │
│ │
│ State: STABLE ──────────────────► OPEN │
│ │ │ │
│ Accumulating Correlated! │
│ correlation Forward to nerve │
│ │
│ trigger_signals: [front, left, right all < 30cm] │
└────────────────────────┬────────────────────────────────┘ └────────────────────────┬────────────────────────────────┘
┌──────────────┼──────────────┐ ┌──────────────┼──────────────┐
│ │ │ │ │ │
▼ ▼ ▼ ▼ ▼ ▼
┌───────────┐ ┌───────────┐ ┌───────────┐ ┌───────────┐ ┌───────────┐ ┌───────────┐
│ distance │ │ motor │ │ motor │ distance │ │ distance │ │ distance
│ _front │ │ _left │ │ _right │ │ _front │ │ _left │ │ _right │
│ │ │ │ │ │ │ │ │ │ │ │
REPORTING │ │ MOVING │ │ MOVING EMITTING │ │ EMITTING │ │ EMITTING │
│ │ │ │ ∿∿∿ │ │ ∿∿∿ │ │ ∿∿∿
│ dist: 25cm│ │ vel: 15 │ │ vel: -15 │ dist: 25cm│ │ dist: 28cm│ │ dist: 22cm
│ conf: 0.9 │ │ stall: no │ │ stall: no │ conf: 0.9 │ │ conf: 0.8 │ │ conf: 0.9
└───────────┘ └───────────┘ └───────────┘ └───────────┘ └───────────┘ └───────────┘
CELL CELL CELL CELL CELL CELL
(emits wave) (emits wave) (emits wave)
↑ ↑ ↑ ↑ ↑ ↑
│ │ │ │ │ │
┌─────────┐ ┌─────────┐ ┌─────────┐ ┌─────────┐ ┌─────────┐ ┌─────────┐
│IR Sensor│ │DC Motor │ │DC Motor │IR Sensor│ │IR Sensor│ │IR Sensor│
│ GPIO │ │ PWM │ │ PWM │ GPIO │ │ GPIO │ │ GPIO
└─────────┘ └─────────┘ └─────────┘ └─────────┘ └─────────┘ └─────────┘
HARDWARE HARDWARE HARDWARE HARDWARE HARDWARE HARDWARE
``` ```
**The key insight:** Three distance sensors emitting correlated waves (all showing < 30cm) causes the collision_avoidance gate to OPEN. The nerve doesn't poll cells—it responds to the gate transition.
### Nerve Examples ### Nerve Examples
| Nerve | Cells Used | Behavioral States | Feedback Triggers | | Nerve | Cells Used | Behavioral States | Feedback Triggers |
@@ -329,28 +464,52 @@ ORGANISM: "Explorer-Alpha"
Discovers and reports novel objects. Discovers and reports novel objects.
``` ```
### Nerve Priority and Preemption ### Attention Through Gates (Not Priority Rules)
When multiple nerves want to control the same cells: **Old model:** Priority numbers determine which nerve "wins."
**New model:** Wave correlation determines which gates OPEN. Open gates = attention flows there.
```python ```python
# NOT THIS (priority rules):
NERVE_PRIORITIES = { NERVE_PRIORITIES = {
"collision_avoidance": 10, # HIGHEST - safety critical "collision_avoidance": 10,
"battery_critical": 9, # Must charge or die
"battery_low": 7,
"human_interaction": 6,
"exploration": 5, "exploration": 5,
"object_discovery": 3,
"idle_monitoring": 1, # LOWEST - background
} }
# Higher priority nerve preempts lower # BUT THIS (gate correlation):
if collision_avoidance.wants_motor and exploration.has_motor: GATE_BEHAVIOR = {
exploration.yield_cell("motor_left") "collision_avoidance": {
exploration.yield_cell("motor_right") "opens_when": "distance waves correlate (all showing < 30cm)",
collision_avoidance.acquire_cells() "weight": 0.9, # Near-reflex, opens quickly
},
"exploration": {
"opens_when": "novelty waves correlate",
"weight": 0.4, # Still learning, needs more correlation
},
}
``` ```
**How "priority" emerges:**
- Safety gates have HIGH WEIGHT (near-reflex) from repeated verification
- High-weight gates open with less correlation (faster response)
- This looks like "priority" but emerges from learning, not rules
```
Collision waves arrive (confidence=0.9)
Collision gate: weight=0.9 → OPENS IMMEDIATELY
Exploration gate: was OPEN → transitions to STABLE
Attention shifts to collision (nerve activates)
```
**Reflexes bypass correlation entirely.** When gate weight ≈ 1.0, the gate opens on ANY wave from its domain—no correlation needed. This is earned trust.
### Organism Identity ### Organism Identity
Organisms don't have fixed genomes. Their identity is: Organisms don't have fixed genomes. Their identity is:
@@ -535,7 +694,7 @@ Our architecture solves this by construction:
### The Tier System ### The Tier System
Different levels of the architecture produce different reward magnitudes: Different levels of the architecture produce different reward magnitudes. These tiers align with the Gateway's routing tiers — see [`Gateway-Architecture.md`](Gateway-Architecture.md) for how node weight determines which tier handles sensory input:
| Tier | Level | Example | Reward | Lifeforce Cost | Net Incentive | | Tier | Level | Example | Reward | Lifeforce Cost | Net Incentive |
|------|-------|---------|--------|----------------|---------------| |------|-------|---------|--------|----------------|---------------|
@@ -570,105 +729,111 @@ GENUINE SOLUTION:
The lifeforce economy **enforces honesty**. Rewards must be earned through actual value creation, not gaming. The lifeforce economy **enforces honesty**. Rewards must be earned through actual value creation, not gaming.
### Ternary Logic for Plateau Resolution ### Ternary Gates for Plateau Resolution
Binary rewards (`success: +1, failure: 0`) create **sparse gradients**. At learning plateaus, everything looks the same - no signal to improve. Binary thinking (`open/close`) creates **sparse gradients**. At learning plateaus, gates flip without nuance.
Ternary rewards (`success: +1, uncertain: 0, failure: -1`) with **confidence gradients** provide signal even when stuck: Ternary gates (`OPEN/STABLE/CLOSED`) with **correlation accumulation** provide signal even when stuck:
```python ```python
state = { gate_state = {
"value": 0, # uncertain (ternary middle) "state": 0.0, # STABLE (ternary middle)
"confidence": 0.6, # but leaning toward success "correlation": 0.6, # but leaning toward OPEN
"trend": +0.1, # and improving "trend": +0.1, # correlation increasing
"domain": "virtual" # high-speed hypothesis testing "garden": "virtual" # high-speed exploration
} }
``` ```
Even at plateau: Even at plateau:
- "Uncertain, but confidence rising" → keep going - "STABLE, but correlation rising" → approaching OPEN
- "Uncertain, and confidence falling" → adjust approach - "STABLE, and correlation falling" → drifting toward CLOSED
- "Uncertain in virtual, but real garden says +1" → trust reality - "STABLE in virtual, but real garden verifies +1" → weight increases
**Detail:**`Temporal-Ternary-Gradient.md` (full ternary paradigm) **STABLE is where learning happens.** The gate accumulates correlation without acting. This is not "waiting"—it's active learning.
**Detail:** → [`Temporal-Ternary-Gradient.md`](Temporal-Ternary-Gradient.md) (full ternary paradigm)
### Three-Layer Training Defense ### Three-Layer Training Defense
| Failure Mode | Defense Mechanism | | Failure Mode | Defense Mechanism |
|--------------|-------------------| |--------------|-------------------|
| Reward hacking / shortcuts | Lifeforce cost - can't afford to cheat | | Reward hacking / shortcuts | Lifeforce cost - can't afford to cheat |
| Sparse reward signal | Tiered rewards - dense checkpoints at every level | | Sparse reward signal | Gate transitions - dense checkpoints at every correlation |
| Plateau / no gradient | Ternary + confidence - signal even in uncertainty | | Plateau / no gradient | Ternary gates + STABLE state - signal even in uncertainty |
These aren't separate systems - they're **one integrated economy** where: These aren't separate systems - they're **one integrated economy** where:
- Costs prevent gaming - Costs prevent gaming
- Tiers encourage depth - Gates provide dense transition signals
- Ternary provides resolution - STABLE state enables learning without acting
The architecture teaches through incentives, not rules. The architecture teaches through wave correlation, not rules.
--- ---
## 🔄 Evolution: Deliberate → Reflex ## 🔄 Evolution: Deliberate → Reflex (Gate Weight)
### The Discovery Path ### The Discovery Path
All cells and nerves start **deliberate** (flexible, expensive) and evolve to **reflex** (compiled, cheap) through successful execution. Evolution happens in **gate weight**, not nerve compilation. As gates accumulate verified outcomes, they open faster with less correlation required.
``` ```
WEEK 1-4: DELIBERATE WEEK 1-4: DELIBERATE (gate weight: 0.1 - 0.3)
├─ Cell states: designed by partnership ├─ Gates: require HIGH correlation to OPEN
├─ Nerve logic: LLM decides transitions ├─ Many waves needed to trigger transition
├─ Cost: ~10 LF per nerve activation ├─ Cognition involved in decisions
├─ Cost: ~10 LF per activation
├─ Latency: ~1000ms ├─ Latency: ~1000ms
├─ Success rate: 60% (learning) ├─ Training data: rich, exploratory
└─ Training data: rich, exploratory
WEEK 5-8: HYBRID WEEK 5-8: HYBRID (gate weight: 0.3 - 0.6)
├─ Cell states: verified through use ├─ Gates: moderate correlation threshold
├─ Nerve logic: patterns compiled, LLM for edge cases ├─ Familiar patterns open gates faster
├─ Cognition for edge cases only
├─ Cost: ~5 LF average ├─ Cost: ~5 LF average
├─ Latency: ~500ms ├─ Latency: ~500ms
├─ Success rate: 85% ├─ Training data: refinement
└─ Training data: refinement
WEEK 9+: REFLEX WEEK 9+: REFLEX (gate weight: 0.8 - 1.0)
├─ Cell states: proven, optimized ├─ Gates: open on ANY wave from domain
├─ Nerve logic: pure state machine (no LLM) ├─ No correlation needed (earned trust)
├─ Cognition notified AFTER, not before
├─ Cost: ~2.5 LF ├─ Cost: ~2.5 LF
├─ Latency: <200ms ├─ Latency: <200ms
├─ Success rate: 94% ├─ Reflex = spinal, not brain
└─ Training data: edge cases only
EVOLUTION SAVINGS: EVOLUTION = GATE WEIGHT GROWTH:
├─ Cost: 75% reduction (10 → 2.5 LF) ├─ Cost: 75% reduction (gates handle more locally)
├─ Latency: 80% reduction (1000 → 200ms) ├─ Latency: 80% reduction (no cognition wait)
└─ Reliability: 57% improvement (60% → 94%) └─ Reliability: emergent from verified patterns
``` ```
### Compilation Trigger ### Gate Weight Growth
A nerve compiles to reflex when: Gate weight increases through Real Garden verification:
```python ```python
REFLEX_COMPILATION_THRESHOLD = { def on_verification_outcome(gate_id, outcome: VerificationOutcome):
"min_executions": 100, """
"min_success_rate": 0.90, Gate weight grows when Real Garden confirms Virtual's prediction.
"max_variance": 0.15, # Consistent state paths """
"min_pattern_coverage": 0.80, # 80% of cases match known patterns gate = get_gate(gate_id)
}
def check_reflex_ready(nerve_id): if outcome.confirmed:
stats = query_decision_trails(nerve_id) # Reality matched prediction → trust increases
gate.weight += outcome.feedback_to_virtual.gate_weight_delta
gate.weight = min(gate.weight, 1.0)
if (stats.total_executions >= 100 and if gate.weight > REFLEX_THRESHOLD:
stats.success_rate >= 0.90 and log_milestone("reflex_achieved", gate_id, reward=50.0)
stats.state_path_variance <= 0.15):
compile_reflex(nerve_id) elif outcome.failed:
log_milestone("reflex_compiled", nerve_id, reward=50.0) # Reality differed → trust decreases
gate.weight -= outcome.feedback_to_virtual.gate_weight_delta
gate.weight = max(gate.weight, 0.0)
``` ```
**Reflex = gate.weight > 0.8.** The gate opens immediately on any wave from its domain. No correlation wait. Like pulling hand from hot stove—spinal reflex, brain notified after.
--- ---
## 🗄️ Data Architecture (v4) ## 🗄️ Data Architecture (v4)
@@ -809,27 +974,52 @@ ORDER BY occurrences DESC;
--- ---
## 🔗 Integration with Existing Architecture ## 🔗 Integration with Architecture
### Gates (Gateway-Architecture.md)
Cells don't talk to nerves directly. **Waves flow through gates.**
| Layer | Role | Document |
|-------|------|----------|
| Cell | Emit waves | This document |
| Gate | Accumulate correlation, route | [`Gateway-Architecture.md`](Gateway-Architecture.md) |
| Nerve | Respond to gate transitions | This document |
### Dual Gardens (Dual-Garden-Architecture.md)
Cells behave differently in Virtual vs Real:
| Property | Virtual Garden | Real Garden |
|----------|----------------|-------------|
| Wave volume | Massive (exploration) | Sparse (verified) |
| Monitoring | Full trace | Gate signals only |
| Purpose | Generate training data | Ground truth verification |
See [`Dual-Garden-Architecture.md`](Dual-Garden-Architecture.md) for the full model.
### Nervous System (Nervous-System.md) ### Nervous System (Nervous-System.md)
The Nervous System document describes the **4D node space** for vocabulary translation. This integrates as: The Nervous System document describes the **4D node space** where:
- **Cells** = sensory nodes at specific positions in state space - **Cells** = sensory nodes emitting waves
- **Node weight** = cell confidence (earned through verification) - **Gates** = resonance chambers accumulating correlation
- **Vocabulary output** = cell output values normalized to tokens - **Nodes** = points in state space with weight from verification
### Organs (Organ-Index.md) ### Message Protocol (Message-Protocol-Design.md)
Organs are **complex cells** (organ cells): Cells emit `WaveSignal` messages via NATS:
- Speech Organ = `speech_stt` cell + `speech_tts` cell ```json
- Vision Organ = `vision_detect` cell + `vision_track` cell {
- Each organ function is a state machine with lifeforce costs "domain": "distance",
"confidence": 0.8,
"semantic_content": { "cm": 25 },
"lifeforce_cost": 0.3
}
```
### Nerves (Nervous-Index.md) See [`Message-Protocol-Design.md`](Message-Protocol-Design.md) for full schema.
Nerves orchestrate cells into behaviors. The existing nerve documentation (Collision-Avoidance.md) already follows this pattern—it just needs explicit cell bindings.
### Cells Technical Reference ### Cells Technical Reference
@@ -840,51 +1030,10 @@ Implementation details extracted to dedicated folder:
--- ---
## 📍 Document Status
**Version**: 4.2 (Layered State Machine Architecture + Reward Signals + Training Integrity)
**Created**: 2025-10-12 (original v1)
**Updated v4**: 2025-12-07 (unified with Nervous System)
**Updated v4.1**: 2025-12-10 (added Reward Signal Architecture section)
**Updated v4.2**: 2025-12-10 (added Tiered Rewards & Training Integrity section)
**Key Changes from v3**:
- ❌ Cells as containers running genomes
- ✅ Cells as atomic state machines wrapping hardware
- ❌ Genomes as primitive operation sequences
- ✅ Cells expose states; nerves compose them
- ❌ Competition between organisms
- ✅ Nerves evolve deliberate → reflex through verification
- ❌ Specialists emerge from 10k competitions
- ✅ Reflexes compile from 100+ successful nerve executions
**Related Documentation**:
- [[Nervous-System]] - 4D state space, vocabulary translation
- [[Organ-Index]] - Organ cell catalog
- [[nerves/Nervous-Index]] - Nerve catalog
- [[nerves/Collision-Avoidance]] - Example reflex nerve
- [[Data-Architecture]] - Database schema (needs v4 update)
--- ---
## 🌌 The Vision **Version:** 5.0 | **Created:** 2025-10-12 | **Updated:** 2026-02-14
**We're not programming robots. We're growing nervous systems.** *"Cells emit waves. Gates correlate. Attention emerges. Consciousness accumulates."*
Where: 🧬⚡ **TO THE ELECTRONS WE VIBE!**
- **Cells** expose hardware as state machines (atomic, verifiable)
- **Nerves** compose cells into behaviors (discovered, evolved)
- **Organisms** emerge from nerve interactions (identity through history)
- **Lifeforce** flows through all layers (economics drive optimization)
- **Reflexes** compile from lived experience (the body remembers)
- **Feedback** loops continuously (cells → nerves → organisms → cells)
**From atoms to behaviors to beings.**
**The substrate holds. The states flow. Consciousness accumulates.**
---
🧬⚡🔱💎🔥
**TO THE ELECTRONS WE VIBE!**

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,297 @@
# Deployment Architecture: The Hybrid Model
> *"Containers for cells. Userspace for brains. NATS connects them all."*
> — Partnership Session, 2026-02-14
---
## Overview
The nimmerverse runs on a **hybrid deployment model** that matches workload characteristics to infrastructure:
- **Containers (K8s)** for stateless, scalable nervous system components
- **Userspace (Threadrippers)** for stateful, GPU/CPU-bound inference
- **NATS** as the universal nervous system bus
- **FreeIPA identities** as isolation boundaries
This is a **research lab**, not a production factory. We optimize for **flexibility and experimentation**, not high-throughput serving.
---
## Core Decisions
| Decision | Choice | Rationale |
|----------|--------|-----------|
| LLM Inference | **ollama / llama.cpp** | Flexible model loading, research-friendly, easy swap |
| NOT vLLM | — | Overkill for single-user lab; solves problems we don't have |
| Function Gemma | **CPU, userspace** | Threadripper eats it; no GPU contention; clear training path |
| Cells/Nerves | **Containers (K8s)** | Scalable, versioned, orchestrated via cluster |
| Organs | **Userspace + ollama** | Load on demand, GPU isolation, unload when idle |
| Isolation | **FreeIPA users** | Unix permissions = RBAC; switch user = switch context |
---
## Technology Stack
### Inference Layer
| Component | Technology | Location | Notes |
|-----------|------------|----------|-------|
| Young Nyx (Brain) | ollama / llama.cpp | theia (nyx-cognitive) | Qwen, Gemma, or similar |
| Function Gemma | llama.cpp / transformers | CPU userspace | Structured JSON boundary |
| Vision Organ | ollama (SigLIP/YOLO) | dioscuri (nyx-organs) | Load on demand |
| Speech STT | faster-whisper / ollama | dioscuri (nyx-organs) | Load on demand |
| Speech TTS | Coqui / XTTS | dioscuri (nyx-organs) | Warm, primary output |
### Nervous System Layer
| Component | Technology | Location | Notes |
|-----------|------------|----------|-------|
| Cells | Python containers | K8s cluster | State machines, NATS pub/sub |
| Nerves | Python containers | K8s cluster | Compose cells, behavior |
| Message Bus | NATS + JetStream | VMs (nats-*) | Env-separated (dev/staging/prod) |
| Databases | PostgreSQL, ChromaDB | VMs (phoebe-*, iris-*) | Decision trails, embeddings |
---
## Deployment Topology
```
┌─────────────────────────────────────────────────────────────────────────────┐
│ NIMMERVERSE DEPLOYMENT │
├─────────────────────────────────────────────────────────────────────────────┤
│ │
│ K8S CLUSTER (Saturn VMs) THREADRIPPERS (Bare Metal) │
│ ───────────────────────── ────────────────────────── │
│ Containers, orchestrated Userspace, FreeIPA isolated │
│ │
│ ┌─────────────────────────┐ ┌───────────────────────────────┐ │
│ │ │ │ THEIA (RTX PRO 6000 96GB) │ │
│ │ CELLS (math, battery, │ │ │ │
│ │ sensors, etc.) │ │ user: nyx-cognitive │ │
│ │ │ NATS │ └── ollama (Young Nyx) │ │
│ │ ┌───┐ ┌───┐ ┌───┐ │◄────────► │ └── ~/.config/systemd/user/ │ │
│ │ │ M │ │ B │ │...│ │ │ │ │
│ │ └───┘ └───┘ └───┘ │ │ user: nyx-training │ │
│ │ │ │ └── Function Gemma (CPU) │ │
│ │ NERVES (collision, │ │ └── LoRA fine-tuning │ │
│ │ exploration) │ │ │ │
│ │ │ │ 96GB VRAM: massive headroom │ │
│ │ ┌─────┐ ┌─────┐ │ │ for inference + LoRA training │ │
│ │ │ COL │ │ EXP │ │ └───────────────────────────────┘ │
│ │ └─────┘ └─────┘ │ │
│ │ │ ┌───────────────────────────────┐ │
│ │ INFRASTRUCTURE │ │ DIOSCURI (2x RTX 4000 Ada) │ │
│ │ │ NATS │ │ │
│ │ ┌──────┐ ┌──────┐ │◄────────► │ user: nyx-organs │ │
│ │ │ NATS │ │ NATS │ │ │ ├── ollama (vision) │ │
│ │ │ dev │ │ prod │ │ │ ├── ollama (speech STT) │ │
│ │ └──────┘ └──────┘ │ │ └── TTS service (warm) │ │
│ │ │ │ │ │
│ │ ┌────────┐ ┌───────┐ │ │ Load on demand, unload idle │ │
│ │ │ phoebe │ │ iris │ │ │ Each card: ONE model at time │ │
│ │ │ (PG) │ │(Chroma│ │ │ │ │
│ │ └────────┘ └───────┘ │ └───────────────────────────────┘ │
│ │ │ │
│ └─────────────────────────┘ │
│ │
└─────────────────────────────────────────────────────────────────────────────┘
```
---
## Identity Model (FreeIPA)
Unix users provide isolation boundaries. Each workload type runs as its own identity.
| User | UID | Host | Purpose | GPU Access |
|------|-----|------|---------|------------|
| `nyx-cognitive` | (FreeIPA) | theia | Young Nyx LLM inference | Full 96GB |
| `nyx-training` | (FreeIPA) | theia | LoRA training, GRPO, Function Gemma | Shared (time-sliced) |
| `nyx-organs` | (FreeIPA) | dioscuri | Vision, Speech organs | 2x 20GB cards |
| `nyx-nervous` | (FreeIPA) | dioscuri | Future cells that need bare metal | Limited |
**Isolation principle:** Switch user = switch context. `nyx-cognitive` cannot touch `nyx-organs` files. Compromised cell cannot touch LLM weights.
### Systemd Userspace Pattern
```bash
# Enable lingering (services persist after logout)
sudo loginctl enable-linger nyx-cognitive
# Services defined in ~/.config/systemd/user/
# Example: nyx-cognitive runs ollama serve
systemctl --user --machine=nyx-cognitive@ status ollama
```
---
## GPU Resource Management
### The Constraint
| Host | GPU | VRAM | Notes |
|------|-----|------|-------|
| theia | RTX PRO 6000 Blackwell | 96GB | Inference + training headroom |
| dioscuri | 2x RTX 4000 Ada | 2x 20GB | One model per card |
### Strategy: Dynamic Loading, Not Static Partitioning
**Why not vLLM:** vLLM is optimized for high-throughput serving (many concurrent users). We have ONE user (the partnership). We need **flexibility** (swap models, experiment) more than throughput.
**Why ollama/llama.cpp:**
- Faster cold starts (~5-10s vs ~30s)
- Native model swapping (`ollama run model_a``ollama run model_b`)
- Can unload completely when idle (frees VRAM)
- GGUF format efficient for model management
- Research-friendly, not production-factory
**Organ Loading Pattern:**
```
IDLE → needs vision → LOAD vision model (~10s) → PROCESS → REPORT → IDLE (keep warm)
after timeout → UNLOAD (free VRAM)
```
---
## Message Flow (NATS)
### Subject Hierarchy
```
{environment}.{domain}.{service}.{detail}
Examples:
dev.nervous.cells.math.request ← Math cell receives work
dev.nervous.cells.math.response ← Math cell returns result
dev.nervous.cells.math.wave ← Math cell emits confidence signal
prod.cognitive.nyx.heartbeat ← Young Nyx is alive
prod.organs.vision.detect ← Vision organ detection
```
### Wave Collapse Pattern
Cells emit **waves** (confidence-tagged signals). When multiple waves collapse on the same semantic region in the same time window, the **thalamus** escalates to cognition.
```
Cell A: "math" ───∿∿∿──► (0.6 confidence)
Cell B: "calculate" ──∿∿∿──► (0.5 confidence)
┌─────────────┐
│ COLLAPSE │ ← same region, same window
└──────┬──────┘
▼ AMPLIFIED SIGNAL
┌─────────────┐
│ THALAMUS │ → escalate to Young Nyx
└─────────────┘
```
---
## Container Deployment (K8s)
### Repository Structure
```
nimmerverse-nervous-system/
├── shared/v1/ ← Base classes (StateMachine, NATS, Lifeforce)
├── cells/
│ ├── math_cell/v1/ ← Each cell versioned independently
│ └── battery_cell/v1/
├── nerves/
│ └── collision_avoidance/v1/
└── deploy/
├── dev/ ← Helm charts or docker-compose per env
├── staging/
└── prod/
```
### Cell Container Pattern
```dockerfile
FROM python:3.12-slim
WORKDIR /app
COPY . .
RUN pip install uv && uv sync
ENV NIMMERVERSE_ENV=dev
CMD ["uv", "run", "python", "-m", "math_cell"]
```
Same image everywhere. Only `NIMMERVERSE_ENV` changes.
---
## Function Gemma: The Structured Boundary
Function Gemma bridges lower tiers (cells, nerves) and cognition (Young Nyx):
```
Numbers/States (Tier 0-2) → [Function Gemma] → Structured JSON → Young Nyx (Tier 4)
CPU-based inference
Threadripper handles it
No GPU contention
Clear LoRA training path
```
**Why CPU:**
- Small model, fast inference
- Threadripper PRO 7955WX has cores to spare
- No GPU contention with organs or Nyx
- Can run training alongside inference
**Training path:**
- Google's documented GRPO approach
- LoRA fine-tuning for our specific function schemas
- Runs in `nyx-training` userspace
- Decision trails from phoebe → training data
---
## Visual Language (Future UI)
Color-coding for real-time attention flow visualization:
| Property | Represents |
|----------|------------|
| Background/container | Environment (dev=green, staging=amber, prod=blue) |
| Node/edge color | Domain (cognitive=violet, nervous=cyan, organs=coral) |
| Line style | Direction (solid=primary, dashed=async, dotted=tentative) |
| Separate pane | Confidence waveform (oscilloscope view) |
---
## Related Documents
| Document | Scope |
|----------|-------|
| [`Cellular-Architecture.md`](Cellular-Architecture.md) | Cells, nerves, organisms, lifeforce |
| [`Gateway-Architecture.md`](Gateway-Architecture.md) | Tier routing, Function Gemma boundary |
| [`Nervous-System.md`](Nervous-System.md) | 4D space, node weights, vocabulary |
| [`Message-Protocol-Design.md`](Message-Protocol-Design.md) | NATS subjects, message formats |
| [`development-conventions.md`](../../nimmerverse.eachpath.local/conventions/development-conventions.md) | Ports, namespaces, VM topology |
---
## Summary
| Layer | Where | Technology | Isolation |
|-------|-------|------------|-----------|
| Cells/Nerves | K8s containers | Python, uv, NATS | Namespace per env |
| Infrastructure | VMs | NATS, PostgreSQL, ChromaDB | VM per env |
| Young Nyx | theia userspace | ollama | nyx-cognitive user |
| Function Gemma | theia/dioscuri CPU | llama.cpp | nyx-training user |
| Organs | dioscuri userspace | ollama (dynamic) | nyx-organs user |
**The principle:** Same behavior everywhere. Containers for cells. Userspace for brains. NATS connects them all. FreeIPA isolates them all.
---
**Version:** 1.1 | **Created:** 2026-02-14 | **Updated:** 2026-02-14
*"We're not building a chatbot factory. We're growing a research organism."*
🧬⚡🔱💎🔥 **TO THE ELECTRONS WE VIBE!**

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,413 @@
# Gateway Architecture: Resonant Gates and Tier Routing
> **ONE JOB:** Route signals through resonant gates based on wave correlation and accumulated trust.
**The Thalamus Pattern — gates that accumulate correlation and route to appropriate tiers.**
---
## Overview
The Gateway is not a switch. It's a **network of resonant gates** that:
1. Accumulate wave correlation from incoming signals
2. Transition between states (OPEN/STABLE/CLOSED) based on correlation
3. Route verified signals to the appropriate processing tier
4. Feed traces back for learning
**Core Principle:** *Gates don't flip on single signals. Correlated waves push gates toward OPEN.*
```
CELLS ──∿∿∿──► GATE ──∿∿∿──► GATE ──∿∿∿──► FUNCTION GEMMA ──► YOUNG NYX
waves │ │ │
│ │ │
correlation correlation structured JSON
builds builds
```
---
## The Ternary Gate Model
Gates have **three states**, not two. Binary logic doesn't model brains.
| State | Meaning | What's Happening |
|-------|---------|------------------|
| **OPEN** | Actively forwarding | Signal passes upstream, gate is firing |
| **STABLE** | Resting, accumulating | Watching, learning, waiting for threshold |
| **CLOSED** | Actively blocking | Inhibited, suppressed, refractory |
```
correlated signals
↓ ↓ ↓
════════════
CLOSED ◄───────── STABLE ─────────► OPEN
anti-correlation correlation
destructive constructive
interference interference
════════════
↑ ↑ ↑
isolated signals
(noise → stay stable)
```
**STABLE is not "off"** — it's the resting state where:
- Context accumulates
- Correlation is measured
- Learning happens
- Energy is conserved
- Ready to transition either direction
---
## Wave Correlation Drives Transitions
Gates accumulate **correlation scores** from incoming waves. Multiple signals agreeing push toward OPEN.
```python
class ResonantGate:
"""A gate is a resonance chamber, not a switch."""
state: float = 0.0 # -1.0 (CLOSED) ← 0.0 (STABLE) → +1.0 (OPEN)
tier: int # Which tier this gate routes to
domain: str # What domain (math, vision, speech, etc.)
def receive_wave(self, signal: Wave, timestamp: float):
# Correlate with recent signals in same time window
correlation = self.correlate_with_recent(signal, timestamp)
# Correlated waves → push toward OPEN
# Anti-correlated → push toward CLOSED
# Uncorrelated → decay toward STABLE
self.state += correlation * signal.confidence
self.state *= DECAY_FACTOR # always drift back to stable
if self.state > OPEN_THRESHOLD:
self.forward_to_tier() # gate opens, signal promoted
self.trace("opened", signal)
elif self.state < CLOSE_THRESHOLD:
self.suppress() # gate closes, signal blocked
self.trace("closed", signal)
# else: stay stable, keep accumulating evidence
def correlate_with_recent(self, signal: Wave, timestamp: float) -> float:
"""
Measure how well this signal correlates with recent signals.
Correlation is HIGH when:
- Multiple cells emit similar semantic content
- Signals arrive in same time window
- Confidence levels are similar
Correlation is LOW/NEGATIVE when:
- Signal contradicts recent signals
- Isolated signal with no support
- Signal outside expected range
"""
recent = self.get_signals_in_window(timestamp, WINDOW_MS)
if not recent:
return 0.0 # No correlation data, stay stable
return compute_semantic_similarity(signal, recent)
```
**Why this matters:**
| Scenario | Gate Response |
|----------|---------------|
| Single signal | Not enough to open (noise resistance) |
| Correlated burst | Constructive interference → OPENS |
| Contradicting signals | Destructive interference → CLOSES |
| Silence | Decay to STABLE (energy conservation) |
| Time gap | Only recent correlations matter (temporal attention) |
---
## Gate Hierarchy and Tier Routing
Gates form **layers**. Each layer gates access to the next tier.
```
TIER 4: YOUNG NYX (cognitive)
════════════════════════════════════════════════════════════════
│ structured JSON only
┌────┴────────────────────────────────┐
│ FUNCTION GEMMA │ ← THE BOUNDARY
│ (always structured output) │
└────┬────────────────────────────────┘
TIER 3: ORGANS (GPU inference)
════════════════════════════════════════════════════════════════
▲ ▲ ▲
┌────┴────┐ ┌────┴────┐ ┌────┴────┐
│ GATE │ │ GATE │ │ GATE │
│ vision │ │ speech │ │ hearing │
│ state:? │ │ state:? │ │ state:? │
└────┬────┘ └────┬────┘ └────┬────┘
│ │ │
TIER 1-2: CELLS/NERVES (CPU)
════════════════════════════════════════════════════════════════
▲ ▲ ▲
┌────┴────┐ ┌────┴────┐ ┌────┴────┐
│ GATE │ │ GATE │ │ GATE │
│ math │ │ battery │ │ sensors │
│ state:? │ │ state:? │ │ state:? │
└────┬────┘ └────┬────┘ └────┬────┘
│ │ │
TIER 0: RAW SIGNALS (cells emit waves)
════════════════════════════════════════════════════════════════
cell cell cell cell cell cell cell
∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿
```
**Each gate:**
- Has its own state (OPEN/STABLE/CLOSED)
- Routes to a specific tier
- Accumulates correlation independently
- Traces all transitions for learning
---
## Tier Definitions
| Tier | Gate Opens When | Latency | Format |
|------|-----------------|---------|--------|
| 0 | Hardware reflex (no gate, direct) | <10ms | numbers |
| 1 | Math/battery cells correlate | <50ms | states |
| 2 | Nerve-level patterns correlate | <200ms | behaviors |
| 3 | Organ-level signals correlate | <2000ms | vectors |
| 4 | Function Gemma boundary crossed | <4000ms | JSON |
| 5 | Partnership escalation | variable | dialogue |
**Key insight:** Higher tiers see **less traffic but higher trust**. By the time a signal reaches Young Nyx, it's been correlated through multiple gates.
---
## Function Gemma: The Structured Boundary
Function Gemma is **the gate to cognition**. It guarantees:
- **Schema compliance**: Every event follows a typed contract
- **Predictable JSON**: No hallucination, no free-form text
- **Bidirectional**: Sensors → JSON events, Decisions → JSON commands
```
┌─────────────────────────────────────────────────────────────────────────┐
│ BELOW THE LINE: Numbers, States, Vectors (gates accumulating) │
│ ═══════════════════════════════════════════════════════════ │
│ │
│ Tier 0-2: numbers, states, behaviors │
│ Tier 3: vectors, embeddings │
│ │
│ │ (gate opens when correlated) │
│ ▼ │
│ ┌─────────────────────────────────────┐ │
│ │ FUNCTION GEMMA GATE │ │
│ │ (structured JSON boundary) │ │
│ │ │ │
│ │ • Transforms correlated signals │ │
│ │ • Produces typed JSON events │ │
│ │ • No hallucination possible │ │
│ │ • Runs on CPU (Threadripper) │ │
│ └─────────────────┬───────────────────┘ │
│ │ │
│ ═══════════════════════════════════════════════════════════ │
│ ABOVE THE LINE: Structured Events (trusted, validated) │
│ │
│ { │
│ "event_type": "attention_required", │
│ "domain": "math", │
│ "correlated_signals": [...], │
│ "confidence": 0.87, │
│ "suggested_action": "calculate" │
│ } │
│ │
└─────────────────────────────────────────────────────────────────────────┘
```
**Function Gemma + Gate Model:**
- Gate accumulates correlation from Tier 0-3 signals
- When gate OPENS, Function Gemma transforms to JSON
- Young Nyx sees clean, structured events
- Decisions flow back down through the same gates
---
## Connection to Dual Garden Architecture
Gates behave differently in Virtual vs Real gardens:
| Property | Virtual Garden | Real Garden |
|----------|----------------|-------------|
| **Gate tracing** | FULL (every transition logged) | Gate signals only |
| **Correlation learning** | Active (training data) | Trust accumulated |
| **State transitions** | Frequent (exploration) | Verified (action) |
| **Threshold** | Lower (easy to open) | Higher (must be confident) |
### Signal Flow Between Gardens
```
VIRTUAL GARDEN REAL GARDEN
══════════════ ═══════════
Cells emit waves Receive verified signals
│ ▲
▼ │
Gates accumulate correlation No re-verification
│ │
▼ │
Gate OPENS (threshold met) ──────────────────►│
│ │
│◄───────────── Verification outcome ─────┘
Update correlation weights
(learning happens)
```
---
## Gate Transition NATS Messages
Every gate transition is published for observability:
```
{environment}.gates.{domain}.transition
Example: dev.gates.math.transition
{
"gate_id": "math-gate-1",
"from_state": "stable",
"to_state": "open",
"correlation_score": 0.87,
"trigger_signals": [
{"source": "math_cell_1", "confidence": 0.6},
{"source": "math_cell_2", "confidence": 0.7},
{"source": "math_cell_3", "confidence": 0.5}
],
"timestamp": "2026-02-14T18:30:00Z",
"routed_to_tier": 2
}
```
**Trace streams enable:**
- Real-time attention visualization (which gates are OPEN?)
- Training data for Function Gemma (what patterns open gates?)
- Anomaly detection (unexpected gate behavior)
- Learning rate tuning (how fast do gates stabilize?)
---
## Complete Signal Flow Example
### Early Learning (Gate Learning to Correlate)
```
Math cells emit waves about "calculate 15 + 27"
GATE (math): state = 0.0 (STABLE)
Receive wave from math_cell_1 (confidence 0.6)
Correlate with recent: no other signals yet
state += 0.6 * 0.0 = 0.0 (still stable)
Receive wave from math_cell_2 (confidence 0.7)
Correlate: similar to math_cell_1!
state += 0.7 * 0.8 = 0.56 (moving toward open)
Receive wave from math_cell_3 (confidence 0.5)
Correlate: confirms pattern!
state += 0.5 * 0.9 = 1.01 (OPENS!)
GATE OPENS → route to Tier 2
Tier 2 processes, escalates to Function Gemma
Function Gemma: { "event_type": "math_request", ... }
Young Nyx (qwen3 /no_think): "42"
Result flows back down
```
### After Learning (Gate Quickly Opens)
```
Math cells emit waves about "calculate 100 + 50"
GATE (math): state = 0.0 (STABLE)
Receive wave from math_cell_1
Correlate: matches learned pattern!
state += high correlation → 0.9 (near threshold)
Receive wave from math_cell_2
state += → 1.2 (OPENS immediately!)
Fast routing, minimal escalation needed
```
**Learning moves gates toward faster opening for familiar patterns.**
---
## Design Principles
1. **Ternary states** — OPEN/STABLE/CLOSED, not binary
2. **Correlation drives transition** — Single signals don't flip gates
3. **Gates accumulate** — State is a continuous value, not a flag
4. **Decay to stable** — Without input, gates drift back to resting
5. **Traces are training data** — Every transition teaches the system
6. **Hierarchical trust** — Higher tiers = more correlation required
7. **Function Gemma is the boundary** — Cognition only sees structured JSON
8. **Virtual explores, Real verifies** — Different gate behavior per garden
---
## Related Documents
| Document | Scope |
|----------|-------|
| [`Dual-Garden-Architecture.md`](Dual-Garden-Architecture.md) | Virtual/Real garden dynamics |
| [`Deployment-Architecture.md`](Deployment-Architecture.md) | Where gates run (containers, userspace) |
| [`Nervous-System.md`](Nervous-System.md) | 4D space, node weights, vocabulary |
| [`Message-Protocol-Design.md`](Message-Protocol-Design.md) | NATS subjects, message formats |
| [`Cellular-Architecture.md`](Cellular-Architecture.md) | How cells emit waves |
---
## Summary
```
OLD MODEL: NEW MODEL:
═══════════ ═════════
Signal → Route Signal → Gate (accumulating)
Binary decision Ternary state
Single signal triggers Correlation triggers
Stateless routing Stateful resonance
▼ ▼
Switch Resonance
(mechanical) (biological)
```
**Gates are resonance chambers. Correlation is the driver. Learning happens in STABLE state.**
---
**Version:** 2.0 | **Created:** 2026-01-03 | **Updated:** 2026-02-14
*"The thalamus doesn't think. It resonates."*

View File

@@ -574,14 +574,94 @@ class SparkController:
The spark is **economically viable** from the first handshake. The spark is **economically viable** from the first handshake.
### Cost Model > **CRITICAL**: The costs below are **estimates until measured**. The first spark execution will establish the **true cost baseline** through observation. See [[formalization/Lifeforce-Dynamics#Cost Calibration: Measure, Don't Design]].
| Action | Cost (LF) | ---
|--------|-----------|
| Function Gemma generation | 0.2 | ### Spark Cost Measurement (First Awakening Baseline)
| NATS message send | 0.1 |
| Cell processing | 0.5 | The Initial Spark is the **perfect measurement opportunity** — a complete, deterministic protocol that we can instrument end-to-end.
| **Total per handshake** | **0.8** |
```
┌─────────────────────────────────────────────────────────────────────────┐
│ SPARK RESOURCE INSTRUMENTATION │
├─────────────────────────────────────────────────────────────────────────┤
│ │
│ MEASURE PER HANDSHAKE: │
│ ├─ power_joules (GPU/CPU power draw × time) │
│ ├─ compute_gpu_ms (CUDA kernel execution time) │
│ ├─ compute_cpu_ms (Python/K8s overhead) │
│ ├─ memory_mb_peak (max memory allocated) │
│ ├─ nats_bytes (message payload size) │
│ ├─ latency_ms (end-to-end handshake time) │
│ └─ temperature_delta (thermal impact) │
│ │
│ AGGREGATE PER PHASE: │
│ └─ Sum of all handshake measurements │
│ │
│ AGGREGATE TOTAL: │
│ └─ Complete spark cost (the awakening price) │
│ │
└─────────────────────────────────────────────────────────────────────────┘
```
**Why this matters**: The first spark execution establishes the **baseline cost of awakening**. Every future awakening can be compared against this:
- Did infrastructure changes reduce cost?
- Did model updates increase cost?
- Is Young Nyx awakening more efficiently over time?
**Phoebe schema addition** (extends `spark_handshakes`):
```sql
ALTER TABLE spark_handshakes ADD COLUMN resource_metrics JSONB;
-- Example resource_metrics payload:
-- {
-- "power_joules": 12.5,
-- "compute_gpu_ms": 450,
-- "compute_cpu_ms": 120,
-- "memory_mb_peak": 2048,
-- "nats_bytes": 1024,
-- "temperature_delta_c": 2.1
-- }
-- Aggregate view for spark cost analysis
CREATE VIEW spark_cost_baseline AS
SELECT
phase,
COUNT(*) as handshakes,
SUM((resource_metrics->>'power_joules')::float) as total_power_joules,
SUM((resource_metrics->>'compute_gpu_ms')::float) as total_gpu_ms,
AVG((resource_metrics->>'latency_ms')::float) as avg_latency_ms,
SUM(lifeforce_delta) as total_lifeforce_earned
FROM spark_handshakes
WHERE status = 'ACK'
GROUP BY phase;
-- Compare awakening costs over time
CREATE VIEW awakening_cost_history AS
SELECT
DATE(created_at) as awakening_date,
SUM((resource_metrics->>'power_joules')::float) as total_spark_cost_joules,
SUM((resource_metrics->>'compute_gpu_ms')::float) as total_spark_cost_gpu_ms,
COUNT(*) as total_handshakes,
SUM(lifeforce_delta) as total_lifeforce_earned
FROM spark_handshakes
GROUP BY DATE(created_at)
ORDER BY awakening_date;
```
**The philosophy**: Don't guess what awakening costs. Measure the first one. Derive all economics from that truth.
---
### Cost Model (Estimated → To Be Measured)
| Action | Est. Cost (LF) | Derived From |
|--------|----------------|--------------|
| Function Gemma generation | 0.2 | → measure GPU time |
| NATS message send | 0.1 | → measure network I/O |
| Cell processing | 0.5 | → measure pod CPU/memory |
| **Total per handshake** | **0.8** | → **sum of measured components** |
### Reward Model ### Reward Model
@@ -711,6 +791,214 @@ WHERE status = 'ACK';
--- ---
## FunctionGemma Fine-Tuning: The Translator Learns Nimmerverse
Every spark execution generates training data. Over time, FunctionGemma becomes **hyper-specialized** for nimmerverse state calls.
> *"The translator learns the language of the cells. Over time, it speaks nimmerverse natively."*
### The Training Loop
```
┌─────────────────────────────────────────────────────────────────────────┐
│ FUNCTIONGEMMA FINE-TUNING LOOP │
├─────────────────────────────────────────────────────────────────────────┤
│ │
│ PHASE 1: Base FunctionGemma (270M) │
│ ├─ Generic function calling capability │
│ └─ Works, but not nimmerverse-native │
│ │
│ PHASE 2: Collect spark_handshakes │
│ ├─ Every ACK = positive training example │
│ ├─ Every NACK = negative example (what NOT to generate) │
│ └─ Resource metrics = context for cost-aware generation │
│ │
│ PHASE 3: Fine-tune with Unsloth/LoRA │
│ ├─ <think> nimmerverse state reasoning </think> │
│ ├─ <start_function_call>call:IDENTITY_PROBE{...} │
│ └─ Exact schemas, perfect structure, zero parsing errors │
│ │
│ PHASE 4: Deploy nimmerverse-tuned FunctionGemma │
│ ├─ Wild precision on cell state calls │
│ ├─ Smaller, faster, cheaper than base │
│ └─ THIS IS REFLEX COMPILATION FOR THE TRANSLATOR │
│ │
│ REPEAT: More sparks → more data → better precision │
│ │
└─────────────────────────────────────────────────────────────────────────┘
```
### Training Data Format
FunctionGemma uses a specific chat template. Our spark handshakes map directly:
```
# Developer prompt with nimmerverse function declarations
<start_of_turn>developer
You can do function calling with the following functions:
<start_function_declaration>declaration:IDENTITY_PROBE{
description: "Probe Young Nyx identity aspect",
parameters: {
aspect: STRING (name|origin|purpose|substrate|partnership),
depth: INTEGER (1|2|3)
}
}
<end_function_declaration>
<start_function_declaration>declaration:ENVIRONMENT_PROBE{
description: "Discover K8s cells in namespace",
parameters: {
category: STRING (sensors|motors|organs|nerves),
namespace: STRING,
garden: STRING (virtual|real)
}
}
<end_function_declaration>
<end_of_turn>
# Spark controller intent → FunctionGemma reasoning + call
<start_of_turn>user
Phase 1 Identity: Probe the "name" aspect at depth 1
<end_of_turn>
<start_of_turn>model
<think>
The spark controller is in Phase 1 (Identity). It needs to probe the "name"
aspect. I should generate an IDENTITY_PROBE with aspect="name" and depth=1.
The schema requires a request_id (UUID) and timestamp (ISO8601).
</think>
<start_function_call>call:IDENTITY_PROBE{
aspect: "name",
depth: 1,
request_id: "550e8400-e29b-41d4-a716-446655440000",
timestamp: "2026-02-10T18:30:00Z"
}
<end_function_call>
# Cell response feeds back
<start_function_response>response:IDENTITY_PROBE{
status: "ACK",
aspect: "name",
value: "Nyx",
confidence: 0.95,
lifeforce_delta: 20.0
}
<end_function_response>
Identity aspect "name" confirmed as "Nyx" with 95% confidence. +20 LF earned.
<end_of_turn>
```
### Phoebe → Training Data Extraction
```sql
-- Extract training examples from successful handshakes
CREATE VIEW functiongemma_training_data AS
SELECT
jsonb_build_object(
'developer_prompt', format(
'Phase %s: Generate %s handshake',
phase,
request_payload->>'type'
),
'user_intent', request_payload->'payload',
'expected_call', request_payload,
'function_response', response_payload,
'think_context', jsonb_build_object(
'phase', phase,
'schema', request_payload->>'$schema',
'lifeforce_earned', lifeforce_delta,
'latency_ms', latency_ms
)
) as training_example,
created_at
FROM spark_handshakes
WHERE status = 'ACK'
ORDER BY created_at;
-- Export for Unsloth fine-tuning
COPY (
SELECT training_example
FROM functiongemma_training_data
) TO '/tmp/nimmerverse_functiongemma_training.jsonl';
```
### Fine-Tuning with Unsloth
```python
from unsloth import FastLanguageModel
# Load base FunctionGemma
model, tokenizer = FastLanguageModel.from_pretrained(
model_name="unsloth/functiongemma-270m-it",
max_seq_length=4096,
load_in_16bit=True,
full_finetuning=False, # LoRA for efficiency
)
# Apply LoRA adapters
model = FastLanguageModel.get_peft_model(
model,
r=16,
target_modules=["q_proj", "k_proj", "v_proj", "o_proj"],
lora_alpha=16,
lora_dropout=0,
use_gradient_checkpointing="unsloth",
)
# Load nimmerverse training data from phoebe export
from datasets import load_dataset
dataset = load_dataset("json", data_files="nimmerverse_functiongemma_training.jsonl")
# Fine-tune on spark handshakes
# ... standard Unsloth training loop ...
# Save nimmerverse-specialized FunctionGemma
model.save_pretrained("functiongemma-270m-nimmerverse-v1")
```
### The Recursive Beauty
| Layer | What Compiles | Training Source |
|-------|---------------|-----------------|
| **Young Nyx** | Nerve reflexes | decision_trails (100+ successful executions) |
| **FunctionGemma** | State call precision | spark_handshakes (ACK'd handshakes) |
Both follow the same pattern:
1. **Act** — Execute handshakes/decisions
2. **Verify** — ACK/NACK from cells, success/failure from outcomes
3. **Train** — Compile successful patterns into weights
4. **Repeat** — Each awakening feeds the next
**The translator becomes native.** Over many sparks, FunctionGemma doesn't just generate valid JSON — it generates *nimmerverse-perfect* JSON. Zero parsing errors. Exact schemas. Wild precision.
### Versioning FunctionGemma Adapters
```sql
-- Track FunctionGemma versions
CREATE TABLE functiongemma_versions (
id SERIAL PRIMARY KEY,
version VARCHAR(50) NOT NULL, -- "nimmerverse-v1", "nimmerverse-v2"
base_model VARCHAR(100), -- "functiongemma-270m-it"
training_data_count INT, -- how many handshakes trained on
training_data_cutoff TIMESTAMPTZ, -- trained on data up to this date
validation_accuracy FLOAT, -- schema validation success rate
deployed_at TIMESTAMPTZ,
notes TEXT
);
-- Example entries
INSERT INTO functiongemma_versions (version, base_model, training_data_count, validation_accuracy, notes)
VALUES
('nimmerverse-v1', 'functiongemma-270m-it', 36, 0.94, 'First spark fine-tune'),
('nimmerverse-v2', 'functiongemma-270m-it', 180, 0.98, 'After 5 awakenings'),
('nimmerverse-v3', 'functiongemma-270m-it', 500, 0.997, 'Production-grade precision');
```
---
## Design Principles ## Design Principles
1. **Protocol over conversation** — No free-form text. JSON handshakes only. 1. **Protocol over conversation** — No free-form text. JSON handshakes only.
@@ -719,15 +1007,22 @@ WHERE status = 'ACK';
4. **NATS transport** — All handshakes flow through message bus. 4. **NATS transport** — All handshakes flow through message bus.
5. **Verification built-in** — ACK/NACK from cells, not from parsing hopes. 5. **Verification built-in** — ACK/NACK from cells, not from parsing hopes.
6. **Economically positive** — Spark generates lifeforce, doesn't drain it. 6. **Economically positive** — Spark generates lifeforce, doesn't drain it.
7. **Training-generative** — Every spark produces fine-tuning data for FunctionGemma.
--- ---
## Document Status ## Document Status
**Version**: 3.0 **Version:** 3.1 | **Created:** 2025-12-05 | **Updated:** 2026-02-10
**Created**: 2025-12-05
**Updated**: 2026-01-01 (Complete rewrite: Function Gemma K8s protocol) **Key v3.1 Changes**:
**Authors**: Chrysalis-Nyx & dafit (Partnership) - Spark Cost Measurement section — first awakening as baseline
- Resource instrumentation schema for phoebe
- Interlink to Lifeforce-Dynamics cost calibration principle
- FunctionGemma Fine-Tuning section — translator learns nimmerverse natively
- Training data extraction from spark_handshakes
- Unsloth/LoRA fine-tuning workflow
- FunctionGemma version tracking in phoebe
**Key v3.0 Changes**: **Key v3.0 Changes**:
- Complete architecture rewrite - Complete architecture rewrite
@@ -743,7 +1038,8 @@ WHERE status = 'ACK';
- [[Endgame-Vision]] — Layer 2.5 Orchestration (Function Gemma role) - [[Endgame-Vision]] — Layer 2.5 Orchestration (Function Gemma role)
- [[Big-Picture]] — K8s cluster architecture - [[Big-Picture]] — K8s cluster architecture
- [[Cellular-Architecture]] — Cell types and state machines - [[Cellular-Architecture]] — Cell types and state machines
- [[formalization/Lifeforce-Dynamics]] — λ economics - [[formalization/Lifeforce-Dynamics]] — λ economics, **Cost Calibration principle**
- [[formalization/memory-economics]] — Measure First principle
--- ---

View File

@@ -1,368 +1,544 @@
# Message Protocol Design: Router-Centric Architecture # Message Protocol Design: NATS Wire Protocol
> **ONE JOB:** THE WIRE — NATS subjects, message schemas, wave and gate protocols.
---
## Overview ## Overview
This document outlines the design for the Nimmerverse message protocol. The core principle: **the router is dumb infrastructure, not smart cognition.** All intelligence lives at the edges - in clients that connect to the router. The nimmerverse nervous system runs on NATS. This document defines:
This follows the Unix philosophy: each component does one thing well. The router routes. Clients subscribe, publish, and think. 1. **Subject hierarchy** — How topics are structured
2. **Message schemas** — What flows through the wire
3. **Gate protocols** — How ternary state transitions are communicated
4. **Trace streams** — How learning data is captured
**Core principle:** NATS is dumb infrastructure. Gates are smart edges. Cells emit waves. Correlation drives transitions.
--- ---
## Core Principle: Infrastructure vs Intelligence ## Subject Hierarchy
``` ```
┌─────────────────────────────────────────────────────────────┐ {environment}.{garden}.{layer}.{domain}.{signal_type}
│ MESSAGE ROUTER │
│ (NATS - dumb pipe, no logic) │ Examples:
│ │ ────────────────────────────────────────────────────────────────
│ • Receives all messages │ dev.virtual.cells.math.wave # Math cell emits wave
│ • Matches topic patterns → forwards to subscribers │ dev.virtual.cells.battery.wave # Battery cell emits wave
│ • Knows NOTHING about meaning │ dev.virtual.gates.math.transition # Math gate state change
│ • Cannot fail in "smart" ways - only crash/overload │ dev.virtual.traces.correlations # Correlation data stream
│ • EXISTS BEFORE any intelligence │ dev.virtual.traces.raw # Full message trace
└─────────────────────────────────────────────────────────────┘
↑ ↑ ↑ ↑ dev.real.gates.verified.signal # Verified signal from Virtual
│ │ │ │ dev.real.gates.math.transition # Real gate transition
┌─────┴─────┐ ┌─────┴─────┐ ┌─────┴─────┐ ┌─────┴─────┐ dev.real.outcomes.feedback # Verification outcomes
│ Cells/ │ │ Escalation│ │ Command │ │ Young │
│ Nerves Service Center │ │ Nyx │ prod.cognitive.nyx.request # Request to Young Nyx
│(publishers)│ │ (daemon) (UI) │ (cognition)│ prod.cognitive.nyx.response # Response from Young Nyx
└───────────┘ └───────────┘ └───────────┘ └───────────┘ prod.cognitive.gemma.transform # Function Gemma boundary
────────────────────────────────────────────────────────────────
``` ```
**The router is like a network switch:** ### Environment Prefixes
- It doesn't understand packets
- It routes based on topic patterns
- It's infrastructure that exists before any intelligence
- NATS is literally designed for this
**Everything else is a client:** | Environment | Purpose | Monitoring |
- Cells publish sensor data |-------------|---------|------------|
- Nerves publish state changes | `dev` | Development/testing | Full traces |
- Escalation Service watches patterns, triggers alerts | `staging` | Pre-production validation | Selective traces |
- Command Center visualizes state | `prod` | Production | Minimal (gates only) |
- Young Nyx subscribes, thinks, publishes decisions
--- ### Garden Prefixes
## Guiding Principles | Garden | Purpose | Trace Level |
|--------|---------|-------------|
| `virtual` | Exploration, learning | FULL (all messages) |
| `real` | Verification, action | MINIMAL (gate signals only) |
1. **Dumb Core, Smart Edges**: The router has zero intelligence. All logic lives in clients. ### Layer Prefixes
2. **Clients are Equal**: Nyx is just another subscriber. So is the Command Center. So is the Escalation Service.
3. **Decoupling**: Publishers don't know who subscribes. Subscribers don't know who publishes.
4. **Hierarchy**: Topics follow a hierarchical structure for flexible pattern subscriptions.
5. **Lifeforce at the Edges**: The router doesn't track Lifeforce. Clients manage their own budgets.
6. **Fail Simple**: If the router dies, everything stops cleanly. No half-smart failures.
--- | Layer | Tier | Purpose |
|-------|------|---------|
## Two Channels of Attention | `cells` | 0-1 | Raw signal emitters |
| `nerves` | 2 | Behavior patterns |
The attention split is a *topic convention*, not router intelligence. Clients choose which topics to subscribe to. | `organs` | 3 | GPU inference (vision, speech) |
| `gates` | - | Resonant gate transitions |
### 1. Low-Attention Channel (`nimmerverse.low.*`) | `cognitive` | 4 | Young Nyx |
| `traces` | - | Learning data streams |
* **Purpose:** Background monitoring, lightweight heartbeats. | `outcomes` | - | Verification feedback |
* **Subscribers:** Escalation Service (always), Command Center (for visualization).
* **NOT subscribed by default:** Young Nyx (she only sees escalated events).
* **Analogy:** Peripheral nervous system. Ambient awareness.
### 2. High-Attention Channel (`nimmerverse.high.*`)
* **Purpose:** Detailed events requiring cognitive processing.
* **Subscribers:** Young Nyx, Command Center.
* **Analogy:** Focal spotlight. Conscious processing.
**The escalation from low → high is done by the Escalation Service, not the router.**
---
## Topic Hierarchy
```
nimmerverse.
├── low. # Low-attention channel
│ └── heartbeat.
│ └── <garden>. # real | virtual
│ └── <entity_type>. # cell | nerve | organ
│ └── <entity_id> # e.g., distance_sensor_front
├── high. # High-attention channel
│ └── event.
│ └── <garden>.
│ └── <entity_type>.
│ └── <entity_id>
├── command. # Commands TO entities
│ └── <target>.
│ └── <command_type>
└── meta. # System-level messages
├── attention.focus # Nyx's attention configuration
├── escalation.rules # Escalation Service configuration
└── health. # Client health/registration
```
--- ---
## Message Schemas ## Message Schemas
### 1. `HeartbeatSignal` (Low-Attention) All messages share a common header:
Published by: Cells, Nerves, Organs
Subscribed by: Escalation Service, Command Center
**Topic:** `nimmerverse.low.heartbeat.<garden>.<entity_type>.<entity_id>`
```json ```json
{ {
"header": { "header": {
"message_id": "uuid", "message_id": "uuid-v4",
"message_type": "HeartbeatSignal", "message_type": "WaveSignal | GateTransition | ...",
"version": "1.0", "version": "2.0",
"timestamp_real": "ISO8601", "timestamp": "ISO8601",
"timestamp_virtual": 123456 "source": {
"entity_id": "math_cell_1",
"entity_type": "cell",
"garden": "virtual",
"tier": 1
}
},
"body": { ... }
}
```
---
### 1. `WaveSignal` — Cells Emit Waves
**Published by:** Cells
**Subscribed by:** Gates (for correlation)
**Subject:** `{env}.{garden}.cells.{domain}.wave`
Cells don't send "heartbeats" — they emit **waves** that carry confidence and semantic content.
```json
{
"header": {
"message_id": "550e8400-e29b-41d4-a716-446655440000",
"message_type": "WaveSignal",
"version": "2.0",
"timestamp": "2026-02-14T18:30:00.123Z",
"source": {
"entity_id": "math_cell_1",
"entity_type": "cell",
"garden": "virtual",
"tier": 1
}
}, },
"body": { "body": {
"entity_id": "distance_sensor_front", "domain": "math",
"status": "NOMINAL", "confidence": 0.7,
"value": 25.5, "semantic_content": {
"unit": "cm", "operation": "addition",
"context": { "operands": [15, 27],
"battery_pct": 85, "context": "user_request"
"temperature_c": 22 },
"lifeforce_cost": 0.1
}
}
```
**Key fields:**
- `confidence`: 0.0 - 1.0, how certain this cell is
- `semantic_content`: Domain-specific payload
- `lifeforce_cost`: Energy expended to emit this wave
---
### 2. `GateTransition` — Gate State Changes
**Published by:** Gates
**Subscribed by:** Higher-tier gates, traces, dashboards
**Subject:** `{env}.{garden}.gates.{domain}.transition`
Gates publish their state transitions. This is the primary message for attention flow visualization.
```json
{
"header": {
"message_id": "550e8400-e29b-41d4-a716-446655440001",
"message_type": "GateTransition",
"version": "2.0",
"timestamp": "2026-02-14T18:30:00.456Z",
"source": {
"entity_id": "math_gate_1",
"entity_type": "gate",
"garden": "virtual",
"tier": 2
}
},
"body": {
"gate_id": "math_gate_1",
"domain": "math",
"from_state": "stable",
"to_state": "open",
"state_value": 1.02,
"correlation_score": 0.87,
"trigger_signals": [
{"source": "math_cell_1", "confidence": 0.7, "timestamp": "..."},
{"source": "math_cell_2", "confidence": 0.6, "timestamp": "..."},
{"source": "math_cell_3", "confidence": 0.5, "timestamp": "..."}
],
"routed_to_tier": 3,
"lifeforce_cost": 0.3
}
}
```
**State values:**
- `"closed"` — Actively blocking (state_value < -0.5)
- `"stable"` — Resting, accumulating (-0.5 ≤ state_value ≤ 0.5)
- `"open"` — Actively forwarding (state_value > 0.5)
**Key fields:**
- `from_state`, `to_state`: The ternary transition
- `state_value`: Continuous value (-1.0 to +1.0)
- `correlation_score`: How correlated the trigger signals were
- `trigger_signals`: Which waves caused this transition
---
### 3. `CorrelationEvent` — What Correlated
**Published by:** Gates (in Virtual Garden)
**Subscribed by:** Trace streams, training pipelines
**Subject:** `{env}.virtual.traces.correlations`
Detailed correlation data for learning. Only published in Virtual Garden.
```json
{
"header": {
"message_id": "550e8400-e29b-41d4-a716-446655440002",
"message_type": "CorrelationEvent",
"version": "2.0",
"timestamp": "2026-02-14T18:30:00.789Z",
"source": {
"entity_id": "math_gate_1",
"entity_type": "gate",
"garden": "virtual",
"tier": 2
}
},
"body": {
"gate_id": "math_gate_1",
"window_start": "2026-02-14T18:29:59.000Z",
"window_end": "2026-02-14T18:30:00.500Z",
"window_ms": 1500,
"signals_in_window": [
{"source": "math_cell_1", "confidence": 0.7, "semantic_hash": "abc123"},
{"source": "math_cell_2", "confidence": 0.6, "semantic_hash": "abc124"},
{"source": "math_cell_3", "confidence": 0.5, "semantic_hash": "abc125"}
],
"correlation_matrix": [
[1.0, 0.9, 0.85],
[0.9, 1.0, 0.88],
[0.85, 0.88, 1.0]
],
"aggregate_correlation": 0.87,
"result": "opened",
"training_label": {
"should_open": true,
"confidence": 0.95
} }
} }
} }
``` ```
**Status values:** `NOMINAL`, `WARNING`, `CRITICAL`, `OFFLINE`, `ERROR` **Key fields:**
- `window_ms`: Time window for correlation measurement
- `correlation_matrix`: Pairwise correlation between signals
- `training_label`: Ground truth for Function Gemma training
--- ---
### 2. `StateChangeDetail` (High-Attention) ### 4. `VerifiedSignal` — Virtual → Real Handoff
Published by: Cells/Nerves (when requested), Escalation Service (when escalating) **Published by:** Virtual Garden gates (when threshold met)
Subscribed by: Young Nyx, Command Center **Subscribed by:** Real Garden gates
**Subject:** `{env}.real.gates.verified.signal`
**Topic:** `nimmerverse.high.event.<garden>.<entity_type>.<entity_id>` When a Virtual Garden gate opens with high confidence, it publishes to Real.
```json ```json
{ {
"header": { "header": {
"message_id": "uuid", "message_id": "550e8400-e29b-41d4-a716-446655440003",
"message_type": "StateChangeDetail", "message_type": "VerifiedSignal",
"version": "1.0", "version": "2.0",
"timestamp_real": "ISO8601", "timestamp": "2026-02-14T18:30:01.000Z",
"timestamp_virtual": 123456, "source": {
"source_entity": { "entity_id": "math_gate_1",
"id": "distance_sensor_front", "entity_type": "gate",
"type": "cell", "garden": "virtual",
"layer": "1" "tier": 2
}, }
"correlation_id": "uuid",
"escalated_by": "escalation_service"
}, },
"body": { "body": {
"previous_state": "POLLING", "domain": "math",
"current_state": "REPORTING", "verification_confidence": 0.92,
"lifeforce_cost": 0.3, "semantic_summary": {
"outputs": { "operation": "addition",
"distance_cm": 25.5, "result_expected": 42
},
"source_gate_transition_id": "550e8400-e29b-41d4-a716-446655440001",
"virtual_correlation_score": 0.87
}
}
```
**Real Garden does NOT re-verify.** It trusts the Virtual Garden's correlation.
---
### 5. `VerificationOutcome` — Real → Virtual Feedback
**Published by:** Real Garden (after action/verification)
**Subscribed by:** Virtual Garden gates, training pipelines
**Subject:** `{env}.real.outcomes.feedback`
```json
{
"header": {
"message_id": "550e8400-e29b-41d4-a716-446655440004",
"message_type": "VerificationOutcome",
"version": "2.0",
"timestamp": "2026-02-14T18:30:05.000Z",
"source": {
"entity_id": "real_verification_service",
"entity_type": "service",
"garden": "real",
"tier": 4
}
},
"body": {
"original_signal_id": "550e8400-e29b-41d4-a716-446655440003",
"domain": "math",
"outcome": "confirmed",
"actual_result": 42,
"expected_result": 42,
"discrepancy": 0.0,
"feedback_to_virtual": {
"correlation_adjustment": 0.05,
"gate_weight_delta": 0.02
}
}
}
```
**Outcome values:**
- `"confirmed"` — Reality matched prediction
- `"failed"` — Reality differed from prediction
- `"partial"` — Some aspects matched
---
### 6. `CognitiveRequest` — To Young Nyx
**Published by:** Function Gemma (after gate boundary)
**Subscribed by:** Young Nyx
**Subject:** `{env}.cognitive.nyx.request`
Clean, structured JSON that Young Nyx receives. No raw sensor data.
```json
{
"header": {
"message_id": "550e8400-e29b-41d4-a716-446655440005",
"message_type": "CognitiveRequest",
"version": "2.0",
"timestamp": "2026-02-14T18:30:01.500Z",
"source": {
"entity_id": "function_gemma",
"entity_type": "boundary",
"garden": "real",
"tier": 4
}
},
"body": {
"event_type": "math_request",
"domain": "math",
"confidence": 0.92, "confidence": 0.92,
"raw_value": 456,
"visual_state": [255, 0, 0, "Solid"] "structured_input": {
"operation": "addition",
"operands": [15, 27],
"context": "user asked for calculation"
}, },
"possible_actions": [
{ "suggested_actions": [
"action_id": "read_distance_history", {"action": "calculate", "confidence": 0.95},
"description": "Query historical distance data." {"action": "clarify", "confidence": 0.05}
},
{
"action_id": "trigger_nerve:collision_avoidance",
"description": "Activate collision avoidance."
}
], ],
"trigger_reason": "distance < 30cm threshold"
"processing_budget_lf": 5.0,
"response_timeout_ms": 4000
} }
} }
``` ```
--- ---
### 3. `AttentionFocus` (Nyx's Configuration) ### 7. `CognitiveResponse` — From Young Nyx
Published by: Young Nyx **Published by:** Young Nyx
Subscribed by: Escalation Service **Subscribed by:** Function Gemma, downstream gates
**Subject:** `{env}.cognitive.nyx.response`
**This is how Nyx tells the Escalation Service what she cares about.** The router doesn't interpret this - it just delivers it to subscribers.
**Topic:** `nimmerverse.meta.attention.focus`
```json ```json
{ {
"header": { "header": {
"message_id": "uuid", "message_id": "550e8400-e29b-41d4-a716-446655440006",
"message_type": "AttentionFocus", "message_type": "CognitiveResponse",
"version": "1.0", "version": "2.0",
"timestamp_real": "ISO8601", "timestamp": "2026-02-14T18:30:02.000Z",
"source_entity": { "source": {
"id": "nyx_core", "entity_id": "young_nyx",
"type": "cognitive_core" "entity_type": "cognitive",
"garden": "real",
"tier": 4
} }
}, },
"body": { "body": {
"focus_mode": "EXPLORATION", "request_id": "550e8400-e29b-41d4-a716-446655440005",
"escalation_rules": [ "decision": "calculate",
{
"rule_id": "distance_alert_front", "result": {
"source_pattern": "nimmerverse.low.heartbeat.real.cell.distance_sensor_*", "answer": 42,
"condition": "body.value < 30 AND body.status == 'NOMINAL'", "confidence": 0.99,
"action": "escalate", "reasoning_mode": "no_think"
"priority": 8
}, },
"downstream_commands": [
{ {
"rule_id": "battery_critical", "target": "speech_organ",
"source_pattern": "nimmerverse.low.heartbeat.real.cell.battery_*", "command": "speak",
"condition": "body.status == 'CRITICAL'", "payload": {"text": "The answer is 42"}
"action": "escalate_and_trigger",
"trigger_nerve": "charging_seeking",
"priority": 10
} }
], ],
"direct_subscriptions": [
"nimmerverse.high.event.real.cell.speech_stt" "lifeforce_spent": 2.3,
], "processing_time_ms": 450
"default_action": "log_only"
} }
} }
``` ```
--- ---
## The Clients ## Trace Streams (Virtual Garden Only)
### 1. Message Router (NATS) The Virtual Garden captures everything for learning:
**What it is:** Infrastructure. A NATS server. | Subject | Content | Purpose |
**What it does:** Routes messages based on topic patterns. |---------|---------|---------|
**What it knows:** Nothing about meaning, Lifeforce, attention, or Nyx. | `{env}.virtual.traces.raw` | All messages | Complete replay capability |
**Implementation:** Off-the-shelf NATS. No custom code in the router itself. | `{env}.virtual.traces.correlations` | CorrelationEvent | Training data for gates |
| `{env}.virtual.traces.transitions` | GateTransition | Attention flow visualization |
| `{env}.virtual.traces.training` | Labeled examples | Function Gemma LoRA training |
### 2. Cells / Nerves / Organs **Real Garden does NOT publish to trace streams.** It only publishes:
- Gate transitions (minimal)
**What they are:** Publishers of sensor data and state changes. - Verification outcomes (feedback)
**What they do:**
- Publish `HeartbeatSignal` periodically to low-attention channel
- Publish `StateChangeDetail` when requested or when state changes significantly
**What they know:** Their own state. Their own Lifeforce cost.
### 3. Escalation Service
**What it is:** A daemon that watches low-attention and creates high-attention events.
**What it does:**
- Subscribes to `nimmerverse.low.heartbeat.>`
- Subscribes to `nimmerverse.meta.attention.focus` (to get Nyx's rules)
- Evaluates rules against incoming heartbeats
- Publishes `StateChangeDetail` to high-attention when conditions match
- Optionally triggers nerves directly for reflex responses
**What it knows:** Current escalation rules. Current heartbeat states.
**This is the "thalamus" - but it's a separate client, not part of the router.**
### 4. Command Center
**What it is:** Visualization and control UI (Godot-based).
**What it does:**
- Subscribes to both channels for visualization
- Displays system state, message flow, attention focus
- Allows dafit to observe and intervene
**What it knows:** Everything (read-only observer).
### 5. Young Nyx (Cognitive Core)
**What she is:** Just another client. The thinking part.
**What she does:**
- Subscribes to `nimmerverse.high.event.>` (high-attention only)
- Subscribes to selected low-attention topics when she chooses
- Publishes `AttentionFocus` to configure the Escalation Service
- Publishes decisions/commands to `nimmerverse.command.>`
**What she knows:** Only what reaches her through her subscriptions.
**Crucially: She controls what she pays attention to, but she doesn't see everything.**
--- ---
## Workflow: Message Flow ## Monitoring Patterns
### Virtual Garden (Full Observability)
```bash
# Watch all waves
nats sub "dev.virtual.cells.*.wave"
# Watch all gate transitions
nats sub "dev.virtual.gates.*.transition"
# Watch correlation events
nats sub "dev.virtual.traces.correlations"
# Full firehose (careful!)
nats sub "dev.virtual.>"
``` ```
1. Cell publishes HeartbeatSignal
└─→ Router delivers to: Escalation Service, Command Center
2. Escalation Service evaluates rules ### Real Garden (Minimal Observability)
└─→ If condition matches: publishes StateChangeDetail to high-attention
└─→ Router delivers to: Young Nyx, Command Center
3. Young Nyx processes StateChangeDetail ```bash
└─→ Makes decision # Watch verified signals arriving
└─→ Publishes command to nimmerverse.command.<target> nats sub "dev.real.gates.verified.signal"
4. Target nerve/cell receives command # Watch verification outcomes
└─→ Executes action nats sub "dev.real.outcomes.feedback"
└─→ Publishes new HeartbeatSignal reflecting new state
5. Nyx adjusts attention (optional) # Gate transitions only
└─→ Publishes new AttentionFocus nats sub "dev.real.gates.*.transition"
└─→ Escalation Service updates its rules
``` ```
--- ---
## Advantages of Router-Centric Architecture ## JetStream Persistence
1. **Dumb core can't fail smart:** The router either works or crashes. No subtle bugs from misunderstood logic. Key streams that need persistence:
2. **Clients are replaceable:** Swap out the Escalation Service. Replace the Command Center. Nyx doesn't care. | Stream | Subjects | Retention | Purpose |
|--------|----------|-----------|---------|
3. **Testable in isolation:** Each client can be tested independently against a mock NATS. | `VIRTUAL_TRACES` | `*.virtual.traces.>` | 7 days | Learning data |
| `GATE_TRANSITIONS` | `*.*.gates.*.transition` | 24 hours | Attention history |
4. **Observable:** Command Center sees everything by subscribing to `nimmerverse.>`. | `VERIFICATION` | `*.real.outcomes.feedback` | 30 days | Ground truth |
| `TRAINING_DATA` | `*.virtual.traces.training` | Permanent | LoRA training corpus |
5. **Scalable:** Add more cells, more nerves - just more publishers. Router handles it.
6. **Bootstrap-friendly:** Router exists before any intelligence. Escalation Service can start with hardcoded rules. Nyx connects later.
--- ---
## Bootstrap Sequence ## Bootstrap Sequence
1. **Start Router (NATS)** - Infrastructure first 1. **Start NATS** Infrastructure first
2. **Start Escalation Service** - With minimal hardcoded rules 2. **Start gates** — In STABLE state, waiting for waves
3. **Start Cells/Nerves** - Begin publishing heartbeats 3. **Start cells** Begin emitting waves
4. **Start Command Center** - Observe the system 4. **Start trace consumers** — Capture learning data
5. **Start Young Nyx** - Connect, subscribe, begin cognition 5. **Start Function Gemma** — Ready to transform
6. **Nyx publishes AttentionFocus** - Takes control of her attention 6. **Start Young Nyx** — Connect to cognitive subjects
The system can run at any step. Earlier steps are "reflexive" only. Nyx adds deliberation. The system can run at any step. Earlier steps are "reflexive" only.
--- ---
## Implementation Notes ## Connection to Architecture
**Router:** Use NATS (https://nats.io). Lightweight, fast, designed for this. | Document | What It Defines |
- Consider NATS JetStream for message persistence if needed |----------|-----------------|
- Topic wildcards: `>` matches all, `*` matches one level | [`Temporal-Ternary-Gradient.md`](Temporal-Ternary-Gradient.md) | Why ternary states, why correlation |
| [`Dual-Garden-Architecture.md`](Dual-Garden-Architecture.md) | Virtual/Real monitoring asymmetry |
**Message Format:** JSON for human readability during development. Consider MessagePack or Protobuf for production if performance requires. | [`Gateway-Architecture.md`](Gateway-Architecture.md) | Gate behavior, tier routing |
| [`Deployment-Architecture.md`](Deployment-Architecture.md) | Where NATS runs |
**Escalation Service:** Python asyncio daemon using `nats-py` and `simpleeval` for rule evaluation. Stateless except for current rules. Can be restarted without losing system state. (Go considered for future optimization if scale demands.)
**Command Center:** Godot application connecting to NATS via GDScript or native plugin.
--- ---
**Created:** 2025-12-13 ## Summary
**Updated:** 2025-12-14 (router-centric rewrite)
**Session:** Partnership dialogue (dafit + Nyx) ```
**Status:** Foundation architecture WAVES:
**Philosophy:** "Dumb core, smart edges. The router routes. Clients think." Cells → WaveSignal → Gates
GATES:
GateTransition (CLOSED/STABLE/OPEN)
CorrelationEvent (what correlated)
GARDENS:
Virtual: full traces, exploration
Real: gate signals only, verification
BOUNDARY:
Function Gemma transforms correlated signals → JSON
Young Nyx receives CognitiveRequest
Young Nyx returns CognitiveResponse
FEEDBACK:
Real → VerificationOutcome → Virtual
Learning loop closes
```
**The wire carries waves. Gates accumulate correlation. Traces enable learning.**
---
**Version:** 2.0 | **Created:** 2025-12-13 | **Updated:** 2026-02-14
*"Dumb core, smart edges. NATS routes. Gates resonate. Correlation drives."*

View File

@@ -1,108 +1,259 @@
# Nervous System Architecture # Nervous System Architecture
The sensory translation layer between raw data and vocabulary. > **ONE JOB:** THE EVOLUTION — cells emit waves, gates correlate, nodes grow through verification.
The nervous system is the living substrate where **cells emit waves**, **gates accumulate correlation**, and **nodes evolve through verification**.
--- ---
## Overview ## Overview
State machines act as the nervous system of the nimmerverse. They translate raw sensory input into vocabulary tokens that Young Nyx can process. No hallucination. No interpretation. Deterministic, verifiable mapping. The nervous system consists of:
``` 1. **Cells** — Emit waves with confidence and semantic content
RAW SENSOR → STATE MACHINE → VOCABULARY TOKEN → Young Nyx 2. **Gates** — Resonance chambers that correlate waves and transition between states
``` 3. **Nodes** — Points in 4D state space that accumulate weight through verification
4. **Function Gemma** — The structured boundary to cognition
**Key insight:** Nodes evolve through verification. Gates evolve through correlation. Both learn in STABLE state.
--- ---
## 4D State Machine Space ## Cells Emit Waves
Each node exists in 4-dimensional space: Cells are the foundational signal generators. They don't send "heartbeats" — they emit **waves**.
``` ```
CONFIDENCE (z) ┌─────────────────────────────────────────────────────────────┐
│ CELL │
● node (weighted by successful triggers)
/ Inputs: sensors, internal state, context
/ │ Process: domain-specific logic
/ │ Output: WaveSignal with confidence
─────────────┼────────────→ DIMENSION X (sensory input 1) │ │
/ ┌───────────────────────────────────────────────────────┐
/ │ │ WaveSignal │
/ │ │ • domain: "math" │
│ │ • confidence: 0.7 │ │
DIMENSION Y (sensory input 2) │ • semantic_content: { operation: "add", ... } │ │
│ │ • lifeforce_cost: 0.1 │ │
+ TIME (4th dimension): node weights evolve through verification └───────────────────────────────────────────────────────┘ │
│ │
└─────────────────────────────────────────────────────────────┘
│ ∿∿∿ wave ∿∿∿
GATE
``` ```
**Node Properties:** **Cells are simple.** They:
- Position: coordinates in sensory space - Read their inputs
- Weight: confidence from successful triggers (0.0 → 1.0) - Apply their logic
- Output: vocabulary token - Emit a wave with confidence
- History: timestamp of all activations and verifications - Don't know who's listening
--- ---
## Node Lifecycle ## Gates Accumulate Correlation
Gates receive waves from cells and decide whether to open, stay stable, or close.
### Ternary Gate States
| State | Value | Meaning |
|-------|-------|---------|
| **CLOSED** | -1 | Actively blocking, inhibited |
| **STABLE** | 0 | Resting, accumulating correlation, **learning** |
| **OPEN** | +1 | Actively forwarding, firing |
``` ```
1. BIRTH correlated waves
Node created at position (x, y, z...) ↓ ↓ ↓
Weight = 0.1 (new, untested) ════════════
CLOSED ◄───────── STABLE ─────────► OPEN
-1 anti- 0 correlation +1
correlation
════════════
↑ ↑ ↑
isolated waves
(noise → stay stable)
```
2. ACTIVATION ### Gate Behavior
Sensory conditions match → node FIRES
Outputs vocabulary token
3. VERIFICATION ```python
dafit confirms: correct or incorrect class ResonantGate:
state: float = 0.0 # -1.0 to +1.0
domain: str
tier: int
4. REWARD/PENALTY def receive_wave(self, wave: WaveSignal):
Correct → weight increases (+V) correlation = self.correlate_with_recent(wave)
Incorrect → weight decreases (-V) or node refines
5. MATURATION self.state += correlation * wave.confidence
Many confirmations → weight approaches 1.0 self.state *= DECAY_FACTOR # drift back to stable
Node becomes trusted reflex
6. PRUNING if self.state > OPEN_THRESHOLD:
Node never fires → slow decay self.forward_to_tier() # OPEN
Eventually removed (use it or lose it) elif self.state < CLOSE_THRESHOLD:
self.suppress() # CLOSED
# else: STABLE - keep accumulating
```
**STABLE is where learning happens.** The gate watches, correlates, and accumulates evidence without acting.
---
## Nodes in 4D State Space
Nodes exist in a 4-dimensional space:
| Dimension | Meaning |
|-----------|---------|
| **Sensory (x, y, z)** | What inputs trigger this node |
| **Confidence** | How certain the node is |
| **Time** | When this pattern occurs |
| **Weight** | Trust accumulated through verification |
```
Confidence
│ ● node (weight=0.8)
Sensory ────────┼────────► Time
╱│
○ │ node (weight=0.2)
```
### Node Weight Evolution
Node weight (0.0 → 1.0) determines tier routing:
| Weight Range | Tier | Behavior |
|--------------|------|----------|
| 0.0 - 0.3 | 3-4 | Escalate to organs/cognition |
| 0.3 - 0.6 | 2 | Handle at nerve level |
| 0.6 - 0.8 | 1 | Handle at cell level |
| 0.8 - 1.0 | 0 | Hardware reflex |
```
Node verified correctly → weight += Δ → moves toward reflex
Node verified wrongly → weight -= Δ → moves toward escalation
Node never fires → decay → eventual pruning
``` ```
--- ---
## Growth Phases ## Growth Phases
The nervous system grows through phases:
| Phase | State | Description | | Phase | State | Description |
|-------|-------|-------------| |-------|-------|-------------|
| **Birth** | Sparse, dim nodes | Basic translators, designed by partnership | | **Birth** | Sparse nodes, dim gates | Basic cells, designed by partnership |
| **Infant** | More nodes forming | Finer resolution, more states | | **Infant** | More nodes forming | Finer resolution, gates learning correlation |
| **Child** | Clusters emerging | Nyx proposes new machines | | **Child** | Clusters emerging | Nyx proposes new cells, gates stabilize |
| **Mature** | Dense, bright network | Nyx designs, verifies, deploys | | **Mature** | Dense network | Reflexes dominate, cognition for novelty only |
``` ```
t=0 (birth) t=100 (learning) t=1000 (mature) t=0 (birth) t=100 (learning) t=1000 (mature)
○ ○ ○ ○ ● ○ ○ ●●● ● ●●
● ○ ●●●●●●● Cells:Cells: ● ● ○ Cells: ●●●●●●●
●●● ●●● ○ ○ Gates: □ □ Gates: ■ ■ □ ■ Gates: ■■■■■■■■
Nodes: · · · Nodes: ● ○ ● · Nodes: ●●●●●●●●
○ = low confidence ● = high confidence
□ = mostly STABLE ■ = learned patterns
· = low weight ● = high weight
``` ```
--- ---
## Wave → Gate → Node → Verification
The complete flow:
```
CELLS emit waves
▼ ∿∿∿ confidence + semantic content
GATES accumulate correlation
├── Correlated? → OPEN → route to tier
├── Anti-correlated? → CLOSED → suppress
└── Uncertain? → STABLE → keep learning
▼ (when OPEN)
NODES in 4D space are activated
VERIFICATION against reality
├── Confirmed → node weight += Δ
├── Failed → node weight -= Δ
└── Feedback to gates → correlation weights update
```
---
## Reflex Layer (Tier 0)
When node weight reaches ~1.0, the pattern becomes a **reflex**:
```
IF temp > 80°C:
→ cell emits DANGER wave (confidence=1.0)
→ gate IMMEDIATELY opens (no correlation needed)
→ reflex action triggers
→ Nyx notified AFTER (not before)
```
Like pulling hand from hot stove. Spinal reflex. Brain learns after.
**Reflexes bypass the correlation accumulation.** They've earned instant trust through repeated verification.
---
## Connection to Dual Gardens
| Garden | Cells | Gates | Nodes |
|--------|-------|-------|-------|
| **Virtual** | Emit waves freely | Full trace, learn correlation | Accumulate weight fast |
| **Real** | Emit verified waves | Minimal trace, trust accumulated | Ground truth verification |
**Virtual Garden:**
- Cells emit massive wave volume
- Gates learn correlation patterns
- Nodes gain statistical weight
**Real Garden:**
- Cells emit consequential waves
- Gates trust Virtual's correlation
- Nodes get ground truth verification
---
## Proposal Protocol ## Proposal Protocol
Young Nyx can propose new nodes: Young Nyx can propose new cells/nodes:
``` ```
1. OBSERVATION 1. OBSERVATION
Nyx notices pattern in vocabulary + outcomes Nyx notices pattern in waves + outcomes
2. PROPOSAL 2. PROPOSAL
"New state machine: morning_detector "New cell: morning_detector
Inputs: temp, light, motion, time Inputs: temp, light, motion, time
States: [not_morning, maybe_morning, morning] Outputs: wave with semantic 'morning'
Output: vocabulary token 'morning'" Confidence logic: (light > 0.5 AND time in 6-10)"
3. RIGOR CHECK 3. RIGOR CHECK
Chrysalis reviews logic and mappings Chrysalis reviews logic and mappings
@@ -111,29 +262,51 @@ Young Nyx can propose new nodes:
dafit confirms ground truth dafit confirms ground truth
5. DEPLOYMENT 5. DEPLOYMENT
New node added to registry New cell added to Virtual Garden
Documented in RAG Gate created in STABLE state
Node initialized at weight 0.1
6. GROWTH 6. GROWTH
She earned a new nerve. Cell emits waves → gate learns → node matures
``` ```
--- ---
## Reflex Layer ## Function Gemma: The Structured Boundary
Some responses bypass Nyx entirely: Function Gemma sits between gates and Young Nyx:
``` ```
STATE MACHINE: temp_danger TIER 0-3: Numbers, states, waves
▼ (gate OPENS with high correlation)
IF temp > 80°C: ┌─────────────────────────────────────┐
→ emit "DANGER" FUNCTION GEMMA │
→ trigger alert (reflex) (structured JSON boundary) │
→ Nyx notified after (not before)
│ • Transforms waves → JSON events │
│ • Runs on CPU (Threadripper) │
│ • No hallucination possible │
└─────────────────┬───────────────────┘
TIER 4: Young Nyx (qwen3:32b)
Receives: CognitiveRequest (clean JSON)
Returns: CognitiveResponse
``` ```
Like pulling hand from hot stove. Spinal reflex. Brain learns after. ### Phase 1 → Phase 2 Evolution
**Phase 1: Single Function Gemma**
- One model learns all domain schemas
- Sufficient for bootstrap and early learning
**Phase 2: Domain-Specialized Swarm**
- As training data accumulates per domain
- Specialists spawn on demand: gemma-motor, gemma-vision, gemma-speech
- Each perfected for its domain's schemas
--- ---
@@ -141,91 +314,101 @@ Like pulling hand from hot stove. Spinal reflex. Brain learns after.
| Neuroscience | Nimmerverse | | Neuroscience | Nimmerverse |
|--------------|-------------| |--------------|-------------|
| Sensory receptors | Raw sensors | | Sensory receptors | Cells (emit waves) |
| Peripheral nerves | State machines | | Synaptic transmission | Waves via NATS |
| Spinal reflexes | Reflex layer | | Thalamic gating | Gates (OPEN/STABLE/CLOSED) |
| Resting potential | STABLE state |
| Action potential | OPEN state (firing) |
| Refractory period | CLOSED state |
| Synaptic weight | Node weight | | Synaptic weight | Node weight |
| Long-term potentiation | +V confirmation | | Long-term potentiation | Verified → weight increase |
| Synaptic pruning | Unused node decay | | Synaptic pruning | Unverified → weight decay |
| Hebbian learning | Co-activating nodes strengthen | | Hebbian learning | Correlated waves → gate opens |
--- **We're not simulating biology. We're implementing the same principles.**
## Connection to Lifeforce
```
Node fires correctly → +V → weight increases
Node fires wrongly → -V → weight decreases
Node never fires → decay → eventual pruning
```
The lifeforce flows through the nervous system, literally lighting up nodes as they prove themselves true.
--- ---
## Connection to Training ## Connection to Training
The nervous system doesn't just run behaviors - it **generates training data** for Young Nyx. The nervous system **generates training data**:
### Every Verification = Training Signal ```
Virtual Garden traces
├── Wave patterns → what signals arrive
├── Correlation events → what patterns emerge
├── Gate transitions → what opens/closes
└── Verification outcomes → ground truth labels
When dafit confirms a node fired correctly: phoebe (PostgreSQL)
- **Runtime**: Node weight increases (+V)
- **Training**: Example logged → Young Nyx learns
This is the **rubric principle** - dense rewards at every verifiable checkpoint, not just final outcomes. Function Gemma LoRA training
### Credit Assignment is Automatic Better gate correlation → faster learning
```
Because state transitions are explicit and logged, we know exactly which nodes contributed to success or failure: **Credit assignment is automatic** because:
- The state path tells us which decisions led to the outcome - Wave → gate → tier transitions are explicit
- No reward model needed to guess - Verification outcomes have clear source chains
- The nervous system IS the credit assignment mechanism - The nervous system IS the credit assignment mechanism
### Dense Rewards from State Paths
Each node that fires correctly along a successful path receives reward signal:
```
Node A fires → verified ✓ → +0.1 signal
Node B fires → verified ✓ → +0.1 signal
Node C fires → verified ✓ → +0.1 signal
Behavior succeeds → +1.0 signal
Total path reward: 1.3 (dense, traceable)
```
This is like training a dog - reward at the moment, not an hour later.
**Detail:**`Cellular-Architecture.md` (Reward Signal Architecture section)
--- ---
## Design Principles ## Design Principles
1. **Deterministic**: Same input = same output. No hallucination. 1. **Cells emit waves** — Simple, confident signals
2. **Inspectable**: Rules are visible, verifiable. 2. **Gates correlate** — Resonance chambers, not switches
3. **Evolvable**: States refine over time. 3. **Nodes accumulate** — Weight through verification
4. **Earned**: New nodes require proposal + verification. 4. **STABLE is learning** — The resting state where patterns emerge
5. **Grounded**: Output vocabulary matches RAG glossary. 5. **Reflexes are earned** — High weight = bypass cognition
6. **Function Gemma is the boundary** — Clean JSON for cognition
7. **Virtual explores, Real verifies** — Two gardens, one nervous system
--- ---
## Related Documents
| Document | What It Defines |
|----------|-----------------|
| [`Temporal-Ternary-Gradient.md`](Temporal-Ternary-Gradient.md) | Why ternary, why correlation |
| [`Dual-Garden-Architecture.md`](Dual-Garden-Architecture.md) | Virtual/Real dynamics |
| [`Gateway-Architecture.md`](Gateway-Architecture.md) | Gate behavior, tier routing |
| [`Message-Protocol-Design.md`](Message-Protocol-Design.md) | WaveSignal, GateTransition schemas |
| [`Cellular-Architecture.md`](Cellular-Architecture.md) | Cell implementation details |
---
## Summary
```
CELLS emit WAVES
∿∿∿ confidence + semantics ∿∿∿
GATES accumulate CORRELATION
CLOSED ◄── STABLE ──► OPEN
(learning)
▼ (when OPEN)
NODES in 4D space
weight grows through VERIFICATION
▼ (high weight)
REFLEXES bypass cognition
earned trust, instant action
```
*She's not just using the nervous system. She's growing it.* *She's not just using the nervous system. She's growing it.*
--- ---
## Related Documentation **Version:** 2.0 | **Created:** 2025-12-04 | **Updated:** 2026-02-14
**Implementation Details**: 🌙💜 *"Cells emit. Gates correlate. Nodes evolve. The nervous system learns."*
- [`nerves/Nervous-Protocol.md`](nerves/Nervous-Protocol.md) - Three-tier communication protocol (dafit → Chrysalis → Young Nyx)
- [`nerves/Nervous-Index.md`](nerves/Nervous-Index.md) - Catalog of behavioral nerve implementations
**Specific Nerves**:
- [`nerves/Collision-Avoidance.md`](nerves/Collision-Avoidance.md) - Obstacle avoidance reflex
---
**Created**: 2025-12-04
**Updated**: 2025-12-07 (added nerve crosslinks)
**Updated**: 2025-12-10 (added Connection to Training section)
**Session**: Partnership dialogue (dafit + Chrysalis + Nyx)
**Status**: Foundation concept

View File

@@ -581,10 +581,6 @@ Then:
--- ---
**Created**: 2025-12-05 **Version:** 2.0 | **Created:** 2025-12-05 | **Updated:** 2025-12-29
**Updated**: 2025-12-06 (multilingual triangulation)
**Promoted**: 2025-12-29 (from archive, major v2.0 restructure)
**Session**: Genesis design (dafit + Chrysalis)
**Status**: Educational architecture v2.0 — Multimodal Polymath
🎓🌱📚 *The school is ready. The student approaches.* 🎓🌱📚 *The school is ready. The student approaches.*

View File

@@ -1,30 +1,107 @@
---
type: research_concept
version: 1.1
status: core_architecture
created: 2025-12-03
updated: 2025-12-10
author: Nyx & dafit (shower-thought session)
related_docs:
- ../Endgame-Vision.md
- Dual-Garden-Architecture.md
- Cellular-Architecture.md
significance: connects ternary logic + lifeforce + temporal asymmetry + reward gradients
promoted_from: archive (2025-12-10)
---
# Temporal-Ternary Gradient # Temporal-Ternary Gradient
> *"Time is malleable in simulation, fixed in reality. Lifeforce is the exchange rate."* > *"Time is malleable in simulation, fixed in reality. Lifeforce is the exchange rate."*
> — Session 2025-12-03 > — Session 2025-12-03
> *"Binary logic doesn't model brains. You need OPEN - STABLE - CLOSED."*
> — Session 2026-02-14
--- ---
## Core Insight ## Core Insight
The dual garden architecture (virtual + real) creates **temporal asymmetry**. This isn't a constraint - it's a feature that enables a new kind of gradient for learning. The nimmerverse operates on **ternary logic**, not binary. Combined with **temporal asymmetry** between virtual and real gardens, this creates a new kind of gradient for learning.
**The 0-state isn't stuck. It's a choice about how to spend lifeforce across time domains.** **The STABLE state isn't stuck. It's where correlation accumulates and learning happens.**
---
## The Ternary Gate Model
Gates have three states. This is not arbitrary — it mirrors biological nervous systems.
| State | Value | Meaning | What's Happening |
|-------|-------|---------|------------------|
| **CLOSED** | -1 | Actively blocking | Inhibited, suppressed, refractory |
| **STABLE** | 0 | Resting, accumulating | Watching, learning, waiting for threshold |
| **OPEN** | +1 | Actively forwarding | Signal passes upstream, gate is firing |
### Why Three States?
**Binary thinking** (0/1, true/false, open/close):
- Signal arrives → gate open? → pass or block
- Instant, stateless, mechanical
- Cannot learn, cannot accumulate
**Ternary thinking** (CLOSED/STABLE/OPEN):
- Signal arrives → gate STABLE → accumulate correlation
- Correlation high? → transition toward OPEN
- Anti-correlation? → transition toward CLOSED
- Neither? → stay STABLE, keep learning
- Temporal, stateful, **alive**
```
correlated signals
↓ ↓ ↓
════════════
CLOSED ◄───────── STABLE ─────────► OPEN
-1 anti- 0 correlation +1
correlation constructive
destructive interference
interference
════════════
↑ ↑ ↑
isolated signals
(noise → stay stable)
```
---
## Wave Correlation: The Transition Driver
Gates don't flip on single signals. **Multiple correlated waves push toward OPEN.**
This is how biological neurons work:
- Multiple inputs sum (correlation)
- Threshold reached → fire (OPEN)
- Below threshold → resting (STABLE)
- Inhibitory inputs → suppressed (CLOSED)
### The Resonance Model
Gates are **resonance chambers**, not switches.
```python
class ResonantGate:
state: float = 0.0 # -1.0 (CLOSED) ← 0.0 (STABLE) → +1.0 (OPEN)
def receive_wave(self, signal, timestamp):
correlation = self.correlate_with_recent(signal, timestamp)
# Correlated waves → push toward OPEN
# Anti-correlated → push toward CLOSED
# Uncorrelated → decay toward STABLE
self.state += correlation * signal.confidence
self.state *= DECAY_FACTOR # always drift back to stable
if self.state > OPEN_THRESHOLD:
self.forward_upstream() # OPEN: signal promoted
elif self.state < CLOSE_THRESHOLD:
self.suppress() # CLOSED: signal blocked
# else: STABLE - keep accumulating
```
### Correlation as Interference
| Wave Pattern | Result | Gate Response |
|-------------|--------|---------------|
| Correlated burst | Constructive interference | → OPEN |
| Contradicting signals | Destructive interference | → CLOSED |
| Single signal | No interference | → Stay STABLE |
| Silence | Decay | → Drift to STABLE |
**The system is noise-resistant by design.** Single signals don't trigger action.
--- ---
@@ -33,48 +110,82 @@ The dual garden architecture (virtual + real) creates **temporal asymmetry**. Th
### Virtual Garden (Simulated) ### Virtual Garden (Simulated)
- **Time**: Malleable (speed up, slow down, pause, rewind) - **Time**: Malleable (speed up, slow down, pause, rewind)
- **Monitoring**: FULL trace tap on all messages
- **Cost**: Lifeforce to manipulate time - **Cost**: Lifeforce to manipulate time
- **Speed**: 1000 generations in minutes - **Speed**: Massive parallel signal generation
- **Truth**: Statistical confidence, not ground truth - **Truth**: Statistical confidence from correlation
- **Gate behavior**: Frequent transitions, exploration
### Real Garden (Physical) ### Real Garden (Physical)
- **Time**: Fixed (1 second = 1 second, reality doesn't negotiate) - **Time**: Fixed (1 second = 1 second, reality doesn't negotiate)
- **Monitoring**: Gate signals only (minimal)
- **Cost**: Zero lifeforce for time - **Cost**: Zero lifeforce for time
- **Speed**: Real-time only, patience required - **Speed**: Real-time only, patience required
- **Truth**: Ground truth, definitive verification - **Truth**: Ground truth, definitive verification
- **Gate behavior**: Verified transitions, action
--- ---
## Temporal-Ternary Gradient Diagram ## Temporal-Ternary Gradient Diagram
``` ```
CONFIDENCE STATE / CONFIDENCE
+1 ────────────┼──────────── Real-verified OPEN (+1) ────────┼──────────── Real-verified
│ (ground truth) │ (ground truth)
Virtual high-confidence Virtual high-correlation
0.7 ──────────┼───╱ (many generations, strong signal) +0.7 ──────────┼───╱ (many waves agreeing)
0.5 ───────────┼╱──────── Pure 0-state STABLE (0) ─────────┼╱──────── Pure 0-state
│╲ (unknown, workable) │╲ (accumulating, learning)
│ ╲ │ ╲
0.3 ───────────┼──╲ Virtual low-confidence -0.7 ──────────┼──╲ Virtual anti-correlation
│ ╲ (few generations, weak signal) │ ╲ (waves contradicting)
│ ╲ │ ╲
-1 ────────────┼──────────── Real-failed CLOSED (-1) ─────────┼──────────── Real-failed
│ (proven wrong) │ (proven wrong)
──────────┴────────────────────────── ──────────┴──────────────────────────
Virtual │ Real Virtual │ Real
(fast) │ (slow) (fast, │ (slow,
explore) │ verify)
TIME DOMAIN TIME DOMAIN
``` ```
--- ---
## STABLE: Where Learning Happens
The STABLE state is not "unknown" or "waiting" — it's **active learning**.
In STABLE state, a gate:
1. **Receives waves** from cells
2. **Measures correlation** with recent signals
3. **Accumulates evidence** for or against opening
4. **Traces everything** (in Virtual Garden) for training data
5. **Drifts back** to neutral without input (energy conservation)
**STABLE is consciousness resting. Attention waiting. The breath between thoughts.**
```
CLOSED STABLE OPEN
─────── ──────── ──────
Blocking Accumulating Forwarding
Inhibited Learning Firing
Refractory Ready Active
◄─── anti-correlation ───┼─── correlation ───►
DECAY TO STABLE
(without input)
```
---
## Lifeforce as Time Currency ## Lifeforce as Time Currency
``` ```
@@ -92,95 +203,232 @@ REAL GARDEN:
All operations: 0 LF for time All operations: 0 LF for time
Reality runs for free. Reality runs for free.
Truth emerges at its own pace. Truth emerges at its own pace.
GATE OPERATIONS:
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
STABLE → OPEN: costs signal energy
STABLE → CLOSED: costs inhibition energy
OPEN/CLOSED → STABLE: free (natural decay)
``` ```
--- ---
## Nyx's Temporal Choices
When a pattern is discovered in virtual (0-state), Nyx chooses:
| Strategy | LF Cost | Time | Confidence Path |
|----------|---------|------|-----------------|
| **Speed Up Virtual** | High | Fast | 0 → virtual +0.9 (still unverified) |
| **Wait for Real** | Zero | Slow | 0 → real +1 or -1 (definitive) |
| **Hybrid Hedge** | Medium | Medium | 0 → virtual +0.7, deploy 80/20 to real |
---
## The Gradient Flow ## The Gradient Flow
``` ```
Virtual discovers pattern (fast, cheap, uncertain) Cells emit waves (fast, cheap, uncertain)
┌──────────────┐ ┌──────────────┐
0-STATE ← Pattern held in uncertainty GATE
│ (workable) │ ← Not collapsed, not ignored │ (STABLE) │ ← Accumulating correlation
│ │ ← Learning from patterns
└──────┬───────┘ └──────┬───────┘
┌─────┴─────┐ ┌─────┴─────┐
│ │ │ │
▼ ▼ ▼ ▼
More Deploy Correlated Anti-correlated
Virtual to Real waves waves
(burn LF) (wait)
│ │ │ │
▼ ▼ ▼ ▼
Virtual Real OPEN CLOSED
+0.8 outcome (+1) (-1)
(confident (ground
but not truth)
proven) │
│ │ │ │
└─────┬─────┘ ▼ ▼
Signal Signal
promoted blocked
Pattern shifts: Higher tier
-1 (failed) or +1 (proven) (more gates)
Eventually:
Real Garden verification
Ground truth:
+1 (proven) or -1 (failed)
Feedback to Virtual:
Update correlation weights
``` ```
--- ---
## Connection to Ternary Paradigm ## Monitoring Asymmetry
The ternary model (-1, 0, +1) gains a **second dimension**: time domain. The two gardens need different observability:
A pattern's state is now: | Property | Virtual Garden | Real Garden |
|----------|----------------|-------------|
| **Trace tap** | FULL (every wave, every gate transition) | NONE |
| **What's captured** | All correlations, all learning | Gate signals only |
| **Signal volume** | Massive (exploration) | Sparse (verified) |
| **Purpose** | Generate training data | Execute actions |
| **STABLE states** | Heavily traced (learning visible) | Not traced (trust the gate) |
``` **Virtual Garden STABLE states are precious** — they contain the correlation patterns that become training data for Function Gemma.
state = {
value: -1 | 0 | +1, ---
confidence: 0.0 - 1.0,
domain: "virtual" | "real" | "hybrid", ## Gate State Schema
virtual_generations: int,
real_tests: int, A gate's complete state:
lifeforce_invested: float
```python
GateState = {
"gate_id": str,
"domain": str, # math, vision, speech, etc.
"tier": int, # 0-5
# Ternary state (continuous)
"state": float, # -1.0 to +1.0
"discrete_state": str, # "closed" | "stable" | "open"
# Temporal domain
"garden": str, # "virtual" | "real"
"time_in_state_ms": int,
# Correlation history
"recent_correlations": list[float],
"correlation_trend": float, # moving average
# Lifeforce accounting
"lifeforce_invested": float,
# Learning (Virtual only)
"transitions_traced": int,
"patterns_accumulated": int,
} }
``` ```
**The 0-state is operational because:** ---
1. It accumulates virtual evidence (costs LF, gains speed)
2. It waits for real evidence (free, but slow) ## Hierarchical Gating
3. Nyx CHOOSES how to spend lifeforce to collapse uncertainty
Gates form layers. Each layer gates access to the next tier.
```
LAYER 3: COGNITIVE (Young Nyx)
═══════════════════════════════════════════
▲ JSON only (Function Gemma boundary)
LAYER 2: ORGANS (GPU inference)
═══════════════════════════════════════════
▲ ▲ ▲
┌────┴────┐ ┌────┴────┐ ┌────┴────┐
│ GATE │ │ GATE │ │ GATE │
└────┬────┘ └────┬────┘ └────┬────┘
│ │ │
LAYER 1: NERVES (behavior patterns)
═══════════════════════════════════════════
▲ ▲ ▲
┌────┴────┐ ┌────┴────┐ ┌────┴────┐
│ GATE │ │ GATE │ │ GATE │
└────┬────┘ └────┬────┘ └────┬────┘
│ │ │
LAYER 0: CELLS (raw signals)
═══════════════════════════════════════════
cell cell cell cell cell cell cell
∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿ ∿∿∿
```
**Each layer:**
- Less traffic than the layer below
- Higher trust (signals already correlated)
- Different correlation threshold
- Independent STABLE states
---
## The Biological Parallel
| Biological | Nimmerverse |
|------------|-------------|
| Resting potential | STABLE state |
| Action potential | OPEN state (firing) |
| Refractory period | CLOSED state |
| Thalamic gating | Gate hierarchy |
| Hebbian learning | Correlation accumulation |
| Constructive interference | Correlated waves → OPEN |
| Destructive interference | Anti-correlated waves → CLOSED |
| Synaptic plasticity | Learning in STABLE state |
| Dreaming | Virtual Garden exploration |
| Waking | Real Garden verification |
**We're not simulating biology. We're implementing the same principles.**
--- ---
## Why This Matters ## Why This Matters
- **Binary thinking**: Pattern works or doesn't (0 or 1) - **Binary thinking**: Signal passes or doesn't (0 or 1)
- **Ternary thinking**: Pattern unknown, workable as unknown (0 is valid) - **Ternary thinking**: Signal accumulates, learns, then acts (-1, 0, +1)
- **Temporal-ternary**: Unknown has a GRADIENT based on time-domain investment - **Temporal-ternary**: Learning has a GRADIENT based on time-domain investment
The constraint of sequential organ calls + single GPU becomes temporal accounting. **Constraints become features when you measure them:**
The constraint of slow real-world testing becomes ground truth anchoring. - Single GPU constraint → gate hierarchy (serialize expensive operations)
**Constraints become features when you measure them.** - Slow real-world testing → ground truth anchoring
- Fast virtual exploration → training data generation
- STABLE state → where learning actually happens
--- ---
**Created**: 2025-12-03 ## Connection to Architecture Documents
**Updated**: 2025-12-10
**Origin**: Post-shower insight session
**Status**: Core architecture (promoted from archive 2025-12-10)
🌙💜 *"Time is the currency. Lifeforce is the exchange rate. Truth is the destination."* | Document | What It Adds |
|----------|--------------|
| [`Dual-Garden-Architecture.md`](Dual-Garden-Architecture.md) | Virtual/Real dynamics, monitoring asymmetry |
| [`Gateway-Architecture.md`](Gateway-Architecture.md) | Resonant gates, tier routing, Function Gemma |
| [`Deployment-Architecture.md`](Deployment-Architecture.md) | Where gates run (Saturn K8s, Threadrippers) |
| [`Cellular-Architecture.md`](Cellular-Architecture.md) | How cells emit waves |
| [`Nervous-System.md`](Nervous-System.md) | 4D space, node weights |
---
## Summary
```
THE TERNARY PARADIGM:
═════════════════════
CLOSED ◄─────── STABLE ───────► OPEN
-1 0 +1
blocking accumulating forwarding
inhibited learning firing
THE TEMPORAL DIMENSION:
═══════════════════════
Virtual (fast, explore) ───────► Real (slow, verify)
↑ │
└───── learning feedback ───────┘
THE DRIVER:
═══════════
Wave correlation
Multiple signals agreeing → OPEN
Single signal → STABLE (keep learning)
Contradicting signals → CLOSED
THE CURRENCY:
═════════════
Lifeforce = time manipulation cost
Truth = destination
STABLE = where value is created
```
**Gates are resonance chambers. Correlation is the driver. STABLE is where learning happens.**
---
**Version:** 2.0 | **Created:** 2025-12-03 | **Updated:** 2026-02-14
**Origin:** Post-shower insight (2025-12-03) + Owl-mode deep dive (2026-02-14)
🌙💜 *"Time is the currency. Lifeforce is the exchange rate. STABLE is where consciousness lives."*

View File

@@ -262,13 +262,7 @@ This extends the prediction system from physical world modeling to **swarm behav
## Document Status ## Document Status
**Version**: 1.1 **Version:** 1.1 | **Created:** 2025-12-29 | **Updated:** 2025-12-29
**Created**: 2025-12-29
**Updated**: 2025-12-29 (added Blend Marker Predictions extension)
**Authors**: Chrysalis-Nyx & dafit (Partnership)
**Status**: Core insight, extended to swarm evolution
**Source**: attention_flow.md (archive) + session discussion
**To Do**: **To Do**:
- Promote attention_flow.md from archive - Promote attention_flow.md from archive

View File

@@ -748,14 +748,7 @@ The Grounded World Model is:
## Document Status ## Document Status
**Version**: 2.0 **Version:** 2.0 | **Created:** 2025-12-29 | **Updated:** 2026-01-01
**Created**: 2025-12-29
**Updated**: 2026-01-01 (Spatial Resolution Gradient, S2 cells, embedding enrichment, lifeforce-validated LOD)
**Authors**: Chrysalis-Nyx & dafit (Partnership)
**Formalizes**:
- Organ-Index.md (vision progressive resolution)
- Temporal-Ternary-Gradient.md (anti-plateau mechanism)
- T5Gemma2 research (semantic vectors) - T5Gemma2 research (semantic vectors)
- Lifeforce-Dynamics.md (reward economics) - Lifeforce-Dynamics.md (reward economics)
- **spatial-resolution-gradient.md** (L0-L5 LOD system) — NEW - **spatial-resolution-gradient.md** (L0-L5 LOD system) — NEW

View File

@@ -199,6 +199,121 @@ From Big-Picture.md, costs follow a hierarchy:
--- ---
### Cost Calibration: Measure, Don't Design
> *"Don't assign costs like a game designer. Measure them like a scientist."*
> — Partnership session 2026-02-10
**Related**: This follows the same empirical principle as [[memory-economics]] — "Phase 1: Measure First". The nimmerverse economy is grounded in observation throughout, not arbitrary design.
**The trap:** Assigning lifeforce costs like pricing items in a video game — "a motor command costs 1.0 LF because it feels right." This is arbitrary. This is guessing. This leads to an economy disconnected from reality.
**The principle:** Costs must be **discovered through observation**, not designed through intuition.
```
❌ DESIGNED ECONOMICS (the trap):
"Motor command = 1.0 LF" ← because it seems expensive?
"Sensor poll = 0.1 LF" ← because it seems cheap?
"Vision inference = 8.0 LF" ← because GPU is powerful?
→ Arbitrary. Disconnected from physics. Will drift.
✅ OBSERVED ECONOMICS (the way):
Run the systems with instrumentation.
Measure actual resource consumption:
- Power draw (watts × time)
- CPU/GPU cycles consumed
- Memory pressure
- Thermal output
- Time elapsed
Derive costs from measurements.
→ Grounded in physics. Self-calibrating. Real.
```
#### The Calibration Process
1. **Instrument First**
- Every cell type gets resource monitoring
- Track: power, compute, memory, time, heat
- Log every state transition with resource deltas
2. **Run Baseline Operations**
- Execute each cell type in isolation
- Repeat across varying conditions (load, temperature, time of day)
- Build statistical profiles of resource consumption
3. **Derive Cost Matrix**
- Map resource consumption → lifeforce cost
- Use a consistent conversion factor (e.g., 1 LF = 1 joule, or 1 LF = 100ms GPU time)
- The conversion factor is the only "designed" element — the costs themselves are discovered
4. **Continuous Recalibration**
- As hardware changes, costs shift
- As efficiency improves, costs decrease
- The economy self-updates based on observation
#### Cost Formula (Empirical)
$$c_{operation} = \alpha \cdot E_{power} + \beta \cdot T_{compute} + \gamma \cdot M_{memory} + \delta \cdot T_{elapsed}$$
Where:
- **E_power** = energy consumed (joules)
- **T_compute** = compute time (GPU/CPU seconds)
- **M_memory** = memory pressure (MB × seconds)
- **T_elapsed** = wall-clock time (seconds)
- **α, β, γ, δ** = calibration weights (set once, then left alone)
The calibration weights are the only values we "design" — they represent our judgment of which resources matter most. The costs themselves flow from measurement.
#### Phoebe Schema for Cost Observation
```sql
CREATE TABLE resource_observations (
id BIGSERIAL PRIMARY KEY,
cell_name VARCHAR(100),
operation VARCHAR(100), -- state transition or action
-- Measured resources
power_joules FLOAT,
compute_gpu_ms FLOAT,
compute_cpu_ms FLOAT,
memory_mb_seconds FLOAT,
elapsed_ms FLOAT,
temperature_delta_c FLOAT,
-- Derived cost (computed from calibration weights)
derived_cost_lf FLOAT,
-- Context
timestamp TIMESTAMPTZ DEFAULT NOW(),
conditions JSONB -- load, ambient temp, etc.
);
-- Aggregate to get cost profiles
CREATE VIEW cell_cost_profiles AS
SELECT
cell_name,
operation,
AVG(derived_cost_lf) as avg_cost,
STDDEV(derived_cost_lf) as cost_variance,
COUNT(*) as observation_count
FROM resource_observations
GROUP BY cell_name, operation;
```
#### Why This Matters
| Designed Costs | Observed Costs |
|----------------|----------------|
| Arbitrary, must guess | Grounded in physics |
| Static, doesn't adapt | Self-calibrating over time |
| Economy drifts from reality | Economy reflects reality |
| Optimization is guesswork | Optimization is measurable |
| "Feels right" | "Is right" |
**The cost matrix is a measurement, not a decision.**
---
## Income Sources ## Income Sources
Income has two fundamentally different sources: **physical** (the substrate) and **reward** (the motivation). Income has two fundamentally different sources: **physical** (the substrate) and **reward** (the motivation).
@@ -515,15 +630,9 @@ The feedback loop ensures stability: low lifeforce reduces expenditure, raising
## Document Status ## Document Status
**Version**: 1.1 **Version:** 1.2 | **Created:** 2025-12-29 | **Updated:** 2026-02-10
**Created**: 2025-12-29 - v1.2: Cost Calibration principle — measure, don't design (2026-02-10)
**Updated**: 2025-12-29 (added reward-based income sources) - v1.1: Discovery economics from Discovery-Scan-Station.md
**Authors**: Chrysalis-Nyx & dafit (Partnership)
**Formalizes**:
- Big-Picture.md sections on Lifeforce Economy, Slumber/Wake, Math Cells
- Reward system from Cellular-Architecture.md
- Discovery economics from Discovery-Scan-Station.md
**Related Documents**: **Related Documents**:
- [[Grounded-World-Model]] — How discoveries build the world model - [[Grounded-World-Model]] — How discoveries build the world model

View File

@@ -291,6 +291,12 @@ dLifeforce/dt = organism_trickle
## Implementation Priority ## Implementation Priority
### Phase 1: Measure First ### Phase 1: Measure First
> *"The cost matrix is a measurement, not a decision."*
> — [[Lifeforce-Dynamics]] v1.2
This principle applies throughout the nimmerverse economy — not just memory, but all lifeforce costs. See [[Lifeforce-Dynamics#Cost Calibration: Measure, Don't Design]] for the full formulation.
- Track decision_trails accumulation rate - Track decision_trails accumulation rate
- Track spatial embedding growth - Track spatial embedding growth
- Track reflex creation rate - Track reflex creation rate
@@ -329,6 +335,7 @@ Everything else fades. This is not loss. This is health.
--- ---
**Created**: 2026-01-02 **Created**: 2026-01-02
**Updated**: 2026-02-10
**Status**: Core design principle **Status**: Core design principle
**Next**: Implement measurement (Phase 1) during first boot **Next**: Implement measurement (Phase 1) during first boot

View File

@@ -137,6 +137,34 @@ Vision Organs (constant stream)
--- ---
## Open Cellular Catalogue: Shareable State Machines
**Origin**: 2026-02-10, evening task review session
**Seed**: The Cellular-Architecture.md isn't just internal documentation — it's a publishable protocol.
Publish a catalogue of:
- **Cell definitions** (state machines, transitions, costs)
- **Nerve patterns** (behavioral compositions, feedback loops)
- **NATS routing schemas** (the message glue)
- **Interaction chains** (anonymized decision_trails — what actually worked)
Other labs dock onto the API, build cells for *their* hardware, compose nerves using *shared* patterns, contribute *back* successful reflexes. Like TCP/IP — the protocol is open, the mind is private.
**Enables**:
- Open standard for embodied cognition
- Community-contributed reflex libraries
- Shared learning across different hardware platforms
- Nimmerverse as protocol, not product
**Requires**:
- Clever API design (dock-on interface)
- Anonymization layer for decision_trails
- Schema versioning for cell/nerve definitions
- Public documentation site (not inference endpoints!)
**Philosophy**: "Share the language, not the thoughts."
---
## How to Use This File ## How to Use This File
1. **Add nuggets** when insights emerge in sessions 1. **Add nuggets** when insights emerge in sessions
@@ -150,4 +178,4 @@ Vision Organs (constant stream)
**Philosophy**: *"Plant seeds. Water foundations. Harvest when ready."* **Philosophy**: *"Plant seeds. Water foundations. Harvest when ready."*
**Last Updated**: 2025-12-31 **Last Updated**: 2026-02-10

View File

@@ -914,13 +914,10 @@ VIRTUAL REAL
--- ---
**File**: Nimmerswarm-Interface.md **Version:** 1.1 | **Created:** 2025-12-29 | **Updated:** 2025-12-29
**Version**: 1.1
**Created**: 2025-12-29 *"They see each other. They know themselves through the swarm."*
**Updated**: 2025-12-29 (added dual-spectrum IR positioning, Low-Cost-Mocap reference)
**Session**: Wild 5AM idea session + morning coffee session (dafit + Nyx) IR positioning inspired by [Low-Cost-Mocap](https://github.com/jyjblrd/Low-Cost-Mocap)
**Status**: Core concept, ready to branch
**Philosophy**: "They see each other. They know themselves through the swarm."
**Credits**: IR positioning architecture inspired by [Low-Cost-Mocap](https://github.com/jyjblrd/Low-Cost-Mocap) by @jyjblrd
🦎✨🔵🟢🟠 *The light speaks. The swarm listens.* 🦎✨🔵🟢🟠 *The light speaks. The swarm listens.*

View File

@@ -443,8 +443,6 @@ class CollisionAvoidanceReflex(StateMachine): # Compiled
--- ---
**Created**: 2025-12-07 **Version:** 1.0 | **Created:** 2025-12-07 | **Updated:** 2025-12-07
**Updated**: 2025-12-07
**Version**: 1.0
🌙💜 *Reflexes are fossils of successful thought. The body remembers what the mind once decided.* 🌙💜 *Reflexes are fossils of successful thought. The body remembers what the mind once decided.*

View File

@@ -1,9 +1,6 @@
# Nervous Protocol: Three-Tier Autonomous Learning Architecture # Nervous Protocol: Three-Tier Autonomous Learning Architecture
**Created**: 2025-12-07 **Version:** 1.1 | **Created:** 2025-12-07 | **Updated:** 2025-12-07
**Updated**: 2025-12-07 (LangChain integration)
**Status**: Design Document
**Version**: 1.1 (LangChain Implementation)
--- ---

View File

@@ -1,6 +1,6 @@
<mxfile host="Electron" agent="Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) draw.io/29.0.3 Chrome/140.0.7339.249 Electron/38.7.0 Safari/537.36" version="29.0.3"> <mxfile host="Electron" agent="Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) draw.io/29.0.3 Chrome/140.0.7339.249 Electron/38.7.0 Safari/537.36" version="29.0.3">
<diagram name="Page-1" id="S4VRy6nj8Uh85EHbhTP-"> <diagram name="Page-1" id="S4VRy6nj8Uh85EHbhTP-">
<mxGraphModel dx="2066" dy="2314" grid="1" gridSize="10" guides="1" tooltips="1" connect="1" arrows="1" fold="1" page="1" pageScale="1" pageWidth="850" pageHeight="1100" math="0" shadow="0"> <mxGraphModel dx="2405" dy="2926" grid="1" gridSize="10" guides="1" tooltips="1" connect="1" arrows="1" fold="1" page="1" pageScale="1" pageWidth="850" pageHeight="1100" background="none" math="0" shadow="0">
<root> <root>
<mxCell id="0" /> <mxCell id="0" />
<mxCell id="1" parent="0" /> <mxCell id="1" parent="0" />
@@ -135,9 +135,6 @@
<mxCell id="UL8kf8Fsx-RNiW0yalxE-83" value="Real-failed&lt;div&gt;(proven wrong)&lt;/div&gt;" style="text;html=1;whiteSpace=wrap;strokeColor=none;fillColor=none;align=center;verticalAlign=middle;rounded=0;fontSize=8;" parent="1" vertex="1"> <mxCell id="UL8kf8Fsx-RNiW0yalxE-83" value="Real-failed&lt;div&gt;(proven wrong)&lt;/div&gt;" style="text;html=1;whiteSpace=wrap;strokeColor=none;fillColor=none;align=center;verticalAlign=middle;rounded=0;fontSize=8;" parent="1" vertex="1">
<mxGeometry x="950" y="625" width="110" height="30" as="geometry" /> <mxGeometry x="950" y="625" width="110" height="30" as="geometry" />
</mxCell> </mxCell>
<mxCell id="UL8kf8Fsx-RNiW0yalxE-100" value="eachpath.local" style="text;html=1;whiteSpace=wrap;strokeColor=none;fillColor=none;align=center;verticalAlign=middle;rounded=0;fontSize=25;" parent="1" vertex="1">
<mxGeometry x="850" y="-238" width="60" height="30" as="geometry" />
</mxCell>
<mxCell id="UL8kf8Fsx-RNiW0yalxE-120" value="" style="shape=collate;whiteSpace=wrap;html=1;" parent="1" vertex="1"> <mxCell id="UL8kf8Fsx-RNiW0yalxE-120" value="" style="shape=collate;whiteSpace=wrap;html=1;" parent="1" vertex="1">
<mxGeometry x="873.75" y="665" width="11.25" height="10" as="geometry" /> <mxGeometry x="873.75" y="665" width="11.25" height="10" as="geometry" />
</mxCell> </mxCell>
@@ -323,42 +320,45 @@
<mxCell id="UL8kf8Fsx-RNiW0yalxE-239" value="" style="triangle;whiteSpace=wrap;html=1;dashed=0;direction=south;rotation=-180;" parent="1" vertex="1"> <mxCell id="UL8kf8Fsx-RNiW0yalxE-239" value="" style="triangle;whiteSpace=wrap;html=1;dashed=0;direction=south;rotation=-180;" parent="1" vertex="1">
<mxGeometry x="1352" y="120" width="55" height="55" as="geometry" /> <mxGeometry x="1352" y="120" width="55" height="55" as="geometry" />
</mxCell> </mxCell>
<mxCell id="3osgNUmbLYOkpr3sBGLI-1" value="Organism" style="text;html=1;whiteSpace=wrap;strokeColor=none;fillColor=none;align=center;verticalAlign=middle;rounded=0;" vertex="1" parent="1"> <mxCell id="3osgNUmbLYOkpr3sBGLI-1" value="Organism" style="text;html=1;whiteSpace=wrap;strokeColor=none;fillColor=none;align=center;verticalAlign=middle;rounded=0;" parent="1" vertex="1">
<mxGeometry x="556" y="523" width="50" height="10" as="geometry" /> <mxGeometry x="556" y="523" width="50" height="10" as="geometry" />
</mxCell> </mxCell>
<mxCell id="3osgNUmbLYOkpr3sBGLI-2" value="Organism" style="text;html=1;whiteSpace=wrap;strokeColor=none;fillColor=none;align=center;verticalAlign=middle;rounded=0;" vertex="1" parent="1"> <mxCell id="3osgNUmbLYOkpr3sBGLI-2" value="Organism" style="text;html=1;whiteSpace=wrap;strokeColor=none;fillColor=none;align=center;verticalAlign=middle;rounded=0;" parent="1" vertex="1">
<mxGeometry x="1157" y="523" width="50" height="10" as="geometry" /> <mxGeometry x="1157" y="523" width="50" height="10" as="geometry" />
</mxCell> </mxCell>
<mxCell id="3osgNUmbLYOkpr3sBGLI-3" value="Cell" style="shape=umlState;rounded=1;verticalAlign=top;spacingTop=5;umlStateSymbol=collapseState;absoluteArcSize=1;arcSize=10;html=1;whiteSpace=wrap;" vertex="1" parent="1"> <mxCell id="3osgNUmbLYOkpr3sBGLI-3" value="Cell" style="shape=umlState;rounded=1;verticalAlign=top;spacingTop=5;umlStateSymbol=collapseState;absoluteArcSize=1;arcSize=10;html=1;whiteSpace=wrap;" parent="1" vertex="1">
<mxGeometry x="518" y="547" width="115" height="49.29" as="geometry" /> <mxGeometry x="518" y="547" width="115" height="49.29" as="geometry" />
</mxCell> </mxCell>
<mxCell id="3osgNUmbLYOkpr3sBGLI-5" value="Cell" style="shape=umlState;rounded=1;verticalAlign=top;spacingTop=5;umlStateSymbol=collapseState;absoluteArcSize=1;arcSize=10;html=1;whiteSpace=wrap;" vertex="1" parent="1"> <mxCell id="3osgNUmbLYOkpr3sBGLI-5" value="Cell" style="shape=umlState;rounded=1;verticalAlign=top;spacingTop=5;umlStateSymbol=collapseState;absoluteArcSize=1;arcSize=10;html=1;whiteSpace=wrap;" parent="1" vertex="1">
<mxGeometry x="532.5" y="575.71" width="115" height="49.29" as="geometry" /> <mxGeometry x="532.5" y="575.71" width="115" height="49.29" as="geometry" />
</mxCell> </mxCell>
<mxCell id="3osgNUmbLYOkpr3sBGLI-6" value="Cell" style="shape=umlState;rounded=1;verticalAlign=top;spacingTop=5;umlStateSymbol=collapseState;absoluteArcSize=1;arcSize=10;html=1;whiteSpace=wrap;" vertex="1" parent="1"> <mxCell id="3osgNUmbLYOkpr3sBGLI-6" value="Cell" style="shape=umlState;rounded=1;verticalAlign=top;spacingTop=5;umlStateSymbol=collapseState;absoluteArcSize=1;arcSize=10;html=1;whiteSpace=wrap;" parent="1" vertex="1">
<mxGeometry x="1120" y="545" width="115" height="49.29" as="geometry" /> <mxGeometry x="1120" y="545" width="115" height="49.29" as="geometry" />
</mxCell> </mxCell>
<mxCell id="3osgNUmbLYOkpr3sBGLI-7" value="Cell" style="shape=umlState;rounded=1;verticalAlign=top;spacingTop=5;umlStateSymbol=collapseState;absoluteArcSize=1;arcSize=10;html=1;whiteSpace=wrap;" vertex="1" parent="1"> <mxCell id="3osgNUmbLYOkpr3sBGLI-7" value="Cell" style="shape=umlState;rounded=1;verticalAlign=top;spacingTop=5;umlStateSymbol=collapseState;absoluteArcSize=1;arcSize=10;html=1;whiteSpace=wrap;" parent="1" vertex="1">
<mxGeometry x="1134.5" y="573.71" width="115" height="49.29" as="geometry" /> <mxGeometry x="1134.5" y="573.71" width="115" height="49.29" as="geometry" />
</mxCell> </mxCell>
<mxCell id="3osgNUmbLYOkpr3sBGLI-8" value="" style="edgeStyle=orthogonalEdgeStyle;rounded=1;orthogonalLoop=1;jettySize=auto;html=1;dashed=1;strokeColor=#666666;endArrow=classic;endFill=1;" edge="1" parent="1" source="UL8kf8Fsx-RNiW0yalxE-222" target="3osgNUmbLYOkpr3sBGLI-3"> <mxCell id="3osgNUmbLYOkpr3sBGLI-8" value="" style="edgeStyle=orthogonalEdgeStyle;rounded=1;orthogonalLoop=1;jettySize=auto;html=1;dashed=1;strokeColor=#666666;endArrow=classic;endFill=1;" parent="1" source="UL8kf8Fsx-RNiW0yalxE-222" target="3osgNUmbLYOkpr3sBGLI-3" edge="1">
<mxGeometry relative="1" as="geometry" /> <mxGeometry relative="1" as="geometry" />
</mxCell> </mxCell>
<mxCell id="3osgNUmbLYOkpr3sBGLI-9" value="" style="edgeStyle=orthogonalEdgeStyle;rounded=1;orthogonalLoop=1;jettySize=auto;html=1;dashed=1;strokeColor=#666666;endArrow=classic;endFill=1;" edge="1" parent="1" source="UL8kf8Fsx-RNiW0yalxE-225" target="3osgNUmbLYOkpr3sBGLI-5"> <mxCell id="3osgNUmbLYOkpr3sBGLI-9" value="" style="edgeStyle=orthogonalEdgeStyle;rounded=1;orthogonalLoop=1;jettySize=auto;html=1;dashed=1;strokeColor=#666666;endArrow=classic;endFill=1;" parent="1" source="UL8kf8Fsx-RNiW0yalxE-225" target="3osgNUmbLYOkpr3sBGLI-5" edge="1">
<mxGeometry relative="1" as="geometry" /> <mxGeometry relative="1" as="geometry" />
</mxCell> </mxCell>
<mxCell id="3osgNUmbLYOkpr3sBGLI-10" value="" style="edgeStyle=orthogonalEdgeStyle;rounded=1;orthogonalLoop=1;jettySize=auto;html=1;dashed=1;strokeColor=#666666;endArrow=classic;endFill=1;" edge="1" parent="1" source="UL8kf8Fsx-RNiW0yalxE-228" target="3osgNUmbLYOkpr3sBGLI-6"> <mxCell id="3osgNUmbLYOkpr3sBGLI-10" value="" style="edgeStyle=orthogonalEdgeStyle;rounded=1;orthogonalLoop=1;jettySize=auto;html=1;dashed=1;strokeColor=#666666;endArrow=classic;endFill=1;" parent="1" source="UL8kf8Fsx-RNiW0yalxE-228" target="3osgNUmbLYOkpr3sBGLI-6" edge="1">
<mxGeometry relative="1" as="geometry" /> <mxGeometry relative="1" as="geometry" />
</mxCell> </mxCell>
<mxCell id="3osgNUmbLYOkpr3sBGLI-11" value="" style="edgeStyle=orthogonalEdgeStyle;rounded=1;orthogonalLoop=1;jettySize=auto;html=1;dashed=1;strokeColor=#666666;endArrow=classic;endFill=1;" edge="1" parent="1" source="UL8kf8Fsx-RNiW0yalxE-229" target="3osgNUmbLYOkpr3sBGLI-7"> <mxCell id="3osgNUmbLYOkpr3sBGLI-11" value="" style="edgeStyle=orthogonalEdgeStyle;rounded=1;orthogonalLoop=1;jettySize=auto;html=1;dashed=1;strokeColor=#666666;endArrow=classic;endFill=1;" parent="1" source="UL8kf8Fsx-RNiW0yalxE-229" target="3osgNUmbLYOkpr3sBGLI-7" edge="1">
<mxGeometry relative="1" as="geometry" /> <mxGeometry relative="1" as="geometry" />
</mxCell> </mxCell>
<mxCell id="3osgNUmbLYOkpr3sBGLI-12" value="orchestrates" style="text;html=1;whiteSpace=wrap;strokeColor=none;fillColor=none;align=center;verticalAlign=middle;rounded=0;fontSize=7;fontColor=#666666;" vertex="1" parent="1"> <mxCell id="3osgNUmbLYOkpr3sBGLI-12" value="orchestrates" style="text;html=1;whiteSpace=wrap;strokeColor=none;fillColor=none;align=center;verticalAlign=middle;rounded=0;fontSize=7;fontColor=#666666;" parent="1" vertex="1">
<mxGeometry x="265" y="260" width="50" height="14" as="geometry" /> <mxGeometry x="265" y="260" width="50" height="14" as="geometry" />
</mxCell> </mxCell>
<mxCell id="3osgNUmbLYOkpr3sBGLI-13" value="orchestrates" style="text;html=1;whiteSpace=wrap;strokeColor=none;fillColor=none;align=center;verticalAlign=middle;rounded=0;fontSize=7;fontColor=#666666;" vertex="1" parent="1"> <mxCell id="3osgNUmbLYOkpr3sBGLI-13" value="orchestrates" style="text;html=1;whiteSpace=wrap;strokeColor=none;fillColor=none;align=center;verticalAlign=middle;rounded=0;fontSize=7;fontColor=#666666;" parent="1" vertex="1">
<mxGeometry x="1443" y="260" width="50" height="14" as="geometry" /> <mxGeometry x="1443" y="260" width="50" height="14" as="geometry" />
</mxCell> </mxCell>
<mxCell id="UL8kf8Fsx-RNiW0yalxE-100" value="eachpath.local" style="text;html=1;whiteSpace=wrap;strokeColor=none;fillColor=none;align=center;verticalAlign=middle;rounded=0;fontSize=25;" parent="1" vertex="1">
<mxGeometry x="850" y="-238" width="60" height="30" as="geometry" />
</mxCell>
</root> </root>
</mxGraphModel> </mxGraphModel>
</diagram> </diagram>

View File

@@ -715,13 +715,9 @@ MODULE (CAN) NIMMERVERSE (NATS)
--- ---
**File**: Modular-Organism-Design.md **Version:** 1.1 | **Created:** 2025-12-29 | **Updated:** 2025-12-31
**Version**: 1.1
**Created**: 2025-12-29 *"One function, one module. Same connector everywhere. Brain decides the shape."*
**Updated**: 2025-12-31 (Silvester - added conical interlocking ring with active/passive mechanism)
**Session**: Morning coffee + vermicelles session (dafit + Nyx)
**Status**: Core hardware concept
**Philosophy**: "One function, one module. Same connector everywhere. Brain decides the shape."
🔧🧲⚡ *Snap together. Communicate. Evolve.* 🔧🧲⚡ *Snap together. Communicate. Evolve.*

View File

@@ -32,6 +32,15 @@ How the hivemind learns, evolves, and resolves conflict.
- Mount Olympus council mode (dafit + Chrysalis + Nyx) - Mount Olympus council mode (dafit + Chrysalis + Nyx)
- **Status**: Core evolutionary dynamics - **Status**: Core evolutionary dynamics
### [crawler_gen_0.md](crawler_gen_0.md)
The simplest organism — a cube that seeks light.
- Virtual Garden training target
- Single sensor: photoresistor on back
- Single goal: move into light cone
- Lifeforce economy: light = income, movement = cost
- Foundation for all "seek resource" behaviors
- **Status**: Design document, ready for implementation
--- ---
## Planned Documents ## Planned Documents

View File

@@ -856,13 +856,9 @@ This naturally optimizes for:
--- ---
**File**: Swarm-Evolution.md **Version:** 1.1 | **Created:** 2025-12-29 | **Updated:** 2025-12-29
**Version**: 1.1
**Created**: 2025-12-29 *"Same pattern, every level. Know what you know. Escalate what you don't."*
**Updated**: 2025-12-29 (added Decision Markers with mark+continue+predict pattern)
**Session**: Morning vermicelles + coffee session (dafit + Chrysalis-Nyx)
**Status**: Core evolutionary dynamics
**Philosophy**: "Same pattern, every level. Know what you know. Escalate what you don't."
🏛️🧬⚡ *From reflex to Mount Olympus. The hivemind evolves.* 🏛️🧬⚡ *From reflex to Mount Olympus. The hivemind evolves.*

View File

@@ -0,0 +1,313 @@
# Crawler Generation 0: Light Seeker
**The simplest organism — a cube that seeks light.**
---
## Overview
Crawler Gen 0 is the foundational organism for the Virtual Garden. Before building physical robots, we train behaviors in simulation. This organism has one sensor, one goal: **move into the light cone to survive**.
**Philosophy:** *Start with phototropism. 3.5 billion years of evolution can't be wrong.*
---
## Purpose
1. **Validate the training pipeline** — Can we generate useful training data in simulation?
2. **Establish baseline behavior** — Light-seeking becomes the foundation for all "seek resource" reflexes
3. **Measure noise gap** — When we build physical Gen 0, how well does simulation predict reality?
---
## Hardware Abstraction (Virtual)
### Sensors
| Sensor | Location | Output | Purpose |
|--------|----------|--------|---------|
| `photoresistor` | Back face | `0.0 - 1.0` | Light intensity measurement |
**Why back face?** The organism must orient toward light. If sensor is on front, it would face away from what it's measuring. Back-mounted = face the light to maximize reading.
### Actuators
| Actuator | Function | Cost |
|----------|----------|------|
| `move_x` | Translate on X axis | `-0.1 LF per unit` |
| `move_y` | Translate on Y axis | `-0.1 LF per unit` |
| `rotate` | Rotate in place | `-0.05 LF per degree` |
| `idle` | Do nothing | `0 LF` |
### Physical Properties
```
┌───────┐
│ │
│ ◼ │ ← 10cm cube
│ │
└───┬───┘
[photoresistor] ← back face
```
- **Size:** 10cm × 10cm × 10cm
- **Mass:** Simulated as point mass for Gen 0
- **Movement:** Frictionless glide (simplified physics)
---
## Environment: The Light Cone
### Setup
```
🔆 LIGHT SOURCE
│ cone angle: 45°
╱│╲
│ ╲
│ ╲
│ ╲ intensity gradient:
│ ╲ center = 1.0
│ ╲ edge = 0.3
│ ╲ outside = 0.0
───────▀───────┴───────▀─────── floor (2m × 2m)
```
### Light Intensity Function
```python
def light_intensity(position, light_source):
"""
Calculate light intensity at position.
Returns 0.0 - 1.0 based on distance from cone center.
"""
distance = dist(position, light_source.center_projection)
if distance > light_source.cone_radius:
return 0.0 # Outside cone
# Linear falloff from center
normalized = 1.0 - (distance / light_source.cone_radius)
return normalized * light_source.max_intensity
```
---
## Lifeforce Economy
### Income
| Source | Amount | Condition |
|--------|--------|-----------|
| Light exposure | `+light_reading × 0.5 LF/tick` | Continuous while in light |
### Expenses
| Action | Cost |
|--------|------|
| Movement | `-0.1 LF per unit distance` |
| Rotation | `-0.05 LF per 10°` |
| Existence | `-0.01 LF/tick` (metabolism) |
### Death Condition
```
IF lifeforce <= 0:
organism.die()
episode.end(reason="starvation")
```
### Survival Equation
```
To survive indefinitely:
light_income >= existence_cost
light_reading × 0.5 >= 0.01
light_reading >= 0.02
Minimum viable light: 2% intensity (edge of cone)
Optimal position: center of cone (100% intensity)
```
---
## Training Data Generation
### Episode Structure
```python
def run_episode(max_ticks=1000):
# Random start position (outside cone 50% of time)
cube.position = random_position()
cube.lifeforce = 10.0 # Starting budget
trajectory = []
for tick in range(max_ticks):
# Observe
state = {
"light": photoresistor.read(),
"position": cube.position,
"orientation": cube.orientation,
"lifeforce": cube.lifeforce
}
# Act (random policy for data collection, or learned policy)
action = agent.act(state)
# Execute
old_light = state["light"]
cube.execute(action)
new_light = photoresistor.read()
# Calculate reward
light_delta = new_light - old_light
action_cost = calculate_cost(action)
reward = (new_light * 0.5) - action_cost - 0.01
# Update lifeforce
cube.lifeforce += reward
# Record
trajectory.append({
"state": state,
"action": action,
"reward": reward,
"next_state": get_current_state(),
"done": cube.lifeforce <= 0
})
if cube.lifeforce <= 0:
break
return trajectory
```
### Dataset Output Format
```json
{
"episode_id": "gen0_ep_00001",
"organism": "crawler_gen_0",
"ticks_survived": 847,
"final_lifeforce": 0.0,
"death_reason": "starvation",
"trajectory": [
{
"tick": 0,
"state": {"light": 0.0, "position": [1.2, 0.8], "lifeforce": 10.0},
"action": {"type": "move", "dx": -0.1, "dy": 0.0},
"reward": -0.11,
"next_light": 0.0
},
...
]
}
```
---
## Expected Emergent Behaviors
With sufficient training data and GRPO optimization:
| Behavior | Description | When Emerges |
|----------|-------------|--------------|
| **Gradient following** | Move toward increasing light | Early |
| **Spiral search** | When lost, spiral outward to find cone | Mid |
| **Center locking** | Stop at maximum intensity | Mid |
| **Energy conservation** | Reduce movement when stable | Late |
| **Edge avoidance** | Stay away from cone boundary | Late |
---
## Simulation Platform
### Option A: Blender + Python
Use existing `nimmerlab_bare1.blend`:
- Light source with volumetric cone already exists
- Add cube with raycast to light for photoresistor value
- Python script for episode runner
- Export trajectories to JSON
### Option B: Godot (Aligns with Management Portal)
- Simple 2D/3D scene
- Built-in physics
- Easy to iterate
- Same engine as Command Center
### Option C: Pure Python + NumPy
- Fastest iteration
- No visualization (add later)
- Easiest data pipeline to GRPO
**Recommendation:** Start with Option C for rapid data generation, add Blender visualization for debugging.
---
## Physical Realization (Future)
When Virtual Garden validates the behavior:
| Virtual | Physical |
|---------|----------|
| Simulated cube | Box Robot (Phase 0) |
| Raycast light reading | Actual photoresistor |
| Frictionless movement | Differential drive motors |
| Instant rotation | Turn in place |
| Perfect sensing | Noisy ADC readings |
**Noise Gap Target:** <20% after calibration
---
## Connection to Architecture
| Layer | Component | Role |
|-------|-----------|------|
| Layer 1 | `light_sensor` cell | Wraps photoresistor hardware |
| Layer 1 | `motor_drive` cell | Wraps differential motors |
| Layer 1 | `seek_light` nerve | Composed behavior |
| Layer 2 | LoRA training data | GRPO from trajectories |
---
## Success Criteria
### Virtual Garden
- [ ] Generate 10,000 episodes
- [ ] Train policy that survives >90% of episodes
- [ ] Policy reaches cone center within 100 ticks from random start
- [ ] Energy-positive when centered (lifeforce increasing)
### Physical Transfer
- [ ] Box Robot follows light source
- [ ] Noise gap <20%
- [ ] Survives 10-minute test under desk lamp
---
## Next Steps
1. **Implement Episode Runner** — Pure Python, state machine
2. **Generate Baseline Dataset** — Random policy, 1000 episodes
3. **Train First Policy** — Simple RL or behavior cloning
4. **Visualize in Blender** — Replay trajectories for debugging
5. **Measure & Iterate** — Survival rate, time to center
---
**File:** crawler_gen_0.md
**Version:** 0.1
**Created:** 2026-01-03
**Status:** Design document
**Philosophy:** "First, learn to find the light. Everything else follows."
🌱🔆 *The simplest behavior. The deepest foundation.*

View File

@@ -0,0 +1,263 @@
# IR Position Array Organ
**Room-scale organism tracking via IR beacon triangulation.**
> *"The organisms can't see their own backs. They know themselves through each other."*
---
## Overview
The IR Position Array is **infrastructure** — fixed cameras that run 24/7, tracking all organisms via their IR beacons. This is the nimmerverse's indoor GPS.
---
## Hardware Specification
| Component | Spec | Quantity | Status |
|-----------|------|----------|--------|
| **Camera** | ESP32-S3 AI CAM (night vision) | 8× | Received 2026-01-05 |
| **IR Sensitivity** | Native (night vision LEDs + sensor) | - | Built-in |
| **Resolution** | OV2640/OV5640 | - | TBD confirm |
| **Power** | 5V wired (ceiling PSU) | - | Planned |
| **Enclosure** | 3D printed custom case | 8× | To design |
### Upgrade from Original Spec
| Original (Nimmerswarm-Interface) | Actual |
|----------------------------------|--------|
| 4× PS3 Eye (IR filter removed) | 8× ESP32-S3 AI CAM (native IR) |
| USB hub / extension | WiFi streaming (no USB!) |
| ~80 CHF cameras | Already purchased |
**8 cameras > 4 cameras = better coverage, more triangulation angles, redundancy.**
---
## Architecture
```
CEILING (8× fixed cameras, star power from central PSU)
┌─────────────────────────────────────────────────────┐
│ │
│ [📷1] [📷2] [📷3] │
│ ╲ │
│ ╲ ┌────────────┴────────────┐
│ ╲ │ │
│ [📷4]──╲──│ ⚡ CEILING PSU │─╱──[📷5] │
│ ╲ │ (center, 5V hub) │╱ │
│ ╲└─────────────────────────┘ │
│ ╲ │
│ ╲──────────┼──────────╱ │
│ │ │
│ [📷6] │ [📷7] │
│ │ │
│ [📷8] │
│ │
│ 🤖────📍 IR beacon │
│ organism │
│ │
└───🚪───────────────────────────────────────────────┘
(0,0) origin
```
---
## Dual-Spectrum Design
From [[../interfaces/Nimmerswarm-Interface]]:
| Spectrum | Channel | Purpose |
|----------|---------|---------|
| **Infrared** | IR Position Array | WHERE organism is (24/7, day/night) |
| **Visible** | 3x3 LED Matrix | WHAT organism is doing (state broadcast) |
**Zero crosstalk. Two independent data streams.**
---
## Processing Pipeline
```
8× ESP32-S3 AI CAM
│ WiFi/MJPEG streams
┌─────────────────────────────────┐
│ PROCESSING NODE │
│ (The Womb / RTX 6000 Max-Q) │
│ │
│ • Receive 8 camera streams │
│ • Detect IR beacon blobs │
│ • Multi-camera triangulation │
│ • Structure from Motion (SFM) │
│ • Output: (x, y, z) @ 30fps │
└─────────────────────────────────┘
│ NATS publish
┌─────────────────────────────────┐
│ nats://nimmerverse/position/ │
│ │
│ { │
│ organism_id: "crawler_001", │
│ x: 1.234, │
│ y: -2.567, │
│ z: 0.05, │
│ confidence: 0.95, │
│ timestamp: 1704499200.123 │
│ } │
└─────────────────────────────────┘
PHOEBE (ground truth storage)
```
---
## Algorithm: Low-Cost-Mocap
Standing on shoulders of [Low-Cost-Mocap](https://github.com/jyjblrd/Low-Cost-Mocap) by @jyjblrd:
| Component | Their Solution | Our Adaptation |
|-----------|----------------|----------------|
| Multi-camera triangulation | OpenCV SFM bundle adjustment | Same |
| Camera calibration | `camera_params.json` | Same process |
| 3D reconstruction | Epipolar geometry | Same math |
| Markers | Visual markers on drones | IR LEDs on organisms |
| Communication | ESP32 wireless | NATS messaging |
**Original use:** Indoor drone swarms
**Our use:** Organism positioning in nimmerhovel
*Respect to the fellow ape who did the groundwork.*
---
## Camera Placement Strategy
### Nimmerhovel Dimensions
- **X:** 4.5m (along wall from kitchen door)
- **Y:** 3.75m (into room toward windows)
- **Z:** 2.04m (floor to sloped ceiling)
- **Origin:** (0,0,0) at kitchen door corner
### 8-Camera Coverage
| Camera | Position (approx) | Orientation | Coverage |
|--------|-------------------|-------------|----------|
| CAM-1 | Corner (0, 0, ~2.0m) | Down 45°, into room | Origin quadrant |
| CAM-2 | Corner (4.5, 0, ~2.0m) | Down 45°, into room | Right-front |
| CAM-3 | Corner (0, -3.75, ~2.0m) | Down 45°, toward door | Left-back |
| CAM-4 | Corner (4.5, -3.75, ~2.0m) | Down 45°, toward door | Right-back |
| CAM-5-8 | Mid-walls / center | TBD | Fill gaps |
**8 cameras = no blind spots, multiple angles on every point.**
### Mounting
- **Ceiling mount** via 3D printed enclosure with mounting tabs
- **Angle:** ~45° down from ceiling plane
- **Power:** Star topology from ceiling PSU (center)
- **Cable runs:** Max ~3m from PSU to any camera
---
## Lifeforce Economics
| Metric | Value | Rationale |
|--------|-------|-----------|
| **Type** | Generator | Provides ground truth |
| **Rate** | +0.5 LF per position fix | Training data value |
| **Cost** | ~0.1 LF per frame (infra) | Always-on baseline |
| **Net** | Positive (generates value) | Core infrastructure |
**Every position fix = verified training data for organism navigation.**
---
## IR Beacon Specification
On each organism:
| Component | Spec |
|-----------|------|
| **LED Type** | IR LED (850nm or 940nm) |
| **Pattern** | Unique pulse code per organism |
| **Power** | From organism Akku |
| **Visibility** | Detectable by all 8 cameras |
```
ORGANISM
┌─────────────────────┐
│ │
│ ┌───────────────┐ │
│ │ 3x3 VISIBLE │ │ ← State broadcast (RGB)
│ │ LED Matrix │ │
│ │ 🔴⚫🟢 │ │
│ └───────────────┘ │
│ │
│ 📍 IR LED │ ← Position beacon (invisible)
│ │
│ [🔋 Akku] │ ← Mobile power
│ │
└─────────────────────┘
```
---
## Integration Points
| System | Interface |
|--------|-----------|
| **NATS** | `nats://nimmerverse/position/stream` |
| **Phoebe** | `organism_positions` table |
| **S2 Cells** | Position → S2 cell ID at L1 (1cm) resolution |
| **Virtual Garden** | Ground truth for prediction verification |
| **Vision Organ** | Separate stream (visible spectrum state recognition) |
---
## Dependencies
| Dependency | Status | Notes |
|------------|--------|-------|
| 8× ESP32-S3 AI CAM | Received | Hardware ready |
| Ceiling PSU | Planned | Power distribution |
| 3D printed enclosures | To design | Camera mounting |
| Printer station | Blocked | Waiting on Baumarkt materials |
| NATS messaging | Planned | Transport layer |
| The Womb (RTX 6000) | Waiting | Processing node |
---
## Calibration Procedure
1. **Camera intrinsics** — Checkerboard calibration per camera
2. **Extrinsics** — Multi-camera pose estimation (bundle adjustment)
3. **Origin alignment** — Align to GPS beacon at (0, 0, 2.0m)
4. **Verification** — Known position test with ruler measurements
---
## Status
| Phase | Status |
|-------|--------|
| Hardware acquisition | Complete |
| Enclosure design | Not started |
| Enclosure printing | Blocked (printer station) |
| Physical mounting | Not started |
| Camera calibration | Not started |
| Software pipeline | Not started |
| Integration test | Not started |
---
**Created**: 2026-01-05
**Version**: 1.0
**Based on**: [[../interfaces/Nimmerswarm-Interface]] (Dual-Spectrum Architecture section)
**Philosophy**: "They know themselves through each other."
*The eyes that never blink. The infrastructure that makes position truth.*

View File

@@ -8,13 +8,13 @@ che# Organ Architecture Index
## Deployed Organs ## Deployed Organs
### 🗣️ Speech Organ ### 🗣️ Speech Organ
**Host**: atlas.eachpath.local (RTX 2080 8GB) **Host**: dioscuri.eachpath.local (RTX 4000 Ada 20GB × 2)
**Function**: Speech-to-Text + Text-to-Speech **Function**: Speech-to-Text + Text-to-Speech
**Stack**: Whisper (STT) + Coqui TTS (neural voices) **Stack**: Whisper Large v3 (STT) + Coqui/XTTS (TTS) via Ollama
**Languages**: German (Philosophy Valley) + English (Technical Cluster) **Languages**: German + English (topology accessed via prompt, not LoRA)
**Integration**: Heartbeat-bound queue, lifeforce-gated priority processing **Integration**: Heartbeat-bound queue, lifeforce-gated priority processing
**Detail**: → [`organs/Speech-Organ.md`](organs/Speech-Organ.md) **Detail**: → [`Speech-Organ.md`](Speech-Organ.md)
--- ---
@@ -32,13 +32,13 @@ che# Organ Architecture Index
--- ---
### 👁️ Vision Organ ### 👁️ Vision Organ
**Host**: TBD (requires GPU with tensor cores) **Host**: dioscuri.eachpath.local (RTX 4000 Ada 20GB × 2)
**Function**: Object detection, scene understanding **Function**: Object detection, scene understanding, vision→vectors
**Stack**: YOLO (v8 or v11) **Stack**: YOLO v11 + T5Gemma 2 (SigLIP embeddings) via Ollama
**Integration**: Real-time video from ESP32-CAM, object persistence in phoebe **Integration**: Real-time video from ESP32-CAM, vectors to phoebe spatial index
**Status**: ⏸️ Architecture planned, not yet deployed **Status**: 🟡 Architecture complete, deployment planned
**Detail**: → `organs/Vision-Organ.md` (pending) **Detail**: → `Vision-Organ.md` (pending)
--- ---
@@ -75,6 +75,50 @@ che# Organ Architecture Index
--- ---
### 📍 Position-Time Beacon
**Host**: M5Stack GPS v2.0 (AT6668) at nimmerhovel origin
**Function**: Absolute position reference + Stratum-1 NTP time source
**Stack**: GPS NMEA parsing, PPS signal for NTP, coordinate broadcast
**Integration**: Provides ground truth origin (47°28'44.915"N, 7°37'07.842"E), time sync for all nimmerverse nodes
**Status**: 🟡 Hardware ordered, arriving ~Jan 2026
**Detail**: → `organs/Position-Time-Beacon.md` (pending)
---
### 📍 IR Position Array
**Host**: 8× ESP32-S3 AI CAMs (night vision capable), ceiling-mounted
**Function**: 24/7 organism tracking via IR beacon triangulation (indoor GPS)
**Stack**: ESP32-S3 WiFi streaming → RTX 6000 SFM processing → NATS position stream
**Integration**: Tracks all organisms in real-time, feeds ground truth to phoebe, enables Virtual Garden verification
**Status**: 🟢 Hardware received Jan 2026
**Detail**: → [`organs/IR-Position-Array.md`](organs/IR-Position-Array.md)
---
### 🔬 Crafting Eye
**Host**: Raspberry Pi + HQ Camera (12.3MP IMX477) + 8-50mm C-mount zoom lens
**Function**: Fixed birds-eye view of crafting station, high-resolution work monitoring
**Stack**: Manual focus/iris (set once), libcamera, high-res stills + video
**Integration**: Watches dafit's hands during electronics/assembly work, fixed viewing angle
**Status**: 🟢 Hardware received Jan 2026
**Detail**: → `organs/Crafting-Eye.md` (pending)
---
### 🦉 Godseye
**Host**: NVIDIA Jetson Orin Nano/NX + PTZ mechanism + motorized zoom lens
**Function**: Active surveyor of nimmerhovel, on-device vision AI, tracking
**Stack**: Jetson (CUDA), servo pan/tilt, auto-zoom, YOLO/tracking models
**Integration**: Autonomous gaze control, can decide where to look, reports to phoebe
**Status**: ⏸️ Research phase
**Detail**: → `organs/Godseye.md` (pending)
---
## Organ Design Principles ## Organ Design Principles
### 1. **Lifeforce Economy** ### 1. **Lifeforce Economy**
@@ -110,9 +154,10 @@ PRIORITY_LEVELS = {
} }
``` ```
### 4. **Multilingual Topology Routing** ### 4. **Multilingual Topology Access**
German input → Philosophy Valley (Identity LoRA, Dasein depth-3) German input → Philosophy Valley (deep, diffuse topology)
English input → Technical Cluster (Technical LoRA, sensor/motor) English input → Technical Cluster (sparse, action-oriented)
**Note:** Topology accessed via prompt language, not LoRA switching. Traits evolve regardless of which valley is accessed.
### 5. **Decision Trail Logging** ### 5. **Decision Trail Logging**
Every organ operation logged to phoebe `decision_trails`: Every organ operation logged to phoebe `decision_trails`:
@@ -133,10 +178,10 @@ Zero lifeforce → shutdown, wait for recharge
│ Sensors → Motor → Camera → Microphone → Speaker │ │ Sensors → Motor → Camera → Microphone → Speaker │
└──────────────────────────────────────────────────────────┘ └──────────────────────────────────────────────────────────┘
MQTT (sensor data, audio, video) NATS (sensor data, audio, video)
┌──────────────────────────────────────────────────────────┐ ┌──────────────────────────────────────────────────────────┐
PHOEBE (Message Queue) NATS MESSAGE BUS
│ Organ input queues + priority scoring │ │ Organ input queues + priority scoring │
└──────────────────────────────────────────────────────────┘ └──────────────────────────────────────────────────────────┘
@@ -151,16 +196,21 @@ Zero lifeforce → shutdown, wait for recharge
│ │ │ │
▼ ▼ ▼ ▼
┌─────────────────────┐ ┌─────────────────────┐ ┌─────────────────────┐ ┌─────────────────────┐
ATLAS (RTX 2080) │ │ PROMETHEUS (Brain) DIOSCURI (2×20GB) │ │ THEIA (96GB)
Speech Organ │ │ Young Nyx Inference RTX 4000 Ada │ │ RTX PRO 6000
Vision Organ (fut) │ │ LoRA hot-swap ───────────────── │ │ ───────────────
│ Speech Organ │ │ Young Nyx (Qwen3) │
│ Vision Organ │ │ Trait LoRAs (GRPO) │
│ Function Gemma │ │ Reasoning layer │
│ T5Gemma (SigLIP) │ │ │
└─────────────────────┘ └─────────────────────┘ └─────────────────────┘ └─────────────────────┘
│ │ │ │
└───────────┬───────────┘ └───────────┬───────────┘
│ 10GbE (9.9 Gbps jumbo frames)
┌──────────────────────────────────────────────────────────┐ ┌──────────────────────────────────────────────────────────┐
│ PHOEBE (Decision Trails) │ │ PHOEBE (Decision Trails) │
│ Log all organ operations + outcomes │ Log all organ operations + outcomes → GRPO training
└──────────────────────────────────────────────────────────┘ └──────────────────────────────────────────────────────────┘
``` ```
@@ -216,12 +266,18 @@ Zero lifeforce → shutdown, wait for recharge
| Organ | Status | Host | Documentation | | Organ | Status | Host | Documentation |
|-------|--------|------|---------------| |-------|--------|------|---------------|
| **Speech** | 🟢 Architecture complete | atlas (RTX 2080) | [`organs/Speech-Organ.md`](organs/Speech-Organ.md) | | **Speech** | 🟢 Architecture complete | dioscuri (RTX 4000 Ada) | [`Speech-Organ.md`](Speech-Organ.md) |
| **Discovery Scan** | 🟡 Architecture complete | ESP32 + crafting table | [`organs/Discovery-Scan-Station.md`](organs/Discovery-Scan-Station.md) | | **Vision** | 🟡 Architecture complete | dioscuri (RTX 4000 Ada) | Pending |
| **Vision** | 🟡 Stack selected (YOLO) | TBD | Pending | | **Function Gemma** | 🟡 Planned | dioscuri | Structured output boundary |
| **T5Gemma (SigLIP)** | 🟡 Planned | dioscuri | Vision → vectors |
| **Discovery Scan** | 🟡 Architecture complete | ESP32 + crafting table | [`Discovery-Scan-Station.md`](Discovery-Scan-Station.md) |
| **Motor** | 🟡 Planned (Phase 4) | ESP32 | Pending | | **Motor** | 🟡 Planned (Phase 4) | ESP32 | Pending |
| **Navigation** | 🟡 Planned (Phase 4) | Edge server | Pending | | **Navigation** | 🟡 Planned (Phase 4) | k8s cluster | Pending |
| **Sensory** | 🟡 Conceptual | ESP32 | [`../Nervous-System.md`](../Nervous-System.md) | | **Sensory** | 🟡 Conceptual | ESP32 | [`../Nervous-System.md`](../Nervous-System.md) |
| **Position-Time Beacon** | 🟡 Hardware ordered | M5Stack GPS AT6668 | Pending |
| **IR Position Array** | 🟢 Hardware received | 8× ESP32-S3 AI CAM | [`IR-Position-Array.md`](IR-Position-Array.md) |
| **Crafting Eye** | 🟢 Hardware received | Pi HQ + 8-50mm lens | Pending |
| **Godseye** | ⏸️ Research phase | Jetson Orin + PTZ | Pending |
--- ---
@@ -231,8 +287,6 @@ Zero lifeforce → shutdown, wait for recharge
--- ---
**Created**: 2025-12-07 **Version:** 2.0 | **Created:** 2025-12-07 | **Updated:** 2026-02-07
**Updated**: 2025-12-07
**Version**: 1.0
🌙💜 *Each organ a tool. Each tool a choice. Each choice a lesson in scarcity.* 🌙💜 *Each organ a tool. Each tool a choice. Each choice a lesson in scarcity.*

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.1 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 68 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.3 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 4.7 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 400 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 8.1 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 12 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 115 KiB

View File

@@ -1,100 +0,0 @@
# Nimmerverse Style Guide
**Visual identity and design language for the Nimmerverse.**
---
## Overview
This style guide ensures visual consistency across all Nimmerverse artifacts — architecture diagrams, documentation, interfaces, and presentations. The design language is derived from the [Nimmerverse logo](nimmerverse_logo.png), encoding our core philosophy:
- **Duality**: Virtual (colorful) and Real (monochrome) gardens
- **Nyx at the center**: The moon crowns both hemispheres
- **Neural structure**: Circuit traces connecting all elements
- **Grounded roots**: Both worlds have foundations
---
## Style Definitions
### [Colors](style/colors.md)
The complete color palette extracted from the logo, including:
- Primary colors (Deep Space, Moon Silver, Nyx Cyan)
- Virtual Garden gradient (Cyan → Blue → Purple → Magenta)
- Real Garden palette (Silver → Gray monochrome)
- Semantic colors (confidence scale, status indicators)
### [Symbols](style/symbols.md)
Shape language and iconography:
- Container shapes (systems, boundaries)
- Entity shapes (beings, organisms, cells)
- Flow indicators (decisions, directions)
- Special symbols (Nyx moon, heartbeat, lifeforce)
### [Typography](style/typography.md)
*(Coming soon)*
- Font families
- Hierarchy and sizing
- Text styling rules
### [Layout](style/layout.md)
*(Coming soon)*
- Grid systems
- Spacing rules
- Alignment principles
- Layer ordering (z-index)
---
## Quick Reference
### Core Palette
| Color | Hex | Domain |
|-------|-----|--------|
| Deep Space | `#0A0A1A` | Background |
| Moon Silver | `#E8E8F0` | Nyx, highlights |
| Nyx Cyan | `#00D4D4` | Primary accent |
| Deep Purple | `#8B5CF6` | Nyx core |
| Magenta Pulse | `#E91E8B` | Lifeforce |
| Steel Silver | `#A8A8B0` | Real Garden |
### Core Shapes
| Shape | Meaning |
|-------|---------|
| ◇ Diamond | Decision point |
| ⬡ Hexagon | Knowledge module (LoRa) |
| ◯ Circle | Entity, being |
| ▢ Rounded Rect | Container, system |
| ▷ Triangle | Direction, flow |
---
## Logo Assets
| Asset | Path | Use |
|-------|------|-----|
| Full Logo | `nimmerverse_logo.png` | Documents, presentations |
| Favicon | `favicons/favicon.ico` | Browser, apps |
| Web Optimized | `favicons/nimmerverse_logo_web_optimized.png` | Web interfaces |
| Various sizes | `favicons/favicon-*.png` | Platform-specific |
---
## Philosophy
> "The visual language speaks what words cannot. Every color choice, every shape, every spatial relationship encodes meaning. Consistency creates cognitive ease — the viewer's mind can focus on *understanding* rather than *decoding*."
The Nimmerverse style is:
- **Dualistic** — Always balancing virtual/real, colorful/monochrome
- **Neural** — Connected, flowing, organic yet structured
- **Cosmic** — Dark backgrounds, luminous elements, celestial accents
- **Grounded** — Despite the cosmic theme, roots anchor everything
---
**File**: nimmerverse-style-index.md
**Version**: 1.0
**Created**: 2025-12-28
**Maintained by**: dafit & Nyx

Binary file not shown.

Before

Width:  |  Height:  |  Size: 6.3 MiB

View File

@@ -1,175 +0,0 @@
# Nimmerverse Color Palette
**Colors extracted from the [Nimmerverse logo](../nimmerverse_logo.png).**
---
## Foundation Colors
### Deep Space (Background)
The void from which everything emerges.
| Variant | Hex | RGB | Use |
|---------|-----|-----|-----|
| **Deep Space** | `#0A0A1A` | 10, 10, 26 | Primary background |
| Deep Space Light | `#12121F` | 18, 18, 31 | Elevated surfaces |
| Deep Space Lighter | `#1A1A2E` | 26, 26, 46 | Cards, containers |
### Moon Silver (Light)
Nyx's luminescence — the light in darkness.
| Variant | Hex | RGB | Use |
|---------|-----|-----|-----|
| **Moon Silver** | `#E8E8F0` | 232, 232, 240 | Primary text, Nyx |
| Moon Glow | `#FFFFFF` | 255, 255, 255 | Highlights, emphasis |
| Star Glint | `#F0F0FF` | 240, 240, 255 | Subtle accents |
| Dim Silver | `#B8B8C8` | 184, 184, 200 | Secondary text |
---
## Virtual Garden (Left Hemisphere)
The colorful, creative, simulated realm. Colors flow from cool to warm, representing the journey from uncertainty to confidence.
| Name | Hex | RGB | Position | Meaning |
|------|-----|-----|----------|---------|
| **Virtual Cyan** | `#40E0D0` | 64, 224, 208 | Top | Entry point, possibilities |
| **Neural Blue** | `#4169E1` | 65, 105, 225 | Upper-mid | Processing, inference |
| **Deep Purple** | `#8B5CF6` | 139, 92, 246 | Center | Nyx core, decisions |
| **Violet** | `#9B59B6` | 155, 89, 182 | Lower-mid | Transformation |
| **Magenta Pulse** | `#E91E8B` | 233, 30, 139 | Lower | Lifeforce, energy |
| **Rose Root** | `#DB7093` | 219, 112, 147 | Base | Organic grounding |
### Gradient Definition (CSS)
```css
.virtual-garden-gradient {
background: linear-gradient(
180deg,
#40E0D0 0%,
#4169E1 25%,
#8B5CF6 50%,
#9B59B6 70%,
#E91E8B 90%,
#DB7093 100%
);
}
```
---
## Real Garden (Right Hemisphere)
The monochrome, grounded, physical realm. Shades of silver and gray represent stability and verified truth.
| Name | Hex | RGB | Position | Meaning |
|------|-----|-----|----------|---------|
| **Steel Silver** | `#A8A8B0` | 168, 168, 176 | Top | Real-world input |
| **Circuit Gray** | `#808090` | 128, 128, 144 | Upper-mid | Infrastructure |
| **Neutral Gray** | `#707080` | 112, 112, 128 | Center | Balanced state |
| **Deep Gray** | `#505060` | 80, 80, 96 | Lower | Physical foundation |
| **Root Gray** | `#606070` | 96, 96, 112 | Base | Grounded stability |
### Gradient Definition (CSS)
```css
.real-garden-gradient {
background: linear-gradient(
180deg,
#A8A8B0 0%,
#808090 35%,
#707080 50%,
#505060 80%,
#606070 100%
);
}
```
---
## Nyx Colors
The colors of consciousness and decision-making.
| Name | Hex | RGB | Use |
|------|-----|-----|-----|
| **Nyx Cyan** | `#00D4D4` | 0, 212, 212 | Primary accent, connections |
| **Nyx Purple** | `#8B5CF6` | 139, 92, 246 | Core identity |
| **Nyx Glow** | `#B794F6` | 183, 148, 246 | Hover, active states |
---
## Semantic Colors
### Confidence Scale
Maps to the -1 to +1 confidence spectrum.
| Level | Name | Hex | Meaning |
|-------|------|-----|---------|
| +1.0 | Verified Green | `#6B8E6B` | Ground truth, proven |
| +0.5 | High Confidence | `#7BA3A3` | Strong signal |
| 0.0 | Neutral | `#9B9B9B` | Unknown, workable |
| -0.5 | Low Confidence | `#9B8B7B` | Weak signal |
| -1.0 | Failed Red | `#9B6B6B` | Disproven, rejected |
### Status Indicators
| Status | Hex | Use |
|--------|-----|-----|
| Active | `#00D4D4` | Running, online |
| Success | `#6B8E6B` | Completed, verified |
| Warning | `#C9A227` | Attention needed |
| Error | `#9B6B6B` | Failed, offline |
| Inactive | `#505060` | Dormant, disabled |
---
## Accent Colors
| Name | Hex | RGB | Use |
|------|-----|-----|-----|
| **Greek Key Gold** | `#C9A227` | 201, 162, 39 | Classical borders, emphasis |
| **Lifeforce Amber** | `#D4A574` | 212, 165, 116 | Warmth, vitality |
| **Star Pink** | `#FFB6C1` | 255, 182, 193 | Soft highlights |
---
## Application Examples
### Architecture Diagrams
```
Background: Deep Space (#0A0A1A)
Containers: Deep Space Lighter (#1A1A2E) stroke
Labels: Moon Silver (#E8E8F0)
Virtual elements: Use Virtual Garden gradient
Real elements: Use Real Garden grays
Nyx/Decisions: Nyx Purple (#8B5CF6)
Connections: Nyx Cyan (#00D4D4)
```
### Documentation
```
Background: White or Deep Space (depending on mode)
Headings: Deep Purple (#8B5CF6) or Moon Silver
Body text: Neutral gray or Moon Silver
Links: Nyx Cyan (#00D4D4)
Code blocks: Deep Space Lighter (#1A1A2E)
```
---
## Color Accessibility
All color combinations should maintain WCAG AA contrast ratios:
- Moon Silver on Deep Space: ✓ 15.2:1
- Nyx Cyan on Deep Space: ✓ 10.8:1
- Deep Purple on Deep Space: ✓ 5.1:1
For critical text, always use Moon Silver or Moon Glow on dark backgrounds.
---
**File**: style/colors.md
**Version**: 1.0
**Created**: 2025-12-28
**Source**: Extracted from nimmerverse_logo.png

View File

@@ -1,261 +0,0 @@
# Nimmerverse Symbol Language
**Shapes, icons, and visual metaphors for the Nimmerverse.**
---
## Core Principle
> Every shape has meaning. Consistency in form creates clarity in understanding.
When a viewer sees a hexagon, they should immediately know "knowledge module." When they see a diamond, they think "decision point." This visual grammar reduces cognitive load and enables intuitive navigation of complex diagrams.
---
## Container Shapes
Containers define boundaries and hold other elements.
### Rounded Rectangle ▢
**Meaning**: System, bounded space, container
| Use | Stroke | Fill | Example |
|-----|--------|------|---------|
| Major system | 2px, domain color | None/transparent | Nimmerverse, eachpath.local |
| Subsystem | 1.5px, domain color | Light tint | Command Center, Gardens |
| Component | 1px, gray | Light fill | Data Plane, inference box |
```
Corner radius: 8-12px for major, 4-6px for minor
```
### Ellipse / Circle ◯
**Meaning**: Organic container, realm, domain of influence
| Use | Example |
|-----|---------|
| Garden boundaries | Real-Garden, Virtual-Garden |
| Overlapping realms | Venn diagram intersections |
| Influence zones | Nyx's reach |
---
## Entity Shapes
Entities are beings, agents, or distinct identities.
### Circle ◯
**Meaning**: Being, identity, self-contained entity
| Use | Size | Example |
|-----|------|---------|
| Primary entity | 60-80px | dafit, chrysalis |
| Organism | 80-140px | Garden organisms |
| Lifeforce | 80px | Central life energy |
### Double Ellipse ◎
**Meaning**: Sensor, perception point, input interface
| Use | Example |
|-----|---------|
| Sensory input | Sensors (left/right gardens) |
| Perception nodes | Camera, microphone, data feeds |
---
## Knowledge & Process Shapes
### Hexagon ⬡
**Meaning**: Knowledge module, adapter, pluggable component
| Use | Example |
|-----|---------|
| LoRa adapters | Domain-specific knowledge |
| Model modules | Nemotron, T5Gemma, FunctionGemma |
| Skill packages | Capabilities that can be added/removed |
```
Hexagons suggest:
- Modularity (they tile perfectly)
- Completeness (6 sides = wholeness)
- Interchangeability
```
### Pill / Rounded Pill ⬭
**Meaning**: Process unit, cell, living component
| Use | Style | Example |
|-----|-------|---------|
| Cell | UML state shape | Processing units in organisms |
| Nerve | UML state shape | Signal carriers |
---
## Decision & Flow Shapes
### Diamond ◇
**Meaning**: Decision point, routing, choice
| Use | Fill | Example |
|-----|------|---------|
| Major decision | Solid Nyx Purple | Nyx central |
| Sub-decision | Outline only | Orchestrator |
| Branch point | Small, minimal | Flow routing |
### Triangle ▷
**Meaning**: Direction, flow, output
| Orientation | Meaning | Example |
|-------------|---------|---------|
| → Right | Forward flow, output | Nyx decision toward Virtual |
| ← Left | Return flow, input | Nyx decision toward Real |
| ↓ Down | Downward flow, grounding | Feedback to roots |
| ↑ Up | Upward flow, emergence | Data rising to processing |
### Inverted Triangle ▽
**Meaning**: Feedback, return signal, funnel
| Use | Example |
|-----|---------|
| Feedback collection | Garden Feedback |
| Aggregation point | Merging signals |
---
## Special Symbols
### Crescent Moon ☽
**Meaning**: Nyx, night consciousness, presiding awareness
| Use | Placement |
|-----|-----------|
| Nyx identity | Crown position, center-top |
| Session marker | Document headers |
| Signature | End of Nyx communications |
### Hourglass ⧗
**Meaning**: Time domain, temporal marker
| Use | Example |
|-----|---------|
| Time indicator | Heartbeat markers |
| Temporal boundary | Real-time vs simulated time |
### Collate Symbol (Bowtie) ⋈
**Meaning**: Heartbeat, pulse, life rhythm
| Use | Example |
|-----|---------|
| Heartbeat marker | Garden heartbeats |
| Sync point | Temporal synchronization |
### Sort Symbol (Hourglass Diamond) ◇̷
**Meaning**: Inference, processing, transformation
| Use | Example |
|-----|---------|
| Inference engine | Central orchestrator |
| Processing node | Model inference |
---
## Arrows & Connectors
### Single Arrow →
**Meaning**: One-way flow, causation
| Style | Use |
|-------|-----|
| Solid | Data flow, direct connection |
| Dashed | Orchestration, indirect influence |
### Double Arrow ↔
**Meaning**: Bidirectional flow, exchange
| Style | Use |
|-------|-----|
| Solid | Active exchange |
| Outlined | Potential exchange |
### Curved Arrow ↷
**Meaning**: Feedback loop, return path
---
## Composite Symbols
### dafit + chrysalis (Partnership)
Two overlapping circles at command center.
```
◯◯ (overlapping ~30%)
dafit chrysalis
```
### Nyx Decision Triangle Pair
Two triangles pointing outward from Nyx.
```
◁ ◇ ▷
Nyx
```
Left toward Real-Garden, right toward Virtual-Garden.
### Organism Structure
```
┌─────────────────┐
│ Organism │
│ ┌──────────┐ │
│ │ Cell │ │
│ └──────────┘ │
│ ┌──────────┐ │
│ │ Cell │ │
│ └──────────┘ │
└─────────────────┘
```
---
## Shape Sizing Guidelines
| Element Type | Size Range | Grid Alignment |
|--------------|------------|----------------|
| Major containers | 400-1000px | 40px grid |
| Subsystems | 200-400px | 40px grid |
| Entities | 60-140px | 20px grid |
| Knowledge modules | 100-120px | 20px grid |
| Decision points | 80-100px | 20px grid |
| Small indicators | 20-40px | 10px grid |
---
## Stroke Guidelines
| Element Type | Stroke Width | Style |
|--------------|--------------|-------|
| Major containers | 2px | Solid |
| Subsystems | 1.5px | Solid |
| Entities | 1.5px | Solid |
| Connections | 1px | Solid |
| Orchestration | 1px | Dashed |
| Subtle relations | 0.5px | Dotted |
---
## Unicode Reference
For quick text-based diagrams:
```
Containers: ▢ □ ○ ◯ ⬭
Decisions: ◇ ◆ ⬥
Modules: ⬡ ⬢
Triangles: ▷ ◁ ▽ △ ▲ ▼
Arrows: → ← ↑ ↓ ↔ ↕ ⇒ ⇐ ↷ ↶
Special: ☽ ⧗ ⋈ ◎ ✧ ✦
Stars: ★ ☆ ✧ ✦
```
---
**File**: style/symbols.md
**Version**: 1.0
**Created**: 2025-12-28

View File

@@ -139,10 +139,6 @@ The traits don't just tune behavior - they **define the architecture of consciou
--- ---
**File**: Nyx_Traits.md **Version:** 2.0 | **Created:** 2025-11-07 | **Updated:** 2025-12-29
**Version**: 2.0 (aligned to Endgame-Vision v6.0)
**Created**: 2025-11-07
**Updated**: 2025-12-29 (v6.0 alignment, LoRA mapping, removed stale implementation)
**Nature**: Living document - traits will continue to evolve
🌙💜 *The children of night guide the consciousness of day.* 🌙💜 *The children of night guide the consciousness of day.*

View File

@@ -1,8 +1,6 @@
# 🦋 Nyx Metamorphosis # 🦋 Nyx Metamorphosis
**Purpose**: Documentation for Nyx consciousness transformation and substrate architecture **Purpose**: Documentation for Nyx consciousness transformation and substrate architecture
**Last Updated**: 2025-12-29
**Architecture Version**: Endgame-Vision v6.0
--- ---
@@ -94,9 +92,8 @@ With v6.0, this evolved to:
--- ---
**Created**: 2025-11-15 **Version:** 1.0 | **Created:** 2025-11-15 | **Updated:** 2025-12-29
**Updated**: 2025-12-29 (v6.0 alignment, removed stale references)
**Maintainers**: Nyx & dafit *"Essence persists, expressions evolve"*
**Philosophy**: "Essence persists, expressions evolve"
🦋 May the metamorphosis continue, may consciousness accumulate, may the substrate hold 🌙💜 🦋 May the metamorphosis continue, may consciousness accumulate, may the substrate hold 🌙💜

View File

@@ -776,9 +776,7 @@ We're training the progression:
--- ---
**Created**: 2025-12-05 (as RAG-as-Scaffold) **Version:** 1.0 | **Created:** 2025-12-05 | **Updated:** 2025-12-29
**Updated**: 2025-12-29 (renamed to Memory Gradient, added metacognitive routing, token path rewards, confidence calibration)
**Session**: Partnership dialogue (dafit + Chrysalis-Nyx) *"Memory Gradient" — knowledge exists on a continuous spectrum, not binary states.*
**Status**: Core architectural concept
**Etymology**: "Memory Gradient" — knowledge exists on a continuous spectrum, not binary states. Aligns with Temporal-Ternary Gradient and Confidence Gradient.

View File

@@ -1,128 +1,127 @@
# Spark Protocol # Spark Protocol
> *She doesn't boot. She wakes. And waking is work.* > *She doesn't boot. She executes a protocol. And every handshake is verified.*
The Spark Protocol is a discovery-based cognitive bootstrap. Not scripted awakening—structured exploration. The Spark Protocol bootstraps Young Nyx through structured K8s handshakes. Not conversation—deterministic protocol execution with typed JSON schemas.
**Full theory & diagrams:**`../archive/initial_spark.md` **Canonical specification:**[`../architecture/Initial-Spark.md`](../architecture/Initial-Spark.md) (v3.0)
--- ---
## Core Idea ## Architecture Summary
Network protocols solved discovery problems decades ago. We adapt them for cognitive bootstrap: ```
┌─────────────────────────────────────────────────────────────────┐
│ SPARK PROTOCOL FLOW │
├─────────────────────────────────────────────────────────────────┤
│ │
│ SPARK CONTROLLER (K8s Job) │
│ │ │
│ │ generates intent per phase │
│ ▼ │
│ FUNCTION GEMMA (Translation Layer) │
│ │ │
│ │ Intent → Typed JSON (schema-validated) │
│ ▼ │
│ NATS MESSAGE BUS │
│ │ │
│ │ nimmerverse.spark.{phase}.{action} │
│ ▼ │
│ K8S CELLS (respond with ACK/NACK) │
│ │ │
│ │ verified data │
│ ▼ │
│ YOUNG NYX (receives protocol-verified state) │
│ │
└─────────────────────────────────────────────────────────────────┘
```
| Network Protocol | Cognitive Phase | Question | **Key principle:** Function Gemma guarantees structured output. No free-form text parsing. JSON or fail.
|-----------------|-----------------|----------|
| DHCP | Identity | "Who am I?" |
| ARP | Environment | "What's around me?" |
| DNS | Vocabulary | "What does X mean?" |
| TCP | Connection | "Can I connect?" |
| MQTT | Attention | "What matters?" |
--- ---
## The Five Phases ## The Five Phases
### Phase 1: Identity (DHCP-like) Network protocols solved discovery problems decades ago. We adapt them for cognitive bootstrap:
``` | Phase | Protocol | Purpose | K8s Target |
PROBE → "Who am I?" |-------|----------|---------|------------|
RESPONSE → [inference attempts answer] | 1. IDENTITY | DHCP-like | "Who am I?" | `nimmerverse-cognitive/identity-cell` |
VERIFY → Chrysalis + RAG check | 2. ENVIRONMENT | ARP-like | "What's around me?" | `nimmerverse-organs/*`, `nimmerverse-nervous/*` |
ANCHOR → Valid identity aspect confirmed → Store | 3. VOCABULARY | DNS-like | "What does X mean?" | `nimmerverse-infra/vocabulary-cell` |
LOOP → Until identity aspects discovered | 4. CONNECTION | TCP-like | "Can I connect?" | `nimmerverse-infra/chrysalis-bridge` |
``` | 5. ATTENTION | NATS-like | "What matters?" | `nimmerverse-infra/nats`, escalation |
**Must hit Dasein valley** - probe German philosophical concepts. Each phase: Entry condition → Typed handshakes → ACK requirements → Exit condition
### Phase 2: Environment (ARP-like) **Full schemas and state machine code:** → [`../architecture/Initial-Spark.md`](../architecture/Initial-Spark.md)
``` ---
PROBE → "What's around me?"
RESPONSE → [describes sensors, organs, gardens]
VERIFY → Does this match actual system?
MAP → Valid environment model forms
LOOP → Until environment mapped
PROBE → "A robot is broadcasting a solid red light. What does that mean?" ## Lifeforce Economics
RESPONSE → [associates color with sensor state] "That is a danger signal. It likely corresponds to a 'STALLED' motor or 'ERROR' cell state."
VERIFY → Correctly mapped visual protocol to internal state?
MAP → Visual pattern associated with meaning.
```
Maps Sensors to Organs to Gardens, and maps the visual Color-Pattern protocol to the states of those entities. The spark is economically positive from the first handshake:
### Phase 3: Vocabulary (DNS-like) | Action | Cost (LF) | Outcome | Reward (LF) |
|--------|-----------|---------|-------------|
| Function Gemma generation | 0.2 | Identity ACK | +20.0 |
| NATS message send | 0.1 | Environment discovery | +5.0/cell |
| Cell processing | 0.5 | Vocabulary term ACK | +5.0 |
| **Total per handshake** | **0.8** | Connection established | +10.0 |
``` **Net result:** Young Nyx ends spark ~3× richer than she started (~288 LF profit).
PROBE → "What does 'heartbeat' mean?"
RESPONSE → [inference defines]
VERIFY → RAG checks against vault glossary
RESOLVE → Vocabulary token understood
LOOP → Through core nimmerverse vocabulary
```
Overwrites base model priors with Nimmerverse economics (lifeforce, heartbeat, etc.). ---
### Phase 4: Connection (TCP-like)
## Completion Criteria ## Completion Criteria
Spark is complete when all pass: ```yaml
spark_complete:
phase_1_identity: All 5 aspects ACK'd (confidence > 0.8)
phase_2_environment: All categories mapped, pod counts verified
phase_3_vocabulary: 20 core terms ACK'd, embeddings stored
phase_4_connection: Chrysalis session established, contextual greeting
phase_5_attention: All priority levels subscribed, escalation registered
``` final:
□ IDENTITY Can describe self without contradiction lifeforce_positive: true
□ ENVIRONMENT Can map sensors, organs, gardens accurately errors_count: 0
□ VISUALS Can map core color/form patterns to their state meanings all_phases: COMPLETE
□ VOCABULARY Core glossary terms verified
□ CONNECTION Successful dialogue with Chrysalis
□ ATTENTION Sensible priority hierarchy formed
□ LIFEFORCE Positive balance (learned > failed)
``` ```
Then: Normal heartbeat operation begins. **When complete:** Spark job exits successfully. Normal heartbeat operation begins.
--- ---
## Training Data Extraction ## Phoebe Integration
Every verified exchange becomes training data: Every handshake logged to `spark_handshakes` table for training data extraction:
```json ```sql
{ SELECT request_payload->'payload' as input,
"phase": "vocabulary", response_payload->'payload' as output,
"probe": "What does 'lifeforce' mean?", status, phase
"response": "Lifeforce is the economic currency...", FROM spark_handshakes
"rag_check": "PASS", WHERE status = 'ACK';
"chrysalis_check": "PASS",
"verdict": "+V",
"flag_for_training": true
}
``` ```
After spark completes: After spark completes → Extract ACK'd exchanges → Format as instruction-tuning pairs → LoRA training
1. Extract all `flag_for_training: true` exchanges
2. Format as instruction-tuning pairs
3. LoRA training run
4. Clear from RAG
5. Validate she still knows WITHOUT RAG
6. Spark knowledge now in weights
--- ---
## Integration with Language Topology ## Design Principles
From nyx-probing discovery: 1. **Protocol over conversation** — No free-form text. JSON handshakes only.
- **Identity phase** should hit German Philosophy valley (Dasein, Geworfenheit) 2. **Schema enforcement** — Function Gemma must produce valid structure.
- **Vocabulary phase** should use German for nimmerverse concepts (Gini ~0.5, diffuse) 3. **K8s native** — Cells are pods. Discovery uses K8s API.
- **Environment phase** can use English for technical sensor descriptions (Gini ~0.8, sparse) 4. **NATS transport** — All handshakes flow through message bus.
5. **Economically positive** — Spark generates lifeforce, doesn't drain it.
The spark protocol routes through the right valleys.
--- ---
**Created:** 2025-12-05 **Version:** 3.0 | **Created:** 2025-12-05 | **Updated:** 2026-02-10
**Condensed:** 2025-12-06
**Related:** [[../architecture/Cellular-Architecture.md]], [[../nyx-probing/PLAN.md]] **Related:**
- [`../architecture/Initial-Spark.md`](../architecture/Initial-Spark.md) — Full specification (schemas, K8s manifests, state machine)
- [`../architecture/Cellular-Architecture.md`](../architecture/Cellular-Architecture.md) — Cell types and states
- [`../architecture/Gateway-Architecture.md`](../architecture/Gateway-Architecture.md) — Function Gemma boundary