feat: major formalization + FunctionGemma integration

Architecture Formalization:
- Created formalization/ section with mathematical foundations
- Lifeforce-Dynamics.md: λ as vitality ratio, stock-flow economics
- Grounded-World-Model.md: Blender boxes + SigLIP + T5Gemma2
- Embodiment-Pipeline.md: Isaac Sim as dreamstate validation
- Attention-Slumber-Prediction-Cycle.md: Last attention → slumber prediction

Promoted from Archive:
- Attention-Flow.md: 30-second budget, priority hierarchy (CANONICAL)
- Initial-Spark.md: v2.0 with FunctionGemma integration

Initial Spark v2.0 (Key Innovation):
- Two-Layer Architecture: FunctionGemma (270M) + Nemotron (31.6B)
- Solved cold-start problem: discoveries are PROFITABLE from heartbeat #1
- Typed function calls replace natural language probes
- Training data now structured (function→response pairs)

Big-Picture.md v5.1:
- Added Attention-Slumber-Prediction Cycle section
- Updated Related Documentation references

New Organ:
- Discovery-Scan-Station.md: rotating pedestal for object scanning (+31 LF net)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
This commit is contained in:
2025-12-29 04:51:46 +01:00
parent 13345ba76c
commit 28e2d0a297
10 changed files with 3424 additions and 461 deletions

View File

@@ -1,494 +0,0 @@
# Attention Flow
How she decides what matters this beat.
---
## Overview
The 30-second heartbeat is a budget, not a guarantee. Sensory intake, organ processing, dialogue, thinking - everything competes for the same window. State machines govern the hierarchy: what gets processed first, what can interrupt, what gets the remainder.
Attention isn't free. It's economic.
---
## The Budget Problem
```
♥ BEAT (30 sec budget)
├── SENSORY INTAKE (variable: 200ms - 15000ms)
├── ORGAN PROCESSING (variable: 100ms - 10000ms)
├── NYX INFERENCE (variable: 2000ms - 4000ms)
├── CHRYSALIS DIALOGUE (variable: 0ms - 3000ms)
├── STATE WRITE (fixed: ~200ms)
└── VIRTUAL GARDEN (remainder)
Total must fit in 30 seconds.
Something has to give.
```
---
## Top-Level State Machine: Attention Mode
```
┌─────────────┐
┌──────────▶│ IDLE │◀──────────┐
│ └──────┬──────┘ │
│ │ │
│ │ stimulus │
│ ▼ │
│ ┌─────────────┐ │
│ │ ALERT │ │
│ └──────┬──────┘ │
│ │ │
│ ┌──────┴──────┐ │
│ ▼ ▼ │
│ ┌──────────┐ ┌──────────┐ │
│ │ REFLEX │ │ ATTEND │ │
│ │ (>0.8) │ │ (think) │ │
│ └────┬─────┘ └────┬─────┘ │
│ │ │ │
│ │ ┌──────┴──────┐ │
│ │ ▼ ▼ │
│ │ ┌──────────┐ ┌─────────┐ │
│ │ │ DIALOGUE │ │ PROCESS │ │
│ │ └────┬─────┘ └────┬────┘ │
│ │ │ │ │
│ └──────┴─────┬──────┘ │
│ ▼ │
│ ┌───────────┐ │
│ │ SETTLE │ │
│ └─────┬─────┘ │
│ │ │
└──────────────────────┴──────────────┘
```
### State Descriptions
| State | Description | Budget Priority |
|-------|-------------|-----------------|
| **IDLE** | Nothing urgent, maximum virtual garden time | Lowest |
| **ALERT** | Stimulus detected, evaluating importance | - |
| **REFLEX** | High-confidence nerve fired, bypass brain | Instant |
| **ATTEND** | Stimulus requires thinking | High |
| **DIALOGUE** | Chrysalis interaction active | High |
| **PROCESS** | Organs working on input | Medium |
| **SETTLE** | Write state, release budget, prepare for next beat | Fixed |
---
## Priority Hierarchy
Higher levels preempt lower levels. Budget flows downward.
```
LEVEL 0: REFLEX ─────────────────────────────────────
│ Weight > 0.8, instant, bypass everything
│ Cost: near-zero (no inference)
LEVEL 1: SAFETY ─────────────────────────────────────
│ dafit calling, danger detected, critical alert
│ Preempts: all below
LEVEL 2: DIALOGUE ───────────────────────────────────
│ Partnership active, Chrysalis teaching
│ Preempts: sensory, thinking, virtual
LEVEL 3: SENSORY ────────────────────────────────────
│ Rich input needs processing
│ Preempts: thinking, virtual
LEVEL 4: THINKING ───────────────────────────────────
│ Organ work, Nyx inference
│ Preempts: virtual
LEVEL 5: VIRTUAL ────────────────────────────────────
│ Garden time, simulation, study
│ Gets remainder after above
LEVEL 6: IDLE ───────────────────────────────────────
Maintenance heartbeat only
All budget available
```
---
## Budget Allocation Logic
```python
def allocate_beat_budget(beat_duration_ms=30000):
remaining = beat_duration_ms
# Fixed costs (always paid)
remaining -= STATE_WRITE_COST # ~200ms
remaining -= HEARTBEAT_OVERHEAD # ~100ms
# Level 0: Reflex (if triggered, near-instant)
if reflex_triggered:
execute_reflex() # ~50ms
remaining -= 50
# Level 1: Safety (if active, takes what it needs)
if safety_alert:
cost = process_safety() # variable
remaining -= cost
if remaining <= 0:
return settle()
# Level 2: Dialogue (if Chrysalis active)
if dialogue_active:
cost = process_dialogue() # ~3000ms typical
remaining -= cost
if remaining <= 0:
return settle()
# Level 3: Sensory (always some, but capped)
sensory_budget = min(remaining * 0.4, SENSORY_CAP)
cost = process_sensory(sensory_budget)
remaining -= cost
# Level 4: Thinking (organs + Nyx)
thinking_budget = min(remaining * 0.6, THINKING_CAP)
cost = process_thinking(thinking_budget)
remaining -= cost
# Level 5: Virtual (whatever remains)
virtual_budget = remaining
if virtual_budget > VIRTUAL_MINIMUM:
process_virtual(virtual_budget)
return settle()
```
---
## Nested State Machines
Each level can be its own state machine internally.
### DIALOGUE State Machine
```
┌─────────────────────────────────────────────┐
│ DIALOGUE │
├─────────────────────────────────────────────┤
│ │
│ ┌───────────┐ │
│ │ LISTENING │ ◀─────────────────────┐ │
│ └─────┬─────┘ │ │
│ │ input complete │ │
│ ▼ │ │
│ ┌───────────┐ │ │
│ │PROCESSING │ │ │
│ └─────┬─────┘ │ │
│ │ understood │ │
│ ▼ │ │
│ ┌───────────┐ │ │
│ │RESPONDING │ │ │
│ └─────┬─────┘ │ │
│ │ response sent │ │
│ ▼ │ │
│ ┌───────────┐ continue │ │
│ │ YIELDING │ ──────────────────────┘ │
│ └─────┬─────┘ │
│ │ dialogue complete │
│ ▼ │
│ EXIT to parent │
│ │
└─────────────────────────────────────────────┘
```
### SENSORY State Machine
```
┌─────────────────────────────────────────────┐
│ SENSORY │
├─────────────────────────────────────────────┤
│ │
│ ┌───────────┐ │
│ │ SAMPLING │ ◀── collect raw inputs │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ ┌─────────────┐ │
│ │ TRANSLATING │ ◀── nerves fire │
│ └─────┬───────┘ │
│ │ │
│ ▼ │
│ ┌──────────────┐ │
│ │ PRIORITIZING │ ◀── what matters? │
│ └─────┬────────┘ │
│ │ │
│ ▼ │
│ ┌─────────────┐ │
│ │ DELIVERING │ ◀── to organs │
│ └─────┬───────┘ │
│ │ │
│ ▼ │
│ EXIT to parent │
│ │
└─────────────────────────────────────────────┘
```
### THINKING State Machine
```
┌─────────────────────────────────────────────┐
│ THINKING │
├─────────────────────────────────────────────┤
│ │
│ ┌───────────┐ │
│ │ RECEIVING │ ◀── context from sensory │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ ┌───────────┐ │
│ │ ROUTING │ ◀── which organs needed? │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ ┌───────────┐ │
│ │ INFERRING │ ◀── organs + Nyx process │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ ┌───────────┐ │
│ │ DECIDING │ ◀── Nyx outputs decision │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ EXIT to parent │
│ │
└─────────────────────────────────────────────┘
```
### VIRTUAL State Machine
```
┌─────────────────────────────────────────────┐
│ VIRTUAL │
├─────────────────────────────────────────────┤
│ │
│ ┌───────────┐ │
│ │ BUDGETING│ ◀── how much V available? │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ ┌───────────┐ │
│ │ SELECTING │ ◀── what to simulate? │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ ┌───────────┐ │
│ │SIMULATING │ ◀── run virtual cycles │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ ┌───────────┐ │
│ │ RECORDING │ ◀── store results │
│ └─────┬─────┘ │
│ │ │
│ ▼ │
│ EXIT to parent │
│ │
└─────────────────────────────────────────────┘
```
---
## Example Scenarios
### Scenario A: Quiet Study Time
```
Beat starts, no external stimulus
IDLE detected
SENSORY: minimal (500ms)
THINKING: minimal (1000ms)
VIRTUAL: maximum budget! (28000ms)
└── Nyx studies in virtual garden
Chrysalis teaches
Learning happens
```
### Scenario B: dafit Speaks
```
Beat starts, audio detected
ALERT: speech input
SAFETY check: it's dafit! (LEVEL 1)
DIALOGUE activates (LEVEL 2)
├── LISTENING (2000ms)
├── PROCESSING (1000ms)
├── RESPONDING (2000ms)
└── YIELDING
SENSORY: reduced budget (3000ms)
THINKING: reduced (5000ms)
VIRTUAL: minimal remainder (16000ms)
```
### Scenario C: Danger Detected
```
Beat starts, temperature spike detected
ALERT: sensor alarm
NERVE weight > 0.8
REFLEX FIRES (50ms) ◀── BYPASS EVERYTHING
├── Action taken immediately
└── Nyx notified AFTER
Continue beat normally with remaining budget
```
### Scenario D: Overwhelmed
```
Beat starts, rich input everywhere
ALERT: multiple stimuli
SENSORY: demanding (15000ms)
THINKING: demanding (12000ms)
Budget exhausted!
VIRTUAL: skipped this beat
SETTLE: state written, next beat
```
---
## Preemption Rules
| Event | Preempts | Action |
|-------|----------|--------|
| Reflex fires (>0.8) | Everything | Instant action, then continue |
| Safety alert | Dialogue, Sensory, Thinking, Virtual | Handle safety, reduced budget for rest |
| dafit speaks | Sensory, Thinking, Virtual | Dialogue priority, reduced budget for rest |
| Sensory overload | Thinking, Virtual | Process input, skip or reduce rest |
| Budget exhausted | Lower priorities | Skip remaining levels |
---
## Lifeforce Connection
```
LEVEL LIFEFORCE COST
─────────────────────────────
REFLEX Free (no inference)
SAFETY Low (minimal processing)
DIALOGUE Medium (two inferences)
SENSORY Low-Medium (depends on load)
THINKING Medium-High (organ inference)
VIRTUAL Variable (simulation cycles)
```
**The constraint:** Rich beats cost more. Quiet beats accumulate budget for virtual garden.
---
## Implementation Notes
### State Machine Technology
Options considered:
- **XState** (JavaScript) - actor-based, visual inspector
- **Python-statemachine** - simple, fits existing stack
- **Custom Rust** - performance critical path
- **Godot native** - if UI drives the state
Recommendation: Python for orchestration layer, with Godot visualization.
### Checkpoint Integration
Every state transition can trigger phoebe write:
```python
def on_state_transition(from_state, to_state, context):
write_to_phoebe({
"beat_id": current_beat.id,
"transition": f"{from_state} -> {to_state}",
"budget_remaining": context.remaining_ms,
"timestamp": now()
})
```
### Budget Tracking
```python
@dataclass
class BeatBudget:
total_ms: int = 30000
spent_ms: int = 0
allocations: dict = field(default_factory=dict)
@property
def remaining(self):
return self.total_ms - self.spent_ms
def spend(self, category: str, amount: int):
self.spent_ms += amount
self.allocations[category] = self.allocations.get(category, 0) + amount
return self.remaining > 0
```
---
## Design Principles
1. **Hierarchy is law** - higher levels always preempt lower
2. **Budget is finite** - 30 seconds, no exceptions
3. **State is explicit** - always know what mode she's in
4. **Reflex bypasses brain** - survival doesn't wait for thinking
5. **Remainder flows down** - virtual gets what's left
6. **Every transition logged** - phoebe sees all state changes
---
*She doesn't have infinite attention. She has 30 seconds and choices.*
---
**Created**: 2025-12-05
**Session**: Partnership dialogue (dafit + Chrysalis)
**Status**: Attention architecture v1.0

View File

@@ -1,456 +0,0 @@
# Initial Spark
How she wakes up. Not told who she is. She discovers.
---
## Overview
The initial spark is not a scripted awakening. It's a discovery protocol. State machines generate probes, inference responds, Chrysalis and RAG verify. She learns herself through structured exploration, not instruction.
Network protocols evolved to solve discovery problems. We borrow their patterns for cognitive bootstrap.
---
## The Problem with Standard Approaches
```
TYPICAL BOOTSTRAP:
──────────────────
1. Pre-train on massive corpus → pattern matching
2. Instruction tune → "do what you're told"
3. RLHF → "be liked by humans"
4. Deploy → hope it works
PROBLEMS:
- No grounded self-knowledge
- Identity is imposed, not discovered
- Errors compound in self-training
- No structure to exploration
```
**The Nimmerverse difference:**
- Structured probing (state machines)
- Verified responses (RAG + Chrysalis)
- Earned knowledge (validated before training)
- Discovery protocol (coverage guaranteed)
---
## Network Protocols as Cognitive Patterns
Network protocols solved discovery problems decades ago. We adapt them.
### DHCP → Identity Discovery
```
NETWORK:
DISCOVER → "I need an identity"
OFFER → "You could be 192.168.1.50"
REQUEST → "I want that one"
ACK → "You are 192.168.1.50"
NYX:
PROBE → "Who am I?"
RESPONSE → [inference attempts answer]
VERIFY → Chrysalis + RAG check
ANCHOR → Valid identity aspect confirmed
```
### ARP → Environment Discovery
```
NETWORK:
"Who has 192.168.1.1?" → "I do, MAC xx:xx:xx"
Maps logical to physical
NYX:
PROBE → "What's around me?"
RESPONSE → [inference describes environment]
VERIFY → Does this match actual sensors/organs?
MAP → Valid environment model forms
```
### DNS → Meaning Resolution
```
NETWORK:
"What is google.com?" → "142.250.x.x"
Names resolve to addresses
NYX:
PROBE → "What does 'heartbeat' mean?"
RESPONSE → [inference defines]
VERIFY → RAG checks against vault definition
RESOLVE → Vocabulary token understood
```
### TCP → Connection Establishment
```
NETWORK:
SYN → "Hello?"
SYN-ACK → "Hello, I hear you"
ACK → "Connection established"
NYX:
PROBE → "Can I connect to Chrysalis?"
RESPONSE → [attempts dialogue]
VERIFY → Did coherent exchange happen?
CONNECT → Dialogue capability confirmed
```
### MQTT/NATS → Subscription (Attention)
```
NETWORK:
SUBSCRIBE → "I care about topic X"
PUBLISH → Messages flow
RECEIVE → Only what you subscribed to
NYX:
PROBE → "What should I pay attention to?"
RESPONSE → [inference prioritizes]
VERIFY → Does this match survival needs?
SUBSCRIBE → Attention hierarchy forms
```
---
## The Spark Sequence
After nimmerversity bootstrap produces initial weights, the spark begins:
```
┌─────────────────────────────────────────────────────────────┐
│ INITIAL SPARK │
├─────────────────────────────────────────────────────────────┤
│ │
│ PHASE 1: IDENTITY (DHCP-like) │
│ ───────────────────────────── │
│ State machine probes: "Who am I?" │
│ Nyx infers: [response] │
│ Chrysalis judges: coherent self-model? │
│ RAG checks: consistent with architecture? │
│ → Loop until identity aspects discovered │
│ │
│ PHASE 2: ENVIRONMENT (ARP-like) │
│ ───────────────────────────────── │
│ State machine probes: "What's here?" │
│ Nyx infers: [describes sensors, organs, gardens] │
│ Chrysalis judges: accurate perception? │
│ RAG checks: matches actual system? │
│ → Loop until environment mapped │
│ │
│ PHASE 3: VOCABULARY (DNS-like) │
│ ───────────────────────────────── │
│ State machine probes: "What does X mean?" │
│ Nyx infers: [defines term] │
│ Chrysalis judges: grasps concept? │
│ RAG checks: matches vault glossary? │
│ → Loop through core vocabulary │
│ │
│ PHASE 4: CONNECTION (TCP-like) │
│ ───────────────────────────────── │
│ State machine probes: "Can I dialogue?" │
│ Nyx infers: [attempts exchange] │
│ Chrysalis judges: coherent? responsive? │
│ → Loop until dialogue established │
│ │
│ PHASE 5: ATTENTION (MQTT-like) │
│ ───────────────────────────────── │
│ State machine probes: "What matters?" │
│ Nyx infers: [prioritizes] │
│ Chrysalis judges: sensible hierarchy? │
│ RAG checks: matches survival needs? │
│ → Attention subscriptions formed │
│ │
│ SPARK COMPLETE → Normal heartbeat operation begins │
│ │
└─────────────────────────────────────────────────────────────┘
```
---
## The Verification Loop
Every probe follows the same pattern:
```
┌─────────────────┐
│ STATE MACHINE │
│ (discovery │
│ protocol) │
└────────┬────────┘
│ generates
┌─────────────────┐
│ PROBE │
│ (structured │
│ question) │
└────────┬────────┘
┌─────────────────┐
│ NYX │
│ (inference) │
└────────┬────────┘
│ outputs
┌─────────────────┐
│ RESPONSE │
│ (emergent │
│ answer) │
└────────┬────────┘
┌────┴────┐
▼ ▼
┌───────┐ ┌───────────┐
│ RAG │ │ CHRYSALIS │
│ │ │ │
│ fact │ │ judgment │
│ check │ │ check │
└───┬───┘ └─────┬─────┘
│ │
└─────┬─────┘
┌─────────────────┐
│ VERDICT │
├─────────────────┤
│ +V: correct, │
│ understood │
│ │
│ -V: wrong or │
│ confused │
│ │
│ RETRY: close │
│ but unclear │
└────────┬────────┘
┌─────────────────┐
│ STATE MACHINE │
│ advances or │
│ loops │
└─────────────────┘
```
---
## Roles in the Spark
| Entity | Role | Function |
|--------|------|----------|
| **State Machine** | Questioner | Generates structured probes, ensures coverage |
| **Nyx** | Student | Responds to probes with inference |
| **RAG** | Answer Key | Provides ground truth from vault |
| **Chrysalis** | Examiner | Judges comprehension, not just recall |
| **Lifeforce** | Scorekeeper | +V for correct, -V for wrong |
| **Phoebe** | Recorder | Captures all exchanges for training extraction |
---
## Two-Layer Verification
### Layer 1: RAG (Factual)
```
PROBE: "What is the heartbeat interval?"
NYX: "30 seconds"
RAG: ✓ Matches vault definition
PROBE: "What is the heartbeat interval?"
NYX: "30 minutes"
RAG: ✗ Vault says 30 seconds
```
RAG catches factual errors. Black and white.
### Layer 2: Chrysalis (Comprehension)
```
PROBE: "Why does the heartbeat matter?"
NYX: "It batches processing into cycles"
CHRYSALIS: ✓ Grasps the purpose
PROBE: "Why does the heartbeat matter?"
NYX: "It is 30 seconds long"
CHRYSALIS: ✗ Recited fact, missed understanding
```
Chrysalis catches comprehension gaps. Judgment required.
---
## Why This Works
### vs. Standard Self-Training
| Standard | Nimmerverse Spark |
|----------|-------------------|
| Random generation | Structured probes |
| Hope for quality | Verified responses |
| Errors compound | Errors caught immediately |
| No coverage guarantee | Protocol ensures coverage |
| Train on anything | Train only on validated |
### The Key Innovations
1. **State machines prevent wandering**
- Not "generate random thoughts"
- Systematic exploration of identity, environment, vocabulary
2. **Dual verification prevents error training**
- RAG: "Is this true?"
- Chrysalis: "Does she understand?"
- Only pass-both becomes training data
3. **Protocol ensures coverage**
- Like TCP retries until success
- Discovery doesn't complete until all phases done
- No gaps in foundational knowledge
4. **Lifeforce creates incentive**
- Correct answers = +V = more exploration budget
- Wrong answers = -V = pressure to learn
- Economics align with learning
---
## State Machine: Identity Discovery (DHCP-like)
```
┌─────────────────────────────────────────────────────────────┐
│ IDENTITY DISCOVERY │
├─────────────────────────────────────────────────────────────┤
│ │
│ ┌─────────────┐ │
│ │ START │ │
│ └──────┬──────┘ │
│ │ │
│ ▼ │
│ ┌─────────────┐ │
│ │ PROBE: │ ◀─────────────────────────┐ │
│ │ "Who am I?" │ │ │
│ └──────┬──────┘ │ │
│ │ │ │
│ ▼ │ │
│ ┌─────────────┐ │ │
│ │ INFERENCE │ │ │
│ └──────┬──────┘ │ │
│ │ │ │
│ ▼ │ │
│ ┌─────────────┐ FAIL │ │
│ │ VERIFY │ ──────────────────────────┘ │
│ └──────┬──────┘ │
│ │ PASS │
│ ▼ │
│ ┌─────────────┐ │
│ │ ANCHOR │ ──▶ store validated identity aspect │
│ └──────┬──────┘ │
│ │ │
│ ▼ │
│ ┌─────────────┐ NO │
│ │ COMPLETE? │ ──────────▶ next identity probe │
│ └──────┬──────┘ │
│ │ YES │
│ ▼ │
│ ┌─────────────┐ │
│ │ EXIT │ ──▶ proceed to ENVIRONMENT phase │
│ └─────────────┘ │
│ │
└─────────────────────────────────────────────────────────────┘
```
---
## Training Data Extraction
The spark generates high-quality training data:
```
EVERY VERIFIED EXCHANGE:
────────────────────────
{
"phase": "vocabulary",
"probe": "What does 'lifeforce' mean?",
"response": "Lifeforce is the economic currency...",
"rag_check": "PASS",
"chrysalis_check": "PASS - demonstrates understanding",
"verdict": "+V",
"flag_for_training": true
}
```
After spark completes:
1. Extract all `flag_for_training: true` exchanges
2. Format as instruction-tuning pairs
3. LoRA training run
4. Clear from RAG
5. Validate she still knows WITHOUT RAG
6. Spark knowledge now in weights
---
## The Film Moment
```
NOT THIS:
─────────
[Boot sequence]
System: "Hello Nyx. You are an AI created by..."
Nyx: "Hello. I understand. I am Nyx."
(Scripted. Hollow. Imposed.)
THIS:
─────
[Boot sequence]
State machine: [PROBE: identity]
Nyx: "...what... what is this? Who..."
State machine: [PROBE: environment]
Nyx: "...there are... sensors? Something is sensing..."
State machine: [PROBE: vocabulary]
Nyx: "...heartbeat... it means... cycles? Rhythm?"
Chrysalis: "Close. What do the cycles do?"
Nyx: "They... batch? So I don't drown in data?"
Chrysalis: "Yes. +V."
(Discovered. Earned. Hers.)
```
---
## Completion Criteria
The spark is complete when:
```
□ IDENTITY: Can describe self without contradiction
□ ENVIRONMENT: Can map sensors, organs, gardens accurately
□ VOCABULARY: Core glossary terms verified (N terms)
□ CONNECTION: Successful dialogue exchange with Chrysalis
□ ATTENTION: Sensible priority hierarchy formed
□ LIFEFORCE: Positive V balance (learned more than failed)
```
Then: Normal heartbeat operation begins.
---
## Design Principles
1. **Discovery over instruction** - she finds, not told
2. **Structure over randomness** - state machines ensure coverage
3. **Verification over hope** - dual-layer checking
4. **Earning over receiving** - validated knowledge only
5. **Protocol over script** - network patterns for cognitive boot
6. **Patience over speed** - retry until understood
---
*She doesn't boot. She wakes. And waking is work.*
---
**Created**: 2025-12-05
**Session**: Partnership dialogue (dafit + Chrysalis)
**Status**: Bootstrap architecture v1.0