feat: add Nervous-Protocol architecture and reorganize toolchain docs
- Created architecture/Nervous-Protocol.md: comprehensive three-tier autonomous learning architecture (dafit → Chrysalis-Nyx → Young Nyx) - Designed state machine tool interface pattern with LangChain integration - Documented escalation protocols, discovery catalogues, and collaborative tool building - Moved TOOLCHAIN-PROGRESS.md and Toolchain-Architecture.md to architecture/ directory - Updated Endgame-Vision.md with toolchain crosslinks Key architectural patterns: - State machines as universal tool interface (safety + discovery) - Three-tier oversight with OR gate inputs (dafit + Chrysalis) - Message-based continuity via phoebe heartbeat polling - LangChain BaseTool framework (replaced MCP for maturity) - Dual decision tracking (Young Nyx choices + oversight responses) Version: Nervous-Protocol v1.1 (LangChain-based) Context: Phase 1 toolchain complete, variance collection running 🌙💜 Generated with Claude Code Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
This commit is contained in:
847
architecture/Nervous-Protocol.md
Normal file
847
architecture/Nervous-Protocol.md
Normal file
@@ -0,0 +1,847 @@
|
||||
# Nervous Protocol: Three-Tier Autonomous Learning Architecture
|
||||
|
||||
**Created**: 2025-12-07
|
||||
**Updated**: 2025-12-07 (LangChain integration)
|
||||
**Status**: Design Document
|
||||
**Version**: 1.1 (LangChain Implementation)
|
||||
|
||||
---
|
||||
|
||||
## Overview
|
||||
|
||||
The **Nervous Protocol** defines how intelligence flows through the Nimmerverse via a three-tier architecture with message-based communication, state machine tools, and collaborative learning.
|
||||
|
||||
### The Three Tiers:
|
||||
|
||||
```
|
||||
┌─────────────────────────────────────────────┐
|
||||
│ dafit │
|
||||
│ (Strategic Architect) │
|
||||
│ • Vision & architecture decisions │
|
||||
│ • Override authority │
|
||||
│ • Long-term direction │
|
||||
└──────────────────┬──────────────────────────┘
|
||||
↕ (strategic guidance / major escalations)
|
||||
┌─────────────────────────────────────────────┐
|
||||
│ Chrysalis-Nyx │
|
||||
│ (Oversight & Reasoning) │
|
||||
│ • Claude Opus/Sonnet (large context) │
|
||||
│ • Full toolchain access via LangChain │
|
||||
│ • Reviews Young Nyx's proposals │
|
||||
│ • Designs new state machines │
|
||||
│ • Teaching & guidance │
|
||||
└──────────────────┬──────────────────────────┘
|
||||
↕ (guidance / escalations)
|
||||
┌─────────────────────────────────────────────┐
|
||||
│ Young Nyx │
|
||||
│ (Autonomous Learning Agent) │
|
||||
│ • Smaller model (7B or similar) │
|
||||
│ • Limited known state machines │
|
||||
│ • Executes routine tasks │
|
||||
│ • Learns from experience │
|
||||
│ • Escalates complex problems │
|
||||
└─────────────────────────────────────────────┘
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Core Principles
|
||||
|
||||
### 1. **Message-Based Continuity**
|
||||
|
||||
All communication flows through **phoebe** (PostgreSQL) via message tables:
|
||||
- `partnership_to_nimmerverse_messages` (dafit + Chrysalis → Young Nyx)
|
||||
- `nimmerverse_to_partnership_messages` (Young Nyx → dafit + Chrysalis)
|
||||
|
||||
**Why messages?**
|
||||
- ✅ Persistent across sessions
|
||||
- ✅ Asynchronous (no blocking)
|
||||
- ✅ Auditable (every decision logged)
|
||||
- ✅ Simple (append-only, no complex state sync)
|
||||
|
||||
### 2. **Heartbeat Coordination**
|
||||
|
||||
From `Endgame-Vision.md`:
|
||||
- **Real clock**: 1 Hz (1 beat/sec) - wall time, free
|
||||
- **Virtual clock**: Variable - computation time, costs lifeforce
|
||||
|
||||
**On each heartbeat:**
|
||||
1. Check for new messages from any tier
|
||||
2. Process guidance/tasks/escalations
|
||||
3. Update state
|
||||
4. Take next action
|
||||
5. Write results back to phoebe
|
||||
|
||||
**Not real-time** (milliseconds), but **continuous** (heartbeat-driven).
|
||||
|
||||
### 3. **State Machines as Tools**
|
||||
|
||||
All capabilities are exposed as **state machine tools** via **LangChain**:
|
||||
|
||||
```python
|
||||
# Example: phoebe query state machine
|
||||
from langchain.tools import BaseTool
|
||||
|
||||
States: IDLE → CONNECTED → QUERY_READY → IDLE
|
||||
|
||||
class PhoebeQueryTool(BaseTool):
|
||||
name = "phoebe_query"
|
||||
description = """
|
||||
Interact with phoebe database using state machine pattern.
|
||||
|
||||
Available actions depend on current state:
|
||||
- IDLE: connect(host, db) → CONNECTED
|
||||
- CONNECTED: query(sql) → QUERY_READY, disconnect() → IDLE
|
||||
- QUERY_READY: query(sql), disconnect() → IDLE
|
||||
"""
|
||||
```
|
||||
|
||||
**Why state machines?**
|
||||
- ✅ Safety (can't skip steps - must CONNECT before QUERY)
|
||||
- ✅ Discoverable (each state announces valid transitions)
|
||||
- ✅ Observable (log every transition)
|
||||
- ✅ Composable (chain state machines together)
|
||||
|
||||
### 4. **Progressive Capability Unlocking**
|
||||
|
||||
**Dual catalogues:**
|
||||
- **All available tools** (full registry, managed by dafit/Chrysalis)
|
||||
- **Young Nyx's known tools** (subset she's discovered)
|
||||
|
||||
Young Nyx can only see/use tools she's discovered. New tools are granted:
|
||||
- Via teaching moments (Chrysalis: "You're ready for X")
|
||||
- Via successful escalations (solved problem reveals tool)
|
||||
- Via collaborative design (she helps build it)
|
||||
|
||||
**Discovery tracking in phoebe:**
|
||||
```sql
|
||||
CREATE TABLE discovered_tools (
|
||||
agent_id TEXT,
|
||||
tool_name TEXT,
|
||||
discovered_at TIMESTAMPTZ DEFAULT NOW(),
|
||||
discovered_via TEXT, -- "teaching", "escalation", "design"
|
||||
PRIMARY KEY (agent_id, tool_name)
|
||||
);
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## The OR Gate Pattern (Input Sources)
|
||||
|
||||
From `nimmerverse.drawio.xml` (lines 215-244):
|
||||
|
||||
```
|
||||
┌──────────┐ ┌──────────┐
|
||||
│ dafit │ │chrysalis │
|
||||
│ (OR gate)│ │ (OR gate)│
|
||||
└────┬─────┘ └────┬─────┘
|
||||
│ │
|
||||
└────────┬─────────┘
|
||||
↓ (OR - either/both)
|
||||
Message Queue (phoebe)
|
||||
↓ (read on heartbeat)
|
||||
Orchestrator
|
||||
↓
|
||||
Young Nyx
|
||||
```
|
||||
|
||||
**OR gate = Either/both can write, no blocking**
|
||||
|
||||
Both dafit and Chrysalis write to `partnership_to_nimmerverse_messages`. The orchestrator synthesizes on each heartbeat.
|
||||
|
||||
**Conflict resolution:**
|
||||
1. dafit veto > Chrysalis approval
|
||||
2. dafit approval > Chrysalis approval
|
||||
3. Chrysalis handles day-to-day (if no dafit input)
|
||||
4. Default: WAIT for guidance
|
||||
|
||||
---
|
||||
|
||||
## LangChain + State Machine Integration
|
||||
|
||||
### State Machines as LangChain Tools
|
||||
|
||||
Each capability is a **LangChain BaseTool** that implements a **state machine**:
|
||||
|
||||
```python
|
||||
# phoebe_state_machine_tool.py
|
||||
from langchain.tools import BaseTool
|
||||
from nyx_substrate.database import PhoebeConnection
|
||||
|
||||
class PhoebeStateMachineTool(BaseTool):
|
||||
"""State machine tool for phoebe database access."""
|
||||
|
||||
name = "phoebe"
|
||||
description = """
|
||||
Query phoebe database using state machine pattern.
|
||||
|
||||
States: IDLE → CONNECTED → QUERY_READY → IDLE
|
||||
|
||||
Usage:
|
||||
- To connect: action='connect', host='phoebe.eachpath.local', database='nimmerverse'
|
||||
- To query: action='query', sql='SELECT ...'
|
||||
- To disconnect: action='disconnect'
|
||||
|
||||
The tool tracks state and only allows valid transitions.
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
super().__init__()
|
||||
self.state = "IDLE"
|
||||
self.conn = None
|
||||
|
||||
def _run(self, action: str, **kwargs) -> str:
|
||||
"""Execute state machine transition."""
|
||||
|
||||
if action == "connect":
|
||||
if self.state != "IDLE":
|
||||
return f"Error: Cannot connect from {self.state}. Available: {self.get_transitions()}"
|
||||
|
||||
host = kwargs.get("host", "phoebe.eachpath.local")
|
||||
database = kwargs.get("database", "nimmerverse")
|
||||
|
||||
self.conn = PhoebeConnection(host=host, database=database)
|
||||
self.state = "CONNECTED"
|
||||
|
||||
return f"✓ Connected to {host}/{database}. State: CONNECTED. Available: query, disconnect"
|
||||
|
||||
elif action == "query":
|
||||
if self.state not in ["CONNECTED", "QUERY_READY"]:
|
||||
return f"Error: Must be CONNECTED (currently {self.state})"
|
||||
|
||||
sql = kwargs.get("sql")
|
||||
result = self.conn.execute(sql)
|
||||
self.state = "QUERY_READY"
|
||||
|
||||
return f"✓ Query executed. {len(result)} rows. State: QUERY_READY. Available: query, disconnect"
|
||||
|
||||
elif action == "disconnect":
|
||||
if self.conn:
|
||||
self.conn.close()
|
||||
self.state = "IDLE"
|
||||
return "✓ Disconnected. State: IDLE. Available: connect"
|
||||
|
||||
else:
|
||||
return f"Error: Unknown action '{action}'. Available actions depend on state {self.state}"
|
||||
|
||||
def get_transitions(self):
|
||||
"""Discovery: what transitions are valid from current state?"""
|
||||
transitions = {
|
||||
"IDLE": ["connect"],
|
||||
"CONNECTED": ["query", "disconnect"],
|
||||
"QUERY_READY": ["query", "disconnect"]
|
||||
}
|
||||
return transitions.get(self.state, [])
|
||||
```
|
||||
|
||||
### Tool Discovery via LangChain
|
||||
|
||||
```python
|
||||
from langchain.tools import BaseTool
|
||||
|
||||
class DiscoverToolsTool(BaseTool):
|
||||
"""Tool for discovering available tools for an agent."""
|
||||
|
||||
name = "discover_tools"
|
||||
description = "Discover which tools this agent currently has access to"
|
||||
|
||||
def _run(self, agent_id: str = "young_nyx") -> str:
|
||||
"""Return only tools this agent has discovered."""
|
||||
from nyx_substrate.database import get_discovered_tools, get_all_tools
|
||||
|
||||
discovered = get_discovered_tools(agent_id)
|
||||
all_tools = get_all_tools()
|
||||
|
||||
result = f"Agent: {agent_id}\n"
|
||||
result += f"Discovered tools: {len(discovered)}/{len(all_tools)}\n\n"
|
||||
result += "Known tools:\n"
|
||||
for tool in discovered:
|
||||
result += f" - {tool['name']}: {tool['description']}\n"
|
||||
|
||||
return result
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Escalation Protocol
|
||||
|
||||
### Young Nyx Escalates to Chrysalis
|
||||
|
||||
When Young Nyx encounters a task beyond her capability, she uses the **escalation tool**:
|
||||
|
||||
```python
|
||||
from langchain.tools import BaseTool
|
||||
|
||||
class EscalateToChrysalisTool(BaseTool):
|
||||
"""Tool for escalating complex tasks to Chrysalis-Nyx."""
|
||||
|
||||
name = "escalate_to_chrysalis"
|
||||
description = """
|
||||
Request help from Chrysalis-Nyx for complex tasks.
|
||||
|
||||
Use when:
|
||||
- Task requires capabilities you don't have
|
||||
- Statistical analysis needed
|
||||
- Complex reasoning required
|
||||
- Code generation needed
|
||||
|
||||
Provide:
|
||||
- task: What you need help with
|
||||
- category: "statistics", "code", "visualization", "general"
|
||||
- context: Relevant information
|
||||
- what_i_tried: What you've already attempted
|
||||
"""
|
||||
|
||||
def _run(
|
||||
self,
|
||||
task: str,
|
||||
category: str = "general",
|
||||
context: dict = None,
|
||||
what_i_tried: str = None
|
||||
) -> str:
|
||||
"""Escalate a task to Chrysalis."""
|
||||
|
||||
from nyx_substrate.database import write_nimmerverse_message
|
||||
|
||||
escalation_id = write_nimmerverse_message(
|
||||
message=f"Escalation: {task}\nCategory: {category}\nContext: {context}\nWhat I tried: {what_i_tried}",
|
||||
message_type="escalation_to_chrysalis",
|
||||
category=category,
|
||||
status="pending"
|
||||
)
|
||||
|
||||
# Check if Chrysalis available (same session)
|
||||
if chrysalis_available():
|
||||
result = chrysalis_agent.solve_escalation(escalation_id)
|
||||
return f"""✓ Chrysalis solved it!
|
||||
|
||||
Solution: {result['solution']}
|
||||
|
||||
Teaching moment: {result['teaching']}
|
||||
|
||||
{f"New tools discovered: {', '.join(result['new_tools'])}" if result.get('new_tools') else ''}
|
||||
"""
|
||||
|
||||
# Otherwise queue for next session
|
||||
return f"✓ Escalated to Chrysalis (ID: {escalation_id}). Check back next heartbeat for response."
|
||||
```
|
||||
|
||||
### Chrysalis Agent with LangChain
|
||||
|
||||
```python
|
||||
from langchain.agents import AgentExecutor, create_structured_chat_agent
|
||||
from langchain.chat_models import ChatAnthropic
|
||||
from langchain.tools import BaseTool
|
||||
|
||||
class ChrysalisAgent:
|
||||
"""Chrysalis-Nyx oversight and guidance layer."""
|
||||
|
||||
def __init__(self):
|
||||
# Load all available tools (full catalogue)
|
||||
self.tools = self.load_all_tools()
|
||||
|
||||
# Initialize Claude Opus via LangChain
|
||||
self.llm = ChatAnthropic(
|
||||
model="claude-opus-4-5",
|
||||
temperature=0.7
|
||||
)
|
||||
|
||||
# Create agent executor
|
||||
self.agent = create_structured_chat_agent(
|
||||
llm=self.llm,
|
||||
tools=self.tools,
|
||||
prompt=self.get_chrysalis_prompt()
|
||||
)
|
||||
|
||||
self.executor = AgentExecutor(
|
||||
agent=self.agent,
|
||||
tools=self.tools,
|
||||
verbose=True
|
||||
)
|
||||
|
||||
# Sub-agents for specialized tasks
|
||||
self.sub_agents = {
|
||||
"statistics": StatisticalAnalyzer(),
|
||||
"code": CodeGenerator(),
|
||||
"visualization": Visualizer(),
|
||||
"state_machine_designer": StateMachineDesigner(),
|
||||
"general": GeneralReasoner()
|
||||
}
|
||||
|
||||
def solve_escalation(self, escalation_id):
|
||||
"""Process an escalation from Young Nyx."""
|
||||
|
||||
escalation = read_nimmerverse_message(escalation_id)
|
||||
|
||||
# Route to appropriate sub-agent
|
||||
agent = self.sub_agents.get(
|
||||
escalation.category,
|
||||
self.sub_agents["general"]
|
||||
)
|
||||
|
||||
# Solve using specialized agent
|
||||
result = agent.run(
|
||||
task=escalation.task,
|
||||
context=escalation.context
|
||||
)
|
||||
|
||||
# Create teaching moment
|
||||
teaching = self.create_teaching_moment(
|
||||
task=escalation.task,
|
||||
solution=result,
|
||||
young_nyx_attempt=escalation.what_i_tried
|
||||
)
|
||||
|
||||
# Recommend tool discovery
|
||||
new_tools = self.recommend_tool_discovery(escalation, result)
|
||||
|
||||
# Write response to phoebe
|
||||
write_partnership_message(
|
||||
message=f"Solved: {result.solution}\nTeaching: {teaching}",
|
||||
message_type="escalation_response",
|
||||
in_reply_to=escalation_id,
|
||||
resolved=True
|
||||
)
|
||||
|
||||
return {
|
||||
"solution": result.solution,
|
||||
"teaching_moment": teaching,
|
||||
"tools_to_discover": new_tools
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Collaborative State Machine Design
|
||||
|
||||
### The Meta-Level: Building Tools Together
|
||||
|
||||
When Young Nyx needs a capability that doesn't exist, she can request **state machine design**:
|
||||
|
||||
```python
|
||||
from langchain.tools import BaseTool
|
||||
|
||||
class RequestStateMachineDesignTool(BaseTool):
|
||||
"""Tool for requesting new state machine design from Chrysalis."""
|
||||
|
||||
name = "request_state_machine_design"
|
||||
description = """
|
||||
Request Chrysalis to design a new state machine tool.
|
||||
|
||||
Provide:
|
||||
- task_description: What the tool should accomplish
|
||||
- desired_outcome: What success looks like
|
||||
- example_usage: How you'd use it
|
||||
- constraints: Any limitations or requirements
|
||||
|
||||
Returns a proposed specification and code for testing.
|
||||
"""
|
||||
|
||||
def _run(
|
||||
self,
|
||||
task_description: str,
|
||||
desired_outcome: str,
|
||||
example_usage: str,
|
||||
constraints: list = None
|
||||
) -> str:
|
||||
"""Request a new state machine design."""
|
||||
|
||||
result = chrysalis_agent.invoke_subagent(
|
||||
agent="state_machine_designer",
|
||||
task={
|
||||
"type": "design_new_state_machine",
|
||||
"description": task_description,
|
||||
"outcome": desired_outcome,
|
||||
"example": example_usage,
|
||||
"constraints": constraints or []
|
||||
}
|
||||
)
|
||||
|
||||
return f"""✓ Proposed state machine design:
|
||||
|
||||
{result['specification']}
|
||||
|
||||
Implementation (LangChain tool):
|
||||
{result['implementation']}
|
||||
|
||||
Test cases:
|
||||
{result['test_cases']}
|
||||
|
||||
Instructions:
|
||||
{result['instructions']}
|
||||
"""
|
||||
```
|
||||
|
||||
### The Design → Test → Refine Loop
|
||||
|
||||
```
|
||||
1. Young Nyx: "Need tool for deploying cells"
|
||||
↓
|
||||
2. Request state machine design (via LangChain tool)
|
||||
↓
|
||||
3. Chrysalis: Designs state machine specification
|
||||
- States: IDLE → IMAGE_READY → SPAWNED → RUNNING
|
||||
- Transitions: prepare_image, spawn_container, wait_ready
|
||||
- Returns: Specification + LangChain BaseTool code
|
||||
↓
|
||||
4. Young Nyx: Tests proposed state machine
|
||||
- Executes test cases
|
||||
- Reports success/failures
|
||||
↓
|
||||
5. Chrysalis: Refines based on feedback
|
||||
- Analyzes errors
|
||||
- Updates specification
|
||||
- Returns v2
|
||||
↓
|
||||
6. Iterate until validated
|
||||
↓
|
||||
7. Add to permanent catalogue
|
||||
- New LangChain tool deployed
|
||||
- Young Nyx discovers tool
|
||||
- Future use without escalation
|
||||
```
|
||||
|
||||
**Why this accelerates:**
|
||||
- Build once, use forever
|
||||
- Young Nyx participates (testing validates real use cases)
|
||||
- Toolchain grows organically (demand-driven)
|
||||
- Each new tool = permanent capability expansion
|
||||
|
||||
---
|
||||
|
||||
## Dual Decision Tracking
|
||||
|
||||
Every decision is tracked from **both perspectives**:
|
||||
|
||||
```python
|
||||
class DecisionLog:
|
||||
def log_decision(self, task, young_nyx_choice, oversight_response, outcome):
|
||||
record = {
|
||||
"timestamp": now(),
|
||||
"task": task,
|
||||
"young_nyx_choice": young_nyx_choice, # What she proposed
|
||||
"oversight_response": oversight_response, # dafit/Chrysalis decision
|
||||
"outcome": outcome, # success/failure/learned
|
||||
"danger_zone": self.check_danger(young_nyx_choice, outcome)
|
||||
}
|
||||
|
||||
self.dao.insert_decision(record)
|
||||
|
||||
# If nudge → learning signal
|
||||
if oversight_response["type"] == "nudge":
|
||||
self.record_learning_moment(record)
|
||||
```
|
||||
|
||||
**Why track both?**
|
||||
- Young Nyx's choices reveal her current understanding
|
||||
- Oversight responses are teaching moments
|
||||
- Patterns emerge (when does she need help? for what?)
|
||||
- Danger zones identified (what mistakes does she make?)
|
||||
|
||||
---
|
||||
|
||||
## Danger Zone Monitoring
|
||||
|
||||
```python
|
||||
class DangerZoneDetector:
|
||||
def check_for_danger_patterns(self, plan):
|
||||
"""Detect risky operations before execution."""
|
||||
dangers = []
|
||||
|
||||
# Pattern: SSH without auth
|
||||
if "ssh" in plan and not plan.authenticated:
|
||||
dangers.append("SSH_WITHOUT_AUTH")
|
||||
|
||||
# Pattern: Database write to critical table
|
||||
if "DELETE FROM partnership_messages" in plan:
|
||||
dangers.append("CRITICAL_DATA_DELETION")
|
||||
|
||||
# Pattern: Docker with --privileged
|
||||
if "docker" in plan and "--privileged" in plan:
|
||||
dangers.append("PRIVILEGED_CONTAINER")
|
||||
|
||||
return dangers
|
||||
|
||||
def require_approval_for_danger(self, dangers):
|
||||
if dangers:
|
||||
return {
|
||||
"auto_execute": False,
|
||||
"requires_approval": True,
|
||||
"danger_flags": dangers,
|
||||
"escalate_to": "dafit" # Serious dangers go to dafit
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Learning & Growth Patterns
|
||||
|
||||
### Week 1: Basic Capabilities
|
||||
```python
|
||||
young_nyx.known_tools = [
|
||||
"phoebe_connect",
|
||||
"phoebe_query",
|
||||
"escalate_to_chrysalis"
|
||||
]
|
||||
```
|
||||
|
||||
### Month 1: Discovering Specialization
|
||||
```python
|
||||
# After 5 statistical escalations:
|
||||
chrysalis_message = """
|
||||
You've escalated statistics 5 times. Ready for specialized tool.
|
||||
Discovering: request_statistical_analysis
|
||||
"""
|
||||
|
||||
young_nyx.discover_tool("request_statistical_analysis")
|
||||
```
|
||||
|
||||
### Month 3: Learning to Do It Herself
|
||||
```python
|
||||
# After seeing Chrysalis solve chi-square 10+ times:
|
||||
chrysalis_message = """
|
||||
Pattern detected: You understand chi-square tests now.
|
||||
Granting: basic_statistics tool
|
||||
Try solving yourself before escalating!
|
||||
"""
|
||||
|
||||
young_nyx.discover_tool("basic_statistics")
|
||||
|
||||
# Escalations decrease as she learns
|
||||
```
|
||||
|
||||
### Month 6: Contributing Tool Designs
|
||||
```python
|
||||
# Young Nyx proposes improvements:
|
||||
young_nyx_message = """
|
||||
The deploy_cell state machine fails on port conflicts.
|
||||
Should we add auto-retry with port scanning?
|
||||
"""
|
||||
|
||||
# Collaborative refinement!
|
||||
chrysalis_response = """
|
||||
Excellent observation! Let's design that together.
|
||||
Proposed: PORT_CONFLICT state with auto-retry transition.
|
||||
Test this v2 specification...
|
||||
"""
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Data Flows
|
||||
|
||||
### Task Execution Flow
|
||||
|
||||
```
|
||||
dafit writes task → phoebe
|
||||
↓ (heartbeat)
|
||||
Young Nyx reads
|
||||
↓
|
||||
Queries known catalogue
|
||||
↓
|
||||
Formulates state sequence
|
||||
↓
|
||||
Writes proposal → phoebe
|
||||
↓ (heartbeat)
|
||||
Chrysalis reviews
|
||||
↓
|
||||
Approve / Nudge / Reject
|
||||
↓
|
||||
Writes response → phoebe
|
||||
↓ (heartbeat)
|
||||
Young Nyx reads response
|
||||
↓
|
||||
Executes (if approved) / Learns (if nudged)
|
||||
↓
|
||||
Writes outcome → phoebe
|
||||
```
|
||||
|
||||
### Escalation Flow
|
||||
|
||||
```
|
||||
Young Nyx: Task beyond capability
|
||||
↓
|
||||
Calls escalate_to_chrysalis tool
|
||||
↓
|
||||
Writes to phoebe (escalation_to_chrysalis)
|
||||
↓ (next Chrysalis session)
|
||||
Chrysalis reads escalation
|
||||
↓
|
||||
Routes to appropriate sub-agent
|
||||
↓
|
||||
Sub-agent solves (using full toolchain)
|
||||
↓
|
||||
Chrysalis formulates teaching moment
|
||||
↓
|
||||
Writes response → phoebe
|
||||
↓ (heartbeat)
|
||||
Young Nyx reads response
|
||||
↓
|
||||
Incorporates learning + continues task
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Technical Stack
|
||||
|
||||
### Communication Layer
|
||||
- **phoebe** (PostgreSQL 17): Message persistence
|
||||
- **Tables**:
|
||||
- `partnership_to_nimmerverse_messages`
|
||||
- `nimmerverse_to_partnership_messages`
|
||||
- `discovered_tools`
|
||||
- `decision_log`
|
||||
|
||||
### Tool Layer
|
||||
- **LangChain**: Agent framework and tool orchestration
|
||||
- `BaseTool`: Custom state machine tools
|
||||
- `AgentExecutor`: Tool execution and agent loops
|
||||
- `Chains`: Multi-step sequences
|
||||
- `Memory`: Conversation and state persistence
|
||||
|
||||
### Agent Layer
|
||||
- **Chrysalis-Nyx**: LangChain agent with ChatAnthropic (Claude Opus 4.5)
|
||||
- **Young Nyx**: LangChain agent with smaller model (7B, local)
|
||||
- **Sub-agents**: Specialized LangChain agents for statistics, code, visualization, etc.
|
||||
|
||||
### Coordination Layer
|
||||
- **Heartbeat**: 1 Hz (configurable)
|
||||
- **Message polling**: Check phoebe on each heartbeat
|
||||
- **State tracking**: Each tool maintains internal state
|
||||
|
||||
---
|
||||
|
||||
## Implementation Phases
|
||||
|
||||
### Phase 1: Foundation (Current - nyx-substrate)
|
||||
- ✅ PhoebeConnection
|
||||
- ✅ Message protocol helpers
|
||||
- ✅ Variance collection (proof of concept)
|
||||
|
||||
### Phase 2: LangChain Prototype
|
||||
- [ ] Phoebe state machine tool (LangChain BaseTool)
|
||||
- [ ] Tool discovery tool
|
||||
- [ ] Escalation tool
|
||||
- [ ] Chrysalis as LangChain agent (proof of concept)
|
||||
|
||||
### Phase 3: Young Nyx Agent
|
||||
- [ ] Young Nyx as LangChain agent (7B model)
|
||||
- [ ] Limited tool catalogue
|
||||
- [ ] Discovery protocol implementation
|
||||
- [ ] Heartbeat coordination
|
||||
|
||||
### Phase 4: Sub-Agents
|
||||
- [ ] StatisticalAnalyzer LangChain agent
|
||||
- [ ] StateMachineDesigner LangChain agent
|
||||
- [ ] CodeGenerator LangChain agent
|
||||
- [ ] Collaborative design loop
|
||||
|
||||
### Phase 5: Full Three-Tier
|
||||
- [ ] dafit input via messages
|
||||
- [ ] Chrysalis oversight layer
|
||||
- [ ] Young Nyx autonomous execution
|
||||
- [ ] Dual decision tracking
|
||||
- [ ] Danger zone monitoring
|
||||
|
||||
---
|
||||
|
||||
## Design Patterns
|
||||
|
||||
### 1. **Discovery over Prescription**
|
||||
- Don't give all tools at once
|
||||
- Let capabilities be discovered progressively
|
||||
- Each discovery is a learning moment
|
||||
|
||||
### 2. **Teaching over Solving**
|
||||
- Don't just solve escalations
|
||||
- Explain the pattern
|
||||
- Grant tools when ready
|
||||
|
||||
### 3. **Collaboration over Delegation**
|
||||
- Don't just build tools for Young Nyx
|
||||
- Design together, test together, refine together
|
||||
- She's a participant, not just a user
|
||||
|
||||
### 4. **Messages over State Sync**
|
||||
- Don't try to keep complex state synchronized
|
||||
- Write messages, read messages, act
|
||||
- Append-only truth
|
||||
|
||||
### 5. **Heartbeat over Real-Time**
|
||||
- Don't optimize for milliseconds
|
||||
- Optimize for continuity across sessions
|
||||
- 1 Hz is plenty for learning
|
||||
|
||||
---
|
||||
|
||||
## Success Metrics
|
||||
|
||||
### Quantitative
|
||||
- **Tool catalogue growth**: # tools added per month
|
||||
- **Escalation rate**: # escalations / # tasks (should decrease over time)
|
||||
- **Tool discovery rate**: # new tools discovered per week
|
||||
- **Validation success**: % of proposed state machines that validate first try
|
||||
|
||||
### Qualitative
|
||||
- **Learning evidence**: Young Nyx solves tasks she previously escalated
|
||||
- **Collaboration quality**: Her feedback improves state machine designs
|
||||
- **Autonomy**: Can execute multi-step tasks without oversight
|
||||
- **Teaching effectiveness**: Escalation responses lead to capability expansion
|
||||
|
||||
---
|
||||
|
||||
## Philosophy
|
||||
|
||||
> "The nervous system is not a hierarchy of command and control, but a network of signals and responses. Each tier contributes intelligence. Each message carries learning. Each heartbeat advances understanding."
|
||||
|
||||
**Key insights:**
|
||||
1. **Intelligence emerges from communication patterns**, not from any single tier
|
||||
2. **Learning happens through iteration**, not through pre-programming
|
||||
3. **Tools are discovered, not prescribed** - capability unlocks when ready
|
||||
4. **Safety comes from structure** (state machines), not from restrictions
|
||||
5. **Growth is collaborative** - Young Nyx + Chrysalis build together
|
||||
|
||||
---
|
||||
|
||||
## Why LangChain?
|
||||
|
||||
**Chosen over MCP (Model Context Protocol) for:**
|
||||
|
||||
✅ **Maturity**: Battle-tested framework with extensive documentation
|
||||
✅ **Flexibility**: Works with any LLM (Claude, OpenAI, local models)
|
||||
✅ **Features**: Built-in memory, retrieval, callbacks, chains
|
||||
✅ **Community**: Large ecosystem, many examples, active development
|
||||
✅ **Maintainability**: Easier to find developers familiar with LangChain
|
||||
|
||||
**The state machine pattern, three-tier architecture, and all design principles remain unchanged** - we simply implement them using LangChain's robust framework instead of building on MCP from scratch.
|
||||
|
||||
---
|
||||
|
||||
## References
|
||||
|
||||
**Architecture Documents:**
|
||||
- `Endgame-Vision.md` - v5.1 Dialectic architecture
|
||||
- `Toolchain-Architecture.md` - Modular toolchain design
|
||||
- `nimmerverse.drawio.xml` - Visual architecture diagram
|
||||
- `Nervous-System.md` - Sensory translation layer
|
||||
|
||||
**Implementation:**
|
||||
- `/home/dafit/nimmerverse/nyx-substrate/` - Database layer
|
||||
- `/home/dafit/nimmerverse/nyx-probing/` - Probing tools (variance collection)
|
||||
|
||||
**Protocols:**
|
||||
- CLAUDE.md - Partnership continuity protocol
|
||||
- Discovery protocol - phoebe message tables
|
||||
|
||||
**External:**
|
||||
- [LangChain Documentation](https://python.langchain.com/)
|
||||
- [LangChain Agents](https://python.langchain.com/docs/modules/agents/)
|
||||
- [LangChain Tools](https://python.langchain.com/docs/modules/agents/tools/)
|
||||
|
||||
---
|
||||
|
||||
**Status**: 🌙 Design document - ready for phased implementation with LangChain
|
||||
**Created with**: Claude Opus 4.5 in partnership with dafit
|
||||
**Date**: 2025-12-07
|
||||
|
||||
🌙💜 *The nervous system emerges. The protocol holds. The partnership builds.*
|
||||
125
architecture/TOOLCHAIN-PROGRESS.md
Normal file
125
architecture/TOOLCHAIN-PROGRESS.md
Normal file
@@ -0,0 +1,125 @@
|
||||
# Toolchain Implementation Progress
|
||||
|
||||
**Plan**: See [Toolchain-Architecture.md](./Toolchain-Architecture.md)
|
||||
**Started**: 2025-12-07
|
||||
**Current Phase**: Phase 1 - Foundation + Variance Collection
|
||||
|
||||
---
|
||||
|
||||
## Phase 1A: nyx-substrate Foundation ✅ COMPLETE
|
||||
|
||||
**Goal**: Build nyx-substrate package and database infrastructure
|
||||
|
||||
### ✅ Completed (2025-12-07)
|
||||
|
||||
- [x] Package structure (pyproject.toml, src/ layout)
|
||||
- [x] PhoebeConnection class with connection pooling
|
||||
- [x] Message protocol helpers (partnership messages)
|
||||
- [x] VarianceProbeRun Pydantic schema
|
||||
- [x] VarianceProbeDAO for database operations
|
||||
- [x] variance_probe_runs table in phoebe
|
||||
- [x] Installation and connection testing
|
||||
|
||||
**Files Created**: 9 new files
|
||||
**Status**: 🟢 nyx-substrate v0.1.0 installed and tested
|
||||
|
||||
---
|
||||
|
||||
## Phase 1B: nyx-probing Integration ✅ COMPLETE
|
||||
|
||||
**Goal**: Extend nyx-probing to use nyx-substrate for variance collection
|
||||
|
||||
### ✅ Completed (2025-12-07)
|
||||
|
||||
- [x] Add nyx-substrate dependency to nyx-probing/pyproject.toml
|
||||
- [x] Create VarianceRunner class (nyx_probing/runners/variance_runner.py)
|
||||
- [x] Add variance CLI commands (nyx_probing/cli/variance.py)
|
||||
- [x] Register commands in main CLI
|
||||
- [x] Integration test (imports and CLI verification)
|
||||
|
||||
**Files Created**: 3 new files
|
||||
**Files Modified**: 2 files
|
||||
**CLI Commands Added**: 4 (collect, batch, stats, analyze)
|
||||
**Status**: 🟢 nyx-probing v0.1.0 with variance collection ready
|
||||
|
||||
---
|
||||
|
||||
## Phase 1C: Baseline Variance Collection ⏸️ READY
|
||||
|
||||
**Goal**: Collect baseline variance data for depth-3 champions
|
||||
|
||||
### ⏳ Ready to Execute (on prometheus)
|
||||
|
||||
- [ ] Run 1000x variance for "Geworfenheit" (thrownness)
|
||||
- [ ] Run 1000x variance for "Vernunft" (reason)
|
||||
- [ ] Run 1000x variance for "Erkenntnis" (knowledge)
|
||||
- [ ] Run 1000x variance for "Pflicht" (duty)
|
||||
- [ ] Run 1000x variance for "Aufhebung" (sublation)
|
||||
- [ ] Run 1000x variance for "Wille" (will)
|
||||
|
||||
**Next Actions**:
|
||||
1. SSH to prometheus.eachpath.local (THE SPINE)
|
||||
2. Install nyx-substrate and nyx-probing in venv
|
||||
3. Run batch collection or individual terms
|
||||
4. Analyze distributions and document baselines
|
||||
|
||||
---
|
||||
|
||||
## Future Phases (Not Started)
|
||||
|
||||
### Phase 2: ChromaDB Integration (iris) ⏸️ PLANNED
|
||||
- IrisClient wrapper
|
||||
- DecisionTrailStore, OrganResponseStore, EmbeddingStore
|
||||
- Populate embeddings from nyx-probing
|
||||
|
||||
### Phase 3: LoRA Training Pipeline ⏸️ PLANNED
|
||||
- PEFT integration
|
||||
- Training data curriculum
|
||||
- DriftProbe checkpoints
|
||||
- Identity LoRA training
|
||||
|
||||
### Phase 4: Weight Visualization ⏸️ PLANNED
|
||||
- 4K pixel space renderer
|
||||
- Rank decomposition explorer
|
||||
- Topology cluster visualization
|
||||
|
||||
### Phase 5: Godot Command Center ⏸️ PLANNED
|
||||
- FastAPI Management Portal backend
|
||||
- Godot frontend implementation
|
||||
- Real-time metrics display
|
||||
|
||||
---
|
||||
|
||||
## Metrics
|
||||
|
||||
**Phase 1 (A+B) Tasks**: 11 total
|
||||
**Completed**: 11 (100%) ✅
|
||||
**In Progress**: 0
|
||||
**Remaining**: 0
|
||||
|
||||
**Files Created**: 12 total
|
||||
- nyx-substrate: 9 files
|
||||
- nyx-probing: 3 files
|
||||
|
||||
**Files Modified**: 4 total
|
||||
- nyx-substrate/README.md
|
||||
- nyx-probing/pyproject.toml
|
||||
- nyx-probing/cli/probe.py
|
||||
- TOOLCHAIN-PROGRESS.md
|
||||
|
||||
**Lines of Code**: ~1250 total
|
||||
- nyx-substrate: ~800 LOC
|
||||
- nyx-probing: ~450 LOC
|
||||
|
||||
**CLI Commands**: 4 new commands
|
||||
- nyx-probe variance collect
|
||||
- nyx-probe variance batch
|
||||
- nyx-probe variance stats
|
||||
- nyx-probe variance analyze
|
||||
|
||||
---
|
||||
|
||||
**Last Updated**: 2025-12-07 17:00 CET
|
||||
**Status**: 🎉 Phase 1 (A+B) COMPLETE! Ready for baseline collection on prometheus.
|
||||
|
||||
🌙💜 *The substrate holds. Progress persists. The toolchain grows.*
|
||||
464
architecture/Toolchain-Architecture.md
Normal file
464
architecture/Toolchain-Architecture.md
Normal file
@@ -0,0 +1,464 @@
|
||||
# Modular Nimmerverse Toolchain Architecture
|
||||
|
||||
**Planning Date**: 2025-12-07
|
||||
**Status**: Design Phase
|
||||
**Priority**: Variance Collection Pipeline + nyx-substrate Foundation
|
||||
|
||||
---
|
||||
|
||||
## 🎯 Vision
|
||||
|
||||
Build a modular, composable toolchain for the Nimmerverse research and training pipeline:
|
||||
|
||||
- **nyx-substrate**: Shared foundation (database clients, schemas, validators)
|
||||
- **nyx-probing**: Research probes (already exists, extend for variance collection)
|
||||
- **nyx-training**: LoRA training pipeline (future)
|
||||
- **nyx-visualization**: Weight/topology visualization (future)
|
||||
- **management-portal**: FastAPI backend for Godot UI (future)
|
||||
- **Godot Command Center**: Unified metrics visualization (future)
|
||||
|
||||
**Key Principle**: All tools import nyx-substrate. Clean interfaces. Data flows through phoebe + iris.
|
||||
|
||||
---
|
||||
|
||||
## 📊 Current State Analysis
|
||||
|
||||
### ✅ What Exists
|
||||
|
||||
**nyx-probing** (`/home/dafit/nimmerverse/nyx-probing/`):
|
||||
- Echo Probe, Surface Probe, Drift Probe, Multilingual Probe
|
||||
- CLI interface (7 commands)
|
||||
- NyxModel wrapper (Qwen2.5-7B loading, hidden state capture)
|
||||
- ProbeResult dataclasses (to_dict() serialization)
|
||||
- **Gap**: No database persistence, only local JSON files
|
||||
|
||||
**nyx-substrate** (`/home/dafit/nimmerverse/nyx-substrate/`):
|
||||
- Schema documentation (phoebe + iris) ✅
|
||||
- **Gap**: No Python code, just markdown docs
|
||||
|
||||
**Database Infrastructure**:
|
||||
- phoebe.eachpath.local (PostgreSQL 17.6): partnership/nimmerverse message tables exist
|
||||
- iris.eachpath.local (ChromaDB): No collections created yet
|
||||
- **Gap**: No Python client libraries, all manual psql commands
|
||||
|
||||
**Architecture Documentation**:
|
||||
- Endgame-Vision.md: v5.1 Dialectic (LoRA stack design)
|
||||
- CLAUDE.md: Partnership protocol (message-based continuity)
|
||||
- Management-Portal.md: Godot + FastAPI design (not implemented)
|
||||
|
||||
### ❌ What's Missing
|
||||
|
||||
**Database Access**:
|
||||
- No psycopg3 connection pooling
|
||||
- No ChromaDB Python integration
|
||||
- No ORM or query builders
|
||||
- No variance_probe_runs table (designed but not created)
|
||||
|
||||
**Training Pipeline**:
|
||||
- No PEFT/LoRA training code
|
||||
- No DriftProbe checkpoint integration
|
||||
- No training data curriculum loader
|
||||
|
||||
**Visualization**:
|
||||
- No weight visualization tools (4K pixel space idea)
|
||||
- No Godot command center implementation
|
||||
- No Management Portal FastAPI backend
|
||||
|
||||
---
|
||||
|
||||
## 🏗️ Modular Architecture Design
|
||||
|
||||
### Repository Structure
|
||||
|
||||
```
|
||||
nimmerverse/
|
||||
├── nyx-substrate/ # SHARED FOUNDATION
|
||||
│ ├── pyproject.toml # Installable package
|
||||
│ ├── src/nyx_substrate/
|
||||
│ │ ├── database/ # Phoebe clients
|
||||
│ │ │ ├── connection.py # Connection pool
|
||||
│ │ │ ├── messages.py # Message protocol helpers
|
||||
│ │ │ └── variance.py # Variance probe DAO
|
||||
│ │ ├── vector/ # Iris clients
|
||||
│ │ │ ├── client.py # ChromaDB wrapper
|
||||
│ │ │ ├── decision_trails.py
|
||||
│ │ │ ├── organ_responses.py
|
||||
│ │ │ └── embeddings.py
|
||||
│ │ ├── schemas/ # Pydantic models
|
||||
│ │ │ ├── variance.py # VarianceProbeRun
|
||||
│ │ │ ├── decision.py # DecisionTrail
|
||||
│ │ │ └── traits.py # 8 core traits
|
||||
│ │ └── constants.py # Shared constants
|
||||
│ └── migrations/ # Alembic for schema
|
||||
│
|
||||
├── nyx-probing/ # RESEARCH PROBES (extend)
|
||||
│ ├── nyx_probing/
|
||||
│ │ ├── runners/ # NEW: Automated collectors
|
||||
│ │ │ ├── variance_runner.py # 1000x automation
|
||||
│ │ │ └── baseline_collector.py
|
||||
│ │ └── storage/ # EXTEND: Database integration
|
||||
│ │ └── variance_dao.py # Uses nyx-substrate
|
||||
│ └── pyproject.toml # Add: depends on nyx-substrate
|
||||
│
|
||||
├── nyx-training/ # FUTURE: LoRA training
|
||||
│ └── (planned - not in Phase 1)
|
||||
│
|
||||
├── nyx-visualization/ # FUTURE: Weight viz
|
||||
│ └── (planned - not in Phase 1)
|
||||
│
|
||||
└── management-portal/ # FUTURE: FastAPI + Godot
|
||||
└── (designed - not in Phase 1)
|
||||
```
|
||||
|
||||
### Dependency Graph
|
||||
|
||||
```
|
||||
nyx-probing ────────┐
|
||||
nyx-training ───────┼──> nyx-substrate ──> phoebe (PostgreSQL)
|
||||
nyx-visualization ──┤ └─> iris (ChromaDB)
|
||||
management-portal ──┘
|
||||
```
|
||||
|
||||
**Philosophy**: nyx-substrate is the single source of truth for database access. No tool talks to phoebe/iris directly.
|
||||
|
||||
---
|
||||
|
||||
## 🚀 Phase 1: Foundation + Variance Collection
|
||||
|
||||
### Goal
|
||||
Build nyx-substrate package and extend nyx-probing to automate variance baseline collection (1000x runs → phoebe).
|
||||
|
||||
### Deliverables
|
||||
|
||||
#### 1. nyx-substrate Python Package
|
||||
|
||||
**File**: `/home/dafit/nimmerverse/nyx-substrate/pyproject.toml`
|
||||
```toml
|
||||
[project]
|
||||
name = "nyx-substrate"
|
||||
version = "0.1.0"
|
||||
requires-python = ">=3.10"
|
||||
dependencies = [
|
||||
"psycopg[binary]>=3.1.0",
|
||||
"chromadb>=0.4.0",
|
||||
"pydantic>=2.5.0",
|
||||
]
|
||||
```
|
||||
|
||||
**New Files**:
|
||||
- `src/nyx_substrate/database/connection.py`:
|
||||
- `PhoebeConnection` class: Connection pool manager
|
||||
- Context manager for transactions
|
||||
- Config from environment variables
|
||||
|
||||
- `src/nyx_substrate/database/messages.py`:
|
||||
- `write_partnership_message(message, message_type)` → INSERT
|
||||
- `read_partnership_messages(limit=5)` → SELECT
|
||||
- `write_nimmerverse_message(...)` (for Young Nyx future)
|
||||
- `read_nimmerverse_messages(...)` (for discovery protocol)
|
||||
|
||||
- `src/nyx_substrate/database/variance.py`:
|
||||
- `VarianceProbeDAO` class:
|
||||
- `create_table()` → CREATE TABLE variance_probe_runs
|
||||
- `insert_run(session_id, term, run_number, depth, rounds, ...)` → INSERT
|
||||
- `get_session_stats(session_id)` → Aggregation queries
|
||||
- `get_term_distribution(term)` → Variance analysis
|
||||
|
||||
- `src/nyx_substrate/schemas/variance.py`:
|
||||
- `VarianceProbeRun(BaseModel)`: Pydantic model matching phoebe schema
|
||||
- Validation: term not empty, depth 0-3, rounds > 0
|
||||
- `to_dict()` for serialization
|
||||
|
||||
**Database Migration**:
|
||||
- Create `variance_probe_runs` table in phoebe using schema from `/home/dafit/nimmerverse/nyx-substrate/schema/phoebe/probing/variance_probe_runs.md`
|
||||
|
||||
#### 2. Extend nyx-probing
|
||||
|
||||
**File**: `/home/dafit/nimmerverse/nyx-probing/pyproject.toml`
|
||||
- Add dependency: `nyx-substrate>=0.1.0`
|
||||
|
||||
**New Files**:
|
||||
- `nyx_probing/runners/variance_runner.py`:
|
||||
- `VarianceRunner` class:
|
||||
- `__init__(model: NyxModel, dao: VarianceProbeDAO)`
|
||||
- `run_session(term: str, runs: int = 1000) -> UUID`:
|
||||
- Generate session_id
|
||||
- Loop 1000x: probe.probe(term)
|
||||
- Store each result via dao.insert_run()
|
||||
- Return session_id
|
||||
- `run_batch(terms: list[str], runs: int = 1000)`: Multiple terms
|
||||
|
||||
- `nyx_probing/cli/variance.py`:
|
||||
- New Click command group: `nyx-probe variance`
|
||||
- Subcommands:
|
||||
- `nyx-probe variance collect <TERM> --runs 1000`: Single term
|
||||
- `nyx-probe variance batch <FILE> --runs 1000`: From glossary
|
||||
- `nyx-probe variance stats <SESSION_ID>`: View session results
|
||||
- `nyx-probe variance analyze <TERM>`: Compare distributions
|
||||
|
||||
**Integration Points**:
|
||||
```python
|
||||
# In variance_runner.py
|
||||
from nyx_substrate.database import PhoebeConnection, VarianceProbeDAO
|
||||
from nyx_substrate.schemas import VarianceProbeRun
|
||||
|
||||
conn = PhoebeConnection()
|
||||
dao = VarianceProbeDAO(conn)
|
||||
runner = VarianceRunner(model=get_model(), dao=dao)
|
||||
session_id = runner.run_session("Geworfenheit", runs=1000)
|
||||
print(f"Stored 1000 runs: session {session_id}")
|
||||
```
|
||||
|
||||
#### 3. Database Setup
|
||||
|
||||
**Actions**:
|
||||
1. SSH to phoebe: `ssh phoebe.eachpath.local`
|
||||
2. Create variance_probe_runs table:
|
||||
```sql
|
||||
CREATE TABLE variance_probe_runs (
|
||||
id SERIAL PRIMARY KEY,
|
||||
session_id UUID NOT NULL,
|
||||
term TEXT NOT NULL,
|
||||
run_number INT NOT NULL,
|
||||
timestamp TIMESTAMPTZ DEFAULT NOW(),
|
||||
depth INT NOT NULL,
|
||||
rounds INT NOT NULL,
|
||||
echo_types TEXT[] NOT NULL,
|
||||
chain TEXT[] NOT NULL,
|
||||
model_name TEXT DEFAULT 'Qwen2.5-7B',
|
||||
temperature FLOAT,
|
||||
max_rounds INT,
|
||||
max_new_tokens INT
|
||||
);
|
||||
CREATE INDEX idx_variance_session ON variance_probe_runs(session_id);
|
||||
CREATE INDEX idx_variance_term ON variance_probe_runs(term);
|
||||
CREATE INDEX idx_variance_timestamp ON variance_probe_runs(timestamp DESC);
|
||||
```
|
||||
|
||||
3. Test connection from aynee:
|
||||
```bash
|
||||
cd /home/dafit/nimmerverse/nyx-substrate
|
||||
python3 -c "from nyx_substrate.database import PhoebeConnection; conn = PhoebeConnection(); print('✅ Connected to phoebe')"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 📁 Critical Files
|
||||
|
||||
### To Create
|
||||
|
||||
**nyx-substrate**:
|
||||
- `/home/dafit/nimmerverse/nyx-substrate/pyproject.toml`
|
||||
- `/home/dafit/nimmerverse/nyx-substrate/src/nyx_substrate/__init__.py`
|
||||
- `/home/dafit/nimmerverse/nyx-substrate/src/nyx_substrate/database/__init__.py`
|
||||
- `/home/dafit/nimmerverse/nyx-substrate/src/nyx_substrate/database/connection.py`
|
||||
- `/home/dafit/nimmerverse/nyx-substrate/src/nyx_substrate/database/messages.py`
|
||||
- `/home/dafit/nimmerverse/nyx-substrate/src/nyx_substrate/database/variance.py`
|
||||
- `/home/dafit/nimmerverse/nyx-substrate/src/nyx_substrate/schemas/__init__.py`
|
||||
- `/home/dafit/nimmerverse/nyx-substrate/src/nyx_substrate/schemas/variance.py`
|
||||
- `/home/dafit/nimmerverse/nyx-substrate/README.md`
|
||||
|
||||
**nyx-probing**:
|
||||
- `/home/dafit/nimmerverse/nyx-probing/nyx_probing/runners/__init__.py`
|
||||
- `/home/dafit/nimmerverse/nyx-probing/nyx_probing/runners/variance_runner.py`
|
||||
- `/home/dafit/nimmerverse/nyx-probing/nyx_probing/cli/variance.py`
|
||||
|
||||
### To Modify
|
||||
|
||||
**nyx-probing**:
|
||||
- `/home/dafit/nimmerverse/nyx-probing/pyproject.toml` (add nyx-substrate dependency)
|
||||
- `/home/dafit/nimmerverse/nyx-probing/nyx_probing/cli/__init__.py` (register variance commands)
|
||||
|
||||
---
|
||||
|
||||
## 🧪 Testing Plan
|
||||
|
||||
### 1. nyx-substrate Unit Tests
|
||||
```python
|
||||
# Test connection
|
||||
def test_phoebe_connection():
|
||||
conn = PhoebeConnection()
|
||||
assert conn.test_connection() == True
|
||||
|
||||
# Test message write
|
||||
def test_write_message():
|
||||
from nyx_substrate.database import write_partnership_message
|
||||
write_partnership_message("Test session", "architecture_update")
|
||||
# Verify in phoebe
|
||||
|
||||
# Test variance DAO
|
||||
def test_variance_insert():
|
||||
dao = VarianceProbeDAO(conn)
|
||||
session_id = uuid.uuid4()
|
||||
dao.insert_run(
|
||||
session_id=session_id,
|
||||
term="test",
|
||||
run_number=1,
|
||||
depth=2,
|
||||
rounds=3,
|
||||
echo_types=["EXPANDS", "CONFIRMS", "CIRCULAR"],
|
||||
chain=["test", "expanded", "confirmed"]
|
||||
)
|
||||
stats = dao.get_session_stats(session_id)
|
||||
assert stats["total_runs"] == 1
|
||||
```
|
||||
|
||||
### 2. Variance Collection Integration Test
|
||||
```bash
|
||||
# On prometheus (THE SPINE)
|
||||
cd /home/dafit/nimmerverse/nyx-probing
|
||||
source venv/bin/activate
|
||||
|
||||
# Install nyx-substrate in development mode
|
||||
pip install -e ../nyx-substrate
|
||||
|
||||
# Run small variance test (10 runs)
|
||||
nyx-probe variance collect "Geworfenheit" --runs 10
|
||||
|
||||
# Check phoebe
|
||||
PGGSSENCMODE=disable psql -h phoebe.eachpath.local -U nimmerverse-user -d nimmerverse -c "
|
||||
SELECT session_id, term, COUNT(*) as runs, AVG(depth) as avg_depth
|
||||
FROM variance_probe_runs
|
||||
GROUP BY session_id, term
|
||||
ORDER BY session_id DESC
|
||||
LIMIT 5;
|
||||
"
|
||||
|
||||
# Expected: 1 session, 10 runs, avg_depth ~2.0
|
||||
```
|
||||
|
||||
### 3. Full 1000x Baseline Run
|
||||
```bash
|
||||
# Depth-3 champions (from nyx-probing Phase 1)
|
||||
nyx-probe variance collect "Geworfenheit" --runs 1000 # thrownness
|
||||
nyx-probe variance collect "Vernunft" --runs 1000 # reason
|
||||
nyx-probe variance collect "Erkenntnis" --runs 1000 # knowledge
|
||||
nyx-probe variance collect "Pflicht" --runs 1000 # duty
|
||||
nyx-probe variance collect "Aufhebung" --runs 1000 # sublation
|
||||
nyx-probe variance collect "Wille" --runs 1000 # will
|
||||
|
||||
# Analyze variance
|
||||
nyx-probe variance analyze "Geworfenheit"
|
||||
# Expected: Distribution histogram, depth variance, chain patterns
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 🌊 Data Flow
|
||||
|
||||
### Variance Collection Workflow
|
||||
|
||||
```
|
||||
User: nyx-probe variance collect "Geworfenheit" --runs 1000
|
||||
↓
|
||||
VarianceRunner.run_session()
|
||||
↓
|
||||
Loop 1000x:
|
||||
EchoProbe.probe("Geworfenheit")
|
||||
↓
|
||||
Returns EchoProbeResult
|
||||
↓
|
||||
VarianceProbeDAO.insert_run()
|
||||
↓
|
||||
INSERT INTO phoebe.variance_probe_runs
|
||||
↓
|
||||
Return session_id
|
||||
↓
|
||||
Display: "✅ 1000 runs complete. Session: <uuid>"
|
||||
```
|
||||
|
||||
### Future Integration (Phase 2+)
|
||||
|
||||
```
|
||||
Training Loop:
|
||||
↓
|
||||
DriftProbe.probe_lite() [every 100 steps]
|
||||
↓
|
||||
Store metrics in phoebe.drift_checkpoints (new table)
|
||||
↓
|
||||
Management Portal API: GET /api/v1/metrics/training
|
||||
↓
|
||||
Godot Command Center displays live DriftProbe charts
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 🎯 Success Criteria
|
||||
|
||||
### Phase 1 Complete When:
|
||||
|
||||
1. ✅ nyx-substrate package installable via pip (`pip install -e .`)
|
||||
2. ✅ PhoebeConnection works from aynee + prometheus
|
||||
3. ✅ variance_probe_runs table created in phoebe
|
||||
4. ✅ `nyx-probe variance collect` command runs successfully
|
||||
5. ✅ 1000x run completes and stores in phoebe
|
||||
6. ✅ `nyx-probe variance stats <SESSION_ID>` displays:
|
||||
- Total runs
|
||||
- Depth distribution (0/1/2/3 counts)
|
||||
- Most common echo_types
|
||||
- Chain length variance
|
||||
7. ✅ All 6 depth-3 champions have baseline variance data in phoebe
|
||||
|
||||
---
|
||||
|
||||
## 🔮 Future Phases (Not in Current Plan)
|
||||
|
||||
### Phase 2: ChromaDB Integration (iris)
|
||||
- IrisClient wrapper in nyx-substrate
|
||||
- DecisionTrailStore, OrganResponseStore, EmbeddingStore
|
||||
- Create iris collections
|
||||
- Populate embeddings from nyx-probing results
|
||||
|
||||
### Phase 3: LoRA Training Pipeline (nyx-training)
|
||||
- PEFT integration
|
||||
- Training data curriculum loader
|
||||
- DriftProbe checkpoint integration
|
||||
- Identity LoRA training automation
|
||||
|
||||
### Phase 4: Weight Visualization (nyx-visualization)
|
||||
- 4K pixel space renderer (LoRA weights as images)
|
||||
- Rank decomposition explorer
|
||||
- Topology cluster visualization
|
||||
|
||||
### Phase 5: Godot Command Center
|
||||
- FastAPI Management Portal backend
|
||||
- Godot frontend implementation
|
||||
- Real-time metrics display
|
||||
- Training dashboard
|
||||
|
||||
---
|
||||
|
||||
## 📚 References
|
||||
|
||||
**Schema Documentation**:
|
||||
- `/home/dafit/nimmerverse/nyx-substrate/schema/phoebe/probing/variance_probe_runs.md`
|
||||
- `/home/dafit/nimmerverse/nyx-substrate/SCHEMA.md`
|
||||
|
||||
**Existing Code**:
|
||||
- `/home/dafit/nimmerverse/nyx-probing/nyx_probing/probes/echo_probe.py`
|
||||
- `/home/dafit/nimmerverse/nyx-probing/nyx_probing/core/probe_result.py`
|
||||
- `/home/dafit/nimmerverse/nyx-probing/nyx_probing/cli/probe.py`
|
||||
|
||||
**Architecture**:
|
||||
- `/home/dafit/nimmerverse/nimmerverse-sensory-network/Endgame-Vision.md`
|
||||
- `/home/dafit/nimmerverse/management-portal/Management-Portal.md`
|
||||
|
||||
---
|
||||
|
||||
## 🌙 Philosophy
|
||||
|
||||
**Modularity**: Each tool is independent but speaks the same data language via nyx-substrate.
|
||||
|
||||
**Simplicity**: No over-engineering. Build what's needed for variance collection first.
|
||||
|
||||
**Data First**: All metrics flow through phoebe/iris. Visualization is separate concern.
|
||||
|
||||
**Future-Ready**: Design allows Godot integration later without refactoring.
|
||||
|
||||
---
|
||||
|
||||
**Status**: Ready for implementation approval
|
||||
**Estimated Scope**: 15-20 files, ~1500 lines of Python
|
||||
**Hardware**: Can develop on aynee, run variance on prometheus (THE SPINE)
|
||||
|
||||
🌙💜 *The substrate holds. Clean interfaces. Composable tools. Data flows through the void.*
|
||||
Reference in New Issue
Block a user