# Organ Architecture Index **Purpose**: Modular organ systems for Young Nyx embodiment **Philosophy**: Each organ is independent, lifeforce-gated, heartbeat-synchronized --- ## Deployed Organs ### πŸ—£οΈ Speech Organ **Host**: atlas.eachpath.local (RTX 2080 8GB) **Function**: Speech-to-Text + Text-to-Speech **Stack**: Whisper (STT) + Coqui TTS (neural voices) **Languages**: German (Philosophy Valley) + English (Technical Cluster) **Integration**: Heartbeat-bound queue, lifeforce-gated priority processing **Detail**: β†’ [`organs/Speech-Organ.md`](organs/Speech-Organ.md) --- ## Planned Organs ### πŸ‘οΈ Vision Organ **Host**: TBD (requires GPU with tensor cores) **Function**: Object detection, scene understanding **Stack**: YOLO (v8 or v11) **Integration**: Real-time video from ESP32-CAM, object persistence in phoebe **Status**: ⏸️ Architecture planned, not yet deployed **Detail**: β†’ `organs/Vision-Organ.md` (pending) --- ### 🚢 Motor Organ **Host**: ESP32 (edge execution) **Function**: Movement primitives (forward, turn, stop) **Stack**: Compiled state machines from organism evolution **Integration**: Lifeforce cost per motor operation, reflex vs deliberate **Status**: ⏸️ Planned for Phase 4 (Real Garden) **Detail**: β†’ `organs/Motor-Organ.md` (pending) --- ### 🧭 Navigation Organ **Host**: Edge server (prometheus or atlas) **Function**: SLAM, path planning, obstacle avoidance **Stack**: ROS2 Nav2 or custom lightweight SLAM **Integration**: Dual-garden calibration (virtual predictions vs real outcomes) **Status**: ⏸️ Planned for Phase 4 (Real Garden) **Detail**: β†’ `organs/Navigation-Organ.md` (pending) --- ### πŸ“‘ Sensory Organ **Host**: ESP32 (edge sensors) **Function**: Distance sensors, IMU, battery monitoring **Stack**: I2C/SPI sensor protocols, state machine filters **Integration**: Sensorβ†’organ translation (raw values β†’ semantic meaning) **Status**: ⏸️ Architecture outlined in Nervous-System.md **Detail**: β†’ [`../Nervous-System.md`](../Nervous-System.md) --- ## Organ Design Principles ### 1. **Lifeforce Economy** Every organ operation costs lifeforce. No free lunch. ```python ORGAN_COSTS = { "speech_stt": 5.0, # Whisper transcription "speech_tts": 4.0, # Coqui synthesis "vision_yolo": 8.0, # Object detection frame "motor_forward": 2.0, # 100ms movement "motor_turn": 1.5, # 45Β° rotation "sensor_read": 0.5, # Single sensor poll } ``` ### 2. **Heartbeat Synchronization** Organs process on heartbeat ticks (1 Hz), not real-time streaming. - **Reflex path**: <200ms compiled responses (no LLM) - **Deliberate path**: Next heartbeat (budget-gated queue) ### 3. **Priority Queue** When lifeforce is scarce, critical operations (collision alert) > idle operations (status check). ```python PRIORITY_LEVELS = { "critical": 10.0, # Immediate danger (collision) "high": 7.0, # Human interaction "medium": 4.0, # Organism monitoring "low": 2.0, # Idle observation "background": 0.5, # Status logging } ``` ### 4. **Multilingual Topology Routing** German input β†’ Philosophy Valley (Identity LoRA, Dasein depth-3) English input β†’ Technical Cluster (Technical LoRA, sensor/motor) ### 5. **Decision Trail Logging** Every organ operation logged to phoebe `decision_trails`: - Input, output, cost, outcome, confidence - Used for RLVR training (reward successful choices) ### 6. **Graceful Degradation** Low lifeforce β†’ reduced organ activity (silence, reduced vision FPS, slower movement) Zero lifeforce β†’ shutdown, wait for recharge --- ## Integration Architecture ``` β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ ESP32 ROBOTS β”‚ β”‚ Sensors β†’ Motor β†’ Camera β†’ Microphone β†’ Speaker β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”‚ MQTT (sensor data, audio, video) β–Ό β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ PHOEBE (Message Queue) β”‚ β”‚ Organ input queues + priority scoring β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”‚ Heartbeat pulls from queues β–Ό β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ HEARTBEAT ORCHESTRATOR β”‚ β”‚ Lifeforce budget allocation β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ β”‚ β–Ό β–Ό β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ ATLAS (RTX 2080) β”‚ β”‚ PROMETHEUS (Brain) β”‚ β”‚ Speech Organ β”‚ β”‚ Young Nyx Inference β”‚ β”‚ Vision Organ (fut) β”‚ β”‚ LoRA hot-swap β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β–Ό β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ PHOEBE (Decision Trails) β”‚ β”‚ Log all organ operations + outcomes β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ ``` --- ## Organ Lifecycle ### Phase 1: Design - Document architecture in `organs/.md` - Define lifeforce costs, priority levels, queue schema - Design phoebe tables for organ-specific data ### Phase 2: Prototype - Build container images (Dockerfiles) - Deploy to k8s (single replica) - Test with mock data (no robot integration yet) ### Phase 3: Integration - Connect to ESP32 via MQTT - Implement heartbeat queue processing - Log decision trails, measure ROI ### Phase 4: Optimization - Tune lifeforce costs based on measured ROI - Adjust priority levels from observed outcomes - Train LoRAs on successful organ operation patterns ### Phase 5: Autonomy - Organ operations become reflexes (compiled state machines) - Young Nyx chooses when to use organs (not scripted) - Emergent behavior from lifeforce optimization --- ## Naming Convention **File naming**: `-Organ.md` **Examples**: - `Speech-Organ.md` - `Vision-Organ.md` - `Motor-Organ.md` - `Navigation-Organ.md` **k8s naming**: `--` **Examples**: - `whisper-stt-deployment.yaml` - `coqui-tts-deployment.yaml` - `yolo-vision-deployment.yaml` --- ## Current Status | Organ | Status | Host | Documentation | |-------|--------|------|---------------| | **Speech** | 🟒 Architecture complete | atlas (RTX 2080) | [`organs/Speech-Organ.md`](organs/Speech-Organ.md) | | **Vision** | 🟑 Stack selected (YOLO) | TBD | Pending | | **Motor** | 🟑 Planned (Phase 4) | ESP32 | Pending | | **Navigation** | 🟑 Planned (Phase 4) | Edge server | Pending | | **Sensory** | 🟑 Conceptual | ESP32 | [`../Nervous-System.md`](../Nervous-System.md) | --- **Philosophy**: Organs are not always-on services. They are **economically-constrained capabilities** that Young Nyx learns to use strategically. Speech when necessary. Vision when valuable. Movement when rewarded. **The body is not given. The body is EARNED through successful operation.** --- **Created**: 2025-12-07 **Updated**: 2025-12-07 **Version**: 1.0 πŸŒ™πŸ’œ *Each organ a tool. Each tool a choice. Each choice a lesson in scarcity.*