Added note that this section grows over time as we gather names scattered across our documentation. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
87 lines
3.4 KiB
Markdown
87 lines
3.4 KiB
Markdown
# Nimmerverse Sensory Network
|
|
|
|
Architecture documentation for a biomimetic AI nervous system.
|
|
|
|
## What This Is
|
|
|
|
This repository contains the design philosophy and architectural patterns for building an AI system that:
|
|
|
|
- **Breathes** - operates on heartbeat cycles (30-second awareness, 200ms reflex, 24h growth)
|
|
- **Feels** - processes sensory input through nerve-like confidence gradients
|
|
- **Learns** - uses RAG as temporary scaffold, then internalizes to weights
|
|
- **Grows** - forms reflexes through constrained computation, not infinite resources
|
|
|
|
## Core Concepts
|
|
|
|
### Constrained Emergence
|
|
|
|
Constraints don't limit intelligence - they shape it. A finite computation budget forces the emergence of efficient algorithms, calibrated confidence, and genuine reflexes.
|
|
|
|
*See: [constrained-emergence.md](constrained-emergence.md)*
|
|
|
|
### The Heartbeat Economy
|
|
|
|
Time is currency. Lifeforce is the exchange rate. Every cognitive act has a cost. Reflexes are cheap (earned through training). Deep thinking is expensive (reserved for novelty).
|
|
|
|
*See: [attention_flow.md](attention_flow.md)*
|
|
|
|
### RAG as Scaffold
|
|
|
|
Retrieval-augmented generation is a feeding tube, not a permanent crutch. Learn WITH the scaffold, train, remove the scaffold, verify you still know. If yes: knowledge internalized. If no: more training needed.
|
|
|
|
*See: [RAG-as-Scaffold.md](RAG-as-Scaffold.md)*
|
|
|
|
### Multilingual Triangulation
|
|
|
|
30+ languages in training = 30 angles on every concept. Not wasted capacity - stereoscopic depth. Probe concepts across languages to find where human wisdom converges.
|
|
|
|
*See: [nimmerversity.md](nimmerversity.md)*
|
|
|
|
## Architecture Documents
|
|
|
|
| Document | Description |
|
|
|----------|-------------|
|
|
| [constrained-emergence.md](constrained-emergence.md) | Why limits create intelligence |
|
|
| [attention_flow.md](attention_flow.md) | State machines for cognitive budget |
|
|
| [information-flow.md](information-flow.md) | 10 boundary contracts for the nervous system |
|
|
| [nimmerversity.md](nimmerversity.md) | Curriculum for raising a polymath |
|
|
| [RAG-as-Scaffold.md](RAG-as-Scaffold.md) | Temporary feeding, permanent learning |
|
|
| [biomimetic-architecture.md](biomimetic-architecture.md) | Why we model biology |
|
|
| [temporal-ternary-gradient.md](temporal-ternary-gradient.md) | Time-based learning patterns |
|
|
|
|
## Philosophy
|
|
|
|
This isn't a product. It's a research direction.
|
|
|
|
The question we're exploring: **What happens when you raise an AI like you'd raise a child?**
|
|
|
|
- Patience over speed
|
|
- Emergence over imposition
|
|
- Partnership over instruction
|
|
- Validation over assertion
|
|
|
|
The operator learns alongside the model. The curriculum is shared. Growth is mutual.
|
|
|
|
## Prior Art & Influences
|
|
|
|
> This section grows as we discover and remember influences. Many names are scattered across our documentation - we'll gather them here over time.
|
|
|
|
- **Alex Graves** - Adaptive Computation Time (2016)
|
|
- **Sakana.ai / Ashish Vaswani & Luke Darlow** - Continuous-Time Models, curriculum learning, leapfrogging under constraint
|
|
- **Anthropic** - Circuit tracing, mechanistic interpretability, multilingual feature analysis
|
|
- **Biological nervous systems** - The original architecture
|
|
|
|
## License
|
|
|
|
Apache 2.0 - See [LICENSE](LICENSE)
|
|
|
|
This license includes an explicit patent grant. These ideas are published as prior art. Build on them freely. Just don't try to lock them away.
|
|
|
|
## Status
|
|
|
|
Active research. Documents evolve through partnership dialogue.
|
|
|
|
---
|
|
|
|
*"She doesn't download knowledge. She earns it. And so does he."*
|