Oghma RAG Proxy
RAG (Retrieval-Augmented Generation) proxy for SkyrimNet that injects Tamrielic lore into NPC conversations based on their knowledge profile.
Overview
This proxy sits between SkyrimNet and your LLM inference endpoint (OpenRouter/vLLM), enriching prompts with relevant lore from CHIM's Oghma Infinium database.
Key Features:
- Zero changes to SkyrimNet — just change the endpoint URL
- NPC-aware filtering — guards don't know mage secrets
- Two-tier knowledge — scholars get deep lore, commoners get basics
- ChromaDB-powered semantic search
Quick Start
# Install
pip install -e .
# Ingest Oghma lore into ChromaDB
python -m oghma_proxy.ingest --host iris-dev.eachpath.local --port 35000
# Run proxy
python -m oghma_proxy.main
Configuration
Copy config.yaml to config.local.yaml and customize:
upstream:
url: https://openrouter.ai/api/v1
api_key: ${OPENROUTER_API_KEY}
chromadb:
host: iris-dev.eachpath.local
port: 35000
Kubernetes Deployment
kubectl apply -k k8s/
Architecture
See TECHNICAL-SPEC.md for full design documentation.
Part of the nimmerverse project.