Why Local-First Matters
Your code is your most sensitive asset. Here is why CHAOS-AI keeps everything local by default — and why it makes the system faster, not just safer.
The Problem with Cloud-First AI
Most AI coding tools transmit your entire codebase to remote servers. Every file, every function, every secret that slips into a config — all sent over the wire and processed on infrastructure you do not control.
For regulated industries, proprietary codebases, and security-conscious teams, this is a non-starter. But even for developers who are not in regulated industries, cloud-first creates a subtle problem: you are trusting a third party with your most valuable asset.
What Local-First Actually Means
CHAOS-AI runs entirely on your machine. The orchestration engine, context hydration layer, agent coordinator, session manager, and all three context databases — local. The only network calls are to the LLM API providers you explicitly configure, and even those are optional if you use Ollama.
Here is what stays local:
- Source code — never transmitted to CHAOS services
- Project context — stored in local SQLite and PostgreSQL databases
- Agent state — persisted on your filesystem in
.claude/agent-state/ - Session history — kept in local storage, exported on request
- API keys — stored in
.envfiles, never sent to our services
The Architecture
Your Machine
+------------------------------------------+
| CHAOS Orchestration Engine |
| +------------------------------------+ |
| | PM Dispatcher + EventBus | |
| | Context Layer (SQLite + Postgres) | |
| | Agent Coordinator (36 agents) | |
| | MCP Server (134 tools) | |
| +------------------------------------+ |
| | |
| LLM API calls only (your config) |
+--------------+---------------------------+
|
Cloud (API only — your choice)
Only LLM inference happens remotely. You choose the provider. You can route different agents to different providers, or route everything through Ollama and stay fully offline.
Provider Flexibility
Because CHAOS is provider-agnostic, you optimize for your constraints:
- Use Claude Code for reasoning-heavy architecture and long-context tasks
- Use GitHub Copilot CLI if you have an existing subscription
- Use Gemini CLI for multimodal and 1M-token context windows
- Use Ollama for sensitive code that must stay fully air-gapped
- Use OpenRouter to route across dozens of models with one config
Local-First is a Performance Feature
Privacy is obvious. What is less obvious: local-first makes CHAOS faster.
No round-trips to a cloud context service. No latency on shared memory lookups. No rate limits on your own context store. The three-database context layer (SQLite FTS5, Praxis Store, PostgreSQL knowledge graph) runs on your hardware at local disk speeds.
Context injection before each agent session takes milliseconds. Token savings are realized before the first API call is made.
The Bottom Line
Local-first is not a checkbox. It is a design constraint that shapes the entire system architecture. Every feature in CHAOS was built assuming your data belongs to you — not because it is required by regulation, but because it is the right default.
Your code deserves better than being uploaded to someone else's server.