llms.txt Content
# Orchex
> Describe what you want. Orchex plans, parallelizes, and executes — safely.
Orchex is a multi-LLM orchestration tool. It takes a natural language description of what you want to build, generates an execution plan with parallel streams, and coordinates multiple LLM providers to execute it — with ownership enforcement, self-healing, and learning.
## Key Features
- 6 LLM providers: Claude, OpenAI, Gemini, DeepSeek, Ollama, AWS Bedrock
- Automatic plan generation from natural language
- Parallel stream execution with dependency management
- File ownership enforcement (no merge conflicts)
- Self-healing: automatic error detection and fix stream generation
- Dynamic model registry: auto-discovers available models from providers daily
- Learning system: improves across executions
## MCP Tools
12 tools available via MCP: init, add_stream, status, execute, complete, recover, learn, init-plan, auto, reset-learning, rollback-stream, reload
## Quick Setup
Auto-configure Orchex for your IDE:
```
npx @wundam/orchex setup
```
Or manually add to your MCP config:
```json
{
"mcpServers": {
"orchex": {
"command": "npx",
"args": ["-y", "@wundam/orchex"]
}
}
}
```
## Links
- Documentation: https://orchex.dev/docs
- Pricing: https://orchex.dev/pricing
- npm: https://www.npmjs.com/package/@wundam/orchex
- MCP Registry: io.github.wundam/orchex