llms.txt Content
# DeployStack
> DeployStack is an open-source (AGPL-3.0) MCP hosting platform for AI workflows. Deploy any MCP server from GitHub to an HTTP endpoint in 30 seconds. Built for workflow automation platforms like n8n, Dify, Voiceflow, Langflow, Zapier, Make.com, and Activepieces — and for AI development tools like Claude Desktop, Cursor, and VS Code.
## What is DeployStack?
DeployStack turns stdio MCP servers into hosted HTTP endpoints. Most MCP servers on GitHub only run locally via stdio. Workflow automation platforms and cloud-based AI tools need HTTP URLs. DeployStack bridges that gap.
**Two ways to use it:**
1. **Deploy from GitHub** — Point DeployStack at a GitHub repo containing an MCP server. It detects the runtime (Node.js, Python, Docker), builds it, and gives you an HTTP endpoint URL. Auto-redeploys when you push.
2. **Install from Catalog** — Browse a curated catalog of popular MCP servers and install them with one click. No local setup needed.
Every MCP server you deploy or install gets a direct HTTP endpoint with a token. Paste the URL into n8n, Zapier, Make.com, or any MCP client.
## Core Features
### MCP Deployment
- **GitHub to URL in 30 seconds** — Connect your repo, select a branch, get an HTTP endpoint
- **Auto-redeploy on push** — Push to your repo, DeployStack rebuilds and redeploys automatically
- **Runtime detection** — Supports Node.js, Python, and Docker-based MCP servers
- **Direct MCP endpoints** — Each server gets its own URL and instance token
- Learn more: [MCP Deployment](https://deploystack.io/mcp-deployment)
### MCP Server Catalog
- **Curated catalog** — Browse and install popular MCP servers with one click
- **No local installation** — Servers run on DeployStack satellite infrastructure
- **Instant setup** — Install, add credentials, start using
### Token Optimization
- **Hierarchical Router** — Reduces token consumption from 75,000 to 1,372 tokens (98% reduction)
- **Two Meta-Tools Pattern** — Expose