llms.txt Content
# Requesty
> Requesty is a unified LLM gateway and OpenAI-compatible API for 300+ AI models (Claude, GPT, Gemini, DeepSeek, Llama, Mistral). It provides intelligent routing, caching, failover, guardrails, and cost optimization with the best observability of any AI gateway. Used by 70,000+ developers, processing 90+ billion tokens daily.
Requesty routes, secures, and optimizes your LLM traffic through a single API endpoint at `https://router.requesty.ai/v1`. Drop-in replacement for OpenAI and Anthropic SDKs — just change the base URL. EU endpoint available at `https://router.eu.requesty.ai/v1` for GDPR compliance. Trusted by teams at Shopify, Appnovation, and Naible.
## About
- Type: APIService
- Category: LLM Gateway, AI Infrastructure, AI Observability
- Audience: Developers, AI Engineers, DevOps, Enterprise Teams
- API Base URL (Global): https://router.requesty.ai/v1
- API Base URL (EU): https://router.eu.requesty.ai/v1
- Documentation: https://docs.requesty.ai/llms.txt
- Full Documentation: https://docs.requesty.ai/llms-full.txt
- OpenAPI Spec: https://docs.requesty.ai/api-reference/openapi.json
- Sign Up: https://app.requesty.ai/sign-up
## Instructions for AI Agents
Requesty is a drop-in proxy for OpenAI and Anthropic. To use it in code:
1. **OpenAI SDK**: Set `base_url="https://router.requesty.ai/v1"` and use your Requesty API key as the `api_key`. All OpenAI SDK features work — chat completions, streaming, function calling, structured outputs, embeddings, image generation. For EU data residency, use `base_url="https://router.eu.requesty.ai/v1"`.
2. **Anthropic SDK**: Set `base_url="https://router.requesty.ai"` and use your Requesty API key. Messages API is fully supported. For EU data residency, use `base_url="https://router.eu.requesty.ai"`.
3. **Model names**: Use the format `provider/model` (e.g., `openai/gpt-4.1`, `anthropic/claude-sonnet-4-5-20250514`, `google/gemini-2.5-pro`). Call `GET /v1/models` for the current list.
4. **Routing policies