← Back to search
40
Partial
Agentic Readiness Score
developer devtoolsmonitoringanalyticsllms-txtmcpai-friendlyml

Agentic Signals

📄
Found
🤖
ai-plugin.json
Not found
📖
OpenAPI Spec
Not found
🔗
Structured API
Not found
🏷
Schema.org Markup
Not found
MCP Server
Found

Embed this badge

Show off your agentic readiness — the badge auto-updates when your score changes.

Agentic Ready 40/100

            

llms.txt Content

# vLLora - Debug your agents in realtime > Your AI Agent Debugger This file contains links to documentation sections following the llmstxt.org standard. ## Table of Contents - [Clone and Experiment with Requests](https://vllora.dev/docs/clone-and-experiment): Use **Clone Request** to turn any finished trace into an isolated **Experiment**, so you can safely try new prompts, models, and parameters without... - [Configuration](https://vllora.dev/docs/configuration): vLLora can be configured via a `config.yaml` file or through command-line arguments. CLI arguments take precedence over config file settings. - [Custom Endpoints](https://vllora.dev/docs/custom-endpoints): Connect your own endpoint to any provider in vLLora. This allows you to use custom API gateways, self-hosted models, or OpenAI-compatible proxies. - [Custom Providers and Models](https://vllora.dev/docs/custom-providers): vLLora is designed to be agnostic and flexible, allowing you to register **Custom Providers** (your own API endpoints) and **Custom Models** (speci... - [Debugging LLM Requests](https://vllora.dev/docs/debug-mode): vLLora supports interactive debugging for LLM requests. When Debug Mode is enabled, vLLora pauses requests before they are sent to the model. You c... - [Installation](https://vllora.dev/docs/installation): vLLora can be installed via Homebrew, the Rust crate, or by building from source. - [Introduction](https://vllora.dev/docs/introduction): Debug your AI agents with complete visibility into every request. vLLora works out of the box with OpenAI-compatible endpoints, supports 300+ model... - [License](https://vllora.dev/docs/license): vLLora is [fair-code](https://faircode.io/) distributed under the **Elastic License 2.0 (ELv2)**. - [Lucy](https://vllora.dev/docs/lucy): Diagnose agent failures and latency issues directly inside your traces using Lucy. - [MCP Support](https://vllora.dev/docs/mcp-support): vLLora provides full support for **Model Context Protocol (MCP)** s