llms.txt Content
# NeuroLink Documentation (Summary)
> Enterprise AI Development Platform - Unified provider access, MCP integration, professional CLI
Generated: 2026-04-18T13:10:14.378Z
Full documentation: https://docs.neurolink.ink/llms-full.txt
---
## Project Overview
NeuroLink is an enterprise AI development platform that provides:
- Unified access to 13+ AI providers through a single consistent API
- 58+ MCP (Model Context Protocol) tools and integrations
- TypeScript SDK and professional CLI
- Production-ready features: Redis memory, failover, telemetry
- Multimodal support: text, images, PDFs, CSV, audio, video
## Supported Providers
- **OpenAI** (`openai`)
- **Anthropic Claude** (`anthropic`)
- **Google AI Studio (Gemini)** (`google-ai`)
- **Google Vertex AI** (`vertex`)
- **AWS Bedrock** (`bedrock`)
- **Azure OpenAI** (`azure`)
- **Mistral AI** (`mistral`)
- **LiteLLM (100+ models)** (`litellm`)
- **OpenRouter** (`openrouter`)
- **Ollama (Local)** (`ollama`)
- **Hugging Face** (`huggingface`)
- **AWS SageMaker** (`sagemaker`)
- **OpenAI-Compatible** (`openai-compatible`)
## API Signatures Summary
### Core Methods
- `neurolink.generate(options)` - Generate text response
- `neurolink.stream(options)` - Stream text response
- `neurolink.generateImage(options)` - Generate images (Gemini/Vertex)
### Configuration
- `new NeuroLink(config)` - Initialize SDK
- `neurolink.addExternalMCPServer(name, config)` - Add MCP server
- `neurolink.registerTool(tool)` - Register custom tool
### Common Options
- `provider` - AI provider slug (openai, anthropic, google-ai, etc.)
- `model` - Model name
- `input.text` - Prompt text
- `input.images` - Image attachments
- `maxTokens` - Response length limit
- `temperature` - Creativity (0-1)
- `thinkingLevel` - Extended thinking (minimal, low, medium, high)
- `structuredOutput` - Zod schema for typed responses
### CLI Commands
- `neurolink generate <prompt>` - Generate text
- `neurolink stream <prompt>` - Stream text
- `neurolink loop` -