llms.txt Content
# Files and Resources
## Attaching Files
You can include files in a conversation using Paths:
```python
from fast_agent import Prompt
from pathlib import Path
plans = await agent.send(
Prompt.user(
"Summarise this PDF",
Path("secret-plans.pdf")
)
)
```
This works for any mime type that can be tokenized by the model.
## MCP Resources
MCP Server resources can be conveniently included in a message with:
```python
description = await agent.with_resource(
"What is in this image?",
"resource://images/cat.png",
"mcp_image_server",
)
```
## Prompt Files
Prompt Files can include Resources:
agent_script.txt
```md
---USER
Please extract the major colours from this CSS file:
---RESOURCE
index.css
```
They can either be loaded with `fast_agent.load_prompt`, or delivered via the built-in `prompt-server`.
# Defining Agents
## Basic Agents
Defining an agent is as simple as:
```python
@fast.agent(
instruction="Given an object, respond only with an estimate of its size."
)
```
We can then send messages to the Agent:
```python
async with fast.run() as agent:
moon_size = await agent("the moon")
print(moon_size)
```
Or start an interactive chat with the Agent:
```python
async with fast.run() as agent:
await agent.interactive()
```
Here is the complete `sizer.py` Agent application, with boilerplate code:
sizer.py
```python
import asyncio
from fast_agent.core.fastagent import FastAgent
# Create the application
fast = FastAgent("Agent Example")
@fast.agent(
instruction="Given an object, respond only with an estimate of its size."
)
async def main():
async with fast.run() as agent:
await agent()
if __name__ == "__main__":
asyncio.run(main())
```
The Agent can then be run with `uv run sizer.py`.
Specify a model with the `--model` switch - for example `uv run sizer.py --model sonnet`.
You can also pass a `Path` for the instruction - e.g.
```python
from pathlib import Path
@fast.agent(
instruction=Path(".