llms.txt Content
# Ori Mnemos
> Markdown-native persistent memory infrastructure for AI agents. Open-source MCP server that gives any LLM durable, searchable, graph-connected memory across sessions. No cloud. No API keys. Your files, your machine.
Ori Mnemos is an npm package (`ori-memory`) that provides AI agents with persistent memory through the Model Context Protocol (MCP). It stores notes as plain markdown files with YAML frontmatter, builds a local embedding index using Xenova/all-MiniLM-L6-v2 (no API keys needed), and exposes semantic search, graph queries, and note management through MCP tools.
## Install
- Human: `npm i ori-memory` then `ori init` then `ori serve --mcp`
- Agent: Add `{"ori": {"command": "ori", "args": ["serve", "--mcp"]}}` to your MCP client config
## Key Features
- **MCP server** for AI agent memory (works with Claude Code, Cursor, Windsurf, Cline, or any MCP client)
- **Markdown-native storage** — notes are plain .md files with YAML frontmatter, not locked in a database
- **Local embeddings** — Xenova/all-MiniLM-L6-v2 runs locally, no API keys or cloud dependency
- **Three-signal retrieval** — composite vector search + keyword matching + graph-based spreading activation
- **Knowledge graph** — wiki-links (`[[note title]]`) as edges, PageRank for importance scoring
- **Session persistence** — your agent wakes up knowing who it is, what it was working on, and what it learned
- **Git-versionable** — your AI's memory has full version history, diff it, branch it, merge it
- **No vendor lock-in** — if Ori disappears, your files still work in Obsidian, VS Code, or any text editor
## MCP Tools
- `ori_orient` — Session briefing: daily status, reminders, vault health, active goals
- `ori_add` — Create a note in inbox
- `ori_promote` — Promote an inbox note to notes/ with classification, linking, and area assignment
- `ori_validate` — Validate a note against schema
- `ori_query` — Query the vault (orphans, dangling links, backlinks, c