Frequently asked questions about Lore.
Memory systems store processed facts without attribution. Lore preserves original documents so you can cite exactly what was said, by whom, and when. When an AI tool uses Lore, it can say "In the Jan 15 interview, Sarah said..." rather than just "Users want faster exports."
Discovery (Phase 1) is free — Lore computes content hashes and checks which files are new without any LLM calls. Only new files trigger LLM processing (Phase 2) for metadata extraction and embedding generation. Re-syncing existing files costs nothing.
Not yet. Each user's data is isolated via Postgres Row Level Security. Team sharing is planned for a future release.
Markdown, JSONL, JSON, plain text, CSV, HTML, XML, PDF, and images (JPEG, PNG, GIF, WebP). Claude extracts metadata automatically during sync — including EXIF data from images and transcripts from structured JSON.
When you add a sync source with lore sync add, the default glob is **/* (all files). Use --glob to restrict to specific types if needed. The ingest MCP tool accepts any text content directly — just tell your AI "save this to lore."
No. The backend (Supabase with pgvector) is fully hosted. Just install, log in, and bring your own OpenAI and Anthropic API keys.
Run lore setup on each machine with the same email. Your data repo URL is saved to your account, so new machines discover and clone it automatically. Lore deduplicates by content hash (SHA256), so nothing gets indexed twice even if the same document is synced from multiple machines.
Lore itself is free. You pay for your own API keys:
research tool. Costs vary by usage. Get a key at console.anthropic.com/settings/keys.Yes. Documents are stored in your local data directory and a Supabase database scoped to your user ID. Row Level Security ensures only you can access your data. API keys are stored locally in ~/.config/lore/config.json with restricted file permissions.
Yes. The CLI works standalone:
lore search "query" — search from the terminallore browse — interactive TUI browserlore research "question" — AI-powered researchMCP integration is optional but recommended for AI-assisted workflows.
Any MCP-compatible client: Claude Code, Claude Desktop, Cursor, Windsurf, Gemini CLI, Codex CLI, and more. See the MCP Setup guide for platform-specific configuration.