Knowledge layer for AI tools

Give every tool
your full context

Store documents once, search and cite from any AI tool. Synced across every machine you work on.

npm install -g @getlore/cli

Then run lore setup to sign in

Set up Lore for me using the instructions at https://getlore.ai/docs/agent-guide

Your AI walks you through it — asks for your email, sends a code, done

Works with
ClaudeChatGPTGeminiCursorOpenClawCLIAny MCP Client

Context with sources

Memory tools lose the original. Lore keeps it so any tool can cite exactly what was said.

Memory

“Users want faster exports”

Lore

“In the Jan 15 interview, Sarah said ‘The export takes forever, I’ve lost work twice this week’”

You and your agents, in sync

Developers

Decisions, specs, and history persist across every tool and session.

“How has the auth system evolved?”

Researchers

Archive anything, search by meaning, get cited quotes from the original.

“Archive this: https://blog.example.com/post”

AI agents

Persistent knowledge via MCP. Search before asking, cite sources automatically.

search("onboarding decisions")

Browse your knowledge base

lore browse gives you a full TUI for searching, reading, and researching — no MCP client needed.

lore browse
Lore TUI — browse documents, search, and research from the terminal

What you get

Semantic Search

Find documents by meaning, not just keywords.

Citations

Every result links to the original source.

9 MCP Tools

Works with any MCP client out of the box.

Git Version Controlled

Your context is checked into git. Full history, diffs, rollbacks.

Deep Research

AI cross-references sources and synthesizes findings.

Synced Everywhere

Same context on every machine and every AI tool. Deduplicated automatically.

Many ways in, one place to search

Drop files in a folder, push content from any AI tool, or pipe it through the CLI. Everything lands in the same searchable knowledge base.

Folder sync

Point Lore at any folder. Drag files in — they're indexed automatically.

lore sync add --path ~/research
MCP ingest

Any AI tool can push content directly. Just ask: “store this in lore.”

“Save this meeting summary to lore”
CLI

Ingest directly from the command line, a file, or a pipe.

lore ingest --file notes.md
Query from anywhere

Search from the CLI, the TUI, or any MCP-connected AI tool. Every result cites the original source.

lore search "what did the team decide about auth?"

Set up in 30 seconds

No account to create, no password to remember. Just your email and two API keys.

1

Install

npm install -g @getlore/cli
2

Run setup

Paste your API keys, enter your email, receive a one-time code — done. That's the whole login.

lore setup
3

Search

Add sources with lore sync add, push content via the ingest MCP tool, or just start searching:

lore search "user pain points"
lore research "What should we prioritize?"
1

Send instructions

Set up Lore for me using the instructions at https://getlore.ai/docs/agent-guide

Your AI reads the guide, installs the package, and asks you for your email and API keys.

2

Paste the code

A 6-digit verification code is sent to your email. Paste it back into the chat when your AI asks.

3

Done

Your AI finishes setup and starts the background daemon. You're ready to search, ingest, and research.

Connect to your tools

1
Add the MCP server
Install in CursorInstall in VS CodeInstall in VS Code InsidersInstall in Goose
Or configure manually:

Add to .mcp.json in your project root, or ~/.claude.json globally:

{
  "mcpServers": {
    "lore": {
      "command": "npx",
      "args": ["-y", "@getlore/cli", "mcp"]
    }
  }
}

Settings → Developer → Edit Config. Include API keys since Desktop doesn't inherit your shell:

{
  "mcpServers": {
    "lore": {
      "command": "npx",
      "args": ["-y", "@getlore/cli", "mcp"],
      "env": {
        "OPENAI_API_KEY": "your-key",
        "ANTHROPIC_API_KEY": "your-key"
      }
    }
  }
}

Add to .cursor/mcp.json or ~/.codeium/windsurf/mcp_config.json:

{
  "mcpServers": {
    "lore": {
      "command": "npx",
      "args": ["-y", "@getlore/cli", "mcp"]
    }
  }
}
2
Sign in

Run npx @getlore/cli setup to configure API keys and sign in. This is required before tools will work.

Available tools

searchFind by meaning or keywords
get_sourceFull document content
list_sourcesBrowse by project or type
list_projectsAll projects overview
ingestAdd content (docs, insights, decisions)
research_statusPoll async research results
syncRefresh from source dirs
archive_projectArchive completed work
researchAI-powered deep research

Commands

CommandDescription
lore setupGuided wizard (config, login, data repo)
lore auth loginSign in with email OTP
lore syncSync all configured sources
lore sync addAdd a source directory
lore ingestPush content into the knowledge base
lore search <query>Semantic search
lore research <query>AI-powered deep research
lore browseInteractive TUI browser
lore updateCheck for and install updates
lore mcpStart MCP server

Common questions

How is Lore different from a memory system?

Memory systems store processed summaries without attribution. Lore preserves original documents so you can cite exactly what was said, by whom, and when.

What does it cost to use?

Lore is free. You bring your own API keys — OpenAI for embeddings, Anthropic for research. Re-syncing existing files costs nothing.

What file formats work?

Markdown, JSON, JSONL, plain text, CSV, HTML, XML, PDF, and images (JPG, PNG, GIF, WebP). Claude extracts metadata automatically.

Do I need to set up infrastructure?

No. The backend is fully hosted. Install, log in, bring your API keys.

How does multi-machine sync work?

Run lore setup on each machine. Your data repo URL is saved to your account, so new machines auto-discover it. Content is deduplicated by hash.