Stop building
amnesiac agents.

Because "similar" isn't the same as "remembered." Reeve is a temporal knowledge graph that understands what you store — not just what's similar.

3-lane Retrieval engine
Neo4j Knowledge graph
MCP Native protocol
Local Ollama compatible

LLMs forget.
Every conversation
starts from zero.

Common workarounds — chat history, vector stores, RAG pipelines — all break down over time. They retrieve similar text. They can't handle contradictions. They have no concept of time or state evolution.

Ask any of them: "What changed about me since last year?"
Silence. Ask Reeve — it knows.

No cross-session memory

Every new conversation resets context entirely. Your agent is a stranger every time it wakes up.

Chat historyVector storesRAG

No contradiction handling

"I moved to New York" and "I live in San Francisco" coexist in a vector store. No resolution. No truth.

PineconeChromaDBWeaviate

No sense of time

Ask "what changed since last year?" — silence. Memory systems store facts, not state evolution.

MemGPTLangMemOpenAI memory

Three steps.
Lifetime memory.

01

Store anything

Call store() with any text. Reeve's LLM parses it into structured entities, states, actions, and locations — writing a living temporal knowledge graph to Neo4j. Not chunks. Not embeddings. Structured understanding.

python
from reeve import store

store("I just joined Google as a software engineer")
store("I love playing football")
store("I moved from San Francisco to New York")
02

Graph evolves, history preserved

New facts don't overwrite old ones — they create SUPERSEDES chains. "I moved to New York" marks San Francisco as historical, not deleted. Entity resolution ensures "Google", "my company", and "work" resolve to one canonical node.

graph
(city: New York) ──SUPERSEDES──▶ (city: San Francisco)
active: true                    active: false

"Google" = "my company" = "work"  → one node
03

Query in natural language

Ask anything. The 3-lane retrieval engine (semantic + temporal + recency) surfaces the right memory — not just the most similar text, but the most relevant knowledge at this moment in time. Landmark memories bypass recency decay entirely.

python
from reeve import query

query("Where do I live?")
# → "New York."

query("Should I play football with my friend?")
# → "Yes — you love football."

query("Did I ever live in SF?")
# → "Yes, before moving to New York."

Built for permanence,
not prototypes.

Core

3-Lane Retrieval

Most systems rank by vector similarity alone. Reeve combines three parallel lanes — semantic, temporal, and recency-weighted — into a single unified score. Important memories surface regardless of age.

score = 0.65×similarity + 0.30×importance + 0.05×recency
Core

State Supersession

Facts evolve. Reeve tracks this with explicit SUPERSEDES chains — current answers are always accurate, history is always preserved.

Design

Landmark Memory

Major life events — promotions, moves, milestones — are protected with an importance floor. They bypass recency decay and surface instantly, no matter how old they are.

Protocol

MCP-Native

Works with any MCP-compatible client — Claude Desktop, LM Studio, AnythingLLM, Cursor. Paste 4 lines of JSON. Done.

{"mcpServers": {"reeve": {"url": "..."}}}
Architecture

Temporal Knowledge Graph

Built on Neo4j with typed relationships — Episodes, Entities, Actions, States, Roles, Locations. Not an embedding dump. A living, evolving model of everything you've stored.

Reliability

Entity Resolution

"Google", "my company", "work" — all resolve to one canonical node via 3-layer matching: exact, substring, and embedding similarity. One identity, many names.

Scalability

Lifespan-Aware Scaling

Search depth scales dynamically with graph size — 2% of total episodes, clamped between 50 and 500. Efficient at day one. Deep at year ten. Ready for a lifetime.

Up in minutes.

Paste into your MCP client config
json
{
  "mcpServers": {
    "reeve": {
      "type": "sse",
      "url":  "https://api.reeve.co.in/mcp"
    }
  }
}

Restart your client after saving. Your AI will remember everything from this point forward.

Works with
Claude Desktop LM Studio AnythingLLM Cursor Ollama Any MCP client

Give your agent
a lifetime.

Memory that persists, evolves, and never forgets what matters. Built on a temporal knowledge graph engineered to last decades.

Build with Reeve →