Cipher: The Open-Source Memory Layer That Lets AI Remember Your Code
“
“Every time I switch editors, I have to explain my project from scratch. What if the AI just… remembered?”
— almost every developer who uses AI pair-programming tools
Cipher is an open-source memory framework built for exactly this frustration.
In plain English: it gives your AI assistant a long-term memory of your code, your decisions, and your reasoning—no matter which IDE or chat tool you use next.
1. What Problem Does Cipher Solve?
2. Core Idea: Two-Layer Memory
Cipher borrows Daniel Kahneman’s “System 1 / System 2” idea and applies it to code:
- ❀
System 1 Memory
- ❀
Programming concepts (e.g., “React hooks rules”) - ❀
Business rules (e.g., “order state machine”) - ❀
Past interactions (e.g., “we changed vite.config.ts
origin totrue
to fix local CORS”)
- ❀
- ❀
System 2 Memory
- ❀
The reasoning chain the model followed to generate the code - ❀
Makes audits, refactors, and onboarding far easier
- ❀
3. Quick Start in Three Minutes
3.1 Install globally (recommended)
npm install -g @byterover/cipher
or inside a project:
npm install @byterover/cipher
3.2 Zero-config first run
cipher # interactive mode
cipher "Remember the common CORS pitfalls with Vite + Express"
cipher --mode api # REST server on http://localhost:3000
The first launch creates an .env.example
; copy it to .env
, add one API key, and you’re done.
4. Installation Options
Docker walk-through:
git clone https://github.com/campfirein/cipher.git
cd cipher
cp .env.example .env
# edit .env with your keys
docker-compose up -d
curl http://localhost:3000/health # → ok
5. Configuration: From One-Line Env to Full YAML
5.1 Absolute minimum
OPENAI_API_KEY=sk-xxx
# or ANTHROPIC_API_KEY, GEMINI_API_KEY, etc.
5.2 Full control with cipher.yml
Create memAgent/cipher.yml
:
llm:
provider: openai
model: gpt-4-turbo
apiKey: $OPENAI_API_KEY
# Expose local files to Claude Desktop
mcpServers:
filesystem:
type: stdio
command: npx
args: ['-y', '@modelcontextprotocol/server-filesystem', '.']
6. Choosing an Embedding Provider
Cipher defaults to OpenAI for reliability, but you can switch in one YAML block.
Local Ollama three-step:
# macOS
brew install ollama
ollama serve
ollama pull nomic-embed-text
7. LLM Support Matrix
Examples:
# Local Ollama
llm:
provider: ollama
model: qwen2.5:32b
baseURL: http://localhost:11434/v1
# Azure
llm:
provider: azure
model: gpt-4o-mini
apiKey: $AZURE_OPENAI_API_KEY
azure:
endpoint: $AZURE_OPENAI_ENDPOINT
8. CLI Reference
Handy sub-commands:
/session list # show all sessions
/session new refactor # create new session
/session switch legacy # jump to old context
/config # print current YAML
/stats # token usage
/help # built-in docs
9. MCP: Plug Cipher Into Any IDE
9.1 Claude Desktop
Add to your Claude Desktop config file:
{
"mcpServers": {
"cipher": {
"type": "stdio",
"command": "cipher",
"args": ["--mode", "mcp"],
"env": {
"OPENAI_API_KEY": "sk-xxx",
"ANTHROPIC_API_KEY": "sk-ant-xxx"
}
}
}
}
Restart Claude Desktop—you’ll see an “ask_cipher” tool in the sidebar.
9.2 MCP Aggregator Mode (Advanced)
To expose all tools (not just ask_cipher
), set:
"env": {
"MCP_SERVER_MODE": "aggregator",
"AGGREGATOR_CONFLICT_RESOLUTION": "prefix"
}
Cursor, Windsurf, or any MCP client can now use Cipher memory plus any other MCP server you list in cipher.yml
.
10. SSE Transport: Connect From a Browser
Start the server:
cipher --mode mcp --mcp-transport-type sse --mcp-port 4000
Browser or front-end connects to http://localhost:4000/mcp
for real-time updates.
11. Real-World Example: Claude Code + Cipher in Three Steps
-
Install
npm i -g @byterover/cipher
-
Create
.env
in your project rootOPENAI_API_KEY=sk-xxx ANTHROPIC_API_KEY=sk-ant-xxx
-
Run
cipher --mode mcp
Open Claude Code and type
@cipher how did we fix Vite CORS last time?
The exact conversation plus patch appears instantly.
12. FAQ: 15 Questions Developers Ask
Q1: Do I need a GPU?
No. Except for Ollama, all inference is cloud-based.
Q2: My company blocks the internet.
Use Ollama + local embeddings—everything stays on-prem.
Q3: Where is the memory stored?
By default, local JSON and vector files; optionally Neo4j.
Q4: Will my code leak?
All calls use your own API keys; Cipher never uploads anything.
Q5: Does it work with Chinese comments?
Yes, the embedding models handle Chinese equally well.
Q6: How do I delete sensitive memories?
/session switch <id>
# delete the session file shown in /config
Q7: How to isolate multiple projects?
Each project gets its own session folder and cipher.yml
.
Q8: Can I use it with GitHub Copilot?
Copilot doesn’t expose MCP, but you can query Cipher through its REST API and feed results back.
Q9: Memory size limits?
OpenAI vectors are 1536-D; 1 M records ≈ 600 MB—disk space is the limit.
Q10: How to back up?
Zip the session directory; it contains both JSON and vector index.
Q11: How is this different from LangChain Memory?
LangChain focuses on chain composition; Cipher is IDE-centric and plug-and-play via MCP.
Q12: Does it work on Windows?
Yes—NPM, Docker, and source builds are all cross-platform.
Q13: Is there a web UI?
Not yet; CLI + API today. A lightweight Next.js UI is on the roadmap.
Q14: How do I track costs?
/stats
prints token usage; match it to your cloud bill.
Q15: Can I run read-only mode?
Yes—start with --read-only
for CI environments.
13. Next Steps
- ❀
Full docs & examples: https://docs.byterover.dev/cipher/overview - ❀
Join the Discord: https://discord.com/invite/UMRrpNjh5W - ❀
Star the repo to get release notifications: https://github.com/campfirein/cipher
“
Cipher isn’t another “do everything” AI framework.
It does one small thing very well: remember every thought you had while coding, so you never have to repeat yourself.
Give it five minutes and it will give your entire project a long-term memory.