# omega-memory — Persistent Memory for Coding Agents > OMEGA is a local-first, cross-model memory system for coding agents. It runs an MCP server plus hooks so recall is automatic across sessions. ## Install Merge the JSON below into your `.mcp.json`: ## Quick Use ```bash pip install omega-memory[server] omega setup omega doctor ``` ## Intro OMEGA is a local-first, cross-model memory system for coding agents. It runs an MCP server plus hooks so recall is automatic across sessions. **Best for:** Developers who want cross-session memory without cloud lock-in **Works with:** Python 3.11+, MCP clients, Claude Code/Cursor/Codex (via setup) **Setup time:** 4-10 minutes ### Key facts (verified) - GitHub: 135 stars · 22 forks · pushed 2026-05-13. - License: Apache-2.0 · owner avatar + repo URL verified via GitHub API. - README-verified entrypoint: `pip install omega-memory[server] # Full install (memory + MCP server)`. ## Main - Use the server+hooks path if you want ambient memory: setup wires MCP + session hooks so recall happens at session start, not only when you remember to query. - Start by storing explicit decisions (architecture, style, constraints), then rely on semantic query when a new session begins to keep continuity. - If you only need a library (no server), use the README’s core-only install — but expect fewer integrations than the MCP server flow. ### Source-backed notes - README Quick Install shows `pip install omega-memory[server]` followed by `omega setup` and `omega doctor`. - README states `omega setup` downloads an ONNX embedding model (~90MB) and registers an MCP server for supported clients. - README lists a memory-tool table (25 tools) and documents a CLI section for setup/doctor/status. ### FAQ - **Does it send data to the cloud?**: README frames it as local-first; any network traffic comes from your LLM provider, not the memory store itself. - **Can I use it without MCP?**: Yes — README shows a library-only install, but integrations are strongest with the MCP server flow. - **What’s a good first memory?**: Store one durable rule (e.g., linting or architecture constraint) and verify it’s recalled on the next session start. ## Source & Thanks > Source: https://github.com/omega-memory/omega-memory > License: Apache-2.0 > GitHub stars: 135 · forks: 22 --- ## Quick Use ```bash pip install omega-memory[server] omega setup omega doctor ``` ## Intro OMEGA 是本地优先的跨模型记忆系统:提供 MCP server + hooks,让编码 agent 跨会话自动回忆决策与偏好。`pip install omega-memory[server]` 后跑 `omega setup/doctor`,嵌入模型本地下载(约 90MB)。 **Best for:** 想要跨会话记忆、又不想把数据交给云端/单一厂商的开发者 **Works with:** Python 3.11+;MCP 客户端;Claude Code/Cursor/Codex(setup 自动接入) **Setup time:** 4-10 minutes ### Key facts (verified) - GitHub:135 stars · 22 forks;最近更新 2026-05-13。 - 许可证:Apache-2.0;作者头像与仓库链接均已通过 GitHub API 复核。 - README 中核对过的入口命令:`pip install omega-memory[server] # Full install (memory + MCP server)`。 ## Main - 如果你想要“自动记忆”,走 server+hooks 路线:setup 会把 MCP 与会话 hooks 接好,让回忆在 SessionStart 自动发生。 - 先显式存关键决策(架构/风格/约束),再在新会话开始时用语义检索保持连续性,减少重复解释。 - 只想当库用(不跑 server)也可以按 README 的 core-only 安装,但集成能力会比 MCP server 少。 ### Source-backed notes - README 的 Quick Install 给出 `pip install omega-memory[server]`,并随后执行 `omega setup` / `omega doctor`。 - README 说明 `omega setup` 会下载 ONNX 嵌入模型(约 90MB)并为对应客户端注册 MCP server。 - README 列出 25 个 memory tools,并在 CLI 章节写明 setup/doctor/status 等命令。 ### FAQ - **会把数据上传云端吗?**:README 把它定位为 local-first;网络流量主要来自你使用的 LLM provider,而不是记忆库本身。 - **不接 MCP 也能用吗?**:能。README 有 library-only 安装,但最完整的集成在 MCP server + hooks 流程。 - **第一条记忆存什么最值?**:先存一条长期规则(如 lint/架构约束),然后新会话开始时验证是否被自动召回。 ## Source & Thanks > Source: https://github.com/omega-memory/omega-memory > License: Apache-2.0 > GitHub stars: 135 · forks: 22 --- Source: https://tokrepo.com/en/workflows/omega-memory-persistent-memory-for-coding-agents Author: MCP Hub