Main
Use the server+hooks path if you want ambient memory: setup wires MCP + session hooks so recall happens at session start, not only when you remember to query.
Start by storing explicit decisions (architecture, style, constraints), then rely on semantic query when a new session begins to keep continuity.
If you only need a library (no server), use the README’s core-only install — but expect fewer integrations than the MCP server flow.
Source-backed notes
- README Quick Install shows
pip install omega-memory[server]followed byomega setupandomega doctor. - README states
omega setupdownloads an ONNX embedding model (~90MB) and registers an MCP server for supported clients. - README lists a memory-tool table (25 tools) and documents a CLI section for setup/doctor/status.
FAQ
- Does it send data to the cloud?: README frames it as local-first; any network traffic comes from your LLM provider, not the memory store itself.
- Can I use it without MCP?: Yes — README shows a library-only install, but integrations are strongest with the MCP server flow.
- What’s a good first memory?: Store one durable rule (e.g., linting or architecture constraint) and verify it’s recalled on the next session start.