FAQ
Q: How is this different from a vector database? A: Vector DBs do similarity search on chunks. Cognee builds a knowledge graph — it understands entities, relationships, and can reason across multiple documents. Think "structured memory" vs "fuzzy search."
Q: Can I use it with Claude Code? A: Yes. Use Cognee as a Python library in your agent's tools. Add project docs to Cognee, then query them during coding sessions for context-aware assistance.
Q: What about privacy? A: Cognee runs locally by default. Your data stays on your machine. You can use local LLMs (Ollama) for fully offline operation.
Q: Does it support real-time updates?
A: Yes. Call cognee.add() and cognee.cognify() incrementally — new knowledge is integrated without reprocessing the entire graph.
Works With
- Claude Code, Cursor, Codex (via Python tool integration)
- Any LLM: OpenAI, Anthropic, Ollama, local models
- Vector stores: Qdrant, Weaviate, PGVector
- Graph stores: Neo4j, NetworkX
- Python 3.9+