# Stash — Self-Hosted Memory + MCP SSE on Postgres/pgvector > Stash is a self-hosted memory layer for agents with an MCP SSE endpoint, backed by Postgres + pgvector and a consolidation pipeline for long-lived recall. ## Install Merge the JSON below into your `.mcp.json`: ## Quick Use 1. Start the stack: ```bash git clone https://github.com/alash3al/stash.git cd stash cp .env.example .env docker compose up ``` 2. Point your MCP client to the SSE endpoint: - URL: `http://localhost:8080/sse` 3. Verify: ask the agent to write one memory, then query it back by tags or keywords. ## Intro Stash is a self-hosted memory layer for agents with an MCP SSE endpoint, backed by Postgres + pgvector and a consolidation pipeline for long-lived recall. - **Best for:** teams that want persistent memory for agents without relying on a hosted SaaS, plus an MCP-compatible interface - **Works with:** Docker Compose, PostgreSQL + pgvector, MCP clients that support SSE (Cursor, Claude Desktop, Windsurf) - **Setup time:** 15-35 minutes ## Practical Notes - Quant: the README highlights an 8-stage consolidation pipeline—treat that as a controllable knob for memory quality vs cost. - Quant: use one SSE endpoint per environment (dev/staging/prod) so memory doesn’t cross-contaminate across contexts. ## Why it matters Stash is interesting when your agent needs long-term memory that stays inside your infra, and you want an MCP endpoint that multiple clients can reuse. - Quick start is one `docker compose up`, which makes it easy to pilot in a team. - SSE transport means multiple MCP-capable clients can point at the same service endpoint. - Positions memory as a consolidation pipeline (episodes → facts), which maps to how teams debug memory quality. ## Rollout pattern - Start with one agent and one project; define what counts as “write-worthy” memory (facts, decisions, constraints). - Add retention and consolidation schedules once you observe retrieval precision in real tasks. - Create a “memory schema” (tags, owners, environments) so retrieval stays scoped and predictable. ## Watchouts Memory is also a data-leak vector. Treat the store as sensitive: enforce auth, avoid storing secrets, and keep environments separated. ### FAQ **Q: Is this only for Cursor?** A: No. It works with any MCP client that supports SSE endpoints; the README lists multiple clients. **Q: How do I keep memory useful?** A: Write only verified facts/decisions and enforce tags + scope per agent or per project. **Q: Can I run it in production?** A: Yes, but add auth, backups, and monitoring; don’t leave it anonymously accessible on a network. ## Source & Thanks > Source: https://github.com/alash3al/stash > License: Apache-2.0 > GitHub stars: 687 · forks: 30 --- ## 快速使用 1. 启动服务: ```bash git clone https://github.com/alash3al/stash.git cd stash cp .env.example .env docker compose up ``` 2. 在 MCP 客户端里配置 SSE: - URL:`http://localhost:8080/sse` 3. 验证:让 Agent 写入一条记忆,再按关键词/标签检索回来。 ## 简介 Stash 是自托管的 Agent 记忆层,基于 Postgres + pgvector 存储,并通过 SSE 暴露 MCP Server;搭配后台归并流程把对话“片段”沉淀为可检索、可复用的长期记忆资产。 - **适合谁:** 想要自托管持久化记忆层(非 SaaS),并希望通过 MCP 接入到各类客户端的团队 - **可搭配:** Docker Compose、PostgreSQL + pgvector、支持 MCP SSE 的客户端(Cursor / Claude Desktop / Windsurf) - **准备时间:** 15-35 分钟 ## 实战建议 - 量化信息:README 提到 8 阶段归并流水线;可把它当作“记忆质量 vs 成本”的可调旋钮。 - 量化信息:建议 dev/staging/prod 各自一套 SSE endpoint,避免记忆跨环境污染。 ## 为什么值得收录 当你想把记忆层留在自家基础设施里,并希望多个 MCP 客户端共享同一个“记忆服务端点”时,Stash 这类实现会很有用。 - 一条 `docker compose up` 就能跑起来,便于团队快速试点。 - SSE 传输意味着多个 MCP 客户端可以共用同一服务端点。 - 把记忆当作“归并流水线”,有利于排查记忆质量问题。 ## 落地路径 - 先限定 1 个 agent + 1 个项目,明确哪些信息值得写入(事实/决策/约束)。 - 观察真实任务中的检索命中率后,再配置保留期与归并调度。 - 建立“记忆 schema”(tags/owner/environment),确保检索可控且不串线。 ## 注意事项 记忆层也可能成为数据泄露路径。建议强制鉴权、避免存秘密,并把环境严格隔离。 ### FAQ **只能给 Cursor 用吗?** 答:不是。任何支持 MCP SSE 的客户端都能用;README 也列了多种客户端。 **怎么让记忆更有用?** 答:只写已验证事实/决策,并要求 tags + 作用域(按 agent 或项目隔离)。 **能上生产吗?** 答:可以,但要加鉴权、备份与监控;不要把它匿名暴露在网络上。 ## 来源与感谢 > Source: https://github.com/alash3al/stash > License: Apache-2.0 > GitHub stars: 687 · forks: 30 --- Source: https://tokrepo.com/en/workflows/stash-self-hosted-memory-mcp-sse-on-postgres-pgvector Author: MCP Hub