Postgres for AI Agents
PostgreSQL MCP servers (two flavors), DBHub universal database MCP, Neon serverless Postgres, Supabase MCP — all the SQL surfaces an agent needs.
What's in this pack
This pack collects the five SQL surfaces an agent realistically needs to read, write, and reason about a Postgres database. Two are stdio MCP servers you point at your own DSN. One is a universal MCP server that handles Postgres, MySQL, SQLite and more. The last two are managed cloud Postgres providers with first-class agent integration.
| # | Asset | Type | Best for |
|---|---|---|---|
| 1 | PostgreSQL MCP (reference) | stdio MCP server | local & self-hosted Postgres |
| 2 | PostgreSQL MCP (community) | stdio MCP server | extra tools (explain, indexes) |
| 3 | DBHub | universal MCP | one server, many DB engines |
| 4 | Neon serverless Postgres | managed cloud + MCP | branching for AI agents |
| 5 | Supabase MCP | managed cloud + MCP | full BaaS (auth, storage, RLS) |
The reference PostgreSQL MCP server lives in the official modelcontextprotocol/servers repo and is the simplest stdio binding: query, list_tables, describe_table. The community variant adds explain, index_advice, and parameterized writes. DBHub abstracts the connection layer so one MCP install can hit Postgres, MySQL, SQLite or DuckDB. Neon and Supabase ship their own MCPs that go beyond raw SQL — Neon exposes branch creation as a tool (one DB per agent task), Supabase exposes auth, storage, and Row Level Security.
Why Postgres for agents matters
Most "agent + database" demos cheat. They use a fresh ephemeral SQLite, hand-rolled fixtures, no concurrency. Real apps run on Postgres with hundreds of tables, RLS policies, JSON columns, and existing connection pooling. Plugging an LLM into that surface naively is how you end up with a DELETE without WHERE in production.
This pack picks the five servers that handle the realities:
- Read-only mode by default. All five servers ship with read-only flags or capability scoping.
- Schema introspection. The agent can ask
describe_tableinstead of guessing column names. - Connection pooling. Each server reuses a pool — no per-query reconnect storms.
- Branch / sandbox. Neon's branch tool lets every agent task get its own throwaway database in 200ms.
Install in one command
# Pack install — drops MCP server configs into your tool's config file
tokrepo install pack/postgres-for-agents
# Or pick servers individually
tokrepo install postgres-mcp
tokrepo install dbhub
tokrepo install neon-mcp
tokrepo install supabase-mcp
The TokRepo CLI writes config to the right place for each tool — claude_desktop_config.json for Claude Code, mcp.json for Cursor, AGENTS.md for Codex CLI. Connection strings stay in your env vars; the configs only carry server names.
Common pitfalls
- Don't expose write access on day one. Start with
--read-only. Promote to read-write only after you've watched the agent execute a few hundred reads safely. - Use a dedicated role, not the postgres superuser. Create a
mcp_agentrole withSELECTon the schemas you actually want exposed. RLS policies should also apply to this role. - Watch the schema-dump cost.
list_tableson a 500-table production schema can blow the agent's context. Filter to a single schema or an allow-list. - Branching only buys you safety if you actually use it. Neon branches are cheap — create one per agent task, point the MCP at the branch DSN, and discard after.
- Supabase RLS is not optional. If the agent connects with a service role key, RLS is bypassed. Use anon + JWT for any task touching user data.
Relationship to other packs
This pack is the structured-data layer. For semantic / unstructured retrieval (similarity search, RAG over documents) you want the Vector DB Showdown pack — Chroma, Qdrant, Weaviate, Pinecone. For the broader MCP toolbox (filesystem, browser, GitHub, etc) see MCP Server Stack. Production agents typically use one item from each pack: Postgres for transactional state, a vector DB for embeddings, and the MCP stack for tool surfaces.
5 assets in this pack
Frequently asked questions
Is the reference PostgreSQL MCP server free?
Yes — it's open-source MIT, shipped in the official modelcontextprotocol/servers repository, no usage limits beyond your own database. You only pay for the database itself (Neon and Supabase have generous free tiers; self-hosted Postgres is free). The MCP layer adds zero cost or telemetry.
How does DBHub compare to running multiple PostgreSQL MCP instances?
DBHub is one MCP process that fronts many DSNs — Postgres, MySQL, SQLite, DuckDB. If your agent needs three databases, DBHub is one server slot with three named connections. Multiple PostgreSQL MCPs work too, but eat one MCP slot each and don't share schema cache. Use DBHub for multi-DB workflows; native PostgreSQL MCP for single-DB precision.
Will this work with Claude Code or Cursor?
Both, plus Codex CLI, Gemini CLI, Cline, Roo Code, Windsurf, and Copilot. MCP is a standard — TokRepo CLI just writes the right config file for each tool. Neon and Supabase MCPs work the same way; Neon also publishes a Cursor extension that does branch-per-task automatically.
Difference between this pack and Vector DB Showdown?
Postgres handles structured rows — orders, users, the data you'd query with SELECT WHERE. Vector DBs handle similarity over embeddings — find me documents that mean approximately X. They're complementary. Real RAG apps run both: Postgres for facts, Qdrant or Chroma for semantic recall, joined by a foreign key in metadata.
What's the gotcha with Supabase service role keys?
The service role key bypasses Row Level Security. If you wire an MCP to that key, the agent can read every user's data — not just the current user's. Always prefer the anon key plus a per-session JWT for tasks that touch user data, and reserve the service role for admin tasks behind a human approval step.
12 packs · 80+ hand-picked assets
Browse every curated bundle on the home page
Back to all packs