Core Tools (Enabled by Default)
| Tool | Description |
|---|---|
| chat | Send messages to any supported model |
| thinkdeep | Extended reasoning with model selection |
| planner | Multi-step project planning across models |
| consensus | Get multiple model opinions and find consensus |
| codereview | Cross-model code review |
| precommit | Pre-commit checks using multiple models |
| debug | Cross-model debugging assistance |
| apilookup | API documentation lookup via models |
| challenge | Devil's advocate analysis from another model |
| clink | CLI-to-CLI bridge — spawn Codex/Gemini CLI subagents |
Additional Tools (Disabled by Default)
Enable by removing from DISABLED_TOOLS env var:
| Tool | Description |
|---|---|
| analyze | Deep code analysis |
| refactor | Code refactoring suggestions |
| testgen | Test generation |
| secaudit | Security auditing |
| docgen | Documentation generation |
| tracer | Code flow tracing |
Supported Providers
| Provider | Models | API Key Env Var |
|---|---|---|
| Gemini | gemini-2.5-pro, gemini-2.5-flash | GEMINI_API_KEY |
| OpenAI | gpt-4o, o3, o4-mini | OPENAI_API_KEY |
| Azure OpenAI | Any deployed model | AZURE_OPENAI_* |
| X.AI/Grok | grok-3 | XAI_API_KEY |
| OpenRouter | 200+ models | OPENROUTER_API_KEY |
| Ollama | Any local model | (local, no key) |
Key Feature: clink (CLI-to-CLI Bridge)
The clink tool lets Claude Code spawn Codex or Gemini CLI as isolated subagents for specific tasks — code reviews, bug hunting, research — without polluting Claude's main context window. Results flow back automatically.
Key Feature: Conversation Continuity
Full context flows across tools and models. When Claude's context resets mid-session, other models retain the conversation history and can bring Claude back up to speed.
FAQ
Q: What is Pal MCP Server? A: An MCP server that connects Claude Code to 7+ AI providers (Gemini, OpenAI, Grok, Ollama, etc.), enabling multi-model workflows, cross-model code review, and CLI-to-CLI bridging.
Q: Is Pal MCP Server free? A: The server itself is free and open source. You need API keys for the model providers you want to use (Gemini has a free tier).
Q: How do I install Pal MCP Server?
A: Add the JSON config to your .mcp.json file and set your API keys. Requires Python 3.10+ and uv.