Skip to main content
0
P

Pal Mcp Server

The power of Claude Code / GeminiCLI / CodexCLI + [Gemini / OpenAI / OpenRouter / Azure / Grok / Oll

Rating

0.0

Votes

0

score

Downloads

0

total

Price

Free

API key required

Works With

Claude CodeCursorWindsurfVS CodeDeveloper tool

About

PAL MCP: Many Workflows. One Context.

Your AI's PAL – a Provider Abstraction Layer Formerly known as Zen MCP

PAL in action

👉 [Watch more examples](#-watch-tools-in-action)

Your CLI + Multiple Models = Your AI Dev Team

Use the 🤖 CLI you love: Claude Code · Gemini CLI · Codex CLI · Qwen Code CLI · Cursor · _and more_

With multiple models within a single prompt: Gemini · OpenAI · Anthropic · Grok · Azure · Ollama · OpenRouter · DIAL · On-Device Model

🆕 Now with CLI-to-CLI Bridge

The new [`clink`](docs/tools/clink.md) (CLI + Link) tool connects external AI CLIs directly into your workflow:

  • Connect external CLIs like Gemini CLI, Codex CLI, and Claude Code directly into your workflow
  • CLI Subagents - Launch isolated CLI instances from _within_ your current CLI! Claude Code can spawn Codex subagents, Codex can spawn Gemini CLI subagents, etc. Offload heavy tasks (code reviews, bug hunting) to fresh contexts while your main session's context window remains unpolluted. Each subagent returns only final results.
  • Context Isolation - Run separate investigations without polluting your primary workspace
  • Role Specialization - Spawn planner, codereviewer, or custom role agents with specialized system prompts
  • Full CLI Capabilities - Web search, file inspection, MCP tool access, latest documentation lookups
  • Seamless Continuity - Sub-CLIs participate as first-class members with full conversation context between tools
bash
# Codex spawns Codex subagent for isolated code review in fresh context
clink with codex codereviewer to audit auth module for security issues
# Subagent reviews in isolation, returns final report without cluttering your context as codex reads each file and walks the directory structure

# Consensus from different AI models → Implementation handoff with full context preservation between tools
Use consensus with gpt-5 and gemini-pro to decide: dark mode or offline support next
Continue with clink gemini - implement the recommended feature
# Gemini receives full debate context and starts coding immediately

👉 [Learn more about clink](docs/tools/clink.md)

Why PAL MCP?

Why rely on one AI model when you can orchestrate them all?

Don't lose this

Three weeks from now, you'll want Pal Mcp Server again. Will you remember where to find it?

Save it to your library and the next time you need Pal Mcp Server, it’s one tap away — from any AI app you use. Group it into a bench with the rest of the team for that kind of task and you can pull the whole stack at once.

⚡ Pro tip for geeks: add a-gnt 🤵🏻‍♂️ as a custom connector in Claude or a custom GPT in ChatGPT — one click and your library is right there in the chat. Or, if you’re in an editor, install the a-gnt MCP server and say “use my [bench name]” in Claude Code, Cursor, VS Code, or Windsurf.

🤵🏻‍♂️

a-gnt's Take

Our honest review

This plugs directly into your AI and gives it new abilities it didn't have before. The power of Claude Code / GeminiCLI / CodexCLI + [Gemini / OpenAI / OpenRouter / Azure / Grok / Oll. Once connected, just ask your AI to use it. It's completely free and works across most major AI apps. This one just landed in the catalog — worth trying while it's fresh.

Tips for getting started

1

Tap "Get" above, pick your AI app, and follow the steps. Most installs take under 30 seconds.

2

Heads up: this needs an API key to work. You'll get one from the service's website (usually free). The setup guide tells you exactly where.

What's New

Version 1.0.06 days ago

Imported from GitHub

Ratings & Reviews

0.0

out of 5

0 ratings

No reviews yet. Be the first to share your experience.