- Home
- Search & Web
- Mcpo
Rating
Votes
0
score
Downloads
0
total
Price
Free
Sign in with your account
Works With
About
⚡️ mcpo
Expose any MCP tool as an OpenAPI-compatible HTTP server—instantly.
mcpo is a dead-simple proxy that takes an MCP server command and makes it accessible via standard RESTful OpenAPI, so your tools "just work" with LLM agents and apps expecting OpenAPI servers.
No custom protocol. No glue code. No hassle.
🤔 Why Use mcpo Instead of Native MCP?
MCP servers usually speak over raw stdio, which is:
- 🔓 Inherently insecure
- ❌ Incompatible with most tools
- 🧩 Missing standard features like docs, auth, error handling, etc.
mcpo solves all of that—without extra effort:
- ✅ Works instantly with OpenAPI tools, SDKs, and UIs
- 🛡 Adds security, stability, and scalability using trusted web standards
- 🧠 Auto-generates interactive docs for every tool, no config needed
- 🔌 Uses pure HTTP—no sockets, no glue code, no surprises
What feels like "one more step" is really fewer steps with better outcomes.
mcpo makes your AI tools usable, secure, and interoperable—right now, with zero hassle.
🚀 Quick Usage
We recommend using uv for lightning-fast startup and zero config.
uvx mcpo --port 8000 --api-key "top-secret" -- your_mcp_server_commandOr, if you’re using Python:
pip install mcpo
mcpo --port 8000 --api-key "top-secret" -- your_mcp_server_commandTo use an SSE-compatible MCP server, simply specify the server type and endpoint:
mcpo --port 8000 --api-key "top-secret" --server-type "sse" -- http://127.0.0.1:8001/sseYou can also provide headers for the SSE connection:
mcpo --port 8000 --api-key "top-secret" --server-type "sse" --header '{"Authorization": "Bearer token", "X-Custom-Header": "value"}' -- http://127.0.0.1:8001/sseTo use a Streamable HTTP-compatible MCP server, specify the server type and endpoint:
mcpo --port 8000 --api-key "top-secret" --server-type "streamable-http" -- http://127.0.0.1:8002/mcpYou can also run mcpo via Docker with no installation:
docker run -p 8000:8000 ghcr.io/open-webui/mcpo:main --api-key "top-secret" -- your_mcp_server_commandExample:
uvx mcpo --port 8000 --api-key "top-secret" -- uvx mcp-server-time --local-timezone=America/New_YorkThat’s it. Your MCP tool is now available at http://localhost:8000 with a generated OpenAPI schema — test it live at http://localhost:8000/docs.
🤝 To integrate with Open WebUI after launching the server, check our [docs](https://docs.openwebui.com/openapi-servers/open-webui/).
🌐 Serving Under a Subpath (--root-path)
If you need to serve mcpo behind a reverse proxy or under a subpath (e.g., /api/mcpo), use the --root-path argument:
mcpo --port 8000 --root-path "/api/mcpo" --api-key "top-secret" -- your_mcp_server_commandAll routes will be served under the specified root path, e.g. http://localhost:8000/api/mcpo/memory.
🔄 Using a Config File
Don't lose this
Three weeks from now, you'll want Mcpo again. Will you remember where to find it?
Save it to your library and the next time you need Mcpo, it’s one tap away — from any AI app you use. Group it into a bench with the rest of the team for that kind of task and you can pull the whole stack at once.
⚡ Pro tip for geeks: add a-gnt 🤵🏻♂️ as a custom connector in Claude or a custom GPT in ChatGPT — one click and your library is right there in the chat. Or, if you’re in an editor, install the a-gnt MCP server and say “use my [bench name]” in Claude Code, Cursor, VS Code, or Windsurf.
a-gnt's Take
Our honest review
This plugs directly into your AI and gives it new abilities it didn't have before. A simple, secure MCP-to-OpenAPI proxy server. Once connected, just ask your AI to use it. It's completely free and works across most major AI apps. This one just landed in the catalog — worth trying while it's fresh.
Tips for getting started
Tap "Get" above, pick your AI app, and follow the steps. Most installs take under 30 seconds.
You'll sign in with your existing account the first time. After that, it just works.
What's New
Imported from GitHub
Ratings & Reviews
0.0
out of 5
0 ratings
No reviews yet. Be the first to share your experience.