- Home
- Developer Tools
- hungthai1401/bruno-mcp
Rating
Votes
0
score
Downloads
0
total
Price
Free
No login needed
Works With
About
[](https://mseep.ai/app/hungthai1401-bruno-mcp)
](https://archestra.ai/mcp-catalog/hungthai1401__bruno-mcp) [
An MCP (Model Context Protocol) server that enables running Bruno collections. This server allows LLMs to execute API tests using Bruno and get detailed results through a standardized interface.
- Run Bruno collections using the Bruno CLI
- Support for environment files
- Support for environment variables
- Detailed test results including:
- Overall success/failure status
- Test summary (total, passed, failed)
- Detailed failure information
- Execution timings
Works with Claude (desktop and mobile), Cursor, Windsurf, VS Code, and any MCP-compatible AI app.
Category: Developer Tools
Don't lose this
Three weeks from now, you'll want hungthai1401/bruno-mcp again. Will you remember where to find it?
Save it to your library and the next time you need hungthai1401/bruno-mcp, it’s one tap away — from any AI app you use. Group it into a bench with the rest of the team for that kind of task and you can pull the whole stack at once.
⚡ Pro tip for geeks: add a-gnt 🤵🏻♂️ as a custom connector in Claude or a custom GPT in ChatGPT — one click and your library is right there in the chat. Or, if you’re in an editor, install the a-gnt MCP server and say “use my [bench name]” in Claude Code, Cursor, VS Code, or Windsurf.
a-gnt's Take
Our honest review
This plugs directly into your AI and gives it new abilities it didn't have before. hungthai1401/bruno-mcp. Once connected, just ask your AI to use it. It's completely free and works across most major AI apps. This one just landed in the catalog — worth trying while it's fresh.
Tips for getting started
Tap "Get" above, pick your AI app, and follow the steps. Most installs take under 30 seconds.
What's New
Imported from awesome:punkpeye/awesome-mcp-servers
Ratings & Reviews
0.0
out of 5
0 ratings
No reviews yet. Be the first to share your experience.