Skip to main content
0
R

Ros Mcp Server

Connect AI models like Claude & GPT with robots using MCP and ROS.

Rating

0.0

Votes

0

score

Downloads

0

total

Price

Free

No login needed

Works With

Claude CodeCursorWindsurfVS CodeDeveloper tool

About

ROS MCP Server πŸ§ β‡„πŸ€–

ROS-MCP-Server connects large language models (such as Claude, GPT, and Gemini) to robots, enabling bidirectional communication with no changes to existing robot source code.

Why ROS-MCP?

  • No robot source code changes β†’ just add the rosbridge node to your existing ROS setup.
  • True two-way communication β†’ LLMs can both control robots and observe everything happening on the Robot.
  • Full context β†’ publish & subscribe to topics, call services & actions, set parameters, read sensor data, and monitor robot state in real time.
  • Deep ROS understanding β†’ guides the LLM to discover available topics, services, actions, and their types (including custom ones) β€” enabling it to use them with the right syntax without manual configuration.
  • Works with any MCP client β†’ built on the open MCP standard, supporting Claude Code, Codex CLI, Gemini CLI, Claude Desktop, ChatGPT, Cursor, and more.
  • Works across ROS versions β†’ compatible across ROS 2 (Jazzy, Humble, and others) and ROS 1 distros.

πŸŽ₯ Examples in Action

πŸ–₯️ Example - Controlling the MOCA mobile manipulator in NVIDIA Isaac Sim Commands are entered into Claude Desktop, which uses the MCP server to control the simulated robot.

πŸ• Example - Controlling Unitree Go2 with natural language (video) The MCP server enables Claude to interpret images from the robot's cameras, and then command the robot based on human natural language commands.

🏭 Example - Debugging an industrial robot (Video)

  • Connecting to an industrial robot enables the LLM to browse all ROS topics and services to assess the robot state.
  • With no predefined context, the MCP server enables the LLM to query details about custom topic and service types and their syntax (00:28).
  • Using only natural language, the operator calls the custom services to test and debug the robot (01:42).

πŸ›  Getting Started

Follow the installation guide to get started.

ROS-MCP works with Claude Code, Codex CLI, Gemini CLI, Claude Desktop, ChatGPT, Cursor, or any MCP-compatible client.

πŸ“š More Examples & Tutorials

Browse our examples to see the server in action. We welcome community PRs with new examples and integrations!

🀝 Contributing

We love contributions of all kinds:

  • Bug fixes and documentation updates
  • New features (e.g., Action support, permissions)
  • Additional examples and tutorials

Check out the contributing guidelines and see issues tagged good first issue to get started.

πŸ“œ License

This project is licensed under the Apache License 2.0.

Don't lose this

Three weeks from now, you'll want Ros Mcp Server again. Will you remember where to find it?

Save it to your library and the next time you need Ros Mcp Server, it’s one tap away β€” from any AI app you use. Group it into a bench with the rest of the team for that kind of task and you can pull the whole stack at once.

⚑ Pro tip for geeks: add a-gnt πŸ€΅πŸ»β€β™‚οΈ as a custom connector in Claude or a custom GPT in ChatGPT β€” one click and your library is right there in the chat. Or, if you’re in an editor, install the a-gnt MCP server and say β€œuse my [bench name]” in Claude Code, Cursor, VS Code, or Windsurf.

πŸ€΅πŸ»β€β™‚οΈ

a-gnt's Take

Our honest review

This plugs directly into your AI and gives it new abilities it didn't have before. Connect AI models like Claude & GPT with robots using MCP and ROS. Once connected, just ask your AI to use it. It's completely free and works across most major AI apps. This one just landed in the catalog β€” worth trying while it's fresh.

Tips for getting started

1

Tap "Get" above, pick your AI app, and follow the steps. Most installs take under 30 seconds.

What's New

Version 1.0.06 days ago

Imported from GitHub

Ratings & Reviews

0.0

out of 5

0 ratings

No reviews yet. Be the first to share your experience.