- Home
- Developer Tools
- maxim-saplin/mcp_safe_local_python_executor
maxim-saplin/mcp_safe_local_python_executor
maxim-saplin/mcp_safe_local_python_executor
Rating
Votes
0
score
Downloads
0
total
Price
Free
No login needed
Works With
About
An MCP server (stdio transport) that wraps Hugging Face's LocalPythonExecutor (from the smolagents framework). It is a custom Python runtime that provides basic isolation/security when running Python code generated by LLMs locally. It does not require Docker or VM. This package allows to expose the Python executor via MCP (Model Context Protocol) as a tool for LLM apps like Claude Desktop, Cursor or any other MCP compatible client. In case of Claude Desktop this tool is an easy way to add a missing Code Interpreter (available as a plugin in ChatGPT for quite a while already).
- Exposes
run_pythontool - Safer execution of Python code compared to direct use of Python
eva()l - Ran via uv in Python venv
- No file I/O ops are allowed
- Restricted list of imports
- collections
- datetime
- itertools
- math
- queue
- random
- re
- stat
- statistics
- time
- unicodedata
Be careful with execution of code produced by LLM on your machine, stay away from MCP servers that run Python via command line or using eval(). The safest option is using a VM or a docker container, though it requires some effort to set-up, consumes resources/slower. There're 3rd party servcices providing Python runtime, though they require registration, API keys etc.
LocalPythonExecutor provides a good balance between direct use of local Python environment (which is easier to set-up) AND remote execution in Dokcer container or a VM/3rd party service (which
Works with Claude (desktop and mobile), Cursor, Windsurf, VS Code, and any MCP-compatible AI app.
Category: Developer Tools
Don't lose this
Three weeks from now, you'll want maxim-saplin/mcp_safe_local_python_executor again. Will you remember where to find it?
Save it to your library and the next time you need maxim-saplin/mcp_safe_local_python_executor, it’s one tap away — from any AI app you use. Group it into a bench with the rest of the team for that kind of task and you can pull the whole stack at once.
⚡ Pro tip for geeks: add a-gnt 🤵🏻♂️ as a custom connector in Claude or a custom GPT in ChatGPT — one click and your library is right there in the chat. Or, if you’re in an editor, install the a-gnt MCP server and say “use my [bench name]” in Claude Code, Cursor, VS Code, or Windsurf.
a-gnt's Take
Our honest review
This plugs directly into your AI and gives it new abilities it didn't have before. maxim-saplin/mcp_safe_local_python_executor. Once connected, just ask your AI to use it. It's completely free and works across most major AI apps. This one just landed in the catalog — worth trying while it's fresh.
Tips for getting started
Tap "Get" above, pick your AI app, and follow the steps. Most installs take under 30 seconds.
What's New
Imported from awesome:punkpeye/awesome-mcp-servers
Ratings & Reviews
0.0
out of 5
0 ratings
No reviews yet. Be the first to share your experience.