Skip to main content
0
C

Code2prompt

A CLI tool to convert your codebase into a single LLM prompt with source tree, prompt templating, an

Rating

0.0

Votes

0

score

Downloads

0

total

Price

Free

Access token required

Works With

Claude CodeCursorWindsurfVS CodeDeveloper tool

About

Convert your codebase into a single LLM prompt.

Website • Documentation • Discord

](https://github.com/mufeedvh/code2prompt/blob/master/LICENSE) [ ](https://pypi.org/project/code2prompt-rs/) [ ](https://discord.com/invite/ZZyBbsHTwH) [ ](https://crates.io/crates/code2prompt) [

Code2Prompt is a powerful context engineering tool designed to ingest codebases and format them for Large Language Models. Whether you are manually copying context for ChatGPT, building AI agents via Python, or running a MCP server, Code2Prompt streamlines the context preparation process.

⚡ Quick Install

Cargo

bash
cargo install code2prompt 

To enable optional Wayland support (e.g., for clipboard integration on Wayland-based systems), use the wayland feature flag:

bash
cargo install --features wayland code2prompt

Homebrew

bash
brew install code2prompt

SDK with pip 🐍

bash
pip install code2prompt-rs

🚀 Quick Start

Once installed, generating a prompt from your codebase is as simple as pointing the tool to your directory.

Basic Usage: Generate a prompt from the current directory and copy it to the clipboard.

sh
code2prompt .

Save to file:

sh
code2prompt path/to/project --output-file prompt.txt

🌐 Ecosystem

Code2Prompt is more than just a CLI tool. It is a complete ecosystem for codebase context.

🧱 Core Library💻 CLI Tool🐍 Python SDK🤖 MCP Server
The internal, high-speed library responsible for secure file traversal, respecting .gitignore rules, and structuring Git metadata.Designed for humans, featuring both a minimal CLI and an interactive TUI. Generate formatted prompts, track token usage, and outputs the result to your clipboard or stdout.Provides fast Python bindings to the Rust Core. Ideal for AI Agents, automation scripts, or deep integration into RAG pipelines. Available on PyPI.Run Code2Prompt as a local service, enabling agentic applications to read your local codebase efficiently without bloating your context window.

📚 Documentation

Check our online documentation for detailed instructions

✨ Features

Code2Prompt transforms your entire codebase into a well-structured prompt for large language models. Key features include:

Don't lose this

Three weeks from now, you'll want Code2prompt again. Will you remember where to find it?

Save it to your library and the next time you need Code2prompt, it’s one tap away — from any AI app you use. Group it into a bench with the rest of the team for that kind of task and you can pull the whole stack at once.

⚡ Pro tip for geeks: add a-gnt 🤵🏻‍♂️ as a custom connector in Claude or a custom GPT in ChatGPT — one click and your library is right there in the chat. Or, if you’re in an editor, install the a-gnt MCP server and say “use my [bench name]” in Claude Code, Cursor, VS Code, or Windsurf.

🤵🏻‍♂️

a-gnt's Take

Our honest review

Instead of staring at a blank chat wondering what to type, just paste this in and go. A CLI tool to convert your codebase into a single LLM prompt with source tree, prompt templating, an. You can tweak the parts in brackets to make it yours. It's completely free and works across most major AI apps. This one just landed in the catalog — worth trying while it's fresh.

Tips for getting started

1

Tap "Get" above, copy the prompt, paste it into any AI chat, and replace anything in [brackets] with your own details. Hit send — that's it.

2

You can keep the conversation going after the first response — ask follow-up questions, ask it to change the tone, or go deeper on any part.

What's New

Version 1.0.06 days ago

Imported from GitHub

Ratings & Reviews

0.0

out of 5

0 ratings

No reviews yet. Be the first to share your experience.