- Home
- Design & Media
- Ai Codex
Ai Codex
Generate a compact codebase index for AI assistants — saves 50K+ tokens per conversation
Rating
Votes
0
score
Downloads
0
total
Price
Free
Access token required
Works With
About
ai-codex
](https://claude.ai/code) [ [](https://www.typescriptlang.org/)
This project was entirely designed, written, and published by [Claude Code](https://claude.ai/code) (Anthropic's AI coding assistant). The concept, implementation, documentation, and examples were all generated in a single conversation session.
Generate a compact codebase index that gives AI coding assistants instant context about your project structure. Instead of wasting 50K+ tokens on file exploration at the start of every conversation, your AI assistant reads a pre-built index and gets to work immediately.
Why
Every time you start a conversation with an AI coding assistant (Claude Code, Cursor, GitHub Copilot, etc.), it spends thousands of tokens exploring your codebase -- reading files, scanning directories, building a mental model. This happens every single conversation.
ai-codex solves this by generating compact, structured reference files that capture:
- Every API route with its HTTP methods
- Every page with its rendering strategy (client vs. server)
- Every library function signature
- Your database schema (key fields, relationships)
- Your component tree with props
The result: 5 small files that replace 50K+ tokens of exploration, every time.
Quick Start
Run it in your project root:
npx ai-codexThat's it. It auto-detects your framework and generates the index.
Output
By default, files are written to .ai-codex/ in your project root:
| File | What it contains |
|---|---|
routes.md | API routes grouped by resource, with HTTP methods |
pages.md | Page tree with client/server rendering tags |
lib.md | Library exports -- function signatures, classes |
schema.md | Database schema -- key fields, FKs, relationships |
components.md | Component index with props, grouped by feature |
Files that don't apply are skipped (e.g., no schema.md if you don't use Prisma).
Configuration
CLI Flags
npx ai-codex --output .claude/codex # custom output directory
npx ai-codex --include src lib # only scan these directories
npx ai-codex --exclude tests __mocks__ # skip these directories
npx ai-codex --schema prisma/schema.prisma # explicit schema pathConfig File
Create a codex.config.json in your project root:
{
"output": ".ai-codex",
"include": ["src", "lib", "app"],
"exclude": ["tests", "__mocks__"],
"schema": "prisma/schema.prisma"
}CLI flags override config file values.
Output Format Examples
routes.md
## products
GET,POST /api/products [auth,db]
GET,PUT,DELETE /api/products/:id [auth,db]
POST /api/products/:id/images [auth]
## orders
GET,POST /api/orders [auth,db]
GET /api/orders/:id [auth,db]
POST /api/orders/:id/refund [auth,db]Don't lose this
Three weeks from now, you'll want Ai Codex again. Will you remember where to find it?
Save it to your library and the next time you need Ai Codex, it’s one tap away — from any AI app you use. Group it into a bench with the rest of the team for that kind of task and you can pull the whole stack at once.
⚡ Pro tip for geeks: add a-gnt 🤵🏻♂️ as a custom connector in Claude or a custom GPT in ChatGPT — one click and your library is right there in the chat. Or, if you’re in an editor, install the a-gnt MCP server and say “use my [bench name]” in Claude Code, Cursor, VS Code, or Windsurf.
a-gnt's Take
Our honest review
This plugs directly into your AI and gives it new abilities it didn't have before. Generate a compact codebase index for AI assistants — saves 50K+ tokens per conversation. Once connected, just ask your AI to use it. It's completely free and works across most major AI apps. This one just landed in the catalog — worth trying while it's fresh.
Tips for getting started
Tap "Get" above, pick your AI app, and follow the steps. Most installs take under 30 seconds.
What's New
Imported from GitHub
Ratings & Reviews
0.0
out of 5
0 ratings
No reviews yet. Be the first to share your experience.