Skip to main content
0
O

OpenCode

Multi-model CLI coding agent with 75+ LLM providers

Rating

4.1

Votes

0

score

Downloads

134

total

Price

Free

API key required

Works With

Claude CodeCursorWindsurfVS CodeDeveloper tool

About

OpenCode is a flexible CLI coding agent that supports 75+ LLM providers including Claude, GPT, Gemini, Groq, AWS Bedrock, and local models via Ollama. The most model-flexible coding CLI available.

Features LSP integration for language server context, multi-session support for parallel agents, and session sharing. Built for developers who want model freedom.

Install via npm or download the binary. Open-source and free to use.

Don't lose this

Three weeks from now, you'll want OpenCode again. Will you remember where to find it?

Save it to your library and the next time you need OpenCode, it’s one tap away — from any AI app you use. Group it into a bench with the rest of the team for that kind of task and you can pull the whole stack at once.

⚡ Pro tip for geeks: add a-gnt 🤵🏻‍♂️ as a custom connector in Claude or a custom GPT in ChatGPT — one click and your library is right there in the chat. Or, if you’re in an editor, install the a-gnt MCP server and say “use my [bench name]” in Claude Code, Cursor, VS Code, or Windsurf.

🤵🏻‍♂️

a-gnt's Take

Our honest review

Multi-model CLI coding agent with 75+ LLM providers. Best for developers and power users who want to extend their AI workflow with developer tools capabilities. It's backed by an active open-source community and verified by the creator. This one just landed in the catalog — worth trying while it's fresh.

Tips for getting started

1

Tap "Get" above, pick your AI app, and follow the steps. Most installs take under 30 seconds.

2

Heads up: this needs an API key to work. You'll get one from the service's website (usually free). The setup guide tells you exactly where.

What's New

Version 1.0.06 days ago

Initial release

Ratings & Reviews

4.1

out of 5

3 ratings

No reviews yet. Be the first to share your experience.