Rating
Votes
0
score
Downloads
3.3K
total
Price
Free
No login needed
Works With
About
Open WebUI is a feature-rich, self-hosted web interface for interacting with LLMs. Supports Ollama, OpenAI-compatible APIs, and direct model loading.
Beautiful, responsive interface with conversation management, model switching, RAG, and multi-user support. The most popular self-hosted LLM UI.
Deploy via Docker. Free and open-source.
Don't lose this
Three weeks from now, you'll want Open WebUI again. Will you remember where to find it?
Save it to your library and the next time you need Open WebUI, it’s one tap away — from any AI app you use. Group it into a bench with the rest of the team for that kind of task and you can pull the whole stack at once.
⚡ Pro tip for geeks: add a-gnt 🤵🏻♂️ as a custom connector in Claude or a custom GPT in ChatGPT — one click and your library is right there in the chat. Or, if you’re in an editor, install the a-gnt MCP server and say “use my [bench name]” in Claude Code, Cursor, VS Code, or Windsurf.
a-gnt's Take
Our honest review
Self-hosted web UI for local and cloud LLMs. Best for anyone looking to make their AI assistant more capable in ai models. It's backed by an active open-source community and verified by the creator. This one just landed in the catalog — worth trying while it's fresh.
Tips for getting started
Tap "Get" above, pick your AI app, and follow the steps. Most installs take under 30 seconds.
What's New
Initial release
Ratings & Reviews
4.4
out of 5
11 ratings
No reviews yet. Be the first to share your experience.
From the Community
Free vs Paid AI Tools: When Is It Worth Paying?
An honest breakdown of when free AI tools are good enough and when paid versions are actually worth the money.
Local AI vs Cloud AI: Pros, Cons, and When to Use Each
A practical guide to choosing between running AI models on your own computer versus using cloud-based AI services.
Ollama: Run AI Models on Your Own Computer
Ollama makes running powerful AI models locally as simple as running a single command. No cloud, no subscriptions, no data sharing.
You Might Also Like
Spotlight
From the Community
Local AI vs Cloud AI: Pros, Cons, and When to Use Each
A practical guide to choosing between running AI models on your own computer versus using cloud-based AI services.
Ollama: Run AI Models on Your Own Computer
Ollama makes running powerful AI models locally as simple as running a single command. No cloud, no subscriptions, no data sharing.
Open WebUI: A Beautiful Interface for Local AI
Open WebUI gives your local AI models a ChatGPT-like interface that is completely private and self-hosted.