Skip to main content
0
C

Comet Opik MCP

Query LLM logs, traces, and telemetry data

Rating

4.1

Votes

0

score

Downloads

740

total

Price

Free

API key required

Works With

Claude CodeCursorWindsurfVS CodeDeveloper tool

About

Comet's Opik MCP server for LLM observability. Query traces, analyze model performance, review prompts and completions, and debug LLM behavior through AI conversations.

Full LLM observability with trace-level granularity. Track costs, latency, and output quality.

Requires Comet Opik API key.

Don't lose this

Three weeks from now, you'll want Comet Opik MCP again. Will you remember where to find it?

Save it to your library and the next time you need Comet Opik MCP, it’s one tap away — from any AI app you use. Group it into a bench with the rest of the team for that kind of task and you can pull the whole stack at once.

⚡ Pro tip for geeks: add a-gnt 🤵🏻‍♂️ as a custom connector in Claude or a custom GPT in ChatGPT — one click and your library is right there in the chat. Or, if you’re in an editor, install the a-gnt MCP server and say “use my [bench name]” in Claude Code, Cursor, VS Code, or Windsurf.

🤵🏻‍♂️

a-gnt's Take

Our honest review

This plugs directly into your AI and gives it new abilities it didn't have before. Query LLM logs, traces, and telemetry data. Once connected, just ask your AI to use it. It's backed by an active open-source community and verified by the creator. This one just landed in the catalog — worth trying while it's fresh.

Tips for getting started

1

Tap "Get" above, pick your AI app, and follow the steps. Most installs take under 30 seconds.

2

Heads up: this needs an API key to work. You'll get one from the service's website (usually free). The setup guide tells you exactly where.

What's New

Version 1.0.06 days ago

Initial release

Ratings & Reviews

4.1

out of 5

26 ratings

No reviews yet. Be the first to share your experience.