Skip to main content
0
K

Kong

🦍 The API and AI Gateway

by Kong

Rating

0.0

Votes

0

score

Downloads

0

total

Price

Free

Sign in with your account

Works With

Claude CodeCursorWindsurfVS CodeDeveloper tool

About

[![][kong-logo]][kong-url]

![Build Status][badge-action-image]][badge-action-url] [

Kong or Kong Gateway is a cloud-native, platform-agnostic, scalable API π–§Ή LLM π–§Ή MCP Gateway distinguished for its high performance and extensibility via plugins. It also provides advanced AI traffic capabilities with multi-LLM support, semantic security, MCP traffic security and analytics, and more.

By providing functionality for proxying, routing, load balancing, health checking, authentication (and more), Kong serves as the central layer for orchestrating microservices or conventional API traffic - and agentic LLM and MCP as well - with ease.

Kong runs natively on Kubernetes thanks to its official Kubernetes Ingress Controller.

[![][kong-diagram]][kong-url]

Installation | Documentation | Discussions | Forum | Blog | Builds][kong-master-builds] | [AI Gateway | Cloud Hosted Kong

Getting Started

If you prefer to use a cloud-hosted Kong, you can sign up for a free trial of Kong Konnect and get started in minutes. If not, you can follow the instructions below to get started with Kong on your own infrastructure.

Let’s test drive Kong by adding authentication to an API in under 5 minutes.

We suggest using the docker-compose distribution via the instructions below, but there is also a docker installation procedure if you’d prefer to run the Kong Gateway in DB-less mode.

Whether you’re running in the cloud, on bare metal, or using containers, you can find every supported distribution on our official installation page.

  1. 1.To start, clone the Docker repository and navigate to the compose folder.
cmd
  $ git clone https://github.com/Kong/docker-kong
  $ cd docker-kong/compose/
  1. 1.Start the Gateway stack using:
cmd
  $ KONG_DATABASE=postgres docker-compose --profile database up

The Gateway is now available on the following ports on localhost:

  • :8000 - send traffic to your service via Kong
  • :8001 - configure Kong using Admin API or via decK
  • :8002 - access Kong's management Web UI (Kong Manager) on localhost:8002

Don't lose this

Three weeks from now, you'll want Kong again. Will you remember where to find it?

Save it to your library and the next time you need Kong, it’s one tap away β€” from any AI app you use. Group it into a bench with the rest of the team for that kind of task and you can pull the whole stack at once.

⚑ Pro tip for geeks: add a-gnt πŸ€΅πŸ»β€β™‚οΈ as a custom connector in Claude or a custom GPT in ChatGPT β€” one click and your library is right there in the chat. Or, if you’re in an editor, install the a-gnt MCP server and say β€œuse my [bench name]” in Claude Code, Cursor, VS Code, or Windsurf.

πŸ€΅πŸ»β€β™‚οΈ

a-gnt's Take

Our honest review

This plugs directly into your AI and gives it new abilities it didn't have before. 🦍 The API and AI Gateway. Once connected, just ask your AI to use it. It's completely free and works across most major AI apps. This one just landed in the catalog β€” worth trying while it's fresh.

Tips for getting started

1

Tap "Get" above, pick your AI app, and follow the steps. Most installs take under 30 seconds.

2

You'll sign in with your existing account the first time. After that, it just works.

What's New

Version 1.0.06 days ago

Imported from GitHub

Ratings & Reviews

0.0

out of 5

0 ratings

No reviews yet. Be the first to share your experience.