LLM proxy to observe and debug what your AI agents are doing.
Documentation | Quickstart for Users | Quickstart for Developers | Run Locally
Invariant Gateway is a lightweight zero-configuration service that acts as an intermediary between AI Agents and LLM providers (such as OpenAI and Anthropic).
Gateway automatically traces agent interactions and stores them in the Invariant Explorer, giving you insights into what your agents are doing. This allows you to observe and debug your agents in Invariant Explorer.
- Single Line Setup: Just change the base URL of your LLM provider to the Invariant Gateway.
- Intercepts agents on an LLM-level for better debugging and analysis.
- Tool Calling and Computer Use Support to capture all forms of agentic interactions.
- Seamless forwarding and LLM streaming to OpenAI, Anthropic, and other LLM providers.
- Store and organize runtime traces in the Invariant Explorer.
Looking to observe and secure AI agents in your organization? See our no-code quickstart guide for users to get started.
To add Gateway to your agentic system, follow one of the integration guides below, depending on the LLM provider.
Gateway supports the OpenAI Chat Completions API (/v1/chat/completions
endpoint).
-
Follow these steps to obtain an OpenAI API key.
-
Modify OpenAI Client Setup
Instead of connecting directly to OpenAI, configure your
OpenAI
client to use Gateway:from httpx import Client from openai import OpenAI client = OpenAI( http_client=Client( headers={ "Invariant-Authorization": "Bearer your-invariant-api-key" }, ), base_url="https://explorer.invariantlabs.ai/api/v1/gateway/{add-your-dataset-name-here}/openai", )
Note: Do not include the curly braces
{}
. If the dataset does not exist in Invariant Explorer, it will be created before adding traces.
Gateway supports the Anthropic Messages API (/v1/messages
endpoint).
-
Follow these steps to obtain an Anthropic API key.
-
Modify Anthropic Client Setup
from httpx import Client from anthropic import Anthropic client = Anthropic( http_client=Client( headers={ "Invariant-Authorization": "Bearer your-invariant-api-key" }, ), base_url="https://explorer.invariantlabs.ai/api/v1/gateway/{add-your-dataset-name-here}/anthropic", )
Note: Do not include the curly braces
{}
. If the dataset does not exist in Invariant Explorer, it will be created before adding traces.
Gateway supports the Gemini generateContent
and streamGenerateContent
methods.
-
Follow these steps to obtain a Gemini API key.
-
Modify Gemini Client Setup
import os from google import genai client = genai.Client( api_key=os.environ["GEMINI_API_KEY"], http_options={ "base_url": "https://explorer.invariantlabs.ai/api/v1/gateway/{add-your-dataset-name-here}/gemini", "headers": { "Invariant-Authorization": "Bearer your-invariant-api-key" }, }, )
Note: Do not include the curly braces
{}
. If the dataset does not exist in Invariant Explorer, it will be created before adding traces.
Integrating directly with a specific agent framework is also supported, simply by configuring the underlying LLM client.
For instance, OpenAI Swarm relies on OpenAI's Python client, the setup is very similar to the standard OpenAI integration:
from swarm import Swarm, Agent
from openai import OpenAI
from httpx import Client
import os
client = Swarm(
client=OpenAI(
http_client=Client(headers={"Invariant-Authorization": "Bearer " + os.getenv("INVARIANT_API_KEY", "")}),
base_url="https://explorer.invariantlabs.ai/api/v1/gateway/weather-swarm-agent/openai",
)
)
def get_weather():
return "It's sunny."
agent = Agent(
name="Agent A",
instructions="You are a helpful agent.",
functions=[get_weather],
)
response = client.run(
agent=agent,
messages=[{"role": "user", "content": "What's the weather?"}],
)
print(response.messages[-1]["content"])
# Output: "It seems to be sunny."
LiteLLM is a python library that acts as a unified interface for calling multiple LLM providers. If you are using it, it is very convinient to connect to Gateway proxy. You just need to pass the correct base_url
.
from litellm import completion
import random
import os
base_url = "/api/v1/gateway/litellm/{add-your-dataset-name-here}"
EXAMPLE_MODELS = ["openai/gpt-4o", "gemini/gemini-2.0-flash", "anthropic/claude-3-5-haiku-20241022"]
model = random.choice(SAMPLE_MODELS)
base_url += "/" + model.split("/")[0] # append /gemini /openai or /anthropic.
if model.split("/")[0] == "gemini":
base_url += f"/v1beta/models/{model.split('/')[1]}" # gemini expects the model name in the url.
chat_response = completion(
model=model,
messages=[{"role": "user", "content": "What is the capital of France?"}],
extra_headers= {"Invariant-Authorization": "Bearer <some-key>"},
stream=True,
base_url=base_url,
)
print(chat_response.choices[0].message.content)
# Output: "Paris."
You can also easily integrate the Gateway with Microsoft Autogen as follows:
import asyncio
from autogen_agentchat.agents import AssistantAgent
from autogen_ext.models.openai import OpenAIChatCompletionClient
import os
from httpx import AsyncClient
async def main() -> None:
client = OpenAIChatCompletionClient(
model="gpt-4o",
http_client=AsyncClient(headers={"Invariant-Authorization": "Bearer " + os.getenv("INVARIANT_API_KEY", "")}),
base_url="https://explorer.invariantlabs.ai/api/v1/gateway/weather-swarm-agent/openai",
)
agent = AssistantAgent("assistant", client)
print(await agent.run(task="Say 'Hello World!'"))
asyncio.run(main())
# Output: "Hello World!"
This will automatically trace your agent interactions in Invariant Explorer.
If you are not building an agent yourself but would like to observe and secure AI agents in your organization, you can do so by configuring the agents to use the Gateway.
See below for example integrations with popular agents.
OpenHands (formerly OpenDevin) is a platform for software development agents powered by AI.
Enable the Advanced Options
toggle under Settings and update the Base URL
to the following
https://explorer.invariantlabs.ai/api/v1/gateway/{add-your-dataset-name-here}/openai
Set the API Key using the following format:
{your-llm-api-key};invariant-auth={your-invariant-api-key}
Note: Do not include the curly braces
{}
.
The Invariant Gateway extracts the invariant-auth
field from the API key and correctly forwards it to Invariant Explorer while sending the actual API key to OpenAI or Anthropic.
SWE-agent allows your preferred language model (e.g., GPT-4o or Claude Sonnet 3.5) to autonomously utilize tools for various tasks, such as fixing issues in real GitHub repositories.
SWE-agent does not support custom headers, so you cannot pass the Invariant API Key via Invariant-Authorization
. However, there is a workaround using the Invariant Gateway.
Run sweagent
with the following flag:
--agent.model.api_base=https://explorer.invariantlabs.ai/api/v1/gateway/{add-your-dataset-name-here}/openai
Note: Do not include the curly braces
{}
.
Instead of setting your API Key normally, modify the environment variable as follows:
export OPENAI_API_KEY={your-openai-api-key};invariant-auth={your-invariant-api-key}
export ANTHROPIC_API_KEY={your-anthropic-api-key};invariant-auth={your-invariant-api-key}
Note: Do not include the curly braces
{}
.
This setup ensures that SWE-agent works seamlessly with Invariant Gateway, maintaining compatibility while enabling full functionality. ๐
You can also operate your own instance of the Gateway, to ensure privacy and security.
To run Gateway locally, you have two options:
-
Clone this repository.
-
To start the Invariant Gateway, then run the following commands. Note that you need to have Docker installed.
cd invariant-gateway
bash run.sh build && bash run.sh up
This will launch Gateway at http://localhost:8005/api/v1/gateway/.
You can also run the Gateway using the published Docker image. This is a good option if you want to run the Gateway in a cloud environment.
# pull the latest image
docker pull --platform linux/amd64 ghcr.io/invariantlabs-ai/invariant-gateway/gateway:latest
# run Gateway on localhost:8005
docker run -p 8005:8005 -e PORT=8005 --platform linux/amd64 ghcr.io/invariantlabs-ai/invariant-gateway/gateway:latest
This will launch Gateway at http://localhost:8005/api/v1/gateway/. This instance will automatically push your traces to https://explorer.invariantlabs.ai
.
- Follow the instructions here to obtain an API key. This allows the gateway to push traces to Invariant Explorer.
By default Gateway points to the public Explorer instance at explorer.invariantlabs.ai
. To point it to your local Explorer instance, modify the INVARIANT_API_URL
value inside .env
. Follow instructions in .env
on how to point to the local instance.
To run the unit tests, execute:
bash run.sh unit-tests
To run the integration tests, execute:
bash run.sh integration-tests
To run a subset of the integration tests, execute:
bash run.sh integration-tests open_ai/test_chat_with_tool_call.py