Skip to content
Ori Nachum edited this page May 20, 2025 · 1 revision

Welcome to the open-responses-server wiki!

Open Responses Server is an Open (MIT License) project for serving Responses endpoint.

This means you can work with the stateful, agentic notation to chat with AI and enjoy File Search, Web Search, Code Interpreter, Computer use and MCP integrations out of the box.

The server wraps any AI provider that serves OpenAI API Chat Completions.

This means you can work locally with Vllm or Ollama (or even LiteLLM) and enjoy MCP integrations in a simple add-on component.

This also opens working with Codex and MCP integration locally in chat or as CLI for pipelines.

For other providers like Bedrock we work on adding an amazing-chat-completions-adapter.

This project is part of a bigger project for K8s Native AI Chat (KNAC) app

Clone this wiki locally