Skip to content

Commit 5e5f9e4

Browse files
Amnah199bilgeyucel
andauthored
Add OpenRouter integration (#325)
* Add openrouter * Update openrouter.md --------- Co-authored-by: Bilge Yücel <bilge.yucel@deepset.ai>
1 parent b37c52e commit 5e5f9e4

File tree

2 files changed

+100
-0
lines changed

2 files changed

+100
-0
lines changed

integrations/openrouter.md

Lines changed: 100 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,100 @@
1+
---
2+
layout: integration
3+
name: OpenRouter
4+
description: Use the OpenRouter API for text generation models.
5+
authors:
6+
- name: deepset
7+
socials:
8+
github: deepset-ai
9+
twitter: deepset_ai
10+
linkedin: https://www.linkedin.com/company/deepset-ai/
11+
pypi: https://pypi.org/project/openrouter-haystack
12+
repo: https://github.com/deepset-ai/haystack-core-integrations/tree/main/integrations/openrouter
13+
type: Model Provider
14+
report_issue: https://github.com/deepset-ai/haystack-core-integrations/issues
15+
logo: /logos/openrouter.png
16+
version: Haystack 2.0
17+
toc: true
18+
---
19+
20+
### **Table of Contents**
21+
- [Overview](#overview)
22+
- [Installation](#installation)
23+
- [Usage](#usage)
24+
- [License](#license)
25+
26+
## Overview
27+
28+
`OpenRouterChatGenerator` lets you call any LLMs available on [OpenRouter](https://openrouter.ai), including:
29+
30+
- OpenAI variants such as `openai/gpt-4o`
31+
- Anthropic’s `claude-3.5-sonnet`
32+
- Community-hosted open-source models (Llama 2, Mixtral, etc.)
33+
34+
For more information on models available via the OpenRouter API, see [the OpenRouter docs](https://openrouter.ai/models).
35+
36+
In addition to basic chat completion, the component exposes OpenRouter-specific features:
37+
38+
* **Provider / model routing** – choose fallback models or provider ordering with the `generation_kwargs` parameter.
39+
* **Extra HTTP headers** – add attribution or tracing headers via `extra_headers`.
40+
41+
42+
In order to follow along with this guide, you'll need a OpenRouter API key. Add it as an environment variable, `OPENROUTER_API_KEY`.
43+
44+
## Installation
45+
46+
```bash
47+
pip install openrouter-haystack
48+
```
49+
50+
## Usage
51+
You can use [OpenRouterChatGenerator](https://docs.haystack.deepset.ai/docs/openrouterchatgenerator) as standalone, within a [pipeline](https://docs.haystack.deepset.ai/docs/pipelines) or with the [Agent component](https://docs.haystack.deepset.ai/docs/agent).
52+
53+
Here's an example of using it as a standalone component:
54+
55+
```python
56+
import os
57+
from haystack.dataclasses import ChatMessage
58+
from haystack_integrations.components.generators.openrouter import OpenRouterChatGenerator
59+
60+
os.environ["OPENROUTER_API_KEY"] = "YOUR_OPENROUTER_API_KEY"
61+
62+
client = OpenRouterChatGenerator() # defaults to openai/gpt-4o-mini
63+
response = client.run(
64+
[ChatMessage.from_user("What are Agentic Pipelines? Be brief.")]
65+
)
66+
print(response["replies"])
67+
68+
```
69+
```bash
70+
{'replies': [ChatMessage(_role=<ChatRole.ASSISTANT: 'assistant'>, _content=[TextContent(text='The capital of Vietnam is Hanoi.')], _name=None, _meta={'model': 'openai/gpt-4o-mini', 'index': 0, 'finish_reason': 'stop', 'usage': {'completion_tokens': 8, 'prompt_tokens': 13, 'total_tokens': 21, 'completion_tokens_details': CompletionTokensDetails(accepted_prediction_tokens=None, audio_tokens=None, reasoning_tokens=0, rejected_prediction_tokens=None), 'prompt_tokens_details': PromptTokensDetails(audio_tokens=None, cached_tokens=0)}})]}
71+
```
72+
`OpenRouterChatGenerator` also support streaming responses if you pass a streaming callback:
73+
74+
```python
75+
import os
76+
from haystack.dataclasses import ChatMessage
77+
from haystack_integrations.components.generators.openrouter import OpenRouterChatGenerator
78+
79+
os.environ["OPENROUTER_API_KEY"] = "YOUR_OPENROUTER_API_KEY"
80+
81+
def show(chunk): # simple streaming callback
82+
print(chunk.content, end="", flush=True)
83+
84+
client = OpenRouterChatGenerator(
85+
model="openrouter/auto", # let OpenRouter pick a model
86+
streaming_callback=show,
87+
generation_kwargs={
88+
"provider": {"sort": "throughput"}, # pick the fastest provider
89+
}
90+
)
91+
92+
response = client.run([ChatMessage.from_user("Summarize RAG in two lines.")])
93+
94+
print (response)
95+
96+
```
97+
98+
### License
99+
100+
`openrouter-haystack` is distributed under the terms of the [Apache-2.0](https://spdx.org/licenses/Apache-2.0.html) license.

logos/openrouter.png

4.18 KB
Loading

0 commit comments

Comments
 (0)