Skip to content

Commit b37c52e

Browse files
Add Meta Llama integration (#326)
* init: meta_llama doc * Update integrations/meta_llama.md --------- Co-authored-by: Julian Risch <julian.risch@deepset.ai>
1 parent ba4deda commit b37c52e

File tree

2 files changed

+132
-0
lines changed

2 files changed

+132
-0
lines changed

integrations/meta_llama.md

Lines changed: 132 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,132 @@
1+
---
2+
layout: integration
3+
name: Meta Llama API
4+
description: Use Llama Models with Haystack
5+
authors:
6+
- name: Young Han
7+
socials:
8+
github: https://github.com/seyeong-han
9+
twitter: deepset_ai
10+
linkedin: https://www.linkedin.com/company/deepset-ai/
11+
pypi: https://pypi.org/project/meta-llama-haystack/
12+
repo: https://github.com/deepset-ai/haystack-core-integrations/tree/main/integrations/meta_llama
13+
type: Model Provider
14+
report_issue: https://github.com/deepset-ai/haystack-core-integrations/issues
15+
logo: /logos/meta_llama.png
16+
version: Haystack 2.0
17+
toc: true
18+
---
19+
20+
### **Table of Contents**
21+
22+
- [Overview](#overview)
23+
- [Installation](#installation)
24+
- [Usage](#usage)
25+
26+
## Overview
27+
28+
This integration supports Meta Llama models provided through Meta’s own inferencing infrastructure. To get the `LLAMA_API_KEY`, check out [the Llama API website](https://llama.developer.meta.com?utm_source=partner-haystack&utm_medium=website).
29+
30+
You can use Llama models with `MetaLlamaChatGenerator`.
31+
32+
Currently, available models are:
33+
34+
| Model ID | Input context length | Output context length | Input Modalities | Output Modalities |
35+
| --- | --- | --- | --- | --- |
36+
| `Llama-4-Scout-17B-16E-Instruct-FP8` | 128k | 4028 | Text, Image | Text |
37+
| `Llama-4-Maverick-17B-128E-Instruct-FP8` | 128k | 4028 | Text, Image | Text |
38+
| `Llama-3.3-70B-Instruct` | 128k | 4028 | Text | Text |
39+
| `Llama-3.3-8B-Instruct` | 128k | 4028 | Text | Text |
40+
41+
## Installation
42+
43+
```bash
44+
pip install meta-llama-haystack
45+
```
46+
47+
## Usage
48+
49+
Based on your use case, you can choose between `MetaLlamaChatGenerator`.
50+
Before using, make sure to set the `LLAMA_API_KEY` environment variable.
51+
52+
### Using `MetaLlamaChatGenerator`
53+
54+
This example showcases how to build a complete RAG system that can answer questions based on the information in your document store using Meta's Llama models.
55+
56+
```python
57+
# To run this example, you will need to set a `LLAMA_API_KEY` environment variable.
58+
59+
from haystack import Document, Pipeline
60+
from haystack.components.builders.chat_prompt_builder import ChatPromptBuilder
61+
from haystack.components.generators.utils import print_streaming_chunk
62+
from haystack.components.retrievers.in_memory import InMemoryBM25Retriever
63+
from haystack.dataclasses import ChatMessage
64+
from haystack.document_stores.in_memory import InMemoryDocumentStore
65+
from haystack.utils import Secret
66+
67+
from haystack_integrations.components.generators.meta_llama import MetaLlamaChatGenerator
68+
69+
# Write documents to InMemoryDocumentStore
70+
document_store = InMemoryDocumentStore()
71+
document_store.write_documents(
72+
[
73+
Document(content="My name is Jean and I live in Paris."),
74+
Document(content="My name is Mark and I live in Berlin."),
75+
Document(content="My name is Giorgio and I live in Rome."),
76+
]
77+
)
78+
79+
# Build a RAG pipeline
80+
prompt_template = [
81+
ChatMessage.from_user(
82+
"Given these documents, answer the question.\n"
83+
"Documents:\n{% for doc in documents %}{{ doc.content }}{% endfor %}\n"
84+
"Question: {{question}}\n"
85+
"Answer:"
86+
)
87+
]
88+
89+
# Define required variables explicitly
90+
prompt_builder = ChatPromptBuilder(template=prompt_template, required_variables={"question", "documents"})
91+
92+
retriever = InMemoryBM25Retriever(document_store=document_store)
93+
llm = MetaLlamaChatGenerator(
94+
api_key=Secret.from_env_var("LLAMA_API_KEY"),
95+
streaming_callback=print_streaming_chunk,
96+
)
97+
98+
rag_pipeline = Pipeline()
99+
rag_pipeline.add_component("retriever", retriever)
100+
rag_pipeline.add_component("prompt_builder", prompt_builder)
101+
rag_pipeline.add_component("llm", llm)
102+
rag_pipeline.connect("retriever", "prompt_builder.documents")
103+
rag_pipeline.connect("prompt_builder", "llm.messages")
104+
105+
# Ask a question
106+
question = "Who lives in Paris?"
107+
rag_pipeline.run(
108+
{
109+
"retriever": {"query": question},
110+
"prompt_builder": {"question": question},
111+
}
112+
)
113+
```
114+
115+
### Using `MetaLlamaChatGenerator`
116+
117+
Below is an example of using `MetaLlamaChatGenerator`:
118+
119+
```python
120+
from haystack.dataclasses import ChatMessage
121+
from haystack_integrations.components.generators.meta_llama import (
122+
MetaLlamaChatGenerator,
123+
)
124+
125+
client = MetaLlamaChatGenerator()
126+
response = client.run(
127+
messages=[ChatMessage.from_user("What is the best French cheese?")]
128+
)
129+
print(response)
130+
131+
>>{'replies': [ChatMessage(_role=<ChatRole.ASSISTANT: 'assistant'>, _content=[TextContent(text='The best French cheese is a matter of personal preference, but some of the most popular and highly-regarded French cheeses include:\n\n1. **Camembert**: A soft, creamy, and earthy cheese from Normandy, often served with bread and fruit.\n2. **Brie**: A soft, white, and mild cheese from the Île-de-France region, often baked or served with crackers.\n3. **Roquefort**: A pungent, blue-veined cheese from the Roquefort-sur-Soulzon region, often served as a dessert or used in salad dressings.\n4. **Époisses**: A strong, golden, and washed-rind cheese from Burgundy, often served with fruit and bread.\n5. **Pont l\'Évêque**: A semi-soft, golden, and washed-rind cheese from Normandy, often served with crackers or bread.\n\nOf course, there are many other excellent French cheeses, and the "best" one will depend on your personal taste preferences. Some other notable mentions include:\n\n* **Comté**: A firm, nutty, and golden cheese from Franche-Comté.\n* **Gruyère**: A nutty, creamy, and firm cheese from the Savoie region.\n* **Bucheron**: A semi-soft, white, and mild cheese from the Loire Valley.\n* **Bleu d\'Auvergne**: A creamy, blue-veined cheese from the Auvergne region.\n\nFrance is home to over 400 different types of cheese, each with its own unique characteristics and flavor profiles. So, feel free to explore and find your own favorite French cheese!')], _name=None, _meta={'model': 'Llama-4-Scout-17B-16E-Instruct-FP8', 'index': 0, 'finish_reason': 'stop', 'usage': {'completion_tokens': 335, 'prompt_tokens': 17, 'total_tokens': 352, 'completion_tokens_details': None, 'prompt_tokens_details': None}})]}
132+
```

logos/meta_llama.png

2.94 KB
Loading

0 commit comments

Comments
 (0)