Skip to content

Commit 54b4856

Browse files
authored
Update anthropic.md (#254)
- update examples to Haystack 2.3+
1 parent 58de760 commit 54b4856

File tree

1 file changed

+13
-6
lines changed

1 file changed

+13
-6
lines changed

integrations/anthropic.md

Lines changed: 13 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -57,6 +57,8 @@ Before using, make sure to set the `ANTHROPIC_API_KEY` environment variable.
5757
Below is an example RAG Pipeline where we answer a predefined question using the contents from the below mentioned URL pointing to Anthropic prompt engineering guide. We fetch the contents of the URL and generate an answer with the `AnthropicChatGenerator`.
5858

5959
```python
60+
# To run this example, you will need to set a `ANTHROPIC_API_KEY` environment variable.
61+
6062
from haystack import Pipeline
6163
from haystack.components.builders import ChatPromptBuilder
6264
from haystack.components.converters import HTMLToDocument
@@ -69,18 +71,23 @@ from haystack_integrations.components.generators.anthropic import AnthropicChatG
6971

7072
messages = [
7173
ChatMessage.from_system("You are a prompt expert who answers questions based on the given documents."),
72-
ChatMessage.from_user("Here are the documents: {{documents}} \\n Answer: {{query}}"),
74+
ChatMessage.from_user(
75+
"Here are the documents:\n"
76+
"{% for d in documents %} \n"
77+
" {{d.content}} \n"
78+
"{% endfor %}"
79+
"\nAnswer: {{query}}"
80+
),
7381
]
7482

7583
rag_pipeline = Pipeline()
7684
rag_pipeline.add_component("fetcher", LinkContentFetcher())
7785
rag_pipeline.add_component("converter", HTMLToDocument())
78-
rag_pipeline.add_component("prompt_builder", ChatPromptBuilder())
86+
rag_pipeline.add_component("prompt_builder", ChatPromptBuilder(variables=["documents"]))
7987
rag_pipeline.add_component(
8088
"llm",
8189
AnthropicChatGenerator(
8290
api_key=Secret.from_env_var("ANTHROPIC_API_KEY"),
83-
model="claude-3-sonnet-20240229",
8491
streaming_callback=print_streaming_chunk,
8592
),
8693
)
@@ -90,10 +97,10 @@ rag_pipeline.connect("fetcher", "converter")
9097
rag_pipeline.connect("converter", "prompt_builder")
9198
rag_pipeline.connect("prompt_builder.prompt", "llm.messages")
9299

93-
question = "What are the best practices in prompt engineering?"
100+
question = "When should we use prompt engineering and when should we fine-tune?"
94101
rag_pipeline.run(
95102
data={
96-
"fetcher": {"urls": ["https://docs.anthropic.com/claude/docs/prompt-engineering"]},
103+
"fetcher": {"urls": ["https://docs.anthropic.com/en/docs/build-with-claude/prompt-engineering/overview"]},
97104
"prompt_builder": {"template_variables": {"query": question}, "template": messages},
98105
}
99106
)
@@ -106,7 +113,7 @@ Below is an example of using `AnthropicGenerator`:
106113
```python
107114
from haystack_integrations.components.generators.anthropic import AnthropicGenerator
108115

109-
client = AnthropicGenerator(model="claude-2.1")
116+
client = AnthropicGenerator()
110117
response = client.run("What's Natural Language Processing? Be brief.")
111118
print(response)
112119

0 commit comments

Comments
 (0)