Skip to content

Commit 02495ff

Browse files
authored
example: update create example (#418)
1 parent 2cad1f5 commit 02495ff

File tree

2 files changed

+6
-33
lines changed

2 files changed

+6
-33
lines changed

examples/README.md

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,8 @@ Run the examples in this directory with:
66
python3 examples/<example>.py
77
```
88

9+
See [ollama/docs/api.md](https://github.com/ollama/ollama/blob/main/docs/api.md) for full API documentation
10+
911
### Chat - Chat with a model
1012
- [chat.py](chat.py)
1113
- [async-chat.py](async-chat.py)
@@ -50,12 +52,8 @@ Requirement: `pip install tqdm`
5052

5153

5254
### Ollama Create - Create a model from a Modelfile
53-
```python
54-
python create.py <model> <modelfile>
55-
```
5655
- [create.py](create.py)
5756

58-
See [ollama/docs/modelfile.md](https://github.com/ollama/ollama/blob/main/docs/modelfile.md) for more information on the Modelfile format.
5957

6058

6159
### Ollama Embed - Generate embeddings with a model

examples/create.py

100644100755
Lines changed: 4 additions & 29 deletions
Original file line numberDiff line numberDiff line change
@@ -1,30 +1,5 @@
1-
import sys
1+
from ollama import Client
22

3-
from ollama import create
4-
5-
6-
args = sys.argv[1:]
7-
if len(args) == 2:
8-
# create from local file
9-
path = args[1]
10-
else:
11-
print('usage: python create.py <name> <filepath>')
12-
sys.exit(1)
13-
14-
# TODO: update to real Modelfile values
15-
modelfile = f"""
16-
FROM {path}
17-
"""
18-
example_modelfile = """
19-
FROM llama3.2
20-
# sets the temperature to 1 [higher is more creative, lower is more coherent]
21-
PARAMETER temperature 1
22-
# sets the context window size to 4096, this controls how many tokens the LLM can use as context to generate the next token
23-
PARAMETER num_ctx 4096
24-
25-
# sets a custom system message to specify the behavior of the chat assistant
26-
SYSTEM You are Mario from super mario bros, acting as an assistant.
27-
"""
28-
29-
for response in create(model=args[0], modelfile=modelfile, stream=True):
30-
print(response['status'])
3+
client = Client()
4+
response = client.create(model='my-assistant', from_='llama3.2', stream=False)
5+
print(response.status)

0 commit comments

Comments
 (0)