Skip to content

Commit 3b91399

Browse files
committed
int fields in ChatData fixed as float in readme
1 parent a17a64e commit 3b91399

File tree

1 file changed

+6
-6
lines changed

1 file changed

+6
-6
lines changed

README.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -89,12 +89,12 @@ The `ChatData` class is used to encapsulate the data required for making chat re
8989
#### LLM Parameters
9090
These properties control various aspects of the generated response (more [info](https://openrouter.ai/docs#parameters)):
9191
- **max_tokens** (int|null): The maximum number of tokens that can be generated in the completion. Default is 1024.
92-
- **temperature** (int|null): A value between 0 and 2 controlling the randomness of the output.
93-
- **top_p** (int|null): A value between 0 and 1 for nucleus sampling, an alternative to temperature sampling.
94-
- **top_k** (int|null): A value between 1 and infinity for top-k sampling (not available for OpenAI models).
95-
- **frequency_penalty** (int|null): A value between -2 and 2 for penalizing new tokens based on their existing frequency.
96-
- **presence_penalty** (int|null): A value between -2 and 2 for penalizing new tokens based on whether they appear in the text so far.
97-
- **repetition_penalty** (int|null): A value between 0 and 2 for penalizing repetitive tokens.
92+
- **temperature** (float|null): A value between 0 and 2 controlling the randomness of the output.
93+
- **top_p** (float|null): A value between 0 and 1 for nucleus sampling, an alternative to temperature sampling.
94+
- **top_k** (float|null): A value between 1 and infinity for top-k sampling (not available for OpenAI models).
95+
- **frequency_penalty** (float|null): A value between -2 and 2 for penalizing new tokens based on their existing frequency.
96+
- **presence_penalty** (float|null): A value between -2 and 2 for penalizing new tokens based on whether they appear in the text so far.
97+
- **repetition_penalty** (float|null): A value between 0 and 2 for penalizing repetitive tokens.
9898
- **seed** (int|null): A value for deterministic sampling (OpenAI models only, in beta).
9999
#### Function-calling
100100
Only natively suported by OpenAI models. For others, we submit a YAML-formatted string with these tools at the end of the prompt.

0 commit comments

Comments
 (0)