File tree Expand file tree Collapse file tree 1 file changed +6
-6
lines changed Expand file tree Collapse file tree 1 file changed +6
-6
lines changed Original file line number Diff line number Diff line change @@ -89,12 +89,12 @@ The `ChatData` class is used to encapsulate the data required for making chat re
8989#### LLM Parameters
9090These properties control various aspects of the generated response (more [ info] ( https://openrouter.ai/docs#parameters ) ):
9191- ** max_tokens** (int|null): The maximum number of tokens that can be generated in the completion. Default is 1024.
92- - ** temperature** (int |null): A value between 0 and 2 controlling the randomness of the output.
93- - ** top_p** (int |null): A value between 0 and 1 for nucleus sampling, an alternative to temperature sampling.
94- - ** top_k** (int |null): A value between 1 and infinity for top-k sampling (not available for OpenAI models).
95- - ** frequency_penalty** (int |null): A value between -2 and 2 for penalizing new tokens based on their existing frequency.
96- - ** presence_penalty** (int |null): A value between -2 and 2 for penalizing new tokens based on whether they appear in the text so far.
97- - ** repetition_penalty** (int |null): A value between 0 and 2 for penalizing repetitive tokens.
92+ - ** temperature** (float |null): A value between 0 and 2 controlling the randomness of the output.
93+ - ** top_p** (float |null): A value between 0 and 1 for nucleus sampling, an alternative to temperature sampling.
94+ - ** top_k** (float |null): A value between 1 and infinity for top-k sampling (not available for OpenAI models).
95+ - ** frequency_penalty** (float |null): A value between -2 and 2 for penalizing new tokens based on their existing frequency.
96+ - ** presence_penalty** (float |null): A value between -2 and 2 for penalizing new tokens based on whether they appear in the text so far.
97+ - ** repetition_penalty** (float |null): A value between 0 and 2 for penalizing repetitive tokens.
9898- ** seed** (int|null): A value for deterministic sampling (OpenAI models only, in beta).
9999#### Function-calling
100100Only natively suported by OpenAI models. For others, we submit a YAML-formatted string with these tools at the end of the prompt.
You can’t perform that action at this time.
0 commit comments