Skip to content

Commit 2fc4c12

Browse files
committed
Add notes about 384 default max_prompt_length
Signed-off-by: Andrea Fasoli <andrea.fasoli@ibm.com>
1 parent 18f4230 commit 2fc4c12

File tree

2 files changed

+2
-2
lines changed

2 files changed

+2
-2
lines changed

aiu_fms_testing_utils/utils/args_parsing.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -176,7 +176,7 @@ def get_args(parser: argparse.ArgumentParser) -> argparse.Namespace:
176176
help=(
177177
"Cap the number of tokens per prompt to a maximum length prior to padding. "
178178
"If None, prompts to decoder models will have no cap, while prompts to "
179-
"encoder models will be capped to a default of 384 tokens."
179+
"encoder models will be capped to a default of 384 tokens (for QA task)."
180180
),
181181
)
182182

aiu_fms_testing_utils/utils/encoders_utils.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -102,7 +102,7 @@ def prepare_validation_features(
102102
max_prompt_length = (
103103
args.max_prompt_length
104104
if args.max_prompt_length is not None
105-
else 384
105+
else 384 # this default is targeted at QA task (not a model limitation)
106106
)
107107

108108
# Some of the questions have lots of whitespace on the left, which is not useful

0 commit comments

Comments
 (0)