Skip to content
This repository was archived by the owner on Oct 9, 2024. It is now read-only.

ValueError: Couldn't instantiate the backend tokenizer from one of: #101

Open
SeekPoint opened this issue Jun 30, 2023 · 0 comments
Open

Comments

@SeekPoint
Copy link

(gh_transformers-bloom-inference) amd00@MZ32-00:~/llm_dev/transformers-bloom-inference$ python bloom-inference-scripts/bloom-accelerate-inference.py --name /hf_model/bloom --batch_size 1 --benchmark
Using 0 gpus
Loading model /home/amd00/hf_model/bloom
Traceback (most recent call last):
File "/home/amd00/llm_dev/transformers-bloom-inference/bloom-inference-scripts/bloom-accelerate-inference.py", line 49, in
tokenizer = AutoTokenizer.from_pretrained(model_name)
File "/home/amd00/anaconda3/envs/gh_transformers-bloom-inference/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 591, in from_pretrained
return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
File "/home/amd00/anaconda3/envs/gh_transformers-bloom-inference/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 1805, in from_pretrained
return cls._from_pretrained(
File "/home/amd00/anaconda3/envs/gh_transformers-bloom-inference/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 1950, in _from_pretrained
tokenizer = cls(*init_inputs, **init_kwargs)
File "/home/amd00/anaconda3/envs/gh_transformers-bloom-inference/lib/python3.10/site-packages/transformers/models/bloom/tokenization_bloom_fast.py", line 118, in init
super().init(
File "/home/amd00/anaconda3/envs/gh_transformers-bloom-inference/lib/python3.10/site-packages/transformers/tokenization_utils_fast.py", line 119, in init
raise ValueError(
ValueError: Couldn't instantiate the backend tokenizer from one of:
(1) a tokenizers library serialization file,
(2) a slow tokenizer instance to convert or
(3) an equivalent slow tokenizer class to instantiate and convert.
You need to have sentencepiece installed to convert a slow tokenizer to a fast one.
(gh_transformers-bloom-inference) amd00@MZ32-00:
/llm_dev/transformers-bloom-inference$

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant