Skip to content

Loading a checkpoint with tied weights results in ValueError: This checkpoint seem corrupted. #42460

@tomaarsen

Description

@tomaarsen

System Info

  • transformers version: 5.0.0.dev0
  • Platform: Windows-10-10.0.26100-SP0
  • Python version: 3.11.13
  • Huggingface_hub version: 1.1.5
  • Safetensors version: 0.6.2
  • Accelerate version: 1.11.0
  • Accelerate config: not found
  • DeepSpeed version: not installed
  • PyTorch version (accelerator?): 2.9.0+cu126 (CUDA)
  • Using distributed or parallel set-up in script?: No
  • Using GPU in script?: No
  • GPU type: NVIDIA GeForce RTX 3090

Who can help?

@Cyrilvallez @molbap

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction

from transformers import AutoModel

model_name = "sentence-transformers/sentence-t5-base"
model = AutoModel.from_pretrained("sentence-transformers/sentence-t5-base")
print(type(model))

Currently, the behaviour on main is:

Traceback (most recent call last):
  File "c:\code\transformers\demo_sentence_t5.py", line 5, in <module>
    model = AutoModel.from_pretrained("sentence-transformers/sentence-t5-base")
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\code\transformers\src\transformers\models\auto\auto_factory.py", line 373, in from_pretrained
    return model_class.from_pretrained(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\code\transformers\src\transformers\modeling_utils.py", line 275, in _wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "C:\code\transformers\src\transformers\modeling_utils.py", line 3977, in from_pretrained
    model, missing_keys, unexpected_keys, mismatched_keys, offload_index, error_msgs = cls._load_pretrained_model(
                                                                                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\code\transformers\src\transformers\modeling_utils.py", line 4163, in _load_pretrained_model
    model.tie_weights(missing_keys=missing_keys, recompute_mapping=False)
  File "C:\code\transformers\src\transformers\modeling_utils.py", line 2337, in tie_weights
    raise ValueError(
ValueError: This checkpoint seem corrupted. The tied weights mapping for this model specifies to tie shared.weight (which should be present and is not), to encoder.embed_tokens.weight (which is present).

Expected behavior

I expect the model to load normally, as it had been doing for <v5.0.

  • Tom Aarsen

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions