Stop autoconverting custom code checkpoints (#37751)
Some checks are pending
Self-hosted runner (benchmark) / Benchmark (aws-g5-4xlarge-cache) (push) Waiting to run
Build documentation / build (push) Waiting to run
New model PR merged notification / Notify new model (push) Waiting to run
Slow tests on important models (on Push - A10) / Get all modified files (push) Waiting to run
Slow tests on important models (on Push - A10) / Slow & FA2 tests (push) Blocked by required conditions
Self-hosted runner (push-caller) / Check if setup was changed (push) Waiting to run
Self-hosted runner (push-caller) / build-docker-containers (push) Blocked by required conditions
Self-hosted runner (push-caller) / Trigger Push CI (push) Blocked by required conditions
Secret Leaks / trufflehog (push) Waiting to run
Update Transformers metadata / build_and_package (push) Waiting to run

* Stop autoconverting custom code checkpoints

* make fixup

* Better auto class detection

* Match the kwarg ordering
This commit is contained in:
Matt 2025-05-26 19:15:28 +01:00 committed by GitHub
parent 07848a8405
commit 706b00928f
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

View File

@ -1005,6 +1005,7 @@ def _get_resolved_checkpoint_files(
user_agent: dict, user_agent: dict,
revision: str, revision: str,
commit_hash: Optional[str], commit_hash: Optional[str],
is_remote_code: bool, # Because we can't determine this inside this function, we need it to be passed in
transformers_explicit_filename: Optional[str] = None, transformers_explicit_filename: Optional[str] = None,
) -> Tuple[Optional[List[str]], Optional[Dict]]: ) -> Tuple[Optional[List[str]], Optional[Dict]]:
"""Get all the checkpoint filenames based on `pretrained_model_name_or_path`, and optional metadata if the """Get all the checkpoint filenames based on `pretrained_model_name_or_path`, and optional metadata if the
@ -1201,7 +1202,10 @@ def _get_resolved_checkpoint_files(
"_commit_hash": commit_hash, "_commit_hash": commit_hash,
**has_file_kwargs, **has_file_kwargs,
} }
if not has_file(pretrained_model_name_or_path, safe_weights_name, **has_file_kwargs): if (
not has_file(pretrained_model_name_or_path, safe_weights_name, **has_file_kwargs)
and not is_remote_code
):
Thread( Thread(
target=auto_conversion, target=auto_conversion,
args=(pretrained_model_name_or_path,), args=(pretrained_model_name_or_path,),
@ -4550,6 +4554,7 @@ class PreTrainedModel(nn.Module, ModuleUtilsMixin, PushToHubMixin, PeftAdapterMi
user_agent=user_agent, user_agent=user_agent,
revision=revision, revision=revision,
commit_hash=commit_hash, commit_hash=commit_hash,
is_remote_code=cls._auto_class is not None,
transformers_explicit_filename=transformers_explicit_filename, transformers_explicit_filename=transformers_explicit_filename,
) )