mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-03 21:00:08 +06:00
![]() * fix: prevent model access error during Optuna hyperparameter tuning The `transformers.integrations.integration_utils.run_hp_search_optuna` function releases model memory and sets trainer.model to None after each trial. This causes an AttributeError when subsequent Trainer.train calls attempt to access the model before reinitialization. This is only an issue when `fp16_full_eval` or `bf16_full_eval` flags are enabled. * Update src/transformers/trainer.py Co-authored-by: Marc Sun <57196510+SunMarc@users.noreply.github.com> --------- Co-authored-by: Marc Sun <57196510+SunMarc@users.noreply.github.com> |
||
---|---|---|
.. | ||
__init__.py | ||
test_data_collator.py | ||
test_trainer_callback.py | ||
test_trainer_distributed_loss.py | ||
test_trainer_distributed.py | ||
test_trainer_fsdp.py | ||
test_trainer_seq2seq.py | ||
test_trainer_tpu.py | ||
test_trainer_utils.py | ||
test_trainer.py |