mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-03 21:00:08 +06:00
![]() * Set best_model_checkpoint only when ckpt exists. Rather than set it explicitly without checking if the checkpoint directory even exists as before, now we moved the setting logic inside of _save_checkpoint and are only setting it if it exists. * Added best_global_step to TrainerState. * Added tests for best_model_checkpoint. * Fixed hard-coded values in test to prevent fail. * Added helper func and removed hard-coded best_step. * Added side effect patch generator for _eval. * Added evaluate side effect func. * Removed erroneous patching. * Fixed minor bug. * Applied Ruff. * Fixed Ruff problem in make style. * Used Trainer.set_initial_training_values. |
||
---|---|---|
.. | ||
__init__.py | ||
test_data_collator.py | ||
test_trainer_callback.py | ||
test_trainer_distributed_loss.py | ||
test_trainer_distributed.py | ||
test_trainer_fsdp.py | ||
test_trainer_seq2seq.py | ||
test_trainer_tpu.py | ||
test_trainer_utils.py | ||
test_trainer.py |