mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-31 02:02:21 +06:00
![]() * Set best_model_checkpoint only when ckpt exists. Rather than set it explicitly without checking if the checkpoint directory even exists as before, now we moved the setting logic inside of _save_checkpoint and are only setting it if it exists. * Added best_global_step to TrainerState. * Added tests for best_model_checkpoint. * Fixed hard-coded values in test to prevent fail. * Added helper func and removed hard-coded best_step. * Added side effect patch generator for _eval. * Added evaluate side effect func. * Removed erroneous patching. * Fixed minor bug. * Applied Ruff. * Fixed Ruff problem in make style. * Used Trainer.set_initial_training_values. |
||
---|---|---|
.. | ||
agents | ||
bettertransformer | ||
deepspeed | ||
extended | ||
fixtures | ||
fsdp | ||
generation | ||
models | ||
optimization | ||
peft_integration | ||
pipelines | ||
quantization | ||
repo_utils | ||
sagemaker | ||
tensor_parallel | ||
tokenization | ||
trainer | ||
utils | ||
__init__.py | ||
test_backbone_common.py | ||
test_configuration_common.py | ||
test_feature_extraction_common.py | ||
test_image_processing_common.py | ||
test_image_transforms.py | ||
test_modeling_common.py | ||
test_modeling_flax_common.py | ||
test_modeling_tf_common.py | ||
test_pipeline_mixin.py | ||
test_processing_common.py | ||
test_sequence_feature_extraction_common.py | ||
test_tokenization_common.py | ||
test_training_args.py |