transformers/tests/trainer
kang sheng 2cbcc5877d
Fix condition when GA loss bug fix is not performed (#35651)
* fix condition when GA loss bug fix is not performed

* max loss diff is 2.29

* fix typo

* add an extra validation that loss should not vary too much
2025-01-16 13:59:53 +01:00
..
__init__.py [Test refactor 1/5] Per-folder tests reorganization (#15725) 2022-02-23 15:46:28 -05:00
test_data_collator.py Enhance DataCollatorForLanguageModeling with Configurable Token Replacement Probabilities (#35251) 2025-01-14 17:01:10 +00:00
test_trainer_callback.py add a callback hook right before the optimizer step (#33444) 2024-09-13 10:43:45 +02:00
test_trainer_distributed.py CI: update to ROCm 6.0.2 and test MI300 (#30266) 2024-05-13 18:14:36 +02:00
test_trainer_fsdp.py Remove FSDP wrapping from sub-models. (#34452) 2024-11-15 23:00:03 +01:00
test_trainer_seq2seq.py Trainer - deprecate tokenizer for processing_class (#32385) 2024-10-02 14:08:46 +01:00
test_trainer_tpu.py [Test refactor 1/5] Per-folder tests reorganization (#15725) 2022-02-23 15:46:28 -05:00
test_trainer_utils.py Add strategy to store results in evaluation loop (#30267) 2024-04-17 12:42:27 +01:00
test_trainer.py Fix condition when GA loss bug fix is not performed (#35651) 2025-01-16 13:59:53 +01:00