transformers/tests/trainer
Joaquin Caballero 031ef8802c
fix FSDP + torch.compile bug when saving pretrained model (#37725)
* args keep_torch_compile=False in _save and _wwrap_method

* Fix FSDP execution on evaluation  for torch_compile mode

* add test trainer FSDP + Torch Compile

* fix quality code

* make style

* Revert " make style"

This reverts commit 77e797f8829c50992cc21496be3d9a3e480e1c97.

* make style
2025-05-06 17:51:28 +02:00
..
__init__.py [Test refactor 1/5] Per-folder tests reorganization (#15725) 2022-02-23 15:46:28 -05:00
test_data_collator.py add FlashAttentionKwargs and seq_idx to flat collator (#36456) 2025-04-16 15:45:03 +02:00
test_trainer_callback.py fix: prevent second save in the end of training if last step was saved already (#36219) 2025-02-20 17:38:52 +01:00
test_trainer_distributed_loss.py Fix multi gpu loss sync condition, add doc and test (#35743) 2025-02-12 15:41:31 +01:00
test_trainer_distributed.py Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
test_trainer_fsdp.py fix FSDP + torch.compile bug when saving pretrained model (#37725) 2025-05-06 17:51:28 +02:00
test_trainer_seq2seq.py Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
test_trainer_tpu.py Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
test_trainer_utils.py Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
test_trainer.py enable xpu in test_trainer (#37774) 2025-05-06 17:13:35 +02:00