mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-31 02:02:21 +06:00
![]() * Add XLA torchrun support * Clarify that currently DDP doesn't work with torch.distributed XLA backend yet * Enable DDP with torchrun and XLA (now available in PT-XLA 1.13) * Add check for AWS Neuron availability and AWS Neuron specific compiler flag * Change the new test's name to TestTrainerDistributedNeuronCore * Remove "assert" and replace raised exception * Remove compiler flag as it is optional. If needed, will be another PR. * Use TORCHELASTIC_RUN_ID to determine whether torchrun is used |
||
---|---|---|
.. | ||
__init__.py | ||
test_data_collator.py | ||
test_trainer_callback.py | ||
test_trainer_distributed.py | ||
test_trainer_seq2seq.py | ||
test_trainer_tpu.py | ||
test_trainer_utils.py | ||
test_trainer.py |