mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-31 02:02:21 +06:00
![]() * bookmark * Bookmark * Bookmark * Actually implement * Pass in kwarg explicitly * Adjust for if we do or don't have labels * Bookmark fix for od * bookmark * Fin * closer * Negate accelerate grad accum div * Fixup not training long enough * Add in compute_loss to take full model output * Document * compute_loss -> compute_loss_fn * Add a test * Refactor * Refactor * Uncomment tests * Update tests/trainer/test_trainer.py Co-authored-by: Daniel Han <danielhanchen@gmail.com> --------- Co-authored-by: Daniel Han <danielhanchen@gmail.com> |
||
---|---|---|
.. | ||
__init__.py | ||
test_data_collator.py | ||
test_trainer_callback.py | ||
test_trainer_distributed.py | ||
test_trainer_fsdp.py | ||
test_trainer_seq2seq.py | ||
test_trainer_tpu.py | ||
test_trainer_utils.py | ||
test_trainer.py |