mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-04 05:10:06 +06:00
![]() * Add some nicety flags for better controlling evaluation.
* Fix dependency issue with outdated requirement
* Add additional flag to example to ensure eval is done
* Wrap code into main function for accelerate launcher to find
* Fix valid batch size flag in readme
* Add note to install git-lfs when initializing/training the model
* Update examples/research_projects/codeparrot/scripts/arguments.py
Co-authored-by: Leandro von Werra <lvwerra@users.noreply.github.com>
* Update examples/research_projects/codeparrot/README.md
Co-authored-by: Leandro von Werra <lvwerra@users.noreply.github.com>
* Revert "Wrap code into main function for accelerate launcher to find"
This reverts commit
|
||
---|---|---|
.. | ||
arguments.py | ||
bpe_training.py | ||
codeparrot_training.py | ||
human_eval.py | ||
initialize_model.py | ||
preprocessing.py | ||
validation_loss.py |