mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-27 16:22:23 +06:00
![]() Currently the L2 regularization is hard-coded to "0.01", even though there is a --weight_decay flag implemented (that is unused). I'm making this flag control the weight decay used for fine-tuning in this script. |
||
---|---|---|
.. | ||
run_openai_gpt.py | ||
run_swag.py | ||
run_transfo_xl.py |