mirror of
https://github.com/huggingface/transformers.git
synced 2025-08-02 19:21:31 +06:00
![]() Reason for issue was that optimzation steps where computed from example size, which is different from actual size of dataloader when an example is chunked into multiple instances. Solution in this pull request is to compute num_optimization_steps directly from len(data_loader). |
||
---|---|---|
.. | ||
lm_finetuning | ||
extract_features.py | ||
run_classifier.py | ||
run_gpt2.py | ||
run_openai_gpt.py | ||
run_squad.py | ||
run_swag.py | ||
run_transfo_xl.py |