transformers/docs/source/en/internal
hasan salim kanmaz c33f6046c3
[WIP] Enable reproducibility for distributed trainings (#16907)
* add seed worker and set_deterministic_seed_for_cuda function to enforce reproducability

* change function name to enable determinism, add docstrings, reproducability support for tf

* change function name to enable_determinism_for_distributed_training

* revert changes in set_seed and call set_seed within enable_full_determinism

* add one position argument for seed_worker function

* add full_determinism flag in training args and call enable_full_determinism when it is true

* add enable_full_determinism to documentation

* apply make fixup after the last commit

* Update src/transformers/training_args.py

Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>

Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>
2022-05-11 09:37:13 -04:00
..
file_utils.mdx Enable doc in Spanish (#16518) 2022-04-04 10:25:46 -04:00
generation_utils.mdx TF generate refactor - Beam Search (#16374) 2022-04-06 18:19:34 +01:00
modeling_utils.mdx Moved functions to pytorch_utils.py (#16625) 2022-04-12 12:38:50 -04:00
pipelines_utils.mdx Enable doc in Spanish (#16518) 2022-04-04 10:25:46 -04:00
tokenization_utils.mdx Enable doc in Spanish (#16518) 2022-04-04 10:25:46 -04:00
trainer_utils.mdx [WIP] Enable reproducibility for distributed trainings (#16907) 2022-05-11 09:37:13 -04:00