transformers/.circleci
Aymeric Augustin 80caf79d07 Prevent excessive parallelism in PyTorch.
We're already using as many processes in parallel as we have CPU cores.
Furthermore, the number of core may be incorrectly calculated as 36
(we've seen this in pytest-xdist) which make compound the problem.

PyTorch performance craters without this.
2019-12-21 08:43:19 +01:00
..
config.yml Prevent excessive parallelism in PyTorch. 2019-12-21 08:43:19 +01:00
deploy.sh Updated v2.2.0 doc 2019-11-27 10:12:20 -05:00