transformers/examples/hans
Victor SANH 6b1ff25084
fix n_gpu count when no_cuda flag is activated (#3077)
* fix n_gpu count when no_cuda flag is activated

* someone was left behind
2020-03-02 10:20:21 -05:00
..
hans_processors.py formating 2020-01-16 13:21:30 +01:00
test_hans.py fix n_gpu count when no_cuda flag is activated (#3077) 2020-03-02 10:20:21 -05:00
utils_hans.py update formating - make flake8 happy 2020-01-16 13:21:30 +01:00