Commit Graph

86 Commits

Author SHA1 Message Date
dependabot[bot]
afea70c01c Bump psutil from 5.6.3 to 5.6.6 in /examples/distillation
Bumps [psutil](https://github.com/giampaolo/psutil) from 5.6.3 to 5.6.6.
- [Release notes](https://github.com/giampaolo/psutil/releases)
- [Changelog](https://github.com/giampaolo/psutil/blob/master/HISTORY.rst)
- [Commits](https://github.com/giampaolo/psutil/compare/release-5.6.3...release-5.6.6)

Signed-off-by: dependabot[bot] <support@github.com>
2020-03-12 21:14:56 -04:00
Victor SANH
6b1ff25084
fix n_gpu count when no_cuda flag is activated (#3077)
* fix n_gpu count when no_cuda flag is activated

* someone was left behind
2020-03-02 10:20:21 -05:00
Julien Chaumond
298bed16a8 make style 2020-03-01 14:08:01 -05:00
VictorSanh
852e032ca6 include roberta in run_squad_w_distillation - cc @graviraja 2020-03-01 01:56:50 +00:00
VictorSanh
b5509abb36 --do_lower_case will always trick me... 2020-03-01 01:39:24 +00:00
Andrew Walker
5bc99e7f33
fix several typos in Distil* readme (#3034) 2020-02-26 12:39:54 -05:00
VictorSanh
2ae98336d1 fix vocab size in binarized_data (distil): int16 vs int32 2020-02-18 16:17:35 +00:00
VictorSanh
ee5a6856ca distilbert-base-cased weights + Readmes + omissions 2020-02-07 15:28:13 -05:00
Julien Chaumond
42f08e596f [examples] rename run_lm_finetuning to run_language_modeling 2020-02-07 09:15:28 -05:00
Lysandre
3bf5417258 Revert erroneous fix 2020-02-04 16:31:07 -05:00
Lysandre
239dd23f64 [Follow up 213]
Masked indices should have -1 and not -100. Updating documentation + scripts that were forgotten
2020-02-03 16:08:05 -05:00
VictorSanh
1ce3fb5cc7 update correct eval metrics (distilbert & co) 2020-01-24 11:45:22 -05:00
VictorSanh
e83d9f1c1d cleaning - change ' to " (black requirements) 2020-01-10 19:34:25 -05:00
VictorSanh
ebba9e929d minor spring cleaning - missing configs + processing 2020-01-10 19:14:58 -05:00
Victor SANH
331065e62d missing import 2020-01-10 11:42:53 +01:00
Victor SANH
414e9e7122 indents test 2020-01-10 11:42:53 +01:00
Victor SANH
3cdb38a7c0 indents 2020-01-10 11:42:53 +01:00
Victor SANH
ebd45980a0 Align with run_squad + fix some errors 2020-01-10 11:42:53 +01:00
Victor SANH
45634f87f8 fix Sampler in distributed training - evaluation 2020-01-10 11:42:53 +01:00
Victor SANH
af1ee9e648 Move torch.nn.utils.clip_grad_norm_ 2020-01-10 11:42:53 +01:00
Lysandre
164c794eb3 New SQuAD API for distillation script 2020-01-10 11:42:53 +01:00
alberduris
81d6841b4b GPU text generation: mMoved the encoded_prompt to correct device 2020-01-06 15:11:12 +01:00
alberduris
dd4df80f0b Moved the encoded_prompts to correct device 2020-01-06 15:11:12 +01:00
Aymeric Augustin
c3783399db Remove redundant requirements with transformers. 2019-12-23 19:17:27 +01:00
Aymeric Augustin
d6eaf4e6d2 Update comments mentioning Python 2. 2019-12-22 18:38:56 +01:00
Aymeric Augustin
c824d15aa1 Remove __future__ imports. 2019-12-22 17:47:54 +01:00
Aymeric Augustin
c11b3e2926 Sort imports for optional third-party libraries.
These libraries aren't always installed in the virtual environment where
isort is running. Declaring them properly avoids mixing these
third-party imports with local imports.
2019-12-22 11:19:13 +01:00
Aymeric Augustin
783a616999 Fix F401 flake8 warning (x88 / 116).
This change is mostly autogenerated with:

    $ python -m autoflake --in-place --recursive --remove-all-unused-imports --ignore-init-module-imports examples templates transformers utils hubconf.py setup.py

I made minor changes in the generated diff.
2019-12-22 10:59:08 +01:00
Aymeric Augustin
fa2ccbc081 Fix E266 flake8 warning (x90). 2019-12-22 10:59:08 +01:00
Aymeric Augustin
631be27078 Fix E722 flake8 warnings (x26). 2019-12-22 10:59:07 +01:00
Aymeric Augustin
158e82e061 Sort imports with isort.
This is the result of:

    $ isort --recursive examples templates transformers utils hubconf.py setup.py
2019-12-22 10:57:46 +01:00
Aymeric Augustin
fa84ae26d6 Reformat source code with black.
This is the result of:

    $ black --line-length 119 examples templates transformers utils hubconf.py setup.py

There's a lot of fairly long lines in the project. As a consequence, I'm
picking the longest widely accepted line length, 119 characters.

This is also Thomas' preference, because it allows for explicit variable
names, to make the code easier to understand.
2019-12-21 17:52:29 +01:00
thomwolf
dc667ce1a7 double check cc @LysandreJik 2019-12-14 09:56:27 +01:00
LysandreJik
3fd71c4431 Update example scripts 2019-12-12 12:08:54 -05:00
VictorSanh
35ff345fc9 update requirements 2019-12-05 12:07:04 -05:00
VictorSanh
552c44a9b1 release distilm-bert 2019-12-05 10:14:58 -05:00
Stefan Schweter
e7cf2ccd15 distillation: add German distilbert model 2019-11-19 19:55:19 +01:00
Rémi Louf
2276bf69b7 update the examples, docs and template 2019-11-14 20:38:02 +01:00
thomwolf
89d6272898 Fix #1623 2019-11-04 16:21:12 +01:00
Victor SANH
fa735208c9
update readme - fix example command distil* 2019-10-30 14:27:28 -04:00
Thomas Wolf
36174696cc
Merge branch 'master' into clean-roberta 2019-10-30 16:51:06 +01:00
VictorSanh
5b6cafb11b [release] fix table weirdness 2019-10-23 10:35:16 -04:00
VictorSanh
8ad5c591cd [RELEASE] DistilRoBERTa 2019-10-23 10:29:47 -04:00
Lysandre
7d709e55ed Remove 2019-10-22 14:12:33 -04:00
VictorSanh
d844db4005 Add citation bibtex 2019-10-11 16:55:42 -04:00
Bilal Khan
5ce8d29abe Change tensorboard imports to use built-in tensorboard if available 2019-10-08 16:29:43 -05:00
VictorSanh
7ce83b4931 update weights for distilgpt2 2019-10-07 12:30:27 -04:00
VictorSanh
f5891c3821 run_squad --> run_squad_w_distillation 2019-10-04 17:23:15 -04:00
VictorSanh
5f07d8f11a prepare release 2019-10-03 10:27:11 -04:00
VictorSanh
35071007cb incoming release 🔥 update links to arxiv preprint 2019-10-03 10:27:11 -04:00