LysandreJik
|
25e8389439
|
Tests for added AutoModels
|
2019-08-30 12:48:55 -04:00 |
|
LysandreJik
|
dc43215c01
|
Added multiple AutoModel classes: AutoModelWithLMHead, AutoModelForQuestionAnswering and AutoModelForSequenceClassification
|
2019-08-30 12:48:55 -04:00 |
|
VictorSanh
|
282c276e09
|
typos + file name coherence in distillation README
|
2019-08-30 12:02:29 -04:00 |
|
VictorSanh
|
803c1cc4ea
|
fix relative import bug cf Issue #1140
|
2019-08-30 12:01:27 -04:00 |
|
thomwolf
|
7044ed6b05
|
fix tokenizers serialization
|
2019-08-30 17:36:11 +02:00 |
|
Thomas Wolf
|
cd65c41a83
|
Merge branch 'master' into xlm-tokenization
|
2019-08-30 17:15:16 +02:00 |
|
thomwolf
|
69da972ace
|
added test and debug tokenizer configuration serialization
|
2019-08-30 17:09:36 +02:00 |
|
thomwolf
|
88111de07c
|
saving and reloading tokenizer configurations
|
2019-08-30 16:55:48 +02:00 |
|
Thomas Wolf
|
b66e9b4433
|
Merge pull request #1158 from rabeehk/master
regarding #1026 pull request
|
2019-08-30 16:30:33 +02:00 |
|
Thomas Wolf
|
0a2fecdf90
|
Merge branch 'master' into master
|
2019-08-30 16:30:08 +02:00 |
|
thomwolf
|
3871b8a107
|
adding xlm 17 and 100 models and config on aws
|
2019-08-30 16:28:42 +02:00 |
|
thomwolf
|
8678ff8df5
|
adding 17 and 100 xlm models
|
2019-08-30 16:26:04 +02:00 |
|
LysandreJik
|
e0caab0cf0
|
fix link
|
2019-08-30 10:09:17 -04:00 |
|
LysandreJik
|
a600b30cc3
|
Fix index number in documentation
|
2019-08-30 10:08:14 -04:00 |
|
LysandreJik
|
20c06fa37d
|
Added DistilBERT to documentation index
|
2019-08-30 10:06:51 -04:00 |
|
Rabeeh KARIMI
|
39eb31e11e
|
remove reloading tokenizer in the training, adding it to the evaluation part
|
2019-08-30 15:44:41 +02:00 |
|
Rabeeh KARIMI
|
350bb6bffa
|
updated tokenizer loading for addressing reproducibility issues
|
2019-08-30 15:34:28 +02:00 |
|
thomwolf
|
82462c5cba
|
Added option to setup pretrained tokenizer arguments
|
2019-08-30 15:30:41 +02:00 |
|
Thomas Wolf
|
41f35d0b3d
|
Merge pull request #1089 from dhpollack/dhp/use_pytorch_layernorm
change layernorm code to pytorch's native layer norm
|
2019-08-30 14:49:08 +02:00 |
|
Thomas Wolf
|
01ad55f8cf
|
Merge pull request #1026 from rabeehk/master
loads the tokenizer for each checkpoint, to solve the reproducability…
|
2019-08-30 14:15:36 +02:00 |
|
Thomas Wolf
|
50e615f43d
|
Merge branch 'master' into improved_testing
|
2019-08-30 13:40:35 +02:00 |
|
thomwolf
|
f8aace6bcd
|
update tokenizers to use self.XX_token_id instead of converting self.XX_token
|
2019-08-30 13:39:52 +02:00 |
|
thomwolf
|
8faf2e086b
|
more doc on special tokens
|
2019-08-30 13:36:22 +02:00 |
|
Thomas Wolf
|
f7978490b2
|
Merge pull request #1148 from huggingface/circleci
Documentation auto-deploy
|
2019-08-30 13:28:16 +02:00 |
|
thomwolf
|
ce5ef4b35d
|
python2 doesn't spark joy
|
2019-08-30 13:22:43 +02:00 |
|
thomwolf
|
5dd7b677ad
|
clean up all byte-level bpe tests
|
2019-08-30 12:43:08 +02:00 |
|
thomwolf
|
ca1a00a302
|
fix for python2
|
2019-08-30 12:29:31 +02:00 |
|
thomwolf
|
4e6a3172ce
|
update roberta docstring as well
|
2019-08-30 12:23:37 +02:00 |
|
thomwolf
|
fd10d79b55
|
update GPT2 docstring
|
2019-08-30 12:23:12 +02:00 |
|
thomwolf
|
abe734ca1f
|
fix GPT-2 and RoBERTa tests to be clean now
|
2019-08-30 12:20:18 +02:00 |
|
thomwolf
|
0f5a799456
|
fix GPT2DoubleHeadModel docstring
|
2019-08-30 11:49:23 +02:00 |
|
thomwolf
|
d51f72d5de
|
adding shortcut to the ids of all the special tokens
|
2019-08-30 11:41:11 +02:00 |
|
thomwolf
|
306af132d7
|
update readme to mention add_special_tokens more clearly in example
|
2019-08-30 11:30:51 +02:00 |
|
thomwolf
|
50e6daf83a
|
fix Roberta tokenizer __init__
|
2019-08-30 11:27:43 +02:00 |
|
thomwolf
|
0517e7a1cb
|
Fix GPT2 and RoBERTa tokenizer to beging with a space - update Roberta tokenizer
|
2019-08-30 11:23:49 +02:00 |
|
erenup
|
6e1ac34e2b
|
Merge remote-tracking branch 'huggingface/master'
|
2019-08-30 15:50:11 +08:00 |
|
jamin
|
2fb9a934b4
|
re-format
|
2019-08-30 14:05:28 +09:00 |
|
jamin
|
c8731b9583
|
update apex fp16 implementation
|
2019-08-30 13:54:00 +09:00 |
|
ziliwang
|
6060b2f89b
|
fix: hard coding for max number
fp16 max number is 65504, the original 1e30 will cause Nan in fp16
|
2019-08-30 12:13:47 +08:00 |
|
epwalsh
|
07e21307b6
|
fix adding special tokens
|
2019-08-29 13:44:50 -07:00 |
|
LysandreJik
|
caf1d116a6
|
Closing bracket in DistilBERT's token count.
|
2019-08-29 15:30:10 -04:00 |
|
LysandreJik
|
e7fba4bef5
|
Documentation auto-deploy
|
2019-08-29 12:14:29 -04:00 |
|
Luis
|
fe8fb10b44
|
Small modification of comment in the run_glue.py example
Add RoBERTa to the comment as it was not explicit that RoBERTa don't use token_type_ids.
|
2019-08-29 14:43:30 +02:00 |
|
erenup
|
2a2832ce73
|
Merge pull request #1 from erenup/run_multiple_choice
roberta, xlnet for multiple choice
|
2019-08-29 16:27:44 +08:00 |
|
erenup
|
942d3f4b20
|
modifiy code of arc label insurance
|
2019-08-29 10:21:17 +08:00 |
|
LysandreJik
|
bf3dc778b8
|
Changed learning rate for run_squad test
|
2019-08-28 18:24:43 -04:00 |
|
thomwolf
|
0a74c88ac6
|
fix #1131
|
2019-08-28 22:41:42 +02:00 |
|
Thomas Wolf
|
5f297c7be3
|
Merge pull request #1087 from huggingface/fix-warnings
Decode now calls private property instead of public method
|
2019-08-28 22:22:11 +02:00 |
|
Thomas Wolf
|
d9847678b3
|
Merge pull request #1136 from adai183/update_SQuAD_script
swap order of optimizer.step() and scheduler.step()
|
2019-08-28 22:00:52 +02:00 |
|
Thomas Wolf
|
0f8ad89206
|
Merge pull request #1135 from stefan-it/master
distilbert: fix number of hidden_size
|
2019-08-28 22:00:12 +02:00 |
|