Commit Graph

1212 Commits

Author SHA1 Message Date
wangfei
6ec1ee9ec2 Fix examples in docstring 2019-08-06 11:32:54 +08:00
wangfei
72622926e5 Fix examples in docstring 2019-08-06 11:32:41 +08:00
wangfei
f889e77b9c Fix examples of loading pretrained models in docstring 2019-08-06 11:30:35 +08:00
wangfei
beb03ec6c5 Fix examples of loading pretrained models in docstring 2019-08-06 11:24:46 +08:00
Thomas Wolf
4fc9f9ef54
Merge pull request #910 from huggingface/auto_models
Adding AutoTokenizer and AutoModel classes that automatically detect architecture - Clean up tokenizers
2019-08-05 19:17:47 +02:00
Thomas Wolf
d43dc48b34
Merge branch 'master' into auto_models 2019-08-05 19:17:35 +02:00
thomwolf
0b524b0848 remove derived classes for now 2019-08-05 19:08:19 +02:00
thomwolf
13936a9621 update doc and tests 2019-08-05 18:48:16 +02:00
thomwolf
ed4e542260 adding tests 2019-08-05 18:14:07 +02:00
thomwolf
3a126e73dd fix #950 2019-08-05 17:26:29 +02:00
thomwolf
7223886dc9 fix #944 2019-08-05 17:16:56 +02:00
thomwolf
70c10caa06 add option mentioned in #940 2019-08-05 17:09:37 +02:00
thomwolf
077ad693e9 tweak issue templates wordings 2019-08-05 16:46:29 +02:00
thomwolf
02d4087cb8 Merge branch 'master' of https://github.com/huggingface/pytorch-pretrained-BERT 2019-08-05 16:26:01 +02:00
thomwolf
7c524d631e add issue templates 2019-08-05 16:25:54 +02:00
Lysandre Debut
6f05ad72b4
Merge pull request #791 from huggingface/doc
RestructuredText table for pretrained models.
2019-08-05 10:18:00 -04:00
thomwolf
b90e29d52c working on automodels 2019-08-05 16:06:34 +02:00
thomwolf
58830807d1 inidicate we only support pytorch 1.0.0+ now 2019-08-05 14:38:59 +02:00
thomwolf
328afb7097 cleaning up tokenizer tests structure (at last) - last remaining ppb refs 2019-08-05 14:08:56 +02:00
Thomas Wolf
0e918707dc
Merge pull request #907 from dhpollack/fix_convert_to_tf
Fix convert to tf
2019-08-05 12:55:04 +02:00
thomwolf
00132b7a7a updating docs - adding few tests to tokenizers 2019-08-04 22:42:55 +02:00
thomwolf
009273dbdd big doc update [WIP] 2019-08-04 12:14:57 +02:00
Julien Chaumond
44dd941efb link to swift-coreml-transformers 2019-08-01 09:50:30 -04:00
Anthony MOI
f2a3eb987e
Fix small typos 2019-07-31 11:05:06 -04:00
Pierric Cistac
97091acb8c
Small spelling fix 2019-07-31 10:37:56 -04:00
Grégory Châtel
769bb643ce Fixing a broken link. 2019-07-31 10:22:41 -04:00
David Pollack
c90119e543 spelling mistake 2019-07-29 16:56:02 +02:00
thomwolf
bfbe52ec39 cleaning up example docstrings 2019-07-27 20:25:39 +02:00
thomwolf
4cc1bf81ee typos 2019-07-27 12:08:21 +02:00
thomwolf
ac27548b25 fix unk_token test 2019-07-27 11:50:47 +02:00
thomwolf
c717d38573 dictionnary => dictionary 2019-07-26 23:30:48 +02:00
Thomas Wolf
6b763d04a9
Merge pull request #911 from huggingface/small_fixes
Small fixes
2019-07-26 21:36:21 +02:00
thomwolf
7b6e474c9a fix #901 2019-07-26 21:26:44 +02:00
thomwolf
632d711411 fix #908 2019-07-26 21:14:37 +02:00
Thomas Wolf
c054b5ee64
Merge pull request #896 from zijunsun/master
fix multi-gpu training bug when using fp16
2019-07-26 19:31:02 +02:00
thomwolf
27b0f86d36 clean up pretrained 2019-07-26 17:09:21 +02:00
thomwolf
57e54ec070 add unk_token to gpt2 2019-07-26 17:09:07 +02:00
thomwolf
ac42049c08 add auto models and auto tokenizer 2019-07-26 17:08:59 +02:00
David Pollack
09ecf225e9 fixed the fix. tf session madness. 2019-07-26 15:20:44 +02:00
David Pollack
edfd965ac8 fix convert_to_tf 2019-07-26 14:13:46 +02:00
zijunsun
f0aeb7a814 multi-gpu training also should be after apex fp16(squad) 2019-07-26 15:23:29 +08:00
Thomas Wolf
46cc9dd2b5
Merge pull request #899 from sukuya/master
Fixed import to use torchscript flag.
2019-07-25 15:03:21 +02:00
Thomas Wolf
6219ad7216
Merge pull request #888 from rococode/patch-1
Update docs for parameter rename
2019-07-25 15:01:22 +02:00
Thomas Wolf
0b6122e96a
Merge pull request #882 from Liangtaiwan/squad_v1_bug
fix squad v1 error (na_prob_file should be None)
2019-07-25 14:59:59 +02:00
Thomas Wolf
c244562cae
Merge pull request #893 from joelgrus/patch-2
make save_pretrained do the right thing with added tokens
2019-07-25 14:58:48 +02:00
Sukuya
e1e2ab3482
Merge pull request #1 from sukuya/sukuya-patch-1
Update torchscript.rst
2019-07-25 16:53:11 +08:00
Sukuya
35c52f2f3c
Update torchscript.rst
Import fixed to pytorch_transformers else torchscript flag can't be used.
2019-07-25 16:51:11 +08:00
zijunsun
adb3ef6368 multi-gpu training also should be after apex fp16 2019-07-25 13:09:10 +08:00
Joel Grus
ae152cec09
make save_pretrained work with added tokens
right now it's dumping the *decoder* when it should be dumping the *encoder*. this fixes that.
2019-07-24 16:54:48 -07:00
rococo // Ron
66b15f73f0
Update docs for parameter rename
OpenAIGPTLMHeadModel now accepts `labels` instead of `lm_labels`
2019-07-24 11:27:08 -07:00