Commit Graph

19383 Commits

Author SHA1 Message Date
Thomas Wolf
08bd8f9f39
Merge pull request #1505 from e-budur/master
Fixed the sample code in the title 'Quick tour'.
2019-10-15 09:50:36 +02:00
Thomas Wolf
8aa3b753bd
Merge pull request #1434 from bryant1410/patch-1
Remove unnecessary use of FusedLayerNorm in XLNet
2019-10-15 09:44:19 +02:00
Thomas Wolf
621e7a2529
Merge pull request #1275 from stecklin/ner-fine-tuning
Implement fine-tuning BERT on CoNLL-2003 named entity recognition task
2019-10-15 09:35:24 +02:00
thomwolf
c55badcee0 Add NER finetuning details by @stefan-it in example readme 2019-10-15 09:33:52 +02:00
Julien Chaumond
788e632622 [ner] Honor args.overwrite_cache 2019-10-15 09:17:31 +02:00
thomwolf
0f9ebb0b43 add seqeval as requirement for examples 2019-10-15 09:17:31 +02:00
thomwolf
66adb71734 update to transformers 2019-10-15 09:17:31 +02:00
Marianne Stecklina
5ff9cd158a Add option to predict on test set 2019-10-15 09:17:31 +02:00
Marianne Stecklina
7f5367e0b1 Add cli argument for configuring labels 2019-10-15 09:17:31 +02:00
Marianne Stecklina
e1d4179b64 Make file reading more robust 2019-10-15 09:17:31 +02:00
Marianne Stecklina
383ef96747 Implement fine-tuning BERT on CoNLL-2003 named entity recognition task 2019-10-15 09:17:31 +02:00
Marianne Stecklina
5adb39e757 Add option to predict on test set 2019-10-15 09:14:53 +02:00
Marianne Stecklina
99b189df6d Add cli argument for configuring labels 2019-10-15 09:14:53 +02:00
Marianne Stecklina
3e9420add1 Make file reading more robust 2019-10-15 09:14:53 +02:00
Marianne Stecklina
cde42c4354 Implement fine-tuning BERT on CoNLL-2003 named entity recognition task 2019-10-15 09:14:53 +02:00
hlums
74c5035808 Fix token order in xlnet preprocessing. 2019-10-14 21:27:11 +00:00
Rémi Louf
fe25eefc15 add instructions to fetch the dataset 2019-10-14 20:45:39 +02:00
Rémi Louf
412793275d delegate the padding with special tokens to the tokenizer 2019-10-14 20:45:16 +02:00
Rémi Louf
447fffb21f process the raw CNN/Daily Mail dataset
the data provided by Li Dong et al. were already tokenized, which means
that they are not compatible with  all the models in the library. We
thus process the raw data directly and tokenize them using the models'
tokenizers.
2019-10-14 18:12:20 +02:00
Thomas Wolf
80889a0226
Merge pull request #1512 from louismartin/fix-roberta-convert
Fix import error in script to convert faisreq roberta checkpoints
2019-10-14 17:40:32 +02:00
Simon Layton
4e6a55751a Force einsum to fp16 2019-10-14 11:12:41 -04:00
Thomas Wolf
f62f992cf7
Merge pull request #1502 from jeffxtang/master
the working example code to use BertForQuestionAnswering
2019-10-14 16:14:52 +02:00
Rémi Louf
67d10960ae load and prepare CNN/Daily Mail data
We write a function to load an preprocess the CNN/Daily Mail dataset as
provided by Li Dong et al. The issue is that this dataset has already
been tokenized by the authors, so we actually need to find the original,
plain-text dataset if we want to apply it to all models.
2019-10-14 14:11:20 +02:00
thomwolf
d9d387afce clean up 2019-10-14 12:14:40 +02:00
thomwolf
b7141a1bc6 maxi simplication 2019-10-14 12:14:08 +02:00
thomwolf
bfbe68f035 update forward pass 2019-10-14 12:04:23 +02:00
thomwolf
0ef9bc923a Cleaning up seq2seq [WIP] 2019-10-14 11:58:13 +02:00
Louis MARTIN
49cba6e543 Fix import error in script to convert faisreq roberta checkpoints 2019-10-14 01:38:57 -07:00
JulianPani
0993586758
remove usage of DUMMY_INPUTS
Hey @thomwolf  
This change da26bae61b (diff-8ddce309e88e8eb5b4d02228fd8881daL28-L29) removed the constant, but one usage of that constant remains in the code.
2019-10-14 02:09:53 +03:00
Timothy Liu
376e65a674 Added automatic mixed precision and XLA options to run_tf_glue.py 2019-10-13 13:19:06 +00:00
Timothy Liu
86f23a1944 Minor enhancements to run_tf_glue.py 2019-10-13 10:21:35 +00:00
Emrah Budur
5a8c6e771a Fixed the sample code in the title 'Quick tour'. 2019-10-12 14:17:17 +03:00
jeffxtang
e76d71521c the working example code to use BertForQuestionAnswering and get an answer from a text and a question 2019-10-11 17:04:02 -07:00
VictorSanh
d844db4005 Add citation bibtex 2019-10-11 16:55:42 -04:00
Lysandre
a701c9b321 CTRL to tf automodels 2019-10-11 16:05:30 -04:00
Rémi Louf
b3261e7ace read parameters from CLI, load model & tokenizer 2019-10-11 18:40:38 +02:00
Rémi Louf
d889e0b71b add base for seq2seq finetuning 2019-10-11 17:36:12 +02:00
Rémi Louf
f8e98d6779 load pretrained embeddings in Bert decoder
In Rothe et al.'s "Leveraging Pre-trained Checkpoints for Sequence
Generation Tasks", Bert2Bert is initialized with pre-trained weights for
the encoder, and only pre-trained embeddings for the decoder. The
current version of the code completely randomizes the weights of the
decoder.

We write a custom function to initiliaze the weights of the decoder; we
first initialize the decoder with the weights and then randomize
everything but the embeddings.
2019-10-11 16:48:11 +02:00
Lysandre
3ddce1d74c Release: 2.1.1 2019-10-11 06:37:49 -04:00
Thomas Wolf
4428aefc63
Merge pull request #1488 from huggingface/pytorch-tpu
GLUE on TPU
2019-10-11 16:33:00 +02:00
Thomas Wolf
3b43b01872
Merge pull request #1482 from huggingface/tf2_integration_tests
Integration of TF 2.0 models with other Keras modules
2019-10-11 16:25:43 +02:00
thomwolf
4b8f3e8f32 adding citation 2019-10-11 16:18:16 +02:00
thomwolf
18a3cef7d5 no nans 2019-10-11 16:09:42 +02:00
thomwolf
1f5d9513d8 fix test 2019-10-11 15:55:01 +02:00
thomwolf
0f9fc4fbde adding option to desactivate past/memory outputs 2019-10-11 15:47:08 +02:00
Thomas Wolf
700331b5ec
Merge pull request #1492 from stefan-it/bert-german-dbmdz-models
Add new BERT models for German (cased and uncased)
2019-10-11 13:01:52 +02:00
Thomas Wolf
573dde9b44
Merge pull request #1405 from slayton58/xlnet_layer_reorder
Re-order XLNet attention head outputs for better perf
2019-10-11 12:10:58 +02:00
Stefan Schweter
5f25a5f367 model: add support for new German BERT models (cased and uncased) from @dbmdz 2019-10-11 10:20:33 +02:00
Luran He
f382a8decd convert int to str before adding to a str 2019-10-10 19:20:39 -04:00
Lysandre
639f4b7190 Don't save/load when on TPU 2019-10-10 19:17:25 +00:00