Stefan Schweter
|
64a971a915
|
auto: add XLM-RoBERTa to auto tokenization
|
2019-12-18 18:24:32 +01:00 |
|
Stefan Schweter
|
036831e279
|
auto: add XLM-RoBERTa to audo modeling
|
2019-12-18 18:23:42 +01:00 |
|
Stefan Schweter
|
41a13a6375
|
auto: add XLMRoBERTa to auto configuration
|
2019-12-18 18:20:27 +01:00 |
|
Morgan Funtowicz
|
0c88c856d5
|
Unnest QuestionAnsweringArgumentHandler
|
2019-12-18 18:18:16 +01:00 |
|
Lysandre
|
8efc6dd544
|
fix #2214
|
2019-12-18 10:47:59 -05:00 |
|
Gunnlaugur Thor Briem
|
a2978465a2
|
Merge branch 'master' into patch-1
|
2019-12-18 14:54:46 +00:00 |
|
Stefan Schweter
|
01b68be34f
|
converter: remove XLM-RoBERTa specific script (can be done with the script for RoBERTa now)
|
2019-12-18 12:24:46 +01:00 |
|
thomwolf
|
3d2096f516
|
further cleanup
|
2019-12-18 11:50:54 +01:00 |
|
Stefan Schweter
|
ca31abc6d6
|
tokenization: *align* fairseq and spm vocab to fix some tokenization errors
|
2019-12-18 11:36:54 +01:00 |
|
thomwolf
|
8e5587fb79
|
few fixes on sampling
|
2019-12-18 11:32:37 +01:00 |
|
Stefan Schweter
|
cce3089b65
|
Merge remote-tracking branch 'upstream/master' into xlmr
|
2019-12-18 11:05:16 +01:00 |
|
thomwolf
|
641a8decdc
|
clean up code and add arbitrary number of return sequences
|
2019-12-18 10:43:48 +01:00 |
|
Morgan Funtowicz
|
e347725d8c
|
More fine-grained control over pipeline creation with config argument.
|
2019-12-18 10:41:24 +01:00 |
|
Julien Chaumond
|
94c99db34c
|
[FinBERT] fix incorrect url
|
2019-12-17 20:35:25 -05:00 |
|
Julien Chaumond
|
7ffa817390
|
[s3] mv files and update links
|
2019-12-17 20:35:25 -05:00 |
|
Antti Virtanen
|
c5f35e61db
|
Uploaded files to AWS.
|
2019-12-17 20:35:25 -05:00 |
|
Antti Virtanen
|
abc43ffbff
|
Add pretrained model documentation for FinBERT.
|
2019-12-17 20:35:25 -05:00 |
|
Antti Virtanen
|
8ac840ff87
|
Adding Finnish BERT.
|
2019-12-17 20:35:25 -05:00 |
|
Julien Chaumond
|
a0d386455b
|
Fix outdated tokenizer doc
|
2019-12-17 20:07:39 -05:00 |
|
Julien Chaumond
|
ea636440d1
|
[roberta.conversion] Do not hardcode vocab size
and support for fairseq 0.9+
|
2019-12-17 18:12:22 -05:00 |
|
Arman Cohan
|
a4df2e0113
|
update roberta conversion
- update to fix conversion for the updated fairseq model
- create save directory if not exist
|
2019-12-17 18:12:22 -05:00 |
|
thomwolf
|
77d397202b
|
clean up dead code
|
2019-12-17 23:28:46 +01:00 |
|
thomwolf
|
bbc0c86f9b
|
beam search + single beam decoding
|
2019-12-17 23:27:02 +01:00 |
|
Lysandre
|
5e289f69bc
|
regex 2019.12.17 install fails with Python 2
|
2019-12-17 15:54:05 -05:00 |
|
Lysandre
|
2cff4bd8f3
|
Fix segmentation fault
|
2019-12-17 15:54:05 -05:00 |
|
Julien Chaumond
|
55397dfb9b
|
CsvPipelineDataFormat: Fix for single-column
|
2019-12-17 13:10:51 -05:00 |
|
thomwolf
|
b6938916ac
|
adding beam search
|
2019-12-17 17:23:36 +01:00 |
|
Gunnlaugur Thor Briem
|
d303f84e7b
|
fix: wrong architecture count in README
Just say “the following” so that this intro doesn't so easily fall out of date :) )
|
2019-12-17 16:18:00 +00:00 |
|
Morgan Funtowicz
|
2fde5a2489
|
Initial bunch of documentation.
|
2019-12-17 12:16:07 +01:00 |
|
thomwolf
|
2f1c745cde
|
update conversion script
|
2019-12-17 11:47:54 +01:00 |
|
thomwolf
|
83bc5235cf
|
Merge branch 'master' into pr/2189
|
2019-12-17 11:47:32 +01:00 |
|
Morgan Funtowicz
|
d7c62661a3
|
Provide serving dependencies for tensorflow and pytorch (serving-tf, serving-torch)
|
2019-12-17 11:23:39 +01:00 |
|
Stefan Schweter
|
f349826a57
|
model: fix cls and sep token for XLM-RoBERTa documentation
|
2019-12-17 10:36:04 +01:00 |
|
Thomas Wolf
|
f061606277
|
Merge pull request #2164 from huggingface/cleanup-configs
[SMALL BREAKING CHANGE] Cleaning up configuration classes - Adding Model Cards
|
2019-12-17 09:10:16 +01:00 |
|
erenup
|
805c21aeba
|
tried to fix the failed checks
|
2019-12-17 11:36:00 +08:00 |
|
erenup
|
d000195ee6
|
add comment for example_index and unique_id in single process
|
2019-12-17 11:28:34 +08:00 |
|
erenup
|
3c6efd0ca3
|
updated usage example in modeling_roberta for question and answering
|
2019-12-17 11:18:12 +08:00 |
|
Julien Chaumond
|
3f5ccb183e
|
[doc] Clarify uploads
cf 855ff0e91d (commitcomment-36452545)
|
2019-12-16 18:20:29 -05:00 |
|
thomwolf
|
3cb51299c3
|
Fix #2109
|
2019-12-16 16:58:44 -05:00 |
|
Lysandre
|
18a879f475
|
fix #2180
|
2019-12-16 16:44:29 -05:00 |
|
Lysandre
|
d803409215
|
Fix run squad evaluate during training
|
2019-12-16 16:31:38 -05:00 |
|
thomwolf
|
a468870fd2
|
refactoring generation
|
2019-12-16 22:22:30 +01:00 |
|
Julien Chaumond
|
855ff0e91d
|
[doc] Model upload and sharing
ping @lysandrejik @thomwolf
Is this clear enough? Anything we should add?
|
2019-12-16 12:42:22 -05:00 |
|
Stefan Schweter
|
d064009b72
|
converter: fix vocab size
|
2019-12-16 17:23:25 +01:00 |
|
Stefan Schweter
|
a701a0cee1
|
configuration: fix model name for large XLM-RoBERTa model
|
2019-12-16 17:17:56 +01:00 |
|
Stefan Schweter
|
59a1aefb1c
|
tokenization: add support for new XLM-RoBERTa model. Add wrapper around fairseq tokenization logic
|
2019-12-16 17:00:55 +01:00 |
|
Stefan Schweter
|
69f4f058fa
|
model: add support for new XLM-RoBERTa model
|
2019-12-16 17:00:12 +01:00 |
|
Stefan Schweter
|
a648ff738c
|
configuration: add support for XLM-RoBERTa model
|
2019-12-16 16:47:39 +01:00 |
|
Stefan Schweter
|
9ed09cb4a3
|
converter: add conversion script for original XLM-RoBERTa weights to Transformers-compatible weights
|
2019-12-16 16:46:58 +01:00 |
|
Stefan Schweter
|
d3549b66af
|
module: add support for XLM-RoBERTa (__init__)
|
2019-12-16 16:38:39 +01:00 |
|