yzy5630
|
a1fe4ba9c9
|
use new API for save and load
|
2019-07-18 15:45:23 +08:00 |
|
yzy5630
|
a7ba27b1b4
|
add parser for adam
|
2019-07-18 08:52:51 +08:00 |
|
yzy5630
|
d6522e2873
|
change loss and optimizer to new API
|
2019-07-17 21:22:34 +08:00 |
|
yzy5630
|
60a1bdcdac
|
fix some errors for distributed lm_finetuning
|
2019-07-17 09:16:20 +08:00 |
|
thomwolf
|
0bab55d5d5
|
[BIG] name change
|
2019-07-05 11:55:36 +02:00 |
|
thomwolf
|
c41f2bad69
|
WIP XLM + refactoring
|
2019-07-03 22:54:39 +02:00 |
|
Oliver Guhr
|
5c08c8c273
|
adds the tokenizer + model config to the output
|
2019-06-11 13:46:33 +02:00 |
|
burcturkoglu
|
00c7fd2b79
|
Division to num_train_optimizer of global_step in lr_this_step is removed.
|
2019-05-09 10:57:03 +03:00 |
|
burcturkoglu
|
fa37b4da77
|
Merge branch 'master' of https://github.com/huggingface/pytorch-pretrained-BERT
|
2019-05-09 10:55:24 +03:00 |
|
burcturkoglu
|
5289b4b9e0
|
Division to num_train_optimizer of global_step in lr_this_step is removed.
|
2019-05-09 10:51:38 +03:00 |
|
MottoX
|
74dbba64bc
|
Prepare optimizer only when args.do_train is True
|
2019-05-02 19:09:29 +08:00 |
|
thomwolf
|
d94c6b0144
|
fix training schedules in examples to match new API
|
2019-04-23 11:17:06 +02:00 |
|
Matthew Carrigan
|
934d3f4d2f
|
Syncing up argument names between the scripts
|
2019-03-20 17:23:23 +00:00 |
|
Matthew Carrigan
|
f19ba35b2b
|
Move old finetuning script into the new folder
|
2019-03-20 16:47:06 +00:00 |
|