transformers/examples
jeonsworld a3a604cefb
Update pregenerate_training_data.py
apply Whole Word Masking technique.
referred to [create_pretraining_data.py](https://github.com/google-research/bert/blob/master/create_pretraining_data.py)
2019-06-10 12:17:23 +09:00
..
lm_finetuning Update pregenerate_training_data.py 2019-06-10 12:17:23 +09:00
extract_features.py Fixes to the TensorFlow conversion tool 2019-04-01 13:17:54 -06:00
run_classifier.py Division to num_train_optimizer of global_step in lr_this_step is removed. 2019-05-09 10:57:03 +03:00
run_gpt2.py Fix indentation weirdness in GPT-2 example. 2019-04-22 02:20:22 +09:00
run_openai_gpt.py Prepare optimizer only when args.do_train is True 2019-05-02 19:09:29 +08:00
run_squad.py Division to num_train_optimizer of global_step in lr_this_step is removed. 2019-05-09 10:57:03 +03:00
run_swag.py Division to num_train_optimizer of global_step in lr_this_step is removed. 2019-05-09 10:57:03 +03:00
run_transfo_xl.py add serialization semantics to tokenizers - fix transfo-xl tokenizer 2019-04-15 11:47:25 +02:00