transformers/examples
dhanajitb f872eb98c2
making unconditional generation work
The unconditional generation works now but if the seed is fixed, the sample is the same every time.
n_samples > 1 will give different samples though.
I am giving the start token as '<|endoftext|>' for the unconditional generation.
2019-03-28 22:46:15 +05:30
..
lm_finetuning typos 2019-03-27 11:54:59 +01:00
extract_features.py fix typo - logger info 2019-03-06 10:05:21 +01:00
run_classifier.py Merge branch 'master' of https://github.com/ananyahjha93/pytorch-pretrained-BERT 2019-03-17 08:30:13 -04:00
run_gpt2.py making unconditional generation work 2019-03-28 22:46:15 +05:30
run_openai_gpt.py fix openai gpt example and updating readme 2019-03-06 11:43:21 +01:00
run_squad.py typo in annotation 2019-03-14 17:32:15 +08:00
run_swag.py add tqdm to the process of eval 2019-03-21 20:59:33 +08:00
run_transfo_xl.py fix typo - logger info 2019-03-06 10:05:21 +01:00