mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-05 22:00:09 +06:00

* Reorganize example folder * Continue reorganization * Change requirements for tests * Final cleanup * Finish regroup with tests all passing * Copyright * Requirements and readme * Make a full link for the documentation * Address review comments * Apply suggestions from code review Co-authored-by: Lysandre Debut <lysandre@huggingface.co> * Add symlink * Reorg again * Apply suggestions from code review Co-authored-by: Thomas Wolf <thomwolf@users.noreply.github.com> * Adapt title * Update to new strucutre * Remove test * Update READMEs Co-authored-by: Lysandre Debut <lysandre@huggingface.co> Co-authored-by: Thomas Wolf <thomwolf@users.noreply.github.com>
15 lines
500 B
Bash
Executable File
15 lines
500 B
Bash
Executable File
#!/usr/bin/env bash
|
|
export PYTHONPATH="../":"${PYTHONPATH}"
|
|
|
|
# From appendix C of paper https://arxiv.org/abs/1912.08777
|
|
# Set --gradient_accumulation_steps so that effective batch size is 256 (2*128, 4*64, 8*32, 16*16)
|
|
python finetune.py \
|
|
--learning_rate=1e-4 \
|
|
--do_train \
|
|
--do_predict \
|
|
--n_val 1000 \
|
|
--val_check_interval 0.25 \
|
|
--max_source_length 512 --max_target_length 56 \
|
|
--freeze_embeds --label_smoothing 0.1 --adafactor --task summarization_xsum \
|
|
"$@"
|