mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-03 21:00:08 +06:00
![]() * Pass datasets trust_remote_code * Pass trust_remote_code in more tests * Add trust_remote_dataset_code arg to some tests * Revert "Temporarily pin datasets upper version to fix CI" This reverts commit |
||
---|---|---|
.. | ||
README.md | ||
requirements.txt | ||
run_summarization.py |
Summarization example
This script shows an example of training a summarization model with the 🤗 Transformers library. For straightforward use-cases you may be able to use these scripts without modification, although we have also included comments in the code to indicate areas that you may need to adapt to your own projects.
Multi-GPU and TPU usage
By default, these scripts use a MirroredStrategy
and will use multiple GPUs effectively if they are available. TPUs
can also be used by passing the name of the TPU resource with the --tpu
argument.
Example command
python run_summarization.py \
--model_name_or_path facebook/bart-base \
--dataset_name cnn_dailymail \
--dataset_config "3.0.0" \
--output_dir /tmp/tst-summarization \
--per_device_train_batch_size 8 \
--per_device_eval_batch_size 16 \
--num_train_epochs 3 \
--do_train \
--do_eval