mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-31 10:12:23 +06:00
Upload DistilBART artwork (#5394)
This commit is contained in:
parent
09e841490c
commit
331d8d2936
@ -1,3 +1,5 @@
|
||||
## Sequence to Sequence
|
||||
|
||||
This directory contains examples for finetuning and evaluating transformers on summarization and translation tasks.
|
||||
Summarization support is more mature than translation support.
|
||||
Please tag @sshleifer with any issues/unexpected behaviors, or send a PR!
|
||||
@ -168,6 +170,7 @@ python run_eval.py sshleifer/distilbart-cnn-12-6 $DATA_DIR/val.source dbart_val_
|
||||
|
||||
|
||||
### DistilBART
|
||||

|
||||
|
||||
For the CNN/DailyMail dataset, (relatively longer, more extractive summaries), we found a simple technique that works:
|
||||
you just copy alternating layers from `bart-large-cnn` and finetune more on the same data.
|
||||
|
Loading…
Reference in New Issue
Block a user