transformers/examples/research_projects
Aleksey Tikhonov 291974c65c
GPT-model attention heads pruning example (#9189)
* Pruning for GPT attn heads

* The code formatted according to the transformers requirements

* Update run_prune_gpt.py

* Update run_prune_gpt.py
2020-12-18 16:32:10 -05:00
..
adversarial Reorganize examples (#9010) 2020-12-11 10:07:02 -05:00
bert-loses-patience Reorganize examples (#9010) 2020-12-11 10:07:02 -05:00
bertabs Reorganize examples (#9010) 2020-12-11 10:07:02 -05:00
bertology GPT-model attention heads pruning example (#9189) 2020-12-18 16:32:10 -05:00
deebert Reorganize examples (#9010) 2020-12-11 10:07:02 -05:00
distillation Reorganize examples (#9010) 2020-12-11 10:07:02 -05:00
longform-qa Reorganize examples (#9010) 2020-12-11 10:07:02 -05:00
mm-imdb Reorganize examples (#9010) 2020-12-11 10:07:02 -05:00
movement-pruning Bump notebook in /examples/research_projects/movement-pruning/lxmert (#9062) 2020-12-11 10:32:43 -05:00
pplm Reorganize examples (#9010) 2020-12-11 10:07:02 -05:00
rag fix a bug in eval_batch_retrieval (#9089) 2020-12-15 14:46:55 +01:00
seq2seq-distillation Reorganize examples (#9010) 2020-12-11 10:07:02 -05:00
README.md Reorganize examples (#9010) 2020-12-11 10:07:02 -05:00

Research projects

This folder contains various research projects using 🤗 Transformers. They are not maintained and require a specific version of 🤗 Transformers that is indicated in the requirements file of each folder. Updating them to the most recent version of the library will require some work.

To use any of them, just run the command

pip install -r requirements.txt

inside the folder of your choice.

If you need help with any of those, contact the author(s), indicated at the top of the README of each folder.