mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-07 06:40:04 +06:00
![]() * add zero-shot distillation script * readme wordsmithing * clean up code * add multi-gpu teacher inference plus tidying up more code * add use_fast_tokenizer arg * update results in readme * more readme wordsmithing * style * Add handle to readme Co-authored-by: Lysandre Debut <lysandre@huggingface.co> * fix code block * add error+docs about distributed & tpu * add @sgugger format requests * xla -> tpu * support fp16 for teacher preds * no checkpoint by default * add demo colab link * add model sharing prompt + model link * correct resulting acc of example Co-authored-by: Lysandre Debut <lysandre@huggingface.co> |
||
---|---|---|
.. | ||
adversarial | ||
bert-loses-patience | ||
bertabs | ||
bertology | ||
deebert | ||
distillation | ||
longform-qa | ||
lxmert | ||
mlm_wwm | ||
mm-imdb | ||
movement-pruning | ||
performer | ||
pplm | ||
rag | ||
seq2seq-distillation | ||
zero-shot-distillation | ||
README.md |
Research projects
This folder contains various research projects using 🤗 Transformers. They are not maintained and require a specific version of 🤗 Transformers that is indicated in the requirements file of each folder. Updating them to the most recent version of the library will require some work.
To use any of them, just run the command
pip install -r requirements.txt
inside the folder of your choice.
If you need help with any of those, contact the author(s), indicated at the top of the README
of each folder.