mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-30 17:52:35 +06:00
![]() * wip * wip - but working with https://github.com/microsoft/DeepSpeed/pull/1044 * cleanup * workaround * working 5/8 modes * solve fp32 distributed zero3 * style * sync * sync * rework * deprecation * cleanup * https://github.com/microsoft/DeepSpeed/pull/1044 pr was merged * clean up * add a guide * more prose * more prose * fix * more prose * sub_group_size was too big * Apply suggestions from code review Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com> * refactor * bug fix * make the true check explicit * new deepspeed release Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com> |
||
---|---|---|
.. | ||
adversarial | ||
bert-loses-patience | ||
bertabs | ||
bertology | ||
deebert | ||
distillation | ||
longform-qa | ||
lxmert | ||
mlm_wwm | ||
mm-imdb | ||
movement-pruning | ||
performer | ||
pplm | ||
rag | ||
rag-end2end-retriever | ||
seq2seq-distillation | ||
wav2vec2 | ||
zero-shot-distillation | ||
README.md |
Research projects
This folder contains various research projects using 🤗 Transformers. They are not maintained and require a specific version of 🤗 Transformers that is indicated in the requirements file of each folder. Updating them to the most recent version of the library will require some work.
To use any of them, just run the command
pip install -r requirements.txt
inside the folder of your choice.
If you need help with any of those, contact the author(s), indicated at the top of the README
of each folder.