mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-30 17:52:35 +06:00
![]() * fix_torch_device_generate_test * remove @ * add length computatan * finish masking * finish * upload * fix some bugs * finish * fix dependency table * correct tensorboard * Apply suggestions from code review * correct processing * slight change init * correct some more mistakes * apply suggestions * improve readme * fix indent * Apply suggestions from code review Co-authored-by: SaulLu <55560583+SaulLu@users.noreply.github.com> * correct tokenizer * finish * finish * finish * finish Co-authored-by: Patrick von Platen <patrick@huggingface.co> Co-authored-by: SaulLu <55560583+SaulLu@users.noreply.github.com> |
||
---|---|---|
.. | ||
adversarial | ||
bert-loses-patience | ||
bertabs | ||
bertology | ||
deebert | ||
distillation | ||
jax-projects | ||
longform-qa | ||
lxmert | ||
mlm_wwm | ||
mm-imdb | ||
movement-pruning | ||
performer | ||
pplm | ||
rag | ||
rag-end2end-retriever | ||
seq2seq-distillation | ||
wav2vec2 | ||
zero-shot-distillation | ||
README.md |
Research projects
This folder contains various research projects using 🤗 Transformers. They are not maintained and require a specific version of 🤗 Transformers that is indicated in the requirements file of each folder. Updating them to the most recent version of the library will require some work.
To use any of them, just run the command
pip install -r requirements.txt
inside the folder of your choice.
If you need help with any of those, contact the author(s), indicated at the top of the README
of each folder.