mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-30 17:52:35 +06:00
![]() * fixed bug run_mlm_flax_stream.py Fixed bug caused by an update to tokenizer keys introduced in recent transformers versions (between `4.6.2` and `4.18.0`) where additional keys were introduced to the tokenizer output. * Update run_mlm_flax_stream.py * adding missing paranthesis * formatted to black * remove cols from dataset instead * reformat to black * moved rem. columns to map * formatted to black Co-authored-by: KennethEnevoldsen <kennethcenevolsen@gmail.com> |
||
---|---|---|
.. | ||
adversarial | ||
bert-loses-patience | ||
bertabs | ||
bertology | ||
codeparrot | ||
decision_transformer | ||
deebert | ||
distillation | ||
fsner | ||
jax-projects | ||
longform-qa | ||
luke | ||
lxmert | ||
mlm_wwm | ||
mm-imdb | ||
movement-pruning | ||
onnx/summarization | ||
performer | ||
pplm | ||
quantization-qdqbert | ||
rag | ||
rag-end2end-retriever | ||
robust-speech-event | ||
self-training-text-classification | ||
seq2seq-distillation | ||
tapex | ||
visual_bert | ||
wav2vec2 | ||
xtreme-s | ||
zero-shot-distillation | ||
README.md |
Research projects
This folder contains various research projects using 🤗 Transformers. They are not maintained and require a specific version of 🤗 Transformers that is indicated in the requirements file of each folder. Updating them to the most recent version of the library will require some work.
To use any of them, just run the command
pip install -r requirements.txt
inside the folder of your choice.
If you need help with any of those, contact the author(s), indicated at the top of the README
of each folder.