mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-30 17:52:35 +06:00
![]() * Make forward pass work * More improvements * Remove unused imports * Remove timm dependency * Improve loss calculation of token classifier * Fix most tests * Add docs * Add model integration test * Make all tests pass * Add LayoutLMv3FeatureExtractor * Improve integration test + make fixup * Add example script * Fix style * Add LayoutLMv3Processor * Fix style * Add option to add visual labels * Make more tokenizer tests pass * Fix more tests * Make more tests pass * Fix bug and improve docs * Fix import of processors * Improve docstrings * Fix toctree and improve docs * Fix auto tokenizer * Move tests to model folder * Move tests to model folder * change default behavior add_prefix_space * add prefix space for fast * add_prefix_spcae set to True for Fast * no space before `unique_no_split` token * add test to hightligh special treatment of added tokens * fix `test_batch_encode_dynamic_overflowing` by building a long enough example * fix `test_full_tokenizer` with add_prefix_token * Fix tokenizer integration test * Make the code more readable * Add tests for LayoutLMv3Processor * Fix style * Add model to README and update init * Apply suggestions from code review * Replace asserts by value errors * Add suggestion by @ducviet00 * Add model to doc tests * Simplify script * Improve README * a step ahead to fix * Update pair_input_test * Make all tokenizer tests pass - phew * Make style * Add LayoutLMv3 to CI job * Fix auto mapping * Fix CI job name * Make all processor tests pass * Make tests of LayoutLMv2 and LayoutXLM consistent * Add copied from statements to fast tokenizer * Add copied from statements to slow tokenizer * Remove add_visual_labels attribute * Fix tests * Add link to notebooks * Improve docs of LayoutLMv3Processor * Fix reference to section Co-authored-by: SaulLu <lucilesaul.com@gmail.com> Co-authored-by: Niels Rogge <nielsrogge@Nielss-MacBook-Pro.local> |
||
---|---|---|
.. | ||
adversarial | ||
bert-loses-patience | ||
bertabs | ||
bertology | ||
codeparrot | ||
decision_transformer | ||
deebert | ||
distillation | ||
fsner | ||
information-gain-filtration | ||
jax-projects | ||
layoutlmv3 | ||
longform-qa | ||
luke | ||
lxmert | ||
mlm_wwm | ||
mm-imdb | ||
movement-pruning | ||
onnx/summarization | ||
performer | ||
pplm | ||
quantization-qdqbert | ||
rag | ||
rag-end2end-retriever | ||
robust-speech-event | ||
self-training-text-classification | ||
seq2seq-distillation | ||
tapex | ||
visual_bert | ||
wav2vec2 | ||
xtreme-s | ||
zero-shot-distillation | ||
README.md |
Research projects
This folder contains various research projects using 🤗 Transformers. They are not maintained and require a specific version of 🤗 Transformers that is indicated in the requirements file of each folder. Updating them to the most recent version of the library will require some work.
To use any of them, just run the command
pip install -r requirements.txt
inside the folder of your choice.
If you need help with any of those, contact the author(s), indicated at the top of the README
of each folder.