transformers/examples/research_projects
Julien Plu aa19f478ac
Add (M)Luke model training for Token Classification in the examples (#14880)
* Add Luke training

* Fix true label tags

* Fix true label tags

* Fix true label tags

* Update the data collator for Luke

* Some training refactor for Luke

* Improve data collator for Luke

* Fix import

* Fix datasets concatenation

* Add the --max_entity_length argument for Luke models

* Remove unused code

* Fix style issues

* Fix style issues

* Move the Luke training into a separate folder

* Fix style

* Fix naming

* Fix filtering

* Fix filtering

* Fix filter

* Update some preprocessing

* Move luke to research_projects

* Checkstyle

* Address comments

* Fix style
2022-01-31 07:58:18 -05:00
..
adversarial Update namespaces inside torch.utils.data to the latest. (#13167) 2021-08-19 14:29:51 +02:00
bert-loses-patience remove extra white space from log format (#12360) 2021-06-25 13:20:14 -07:00
bertabs make style (#11442) 2021-04-26 13:50:34 +02:00
bertology [style] consistent nn. and nn.functional: part 4 examples (#12156) 2021-06-14 12:28:24 -07:00
codeparrot fix: switch from slow to generic tokenizer class (#15122) 2022-01-12 09:12:43 -05:00
deebert remove extra white space from log format (#12360) 2021-06-25 13:20:14 -07:00
distillation Remove n_ctx from configs (#14165) 2021-10-29 11:50:25 +02:00
fsner Update FSNER code in examples->research_projects->fsner (#13864) 2021-10-05 22:47:11 -04:00
jax-projects [urls to hub] Replace outdated model tags with their now-canonical pipeline types (#14617) 2021-12-06 04:35:01 -05:00
longform-qa [style] consistent nn. and nn.functional: part 4 examples (#12156) 2021-06-14 12:28:24 -07:00
luke Add (M)Luke model training for Token Classification in the examples (#14880) 2022-01-31 07:58:18 -05:00
lxmert Bump numpy from 1.19.2 to 1.21.0 in /examples/research_projects/lxmert (#15369) 2022-01-27 14:46:15 -05:00
mlm_wwm [urls to hub] Replace outdated model tags with their now-canonical pipeline types (#14617) 2021-12-06 04:35:01 -05:00
mm-imdb remove extra white space from log format (#12360) 2021-06-25 13:20:14 -07:00
movement-pruning use functional interface for softmax in attention (#14198) 2021-11-30 11:47:33 -05:00
onnx/summarization Move BART + ONNX example to research_projects (#15271) 2022-01-21 14:47:34 +01:00
performer [urls to hub] Replace outdated model tags with their now-canonical pipeline types (#14617) 2021-12-06 04:35:01 -05:00
pplm Fix execution PATH for PPLM Example (#14287) 2021-11-06 10:33:47 -04:00
quantization-qdqbert Add QDQBert model and quantization examples of SQUAD task (#14066) 2021-11-19 13:33:39 -05:00
rag minor fixes in original RAG training (#12395) 2021-06-29 13:39:48 +01:00
rag-end2end-retriever rm require_version_examples (#12088) 2021-06-09 11:02:52 -07:00
robust-speech-event Add a device argument to the eval script (#15371) 2022-01-27 15:58:55 +01:00
seq2seq-distillation Update Transformers to huggingface_hub >= 0.1.0 (#14251) 2021-11-02 18:58:42 -04:00
visual_bert Bump notebook in /examples/research_projects/visual_bert (#15368) 2022-01-27 14:45:58 -05:00
wav2vec2 [Wav2Vec2 Speech Event] Add speech event v2 (#15083) 2022-01-10 10:46:21 +01:00
zero-shot-distillation remove extra white space from log format (#12360) 2021-06-25 13:20:14 -07:00
README.md Reorganize examples (#9010) 2020-12-11 10:07:02 -05:00

Research projects

This folder contains various research projects using 🤗 Transformers. They are not maintained and require a specific version of 🤗 Transformers that is indicated in the requirements file of each folder. Updating them to the most recent version of the library will require some work.

To use any of them, just run the command

pip install -r requirements.txt

inside the folder of your choice.

If you need help with any of those, contact the author(s), indicated at the top of the README of each folder.