transformers/examples
Julien Chaumond 0ae96ff8a7 BIG Reorganize examples (#4213)
* Created using Colaboratory

* [examples] reorganize files

* remove run_tpu_glue.py as superseded by TPU support in Trainer

* Bugfix: int, not tuple

* move files around
2020-05-07 13:48:44 -04:00
..
adversarial BIG Reorganize examples (#4213) 2020-05-07 13:48:44 -04:00
bertology BIG Reorganize examples (#4213) 2020-05-07 13:48:44 -04:00
contrib BIG Reorganize examples (#4213) 2020-05-07 13:48:44 -04:00
distillation Bump psutil from 5.6.3 to 5.6.6 in /examples/distillation 2020-03-12 21:14:56 -04:00
language-modeling BIG Reorganize examples (#4213) 2020-05-07 13:48:44 -04:00
multiple-choice BIG Reorganize examples (#4213) 2020-05-07 13:48:44 -04:00
question-answering BIG Reorganize examples (#4213) 2020-05-07 13:48:44 -04:00
summarization BIG Reorganize examples (#4213) 2020-05-07 13:48:44 -04:00
text-classification BIG Reorganize examples (#4213) 2020-05-07 13:48:44 -04:00
text-generation BIG Reorganize examples (#4213) 2020-05-07 13:48:44 -04:00
token-classification BIG Reorganize examples (#4213) 2020-05-07 13:48:44 -04:00
translation/t5 [isort] add known 3rd party to setup.cfg (#4053) 2020-04-28 17:12:00 -04:00
benchmarks.py [Examples, Benchmark] Improve benchmark utils (#3674) 2020-04-07 16:25:57 -04:00
lightning_base.py BIG Reorganize examples (#4213) 2020-05-07 13:48:44 -04:00
README.md BIG Reorganize examples (#4213) 2020-05-07 13:48:44 -04:00
requirements.txt BIG Reorganize examples (#4213) 2020-05-07 13:48:44 -04:00
test_examples.py BIG Reorganize examples (#4213) 2020-05-07 13:48:44 -04:00
xla_spawn.py Tpu trainer (#4146) 2020-05-07 10:34:04 -04:00

Examples

In this section a few examples are put together. All of these examples work for several models, making use of the very similar API between the different models.

Important To run the latest versions of the examples, you have to install from source and install some specific requirements for the examples. Execute the following steps in a new virtual environment:

git clone https://github.com/huggingface/transformers
cd transformers
pip install .
pip install -r ./examples/requirements.txt
Section Description
TensorFlow 2.0 models on GLUE Examples running BERT TensorFlow 2.0 model on the GLUE tasks.
Running on TPUs Examples on running fine-tuning tasks on Google TPUs to accelerate workloads.
Language Model training Fine-tuning (or training from scratch) the library models for language modeling on a text dataset. Causal language modeling for GPT/GPT-2, masked language modeling for BERT/RoBERTa.
Language Generation Conditional text generation using the auto-regressive models of the library: GPT, GPT-2, Transformer-XL and XLNet.
GLUE Examples running BERT/XLM/XLNet/RoBERTa on the 9 GLUE tasks. Examples feature distributed training as well as half-precision.
SQuAD Using BERT/RoBERTa/XLNet/XLM for question answering, examples with distributed training.
Multiple Choice Examples running BERT/XLNet/RoBERTa on the SWAG/RACE/ARC tasks.
Named Entity Recognition Using BERT for Named Entity Recognition (NER) on the CoNLL 2003 dataset, examples with distributed training.
XNLI Examples running BERT/XLM on the XNLI benchmark.
Adversarial evaluation of model performances Testing a model with adversarial evaluation of natural language inference on the Heuristic Analysis for NLI Systems (HANS) dataset (McCoy et al., 2019.)