mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-03 12:50:06 +06:00
![]() * add T5 fine-tuning notebook [Community notebooks] * Update README.md Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com> |
||
---|---|---|
.. | ||
01-training-tokenizers.ipynb | ||
02-transformers.ipynb | ||
03-pipelines.ipynb | ||
04-onnx-export.ipynb | ||
README.md |
Transformers Notebooks
You can find here a list of the official notebooks provided by Hugging Face.
Also, we would like to list here interesting content created by the community. If you wrote some notebook(s) leveraging transformers and would like be listed here, please open a Pull Request so it can be included under the Community notebooks.
Hugging Face's notebooks 🤗
Notebook | Description | |
---|---|---|
Getting Started Tokenizers | How to train and use your very own tokenizer | |
Getting Started Transformers | How to easily start using transformers | |
How to use Pipelines | Simple and efficient way to use State-of-the-Art models on downstream tasks through transformers | |
How to train a language model | Highlight all the steps to effectively train Transformer model on custom data | |
How to generate text | How to use different decoding methods for language generation with transformers | |
How to export model to ONNX | Highlight how to export and run inference workloads through ONNX |
Community notebooks:
Notebook | Description | Author | |
---|---|---|---|
Train T5 on TPU | How to train T5 on SQUAD with Transformers and Nlp | Suraj Patil | |
Fine-tune T5 for Classification and Multiple Choice | How to fine-tune T5 for classification and multiple choice tasks using a text-to-text format with PyTorch Lightning | Suraj Patil |