mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-28 16:52:24 +06:00

* Update notebooks * From local to global link * from local links to *actual* global links
19 lines
2.3 KiB
Markdown
19 lines
2.3 KiB
Markdown
# Transformers Notebooks
|
|
|
|
You can find here a list of the official notebooks provided by Hugging Face.
|
|
|
|
Also, we would like to list here interesting content created by the community.
|
|
If you wrote some notebook(s) leveraging transformers and would like be listed here, please open a
|
|
Pull Request and we'll review it so it can be included here.
|
|
|
|
|
|
## Hugging Face's notebooks :hugs:
|
|
|
|
| Notebook | Description | |
|
|
|:----------|:-------------:|------:|
|
|
| [Getting Started Tokenizers](https://github.com/huggingface/transformers/blob/master/notebooks/01-training-tokenizers.ipynb) | How to train and use your very own tokenizer |[](https://colab.research.google.com/github/huggingface/transformers/blob/master/notebooks/01-training-tokenizers.ipynb) |
|
|
| [Getting Started Transformers](https://github.com/huggingface/transformers/blob/master/notebooks/02-transformers.ipynb) | How to easily start using transformers | [](https://colab.research.google.com/github/huggingface/transformers/blob/master/notebooks/02-transformers.ipynb) |
|
|
| [How to use Pipelines](https://github.com/huggingface/transformers/blob/master/notebooks/03-pipelines.ipynb) | Simple and efficient way to use State-of-the-Art models on downstream tasks through transformers | [](https://colab.research.google.com/github/huggingface/transformers/blob/master/notebooks/03-pipelines.ipynb) |
|
|
| [How to train a language model](https://github.com/huggingface/blog/blob/master/notebooks/01_how_to_train.ipynb)| Highlight all the steps to effectively train Transformer model on custom data | [](https://colab.research.google.com/github/huggingface/blog/blob/master/notebooks/01_how_to_train.ipynb)|
|
|
| [How to generate text](https://github.com/huggingface/blog/blob/master/notebooks/02_how_to_generate.ipynb)| How to use different decoding methods for language generation with transformers | [](https://colab.research.google.com/github/huggingface/blog/blob/master/notebooks/02_how_to_generate.ipynb)|
|