mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-14 18:18:24 +06:00
Remove references to Python 2 in documentation.
This commit is contained in:
parent
0dddc1494d
commit
45841eaf7b
@ -64,7 +64,7 @@ Choose the right framework for every part of a model's lifetime
|
|||||||
|
|
||||||
## Installation
|
## Installation
|
||||||
|
|
||||||
This repo is tested on Python 2.7 and 3.5+ (examples are tested only on python 3.5+), PyTorch 1.0.0+ and TensorFlow 2.0.0-rc1
|
This repo is tested on Python 3.5+, PyTorch 1.0.0+ and TensorFlow 2.0.0-rc1
|
||||||
|
|
||||||
### With pip
|
### With pip
|
||||||
|
|
||||||
|
@ -1,6 +1,6 @@
|
|||||||
# Installation
|
# Installation
|
||||||
|
|
||||||
Transformers is tested on Python 2.7 and 3.5+ (examples are tested only on python 3.5+) and PyTorch 1.1.0
|
Transformers is tested on Python 3.5+ and PyTorch 1.1.0
|
||||||
|
|
||||||
## With pip
|
## With pip
|
||||||
|
|
||||||
@ -44,7 +44,7 @@ By default, slow tests are skipped. Set the `RUN_SLOW` environment variable to `
|
|||||||
|
|
||||||
## OpenAI GPT original tokenization workflow
|
## OpenAI GPT original tokenization workflow
|
||||||
|
|
||||||
If you want to reproduce the original tokenization process of the `OpenAI GPT` paper, you will need to install `ftfy` (use version 4.4.3 if you are using Python 2) and `SpaCy`:
|
If you want to reproduce the original tokenization process of the `OpenAI GPT` paper, you will need to install `ftfy` and `SpaCy`:
|
||||||
|
|
||||||
``` bash
|
``` bash
|
||||||
pip install spacy ftfy==4.4.3
|
pip install spacy ftfy==4.4.3
|
||||||
|
Loading…
Reference in New Issue
Block a user