diff --git a/src/transformers/tokenization_utils_base.py b/src/transformers/tokenization_utils_base.py index e77ed640932..7dc1ac12445 100644 --- a/src/transformers/tokenization_utils_base.py +++ b/src/transformers/tokenization_utils_base.py @@ -911,7 +911,7 @@ class SpecialTokensMixin: makes it easy to develop model-agnostic training and fine-tuning scripts. When possible, special tokens are already registered for provided pretrained models (for instance - [`BertTokenizer`] `cls_token` is already registered to be :obj*'[CLS]'* and XLM's one is also registered to be + [`BertTokenizer`] `cls_token` is already registered to be `'[CLS]'` and XLM's one is also registered to be `''`). Args: