mirror of
https://github.com/huggingface/transformers.git
synced 2025-08-01 02:31:11 +06:00
![]() * save total_vocab_size = vocab_size + user added tokens to speed up operation * updating length when added_tokens_decoder is set * add test len(tokenizer) |
||
---|---|---|
.. | ||
__init__.py | ||
test_tokenization_fast.py | ||
test_tokenization_utils.py |