mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-27 00:09:00 +06:00
![]() * save total_vocab_size = vocab_size + user added tokens to speed up operation * updating length when added_tokens_decoder is set * add test len(tokenizer) |
||
---|---|---|
.. | ||
__init__.py | ||
test_tokenization_fast.py | ||
test_tokenization_utils.py |