mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-03 21:00:08 +06:00
![]() * save total_vocab_size = vocab_size + user added tokens to speed up operation * updating length when added_tokens_decoder is set * add test len(tokenizer) |
||
---|---|---|
.. | ||
__init__.py | ||
test_tokenization_fast.py | ||
test_tokenization_utils.py |