transformers/tests/models/bloom
Yih-Dar 1fcaad6df9
Use lru_cache for tokenization tests (#36818)
* fix

* fix

* fix

* fix

---------

Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
2025-03-28 15:09:35 +01:00
..
__init__.py BLOOM (#17474) 2022-06-09 12:00:40 +02:00
test_modeling_bloom.py CI: avoid human error, automatically infer generative models (#33212) 2025-02-13 16:27:11 +01:00
test_modeling_flax_bloom.py [tests] remove tf/flax tests in /generation (#36235) 2025-02-17 14:59:22 +00:00
test_tokenization_bloom.py Use lru_cache for tokenization tests (#36818) 2025-03-28 15:09:35 +01:00