transformers/tests/models/llama
Yih-Dar f974214353
Fix some GPU OOM after #37553 (#37591)
* fix

* trigger CI

---------

Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
2025-04-18 10:09:19 +02:00
..
__init__.py LLaMA Implementation (#21955) 2023-03-16 09:00:53 -04:00
test_modeling_flax_llama.py Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
test_modeling_llama.py Fix some GPU OOM after #37553 (#37591) 2025-04-18 10:09:19 +02:00
test_tokenization_llama.py Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00