transformers/tests/models/llama
Yih-Dar df848acc5d
Fix test_compile_static_cache (#30991)
* fix

* fix

* fix

* fix

---------

Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
2024-06-03 15:16:28 +02:00
..
__init__.py LLaMA Implementation (#21955) 2023-03-16 09:00:53 -04:00
test_modeling_flax_llama.py Add Llama Flax Implementation (#24587) 2023-12-07 07:05:00 +01:00
test_modeling_llama.py Fix test_compile_static_cache (#30991) 2024-06-03 15:16:28 +02:00
test_tokenization_llama.py add prefix space ignored in llama #29625 (#30964) 2024-05-24 01:03:00 -07:00