transformers/tests/models/llama
Fanli Lin e85d86398a
add the missing flash attention test marker (#32419)
* add flash attention check

* fix

* fix

* add the missing marker

* bug fix

* add one more

* remove order

* add one more
2024-08-06 11:18:58 +01:00
..
__init__.py LLaMA Implementation (#21955) 2023-03-16 09:00:53 -04:00
test_modeling_flax_llama.py Add Llama Flax Implementation (#24587) 2023-12-07 07:05:00 +01:00
test_modeling_llama.py add the missing flash attention test marker (#32419) 2024-08-06 11:18:58 +01:00
test_tokenization_llama.py Skip tests properly (#31308) 2024-06-26 21:59:08 +01:00