transformers/tests/models/llama
Fanli Lin 8bd1f2f338
[tests] make more tests device-agnostic (#33580)
* enable

* fix

* add xpu skip

* add marker

* skip for xpu

* add more

* enable on accelerator

* add more cases

* add more tests

* add more
2024-09-20 10:16:43 +01:00
..
__init__.py LLaMA Implementation (#21955) 2023-03-16 09:00:53 -04:00
test_modeling_flax_llama.py Add Llama Flax Implementation (#24587) 2023-12-07 07:05:00 +01:00
test_modeling_llama.py [tests] make more tests device-agnostic (#33580) 2024-09-20 10:16:43 +01:00
test_tokenization_llama.py use diff internal model in tests (#33387) 2024-09-11 11:27:00 +02:00