transformers/tests/models/llama
Joao Gante 999981daf4
Tests: remove cuda versions when the result is the same 🧹🧹 (#31955)
remove cuda versions when the result is the same
2024-07-16 16:49:54 +01:00
..
__init__.py LLaMA Implementation (#21955) 2023-03-16 09:00:53 -04:00
test_modeling_flax_llama.py Add Llama Flax Implementation (#24587) 2023-12-07 07:05:00 +01:00
test_modeling_llama.py Tests: remove cuda versions when the result is the same 🧹🧹 (#31955) 2024-07-16 16:49:54 +01:00
test_tokenization_llama.py Skip tests properly (#31308) 2024-06-26 21:59:08 +01:00