transformers/tests/models/llama
Longjie Zheng 0d1692a49b
Fix attn mask ignore logic in training-time trace (#32613)
* fix attn mask logic for training-time trace

* add test

* fix

* fix

* fix

* fix

* fix

* format

* [run-slow] llama

* avoid accelearate

* [run-slow] llama
2024-10-04 19:00:45 +02:00
..
__init__.py LLaMA Implementation (#21955) 2023-03-16 09:00:53 -04:00
test_modeling_flax_llama.py Add Llama Flax Implementation (#24587) 2023-12-07 07:05:00 +01:00
test_modeling_llama.py Fix attn mask ignore logic in training-time trace (#32613) 2024-10-04 19:00:45 +02:00
test_tokenization_llama.py use diff internal model in tests (#33387) 2024-09-11 11:27:00 +02:00