transformers/tests/models/llama
fxmarty 82cc0a79ac
Fix flash attention bugs with Mistral and Falcon (#27625)
* fix various bugs with flash attention

* bump

* fix test

* fix mistral

* use skiptest instead of return that may be misleading

* fix on review
2023-11-21 23:20:44 +09:00
..
__init__.py LLaMA Implementation (#21955) 2023-03-16 09:00:53 -04:00
test_modeling_llama.py Fix flash attention bugs with Mistral and Falcon (#27625) 2023-11-21 23:20:44 +09:00
test_tokenization_llama.py [Styling] stylify using ruff (#27144) 2023-11-16 17:43:19 +01:00