transformers/tests/models/qwen2
Fanli Lin e85d86398a
add the missing flash attention test marker (#32419)
* add flash attention check

* fix

* fix

* add the missing marker

* bug fix

* add one more

* remove order

* add one more
2024-08-06 11:18:58 +01:00
..
__init__.py Add qwen2 (#28436) 2024-01-17 16:02:22 +01:00
test_modeling_qwen2.py add the missing flash attention test marker (#32419) 2024-08-06 11:18:58 +01:00
test_tokenization_qwen2.py Skip tests properly (#31308) 2024-06-26 21:59:08 +01:00