transformers/tests/models/qwen2_moe
Fanli Lin e85d86398a
add the missing flash attention test marker (#32419)
* add flash attention check

* fix

* fix

* add the missing marker

* bug fix

* add one more

* remove order

* add one more
2024-08-06 11:18:58 +01:00
..
__init__.py Add Qwen2MoE (#29377) 2024-03-27 02:11:55 +01:00
test_modeling_qwen2_moe.py add the missing flash attention test marker (#32419) 2024-08-06 11:18:58 +01:00