transformers/tests/models/starcoder2
Fanli Lin e85d86398a
add the missing flash attention test marker (#32419)
* add flash attention check

* fix

* fix

* add the missing marker

* bug fix

* add one more

* remove order

* add one more
2024-08-06 11:18:58 +01:00
..
__init__.py Starcoder2 model - bis (#29215) 2024-02-28 01:24:34 +01:00
test_modeling_starcoder2.py add the missing flash attention test marker (#32419) 2024-08-06 11:18:58 +01:00