transformers/tests/models/stablelm
Fanli Lin e85d86398a
add the missing flash attention test marker (#32419)
* add flash attention check

* fix

* fix

* add the missing marker

* bug fix

* add one more

* remove order

* add one more
2024-08-06 11:18:58 +01:00
..
__init__.py Add StableLM (#28810) 2024-02-14 07:15:18 +01:00
test_modeling_stablelm.py add the missing flash attention test marker (#32419) 2024-08-06 11:18:58 +01:00