transformers/tests/models/esm
Peter St. John d69945e5fc
[ESM] Add flash-attention-2 backend for ESM-2 (#38023)
* Add flash-attention-2 backend for ESM-2

Signed-off-by: Peter St. John <pstjohn@nvidia.com>

* update extended_attention_mask for fa2

Signed-off-by: Peter St. John <pstjohn@nvidia.com>

* add test_flash_attn_2_equivalence test

Signed-off-by: Peter St. John <pstjohn@nvidia.com>

---------

Signed-off-by: Peter St. John <pstjohn@nvidia.com>
2025-05-16 14:11:56 +01:00
..
__init__.py Rebase ESM PR and update all file formats (#19055) 2022-09-30 14:16:25 +01:00
test_modeling_esm.py [ESM] Add flash-attention-2 backend for ESM-2 (#38023) 2025-05-16 14:11:56 +01:00
test_modeling_esmfold.py Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
test_modeling_tf_esm.py Fix typos in strings and comments (#37784) 2025-04-25 13:47:25 +01:00
test_tokenization_esm.py Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00