mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-24 14:58:56 +06:00
![]() * Add flash-attention-2 backend for ESM-2 Signed-off-by: Peter St. John <pstjohn@nvidia.com> * update extended_attention_mask for fa2 Signed-off-by: Peter St. John <pstjohn@nvidia.com> * add test_flash_attn_2_equivalence test Signed-off-by: Peter St. John <pstjohn@nvidia.com> --------- Signed-off-by: Peter St. John <pstjohn@nvidia.com> |
||
---|---|---|
.. | ||
__init__.py | ||
test_modeling_esm.py | ||
test_modeling_esmfold.py | ||
test_modeling_tf_esm.py | ||
test_tokenization_esm.py |