transformers/tests/models/siglip
Pavel Iakubovskii a177821b24
Add FA2 and sdpa support for SigLIP (#31499)
* Rebase to main

* Fix attention implementation autoset for tex and vision configs

* Fixup

* Minor fixes

* Fix copies

* Fix attention_mask for FA2

* Add eqvivalence tests for siglip

* Remove right padding test

* Uncomment flaky

* Fix import

* Add to docs

* Fix test message

* Add sdpa

* Add sdpa equivalence test

* Add siglip sdpa to docs

* Fix typing for attention output

* Add sdpa tests

* Fix signature of FA2

* Autoset attn_implementation in config

* Rename bsz -> batch_size

* Move back autoset attn method

* Mark as flaky

* Correct attention mask padding

* [run-slow] siglip

* Add FA2 and sdpa docs

* Style fix

* Remove flaky for FA2 test

* Change attention implementation set

* Change attn_implementaiton propogation

* Fix typos

* Add modality to assert message

* Add more sdpa backends in test

* [run slow] siglip

* Add math sdpa backend for all options

* [run slow] siglip
2024-07-08 11:10:02 +01:00
..
__init__.py Add SigLIP (#26522) 2024-01-08 18:17:16 +01:00
test_image_processing_siglip.py Skip tests properly (#31308) 2024-06-26 21:59:08 +01:00
test_modeling_siglip.py Add FA2 and sdpa support for SigLIP (#31499) 2024-07-08 11:10:02 +01:00
test_tokenization_siglip.py Skip tests properly (#31308) 2024-06-26 21:59:08 +01:00