transformers/tests/models/siglip2
Pavel Iakubovskii 3249c5dc15
Refactor attention for SigLIP based models (#36981)
* Update Siglip attention implementation

* Update tests for Siglip

* Remove one level of indentation

* Update test to be more specific

* Fixup

* Idefics2

* Idefics3

* Emu3

* SmolVLM

* Phi4 (just init small update)

* Idefics2 (test fix)

* Update siglip2 tests

* Update eager

* trigger

* Clean up

* Transfer inputs to device in test

* Fixing test

* Fixing test

* Revert contiguous

* Remove unused is_flash_attn_2_available

* Move flaky to specific models
2025-04-01 15:37:25 +02:00
..
__init__.py Add SigLIP 2 (#36323) 2025-02-21 09:04:19 +00:00
test_image_processing_siglip2.py Add SigLIP 2 (#36323) 2025-02-21 09:04:19 +00:00
test_modeling_siglip2.py Refactor attention for SigLIP based models (#36981) 2025-04-01 15:37:25 +02:00