transformers/tests/models/siglip
Pavel Iakubovskii 3249c5dc15
Refactor attention for SigLIP based models (#36981)
* Update Siglip attention implementation

* Update tests for Siglip

* Remove one level of indentation

* Update test to be more specific

* Fixup

* Idefics2

* Idefics3

* Emu3

* SmolVLM

* Phi4 (just init small update)

* Idefics2 (test fix)

* Update siglip2 tests

* Update eager

* trigger

* Clean up

* Transfer inputs to device in test

* Fixing test

* Fixing test

* Revert contiguous

* Remove unused is_flash_attn_2_available

* Move flaky to specific models
2025-04-01 15:37:25 +02:00
..
__init__.py Add SigLIP (#26522) 2024-01-08 18:17:16 +01:00
test_image_processing_siglip.py Refactoring of ImageProcessorFast (#35069) 2025-02-04 17:52:31 -05:00
test_modeling_siglip.py Refactor attention for SigLIP based models (#36981) 2025-04-01 15:37:25 +02:00
test_tokenization_siglip.py Use lru_cache for tokenization tests (#36818) 2025-03-28 15:09:35 +01:00