transformers/tests/models/idefics3
Raushan Turganbay 1cfcbfcab8
[VLMs] fix flash-attention tests (#37603)
* fix one test

* fa2 ln test

* remove keys from config recursively

* fix

* fixup
2025-04-24 11:48:11 +02:00
..
__init__.py Add Idefics 3! (#32473) 2024-09-25 21:28:49 +02:00
test_image_processing_idefics3.py Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
test_modeling_idefics3.py [VLMs] fix flash-attention tests (#37603) 2025-04-24 11:48:11 +02:00
test_processor_idefics3.py [vlm] adjust max length for special tokens (#37342) 2025-04-16 20:49:20 +02:00