transformers/tests/models/idefics2
Raushan Turganbay 1cfcbfcab8
[VLMs] fix flash-attention tests (#37603)
* fix one test

* fa2 ln test

* remove keys from config recursively

* fix

* fixup
2025-04-24 11:48:11 +02:00
..
__init__.py Add Idefics2 (#30253) 2024-04-15 17:03:03 +01:00
test_image_processing_idefics2.py Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
test_modeling_idefics2.py [VLMs] fix flash-attention tests (#37603) 2025-04-24 11:48:11 +02:00
test_processor_idefics2.py [vlm] adjust max length for special tokens (#37342) 2025-04-16 20:49:20 +02:00