transformers/tests/models/aya_vision
Raushan Turganbay 1cfcbfcab8
[VLMs] fix flash-attention tests (#37603)
* fix one test

* fa2 ln test

* remove keys from config recursively

* fix

* fixup
2025-04-24 11:48:11 +02:00
..
__init__.py Add aya (#36521) 2025-03-04 12:24:33 +01:00
test_modeling_aya_vision.py [VLMs] fix flash-attention tests (#37603) 2025-04-24 11:48:11 +02:00
test_processor_aya_vision.py [processor] clean up mulitmodal tests (#37362) 2025-04-11 13:32:19 +02:00