transformers/tests/models/llava_onevision
Raushan Turganbay 1cfcbfcab8
[VLMs] fix flash-attention tests (#37603)
* fix one test

* fa2 ln test

* remove keys from config recursively

* fix

* fixup
2025-04-24 11:48:11 +02:00
..
__init__.py Llava Onevision: add model (#32673) 2024-09-05 14:43:20 +05:00
test_image_processing_llava_onevision.py Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
test_modeling_llava_onevision.py [VLMs] fix flash-attention tests (#37603) 2025-04-24 11:48:11 +02:00
test_processor_llava_onevision.py [vlm] adjust max length for special tokens (#37342) 2025-04-16 20:49:20 +02:00