transformers/tests/models/video_llava
Raushan Turganbay 1cfcbfcab8
[VLMs] fix flash-attention tests (#37603)
* fix one test

* fa2 ln test

* remove keys from config recursively

* fix

* fixup
2025-04-24 11:48:11 +02:00
..
__init__.py Add Video Llava (#29733) 2024-05-15 16:42:29 +05:00
test_image_processing_video_llava.py Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
test_modeling_video_llava.py [VLMs] fix flash-attention tests (#37603) 2025-04-24 11:48:11 +02:00