transformers/tests/models/vipllava
Raushan Turganbay 1cfcbfcab8
[VLMs] fix flash-attention tests (#37603)
* fix one test

* fa2 ln test

* remove keys from config recursively

* fix

* fixup
2025-04-24 11:48:11 +02:00
..
__init__.py Adds VIP-llava to transformers (#27932) 2023-12-13 10:42:24 +01:00
test_modeling_vipllava.py [VLMs] fix flash-attention tests (#37603) 2025-04-24 11:48:11 +02:00