transformers/tests/models/llava
Raushan Turganbay 8571bb145a
Fix CI for VLMs (#35690)
* fix some easy test

* more tests

* remove logit check here also

* add require_torch_large_gpu in Emu3
2025-01-20 11:15:39 +01:00
..
__init__.py
test_modeling_llava.py Fix CI for VLMs (#35690) 2025-01-20 11:15:39 +01:00
test_processor_llava.py Chat template: return vectorized output in processors (#34275) 2025-01-10 11:05:29 +01:00