transformers/tests/models/qwen2_vl
Joao Gante 62c7ea0201
CI: avoid human error, automatically infer generative models (#33212)
* tmp commit

* move tests to the right class

* remove ALL all_generative_model_classes = ...

* skip tf roberta

* skip InstructBlipForConditionalGenerationDecoderOnlyTest

* videollava

* reduce diff

* reduce diff

* remove  on vlms

* fix a few more

* manual rebase bits

* more manual rebase

* remove all manual generative model class test entries

* fix up to ernie

* a few more removals

* handle remaining cases

* recurrent gemma

* it's better here

* make fixup

* tf idefics is broken

* tf bert + generate is broken

* don't touch tf :()

* don't touch tf :(

* make fixup

* better comments for test skips

* revert tf changes

* remove empty line removal

* one more

* missing one
2025-02-13 16:27:11 +01:00
..
__init__.py support qwen2-vl (#32318) 2024-08-26 15:16:44 +02:00
test_image_processing_qwen2_vl.py use torch.testing.assertclose instead to get more details about error in cis (#35659) 2025-01-24 16:55:28 +01:00
test_modeling_qwen2_vl.py CI: avoid human error, automatically infer generative models (#33212) 2025-02-13 16:27:11 +01:00
test_processor_qwen2_vl.py Chat template: update for processor (#35953) 2025-02-10 09:52:19 +01:00