transformers/tests/models/flava
Arthur f5d45d89c4
🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288)
* Protect ParallelInterface

* early error out on output attention setting for no wraning in modeling

* modular update

* fixup

* update model tests

* update

* oups

* set model's config

* more cases

* ??

* properly fix

* fixup

* update

* last onces

* update

* fix?

* fix wrong merge commit

* fix hub test

* nits

* wow I am tired

* updates

* fix pipeline!

---------

Co-authored-by: Lysandre <hi@lysand.re>
2025-05-23 17:17:38 +02:00
..
__init__.py [feat] Add FLAVA model (#16654) 2022-05-11 14:56:48 -07:00
test_image_processing_flava.py Add Fast Image Processor for Flava (#37135) 2025-04-14 15:05:31 +02:00
test_modeling_flava.py 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
test_processor_flava.py Remove repeated prepare_images in processor tests (#33163) 2024-09-09 13:20:27 +01:00