transformers/tests/models/opt
Raushan Turganbay d23aae2b8c
[VLMs] support attention backends (#37576)
* update models

* why rename

* return attn weights when sdpa

* fixes

* fix attn implementation composite

* fix moshi

* add message

* add typings

* use explicitly all flags for each attn type

* fix some tests

* import what is needed

* kosmos on main has ew attention already, yay

* new models in main, run fixup

* won't fix kosmos yet

* fix-copies

* clean up after rebasing

* fix tests

* style

* dont cast attns to fp32

* did we update ruff? oke, let's just do what it asks

* fix pixtral after rebase
2025-05-08 18:18:54 +02:00
..
__init__.py Add OPT (#17088) 2022-05-12 12:24:35 +02:00
test_modeling_flax_opt.py Fix typos (#37978) 2025-05-06 14:45:20 +01:00
test_modeling_opt.py [VLMs] support attention backends (#37576) 2025-05-08 18:18:54 +02:00
test_modeling_tf_opt.py Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00