transformers/tests/models
David Klank fa921ad854
fix spelling errors (#38608)
* fix errors test_modeling_mllama.py

* fix error test_modeling_video_llava.py

* fix errors test_processing_common.py
2025-06-05 13:57:23 +01:00
..
albert [Tests] Reduced model size for albert-test model (#38480) 2025-05-30 14:22:32 +00:00
align Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
altclip 🚨 🚨 Setup -> setupclass conversion (#37282) 2025-04-08 17:15:37 +01:00
aria Disable torchscript tests for AriaForConditionalGenerationModelTest (#38225) 2025-05-20 14:37:55 +02:00
audio_spectrogram_transformer Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
auto 🚨 🚨 Fix custom code saving (#37716) 2025-05-26 17:37:30 +01:00
autoformer 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
aya_vision [tests] remove test_sdpa_equivalence (redundant) (#37911) 2025-05-16 18:37:27 +01:00
bamba switch to device agnostic device calling for test cases (#38247) 2025-05-26 10:18:53 +02:00
bark enable cpu offloading for Bark on xpu (#37599) 2025-04-23 11:37:15 +02:00
bart 🔴🔴🔴 [Attention] Refactor Attention Interface for Bart-based Models (#38108) 2025-05-22 17:12:58 +02:00
barthez Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
bartpho Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
beit [Fast Processor] BEiT (#37005) 2025-05-06 17:40:28 -04:00
bert Fix typos in strings and comments (#37784) 2025-04-25 13:47:25 +01:00
bert_generation Fix typos in strings and comments (#37799) 2025-04-28 11:39:11 +01:00
bert_japanese Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
bertweet Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
big_bird Fix typos in strings and comments (#37799) 2025-04-28 11:39:11 +01:00
bigbird_pegasus Bart: new cache format (#35314) 2025-05-16 13:26:54 +02:00
biogpt Bart: new cache format (#35314) 2025-05-16 13:26:54 +02:00
bit Add ImageProcessorFast to BiT processor (#37180) 2025-04-14 17:07:48 +02:00
bitnet Add Bitnet model (#37742) 2025-04-28 15:08:46 +02:00
blenderbot 🔴🔴🔴 [Attention] Refactor Attention Interface for Bart-based Models (#38108) 2025-05-22 17:12:58 +02:00
blenderbot_small 🔴🔴🔴 [Attention] Refactor Attention Interface for Bart-based Models (#38108) 2025-05-22 17:12:58 +02:00
blip 🚨 🚨 Setup -> setupclass conversion (#37282) 2025-04-08 17:15:37 +01:00
blip_2 Fix blip2 tests (#38510) 2025-06-02 22:46:35 +02:00
bloom switch to device agnostic device calling for test cases (#38247) 2025-05-26 10:18:53 +02:00
bridgetower Add Optional to remaining types (#37808) 2025-04-28 14:20:45 +01:00
bros Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
byt5 Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
camembert Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
canine 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
chameleon Fix chameleon tests (#38565) 2025-06-04 10:13:35 +02:00
chinese_clip Add Fast Chinese-CLIP Processor (#37012) 2025-04-15 18:31:20 +02:00
clap Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
clip [tests] expand flex-attn test for vision models (#38434) 2025-06-03 07:40:44 +00:00
clipseg Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
clvp Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
code_llama remove unhandled parameter (#38145) 2025-06-02 15:57:32 +02:00
codegen Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
cohere Fix some tests (especially compile with fullgraph=True on Python<3.11) (#38319) 2025-05-23 17:11:40 +02:00
cohere2 Fix Gemma2IntegrationTest (#38492) 2025-06-02 22:45:09 +02:00
colpali Add ColQwen2 to 🤗 transformers (#35778) 2025-06-02 12:58:01 +00:00
colqwen2 Add ColQwen2 to 🤗 transformers (#35778) 2025-06-02 12:58:01 +00:00
conditional_detr 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
convbert 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
convnext Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
convnextv2 Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
cpm Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
cpmant Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
csm Update CsmForConditionalGenerationIntegrationTest (#38424) 2025-05-28 10:20:43 +02:00
ctrl Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
cvt Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
d_fine 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
dab_detr 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
dac Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
data2vec 🔴🔴🔴 [Attention] Refactor Attention Interface for Bart-based Models (#38108) 2025-05-22 17:12:58 +02:00
dbrx 🚨 🚨 Inherited CausalLM Tests (#37590) 2025-05-23 18:29:31 +01:00
deberta Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
deberta_v2 Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
decision_transformer Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
deepseek_v3 [FlexAttn] Fix models with unique characteristics (#38433) 2025-06-04 13:37:28 +02:00
deformable_detr 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
deit Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
depth_anything Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
depth_pro Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
detr 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
diffllama Remove deprecated use_flash_attention_2 parameter (#37131) 2025-06-02 11:06:25 +02:00
dinat Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
dinov2 Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
dinov2_with_registers Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
distilbert Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
dit Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
donut 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
dpr Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
dpt Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
efficientnet Add EfficientNet Image PreProcessor (#37055) 2025-04-16 21:59:24 +02:00
electra Fix typos in strings and comments (#37784) 2025-04-25 13:47:25 +01:00
emu3 update emu3 test (#38543) 2025-06-03 11:02:01 +02:00
encodec Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
encoder_decoder 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
ernie Remove old code for PyTorch, Accelerator and tokenizers (#37234) 2025-04-10 20:54:21 +02:00
esm [ESM] Add flash-attention-2 backend for ESM-2 (#38023) 2025-05-16 14:11:56 +01:00
falcon 🚨 🚨 Inherited CausalLM Tests (#37590) 2025-05-23 18:29:31 +01:00
falcon_h1 [Falcon H1] Fix slow path forward pass (#38320) 2025-05-26 15:30:35 +02:00
falcon_mamba enable several cases on XPU (#37516) 2025-04-16 11:01:04 +02:00
fastspeech2_conformer 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
flaubert Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
flava 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
fnet 🚨 rm already deprecated pad_to_max_length arg (#37617) 2025-05-01 15:21:55 +02:00
focalnet Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
fsmt Fix typos in strings and comments (#37910) 2025-05-01 14:58:58 +01:00
funnel Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
fuyu 🔴 [VLM] Add base model without head (#37033) 2025-05-07 17:47:51 +02:00
gemma remove unhandled parameter (#38145) 2025-06-02 15:57:32 +02:00
gemma2 Fix Gemma2IntegrationTest (#38492) 2025-06-02 22:45:09 +02:00
gemma3 [tests] expand flex-attn test for vision models (#38434) 2025-06-03 07:40:44 +00:00
git 🚨 🚨 Setup -> setupclass conversion (#37282) 2025-04-08 17:15:37 +01:00
glm switch to device agnostic device calling for test cases (#38247) 2025-05-26 10:18:53 +02:00
glm4 Fix GLM4 checkpoints (#38412) 2025-05-28 16:40:08 +00:00
glpn 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
got_ocr2 [tests] expand flex-attn test for vision models (#38434) 2025-06-03 07:40:44 +00:00
gpt_bigcode Remove head mask in generative models (#35786) 2025-05-15 10:44:19 +02:00
gpt_neo Fix typos in strings and comments (#37799) 2025-04-28 11:39:11 +01:00
gpt_neox 🚨 🚨 Inherited CausalLM Tests (#37590) 2025-05-23 18:29:31 +01:00
gpt_neox_japanese Remove old code for PyTorch, Accelerator and tokenizers (#37234) 2025-04-10 20:54:21 +02:00
gpt_sw3 Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
gpt2 Add AMD expectation to test_gpt2_sample (#38079) 2025-05-12 16:51:21 +02:00
gptj Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
granite switch to device agnostic device calling for test cases (#38247) 2025-05-26 10:18:53 +02:00
granite_speech enable finegrained_fp8 and granite_speech cases on XPU (#38036) 2025-05-14 08:58:40 +00:00
granitemoe switch to device agnostic device calling for test cases (#38247) 2025-05-26 10:18:53 +02:00
granitemoehybrid switch to device agnostic device calling for test cases (#38247) 2025-05-26 10:18:53 +02:00
granitemoeshared switch to device agnostic device calling for test cases (#38247) 2025-05-26 10:18:53 +02:00
grounding_dino 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
groupvit 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
helium switch to device agnostic device calling for test cases (#38247) 2025-05-26 10:18:53 +02:00
herbert Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
hgnet_v2 Add D-FINE Model into Transformers (#36261) 2025-04-29 12:17:55 +01:00
hiera 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
hubert 🔴🔴🔴 [Attention] Refactor Attention Interface for Bart-based Models (#38108) 2025-05-22 17:12:58 +02:00
ibert Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
idefics 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
idefics2 enable misc cases on XPU & use device agnostic APIs for cases in tests (#38192) 2025-05-20 10:09:01 +02:00
idefics3 [VLMs] fix flash-attention tests (#37603) 2025-04-24 11:48:11 +02:00
ijepa Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
imagegpt [test] update test_past_key_values_format (#37614) 2025-04-22 11:07:34 +01:00
informer 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
instructblip [VLMs] support attention backends (#37576) 2025-05-08 18:18:54 +02:00
instructblipvideo 🔴 Video processors as a separate class (#35206) 2025-05-12 11:55:51 +02:00
internvl 🔴 Video processors as a separate class (#35206) 2025-05-12 11:55:51 +02:00
jamba switch to device agnostic device calling for test cases (#38247) 2025-05-26 10:18:53 +02:00
janus [janus] Fix failing tests on mi3XX (#38426) 2025-06-04 09:38:10 +02:00
jetmoe 🚨 🚨 Inherited CausalLM Tests (#37590) 2025-05-23 18:29:31 +01:00
kosmos2 [VLMs] support attention backends (#37576) 2025-05-08 18:18:54 +02:00
layoutlm Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
layoutlmv2 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
layoutlmv3 🚨 rm already deprecated pad_to_max_length arg (#37617) 2025-05-01 15:21:55 +02:00
layoutxlm 🚨 rm already deprecated pad_to_max_length arg (#37617) 2025-05-01 15:21:55 +02:00
led 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
levit Add Fast LeViT Processor (#37154) 2025-04-14 17:07:36 +02:00
lilt Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
llama remove unhandled parameter (#38145) 2025-06-02 15:57:32 +02:00
llama4 switch to device agnostic device calling for test cases (#38247) 2025-05-26 10:18:53 +02:00
llava 🔴 [VLM] Add base model without head (#37033) 2025-05-07 17:47:51 +02:00
llava_next [bug] fix llava processor to calculate unpadding size correctly (#37988) 2025-05-13 13:49:09 +00:00
llava_next_video [video processor] fix tests (#38104) 2025-05-14 10:24:07 +00:00
llava_onevision fix multi-image case for llava-onevision (#38084) 2025-05-21 11:50:46 +02:00
longformer Fix typos in strings and comments (#37799) 2025-04-28 11:39:11 +01:00
longt5 Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
luke 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
lxmert Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
m2m_100 🔴🔴🔴 [Attention] Refactor Attention Interface for Bart-based Models (#38108) 2025-05-22 17:12:58 +02:00
mamba Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
mamba2 Mamba2 remove unecessary test parameterization (#38227) 2025-05-20 13:54:04 +00:00
marian 🔴🔴🔴 [Attention] Refactor Attention Interface for Bart-based Models (#38108) 2025-05-22 17:12:58 +02:00
markuplm 🚨 rm already deprecated pad_to_max_length arg (#37617) 2025-05-01 15:21:55 +02:00
mask2former Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
maskformer 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
mbart 🔴🔴🔴 [Attention] Refactor Attention Interface for Bart-based Models (#38108) 2025-05-22 17:12:58 +02:00
mbart50 Use lru_cache for tokenization tests (#36818) 2025-03-28 15:09:35 +01:00
megatron_bert Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
megatron_gpt2 Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
mgp_str Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
mimi Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
minimax Add support for MiniMax's MiniMax-Text-01 (#35831) 2025-06-04 09:38:40 +02:00
mistral switch to device agnostic device calling for test cases (#38247) 2025-05-26 10:18:53 +02:00
mistral3 🔴 Video processors as a separate class (#35206) 2025-05-12 11:55:51 +02:00
mixtral switch to device agnostic device calling for test cases (#38247) 2025-05-26 10:18:53 +02:00
mlcd Add MLCD model (#36182) 2025-04-15 11:33:09 +01:00
mllama fix spelling errors (#38608) 2025-06-05 13:57:23 +01:00
mluke Fix typos in strings and comments (#37799) 2025-04-28 11:39:11 +01:00
mobilebert Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
mobilenet_v1 Add Fast Image Processor for MobileNetV1 (#37111) 2025-04-23 15:55:41 -04:00
mobilenet_v2 Add Fast Mobilenet-V2 Processor (#37113) 2025-04-14 17:08:47 +02:00
mobilevit Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
mobilevitv2 Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
modernbert Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
moonshine 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
moshi Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
mpnet Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
mpt fix mpt test of different outputs from cuda (#37691) 2025-04-25 18:04:56 +02:00
mra Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
mt5 Fix mt5 test on AMD devices (#38081) 2025-05-12 16:59:00 +02:00
musicgen [tests] expand flex-attn test for vision models (#38434) 2025-06-03 07:40:44 +00:00
musicgen_melody [tests] expand flex-attn test for vision models (#38434) 2025-06-03 07:40:44 +00:00
mvp Fix typos in strings and comments (#37799) 2025-04-28 11:39:11 +01:00
myt5 🚨 🚨 Setup -> setupclass conversion (#37282) 2025-04-08 17:15:37 +01:00
nemotron switch to device agnostic device calling for test cases (#38247) 2025-05-26 10:18:53 +02:00
nllb Use lru_cache for tokenization tests (#36818) 2025-03-28 15:09:35 +01:00
nllb_moe Fix typos in strings and comments (#37799) 2025-04-28 11:39:11 +01:00
nougat Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
nystromformer Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
olmo Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
olmo2 Make HF implementation match original OLMo 2 models for lower precisions (#38131) 2025-05-19 15:35:23 +02:00
olmoe Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
omdet_turbo 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
oneformer Fix OneFormer integration test (#38016) 2025-05-12 16:02:41 +02:00
openai Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
opt Remove head mask in generative models (#35786) 2025-05-15 10:44:19 +02:00
owlv2 🚨 🚨 Setup -> setupclass conversion (#37282) 2025-04-08 17:15:37 +01:00
owlvit Add Fast owlvit Processor (#37164) 2025-04-14 17:58:09 +02:00
paligemma [paligemma] fix processor with suffix (#38365) 2025-05-27 11:31:56 +02:00
paligemma2 🚨🚨[core] Completely rewrite the masking logic for all attentions (#37866) 2025-05-22 11:38:26 +02:00
patchtsmixer 🔴🔴🔴 [Attention] Refactor Attention Interface for Bart-based Models (#38108) 2025-05-22 17:12:58 +02:00
patchtst Force torch>=2.6 with torch.load to avoid vulnerability issue (#37785) 2025-04-25 16:57:09 +02:00
pegasus 🔴🔴🔴 [Attention] Refactor Attention Interface for Bart-based Models (#38108) 2025-05-22 17:12:58 +02:00
pegasus_x 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
perceiver 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
persimmon 🚨 🚨 Inherited CausalLM Tests (#37590) 2025-05-23 18:29:31 +01:00
phi 🚨 🚨 Inherited CausalLM Tests (#37590) 2025-05-23 18:29:31 +01:00
phi3 🚨 🚨 Inherited CausalLM Tests (#37590) 2025-05-23 18:29:31 +01:00
phi4_multimodal Fix incorrect batching audio index calculation for Phi-4-Multimodal (#38103) 2025-05-26 12:41:31 +00:00
phimoe Fix MoE gradient test (#38438) 2025-05-28 16:44:20 +01:00
phobert Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
pix2struct Fix typos in strings and comments (#37799) 2025-04-28 11:39:11 +01:00
pixtral Add args support for fast image processors (#37018) 2025-05-16 12:01:46 -04:00
plbart 🔴🔴🔴 [Attention] Refactor Attention Interface for Bart-based Models (#38108) 2025-05-22 17:12:58 +02:00
poolformer Add Fast Image Processor for PoolFormer (#37182) 2025-04-23 15:55:33 -04:00
pop2piano Fix typos in strings and comments (#37799) 2025-04-28 11:39:11 +01:00
prompt_depth_anything Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
prophetnet Fix typos in strings and comments (#37799) 2025-04-28 11:39:11 +01:00
pvt Add Fast PVT Processor (#37204) 2025-04-23 15:55:20 -04:00
pvt_v2 Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
qwen2 🚨 🚨 Inherited CausalLM Tests (#37590) 2025-05-23 18:29:31 +01:00
qwen2_5_omni [Tests] Clean up test cases for few models (#38315) 2025-05-29 08:21:28 +00:00
qwen2_5_vl [Tests] Clean up test cases for few models (#38315) 2025-05-29 08:21:28 +00:00
qwen2_audio [Tests] Clean up test cases for few models (#38315) 2025-05-29 08:21:28 +00:00
qwen2_moe Fix MoE gradient test (#38438) 2025-05-28 16:44:20 +01:00
qwen2_vl [Tests] Clean up test cases for few models (#38315) 2025-05-29 08:21:28 +00:00
qwen3 🚨 🚨 Inherited CausalLM Tests (#37590) 2025-05-23 18:29:31 +01:00
qwen3_moe Fix MoE gradient test (#38438) 2025-05-28 16:44:20 +01:00
rag Fix typos in strings and comments (#37799) 2025-04-28 11:39:11 +01:00
recurrent_gemma 🚨 🚨 Inherited CausalLM Tests (#37590) 2025-05-23 18:29:31 +01:00
reformer Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
regnet Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
rembert Fix typos in strings and comments (#37784) 2025-04-25 13:47:25 +01:00
resnet Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
roberta Fix typos in strings and comments (#37799) 2025-04-28 11:39:11 +01:00
roberta_prelayernorm Fix typos in strings and comments (#37784) 2025-04-25 13:47:25 +01:00
roc_bert Remove old code for PyTorch, Accelerator and tokenizers (#37234) 2025-04-10 20:54:21 +02:00
roformer tests/roformer: fix couple roformer tests on gpus (#38570) 2025-06-04 18:45:56 +02:00
rt_detr 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
rt_detr_v2 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
rwkv 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
sam 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
sam_hq 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
seamless_m4t [seamless_m4t] Skip some tests when speech is not available (#38430) 2025-06-02 09:17:28 +00:00
seamless_m4t_v2 [seamless_m4t] Skip some tests when speech is not available (#38430) 2025-06-02 09:17:28 +00:00
segformer 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
seggpt Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
sew 🔴🔴🔴 [Attention] Refactor Attention Interface for Bart-based Models (#38108) 2025-05-22 17:12:58 +02:00
sew_d Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
shieldgemma2 [chat-template] Unify tests and clean up 🧼 (#37275) 2025-04-10 14:42:32 +02:00
siglip [tests] expand flex-attn test for vision models (#38434) 2025-06-03 07:40:44 +00:00
siglip2 [tests] expand flex-attn test for vision models (#38434) 2025-06-03 07:40:44 +00:00
smolvlm [smolvlm] skip the test (#38099) 2025-05-13 12:50:43 +00:00
speech_encoder_decoder 🔴🔴🔴 [Attention] Refactor Attention Interface for Bart-based Models (#38108) 2025-05-22 17:12:58 +02:00
speech_to_text 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
speecht5 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
splinter Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
squeezebert Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
stablelm 🚨 🚨 Inherited CausalLM Tests (#37590) 2025-05-23 18:29:31 +01:00
starcoder2 🚨 🚨 Inherited CausalLM Tests (#37590) 2025-05-23 18:29:31 +01:00
superglue Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
superpoint Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
swiftformer Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
swin 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
swin2sr 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
swinv2 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
switch_transformers Correctly drop tokens in SwitchTransformer (#37123) 2025-04-10 16:58:57 +02:00
t5 Fix typos in strings and comments (#37799) 2025-04-28 11:39:11 +01:00
table_transformer 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
tapas 🚨 rm already deprecated pad_to_max_length arg (#37617) 2025-05-01 15:21:55 +02:00
textnet Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
time_series_transformer 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
timesfm [TimesFM] use the main revison instead of revision for integration test (#37558) 2025-04-17 11:26:03 +02:00
timesformer Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
timm_backbone Small fix on context manager detection (#37562) 2025-04-17 15:39:44 +02:00
timm_wrapper Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
trocr 🚨 🚨 Setup -> setupclass conversion (#37282) 2025-04-08 17:15:37 +01:00
tvp Add Optional to remaining types (#37808) 2025-04-28 14:20:45 +01:00
udop 🚨 rm already deprecated pad_to_max_length arg (#37617) 2025-05-01 15:21:55 +02:00
umt5 Fix typos in strings and comments (#37799) 2025-04-28 11:39:11 +01:00
unispeech 🔴🔴🔴 [Attention] Refactor Attention Interface for Bart-based Models (#38108) 2025-05-22 17:12:58 +02:00
unispeech_sat 🔴🔴🔴 [Attention] Refactor Attention Interface for Bart-based Models (#38108) 2025-05-22 17:12:58 +02:00
univnet chore: fix typos in the tests directory (#36813) 2025-03-21 10:20:05 +01:00
upernet Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
video_llava fix spelling errors (#38608) 2025-06-05 13:57:23 +01:00
videomae [tests] expand flex-attn test for vision models (#38434) 2025-06-03 07:40:44 +00:00
vilt 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
vipllava [tests] expand flex-attn test for vision models (#38434) 2025-06-03 07:40:44 +00:00
vision_encoder_decoder 🔴🔴🔴 [Attention] Refactor Attention Interface for Bart-based Models (#38108) 2025-05-22 17:12:58 +02:00
vision_text_dual_encoder 🚨 🚨 Setup -> setupclass conversion (#37282) 2025-04-08 17:15:37 +01:00
visual_bert 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
vit Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
vit_mae Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
vit_msn Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
vitdet Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
vitmatte Add args support for fast image processors (#37018) 2025-05-16 12:01:46 -04:00
vitpose Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
vitpose_backbone Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
vits Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
vivit 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
wav2vec2 🔴🔴🔴 [Attention] Refactor Attention Interface for Bart-based Models (#38108) 2025-05-22 17:12:58 +02:00
wav2vec2_bert 🚨 🚨 Setup -> setupclass conversion (#37282) 2025-04-08 17:15:37 +01:00
wav2vec2_conformer Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
wav2vec2_phoneme Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
wav2vec2_with_lm use torch.testing.assertclose instead to get more details about error in cis (#35659) 2025-01-24 16:55:28 +01:00
wavlm Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
whisper 🔴[Attention] Attention refactor for Whisper-based models (#38235) 2025-05-28 13:32:38 +02:00
x_clip 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
xglm Fix typos in strings and comments (#37799) 2025-04-28 11:39:11 +01:00
xlm Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
xlm_roberta Fix typos in strings and comments (#37799) 2025-04-28 11:39:11 +01:00
xlm_roberta_xl Remove old code for PyTorch, Accelerator and tokenizers (#37234) 2025-04-10 20:54:21 +02:00
xlnet Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
xmod Remove old code for PyTorch, Accelerator and tokenizers (#37234) 2025-04-10 20:54:21 +02:00
yolos 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
yoso Use Python 3.9 syntax in tests (#37343) 2025-04-08 14:12:08 +02:00
zamba 🚨Early-error🚨 config will error out if output_attentions=True and the attn implementation is wrong (#38288) 2025-05-23 17:17:38 +02:00
zamba2 [FlexAttn] Fix models with unique characteristics (#38433) 2025-06-04 13:37:28 +02:00
zoedepth added fast image processor for ZoeDepth and expanded tests accordingly (#38515) 2025-06-04 22:59:17 +00:00
__init__.py Move test model folders (#17034) 2022-05-03 14:42:02 +02:00