transformers/tests/models/whisper
benniekiss 5c6257d1fc
[whisper] Clarify error message when setting max_new_tokens (#33324)
* clarify error message when setting max_new_tokens

* sync error message in test_generate_with_prompt_ids_max_length

* there is no self
2024-09-12 18:48:36 +02:00
..
__init__.py Add WhisperModel to transformers (#19166) 2022-10-05 22:28:31 +02:00
test_feature_extraction_whisper.py Remove trust_remote_code when loading Libri Dummy (#31748) 2024-07-23 14:54:38 +08:00
test_modeling_flax_whisper.py Forbid PretrainedConfig from saving generate parameters; Update deprecations in generate-related code 🧹 (#32659) 2024-08-23 11:12:53 +01:00
test_modeling_tf_whisper.py Forbid PretrainedConfig from saving generate parameters; Update deprecations in generate-related code 🧹 (#32659) 2024-08-23 11:12:53 +01:00
test_modeling_whisper.py [whisper] Clarify error message when setting max_new_tokens (#33324) 2024-09-12 18:48:36 +02:00
test_processor_whisper.py feat: Whisper prompting (#22496) 2023-05-19 09:33:11 +01:00
test_tokenization_whisper.py Fix flax whisper tokenizer bug (#33151) 2024-09-12 12:21:59 +01:00