mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-05 22:00:09 +06:00
![]() * simplify loop * add featur extractor * add model * start conversion * add dropout * initial commit of test files * copnversion for all models * update processor for correct padding * update feature extraction * update integration test logits match * fmnt: off for the logits * on the fly mel bank * small nit * update test * update tokenizer * nit feature extraction * update * update tokenizer test * adds logit processor and update tokenizer to get supress tokens * style * clean convert * revert to original modeling tf utils * Update * update * nit * clean convert file * update tests and nits * quality * slow generation test * ffn_dim to allow customization * update readme * add to toctreee * start fixing integration tests * update tests and code * fix feature extractor * fix config tests common * update code to fix tests * fix feature exctractor * nit feature extraction * update test for new feature extractor * style * add absrtact * large logits wioth custom decoder input ids * wraap around is otrch available * fix feature extractor * correct logits for whisper small.en * nit * fix encoder_attentino_mask * some fixes * remove unnecessary inputs * nits * add normalizer file * update etst tokenization * fix attention mask not defined * fix generate * remove uncoder attention mask useless * update test modeling whisper * update condfig to add second non supress tokens * nits on feature exrtactor * nit for test tokenizers * update etsts * update tests * update tokenization test * fixup * invalidated hf token. Clean convert openai to whisper * fix logit tests * fixup * Add model to README * Fix doc tests * clean merge * revert toc_tree changes * remove useless LogitProcessor * Update whisper .mdx * update config file doc * update configuration docstring * update test tokenization * update test tokenization * update tokenization whisper Added copied from where needed * update feature extraction * nit test name * style * quality * remove get suppress tokens and update non_speech tokens global variables * Update src/transformers/models/whisper/feature_extraction_whisper.py Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com> * clean modeling whisper and test Removed the attention mask arguments that are deprecated * fix large test * Add multilingual audio test, and translate test * style * fix larg multilingual test * nits * add copied from for attention layer * remove attention masks in doc * add english normalizer * Update docs/source/en/model_doc/whisper.mdx Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com> * update tokenization test * remove copied from in whisper attention : no bias in k_proj only * wrap around dependencies in english normalizer * style * correct import generation logits * for now, wrap feature extractor with torch * remove torch depencies for feature extraction and style * Update src/transformers/models/whisper/convert_openai_whisper_to_tfms.py Co-authored-by: NielsRogge <48327001+NielsRogge@users.noreply.github.com> * Update src/transformers/models/whisper/configuration_whisper.py Co-authored-by: NielsRogge <48327001+NielsRogge@users.noreply.github.com> * Update docs/source/en/model_doc/whisper.mdx Co-authored-by: NielsRogge <48327001+NielsRogge@users.noreply.github.com> * fixup * nit * update logitds * style * nit * nits and fix final tests * add `is_more_itertools_available` to utils * quality * add begin supress tokens, supress tokens to generate args and config * clean supressTokensLogitProcessor in generation logits * Nit naming * add supressTokensAtBegin * udpate tests, supress tokens to None or correct values * nit and style * update RAG to fit test and generate_logit * add copy pasted statment on english normalizer * add arguments to config_common_kwargs * Update src/transformers/generation_utils.py Co-authored-by: NielsRogge <48327001+NielsRogge@users.noreply.github.com> * Update src/transformers/generation_logits_process.py Co-authored-by: NielsRogge <48327001+NielsRogge@users.noreply.github.com> * revert changes based on reviews * update doc and nits * Update src/transformers/models/whisper/configuration_whisper.py Co-authored-by: NielsRogge <48327001+NielsRogge@users.noreply.github.com> * Apply suggestions from code review Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com> Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com> Co-authored-by: NielsRogge <48327001+NielsRogge@users.noreply.github.com> * more nits * last nits * update test configuration common * add BART name in decoder attention mask documentation * Update src/transformers/models/whisper/modeling_whisper.py Co-authored-by: NielsRogge <48327001+NielsRogge@users.noreply.github.com> * style * nit * nit * add english.json file to git * nits on documentation * nit * nits * last styling * add main toctree file * remove sentence piece dependency * clean init file * fix tokenizer that has no dependencies on sentencepiece * update whisper init file, nit * remove english.json file * add get decoder prompt id * All weights loading * Remove hanging pdb * Fixup and tidy up * Use same copied from as PT model * Remove whitespace changes * Remove torch references * Tie embeddings * Remove logits processor input to generate * Update logit values * revert changes and add forced logit processor * nit * clean normalizer * remove protected * Add logit processors and update generation code & tests * Some tidy up * Update docstring * update * update based on review * Update src/transformers/models/whisper/configuration_whisper.py Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com> * Update src/transformers/models/whisper/configuration_whisper.py Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com> * Update to reflect changes on the PT model branch * Tidy up * Remove extra whitespace * Fix test - make input ids small enough we can append * Include upstream changes on main * PR comments - add batch tests, remove comments & defaults * Fix model output imports * Update src/transformers/models/whisper/modeling_tf_whisper.py Co-authored-by: Joao Gante <joaofranciscocardosogante@gmail.com> * Update src/transformers/generation_tf_logits_process.py Co-authored-by: Joao Gante <joaofranciscocardosogante@gmail.com> * Update src/transformers/models/whisper/modeling_tf_whisper.py Co-authored-by: Joao Gante <joaofranciscocardosogante@gmail.com> * Update src/transformers/models/whisper/modeling_tf_whisper.py Co-authored-by: Joao Gante <joaofranciscocardosogante@gmail.com> * Update tests/models/whisper/test_modeling_tf_whisper.py Co-authored-by: Joao Gante <joaofranciscocardosogante@gmail.com> * Update src/transformers/models/whisper/modeling_tf_whisper.py Co-authored-by: Joao Gante <joaofranciscocardosogante@gmail.com> * Update src/transformers/models/whisper/modeling_tf_whisper.py Co-authored-by: Joao Gante <joaofranciscocardosogante@gmail.com> * Update docstring example * Update src/transformers/models/whisper/modeling_tf_whisper.py Co-authored-by: Matt <Rocketknight1@users.noreply.github.com> * Remove changes to adjust_logits_during_generation function * Update src/transformers/models/whisper/modeling_tf_whisper.py Co-authored-by: Arthur <48595927+ArthurZucker@users.noreply.github.com> * Tidy up imports that don't require TF * Update tests - skip and no more skip * Update tests/generation/test_generation_tf_logits_process.py Co-authored-by: Joao Gante <joaofranciscocardosogante@gmail.com> * Update src/transformers/models/whisper/modeling_tf_whisper.py * Update src/transformers/models/whisper/modeling_tf_whisper.py Co-authored-by: Matt <Rocketknight1@users.noreply.github.com> * Add training flags * Add (skipped) XLA generation tests * Add embedding correctness test * Add constant ids for generation tests * Make logits finding a bit tidier * Remove unused args * xla generation enabled * Don't skip XLA tests anymore * Fix tests - add position ids to expected signature and update rag generation * Undo method reorder * Remove added whitespace * Remove copy-paste gradient checkopint ref * Remove * Trigger CI - (issue with refs when pulling) Co-authored-by: Arthur Zucker <arthur.zucker@gmail.com> Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com> Co-authored-by: NielsRogge <niels.rogge1@gmail.com> Co-authored-by: Arthur <48595927+ArthurZucker@users.noreply.github.com> Co-authored-by: NielsRogge <48327001+NielsRogge@users.noreply.github.com> Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com> Co-authored-by: Joao Gante <joaofranciscocardosogante@gmail.com> Co-authored-by: Matt <Rocketknight1@users.noreply.github.com> Co-authored-by: Joao Gante <joao@huggingface.co> |
||
---|---|---|
.. | ||
albert.mdx | ||
auto.mdx | ||
bart.mdx | ||
barthez.mdx | ||
bartpho.mdx | ||
beit.mdx | ||
bert-generation.mdx | ||
bert-japanese.mdx | ||
bert.mdx | ||
bertweet.mdx | ||
big_bird.mdx | ||
bigbird_pegasus.mdx | ||
blenderbot-small.mdx | ||
blenderbot.mdx | ||
bloom.mdx | ||
bort.mdx | ||
byt5.mdx | ||
camembert.mdx | ||
canine.mdx | ||
clip.mdx | ||
codegen.mdx | ||
conditional_detr.mdx | ||
convbert.mdx | ||
convnext.mdx | ||
cpm.mdx | ||
ctrl.mdx | ||
cvt.mdx | ||
data2vec.mdx | ||
deberta-v2.mdx | ||
deberta.mdx | ||
decision_transformer.mdx | ||
deformable_detr.mdx | ||
deit.mdx | ||
detr.mdx | ||
dialogpt.mdx | ||
distilbert.mdx | ||
dit.mdx | ||
donut.mdx | ||
dpr.mdx | ||
dpt.mdx | ||
electra.mdx | ||
encoder-decoder.mdx | ||
ernie.mdx | ||
esm.mdx | ||
flaubert.mdx | ||
flava.mdx | ||
fnet.mdx | ||
fsmt.mdx | ||
funnel.mdx | ||
glpn.mdx | ||
gpt_neo.mdx | ||
gpt_neox_japanese.mdx | ||
gpt_neox.mdx | ||
gpt2.mdx | ||
gptj.mdx | ||
groupvit.mdx | ||
herbert.mdx | ||
hubert.mdx | ||
ibert.mdx | ||
imagegpt.mdx | ||
layoutlm.mdx | ||
layoutlmv2.mdx | ||
layoutlmv3.mdx | ||
layoutxlm.mdx | ||
led.mdx | ||
levit.mdx | ||
longformer.mdx | ||
longt5.mdx | ||
luke.mdx | ||
lxmert.mdx | ||
m2m_100.mdx | ||
marian.mdx | ||
markuplm.mdx | ||
maskformer.mdx | ||
mbart.mdx | ||
mctct.mdx | ||
megatron_gpt2.mdx | ||
megatron-bert.mdx | ||
mluke.mdx | ||
mobilebert.mdx | ||
mobilevit.mdx | ||
mpnet.mdx | ||
mt5.mdx | ||
mvp.mdx | ||
nezha.mdx | ||
nllb.mdx | ||
nystromformer.mdx | ||
openai-gpt.mdx | ||
opt.mdx | ||
owlvit.mdx | ||
pegasus_x.mdx | ||
pegasus.mdx | ||
perceiver.mdx | ||
phobert.mdx | ||
plbart.mdx | ||
poolformer.mdx | ||
prophetnet.mdx | ||
qdqbert.mdx | ||
rag.mdx | ||
realm.mdx | ||
reformer.mdx | ||
regnet.mdx | ||
rembert.mdx | ||
resnet.mdx | ||
retribert.mdx | ||
roberta.mdx | ||
roformer.mdx | ||
segformer.mdx | ||
sew-d.mdx | ||
sew.mdx | ||
speech_to_text_2.mdx | ||
speech_to_text.mdx | ||
speech-encoder-decoder.mdx | ||
splinter.mdx | ||
squeezebert.mdx | ||
swin.mdx | ||
swinv2.mdx | ||
t5.mdx | ||
t5v1.1.mdx | ||
tapas.mdx | ||
tapex.mdx | ||
time_series_transformer.mdx | ||
trajectory_transformer.mdx | ||
transfo-xl.mdx | ||
trocr.mdx | ||
ul2.mdx | ||
unispeech-sat.mdx | ||
unispeech.mdx | ||
van.mdx | ||
videomae.mdx | ||
vilt.mdx | ||
vision-encoder-decoder.mdx | ||
vision-text-dual-encoder.mdx | ||
visual_bert.mdx | ||
vit_mae.mdx | ||
vit_msn.mdx | ||
vit.mdx | ||
wav2vec2_phoneme.mdx | ||
wav2vec2-conformer.mdx | ||
wav2vec2.mdx | ||
wavlm.mdx | ||
whisper.mdx | ||
xclip.mdx | ||
xglm.mdx | ||
xlm-prophetnet.mdx | ||
xlm-roberta-xl.mdx | ||
xlm-roberta.mdx | ||
xlm.mdx | ||
xlnet.mdx | ||
xls_r.mdx | ||
xlsr_wav2vec2.mdx | ||
yolos.mdx | ||
yoso.mdx |