mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-18 12:08:22 +06:00
![]() * TFDeberta moved weights to build and fixed name scope added missing , bug fixes to enable graph mode execution updated setup.py fixing typo fix imports embedding mask fix added layer names avoid autmatic incremental names +XSoftmax cleanup added names to layer disable keras_serializable Distangled attention output shape hidden_size==None using symbolic inputs test for Deberta tf make style Update src/transformers/models/deberta/modeling_tf_deberta.py Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com> Update src/transformers/models/deberta/modeling_tf_deberta.py Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com> Update src/transformers/models/deberta/modeling_tf_deberta.py Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com> Update src/transformers/models/deberta/modeling_tf_deberta.py Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com> Update src/transformers/models/deberta/modeling_tf_deberta.py Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com> Update src/transformers/models/deberta/modeling_tf_deberta.py Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com> Update src/transformers/models/deberta/modeling_tf_deberta.py Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com> removed tensorflow-probability removed blank line * removed tf experimental api +torch_gather tf implementation from @Rocketknight1 * layername DeBERTa --> deberta * copyright fix * added docs for TFDeberta & make style * layer_name change to fix load from pt model * layer_name change as pt model * SequenceClassification layername change, to same as pt model * switched to keras built-in LayerNormalization * added `TFDeberta` prefix most layer classes * updated to tf.Tensor in the docstring |
||
---|---|---|
.. | ||
albert.rst | ||
auto.rst | ||
bart.rst | ||
barthez.rst | ||
beit.rst | ||
bert_japanese.rst | ||
bert.rst | ||
bertgeneration.rst | ||
bertweet.rst | ||
bigbird_pegasus.rst | ||
bigbird.rst | ||
blenderbot_small.rst | ||
blenderbot.rst | ||
bort.rst | ||
byt5.rst | ||
camembert.rst | ||
canine.rst | ||
clip.rst | ||
convbert.rst | ||
cpm.rst | ||
ctrl.rst | ||
deberta_v2.rst | ||
deberta.rst | ||
deit.rst | ||
detr.rst | ||
dialogpt.rst | ||
distilbert.rst | ||
dpr.rst | ||
electra.rst | ||
encoderdecoder.rst | ||
flaubert.rst | ||
fsmt.rst | ||
funnel.rst | ||
gpt_neo.rst | ||
gpt.rst | ||
gpt2.rst | ||
herbert.rst | ||
hubert.rst | ||
ibert.rst | ||
layoutlm.rst | ||
led.rst | ||
longformer.rst | ||
luke.rst | ||
lxmert.rst | ||
m2m_100.rst | ||
marian.rst | ||
mbart.rst | ||
megatron_bert.rst | ||
megatron_gpt2.rst | ||
mobilebert.rst | ||
mpnet.rst | ||
mt5.rst | ||
pegasus.rst | ||
phobert.rst | ||
prophetnet.rst | ||
rag.rst | ||
reformer.rst | ||
rembert.rst | ||
retribert.rst | ||
roberta.rst | ||
roformer.rst | ||
speech_to_text.rst | ||
squeezebert.rst | ||
t5.rst | ||
tapas.rst | ||
transformerxl.rst | ||
visual_bert.rst | ||
vit.rst | ||
wav2vec2.rst | ||
xlm.rst | ||
xlmprophetnet.rst | ||
xlmroberta.rst | ||
xlnet.rst | ||
xlsr_wav2vec2.rst |