transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}
Daniel Stancl 38a716cd41
TF BART models - Add cross_attentions to model output and fix cross-attention head masking (#10699)
* Add cross_attn_head_mask to BART

* Fix cross_attentions in TFBart-like models

* This commit enables returning of `cross_attentions`
for TFBart-like models

* It also fixes attention head masking in cross-attenion module

* Update TF model templates

* Fix missing , in TF model templates

* Fix typo: congig -> config
2021-04-26 14:16:21 +02:00
..
__init__.py Fix model templates (#9999) 2021-02-04 07:47:26 -05:00
{{cookiecutter.lowercase_modelname}}.rst Honor contributors to models (#11329) 2021-04-21 09:47:27 -04:00
configuration_{{cookiecutter.lowercase_modelname}}.py Fixes in the templates (#10951) 2021-03-29 17:36:13 -04:00
configuration.json Model Templates for Seq2Seq (#9251) 2020-12-22 23:41:20 +01:00
modeling_{{cookiecutter.lowercase_modelname}}.py Fix cross-attention head mask for Torch encoder-decoder models (#10605) 2021-04-23 18:58:06 +02:00
modeling_tf_{{cookiecutter.lowercase_modelname}}.py TF BART models - Add cross_attentions to model output and fix cross-attention head masking (#10699) 2021-04-26 14:16:21 +02:00
test_modeling_{{cookiecutter.lowercase_modelname}}.py Fix model templates (#9999) 2021-02-04 07:47:26 -05:00
test_modeling_tf_{{cookiecutter.lowercase_modelname}}.py Enforce string-formatting with f-strings (#10980) 2021-03-31 10:00:27 -04:00
to_replace_{{cookiecutter.lowercase_modelname}}.py Fix model templates (#9999) 2021-02-04 07:47:26 -05:00
tokenization_{{cookiecutter.lowercase_modelname}}.py Make get_special_tokens_mask consider all tokens (#11163) 2021-04-09 11:57:44 -04:00
tokenization_fast_{{cookiecutter.lowercase_modelname}}.py Copy tokenizer files in each of their repo (#10624) 2021-03-10 11:26:23 -05:00