transformers/templates
Daniel Stancl 38a716cd41
TF BART models - Add cross_attentions to model output and fix cross-attention head masking (#10699)
* Add cross_attn_head_mask to BART

* Fix cross_attentions in TFBart-like models

* This commit enables returning of `cross_attentions`
for TFBart-like models

* It also fixes attention head masking in cross-attenion module

* Update TF model templates

* Fix missing , in TF model templates

* Fix typo: congig -> config
2021-04-26 14:16:21 +02:00
..
adding_a_new_example_script Enforce string-formatting with f-strings (#10980) 2021-03-31 10:00:27 -04:00
adding_a_new_model TF BART models - Add cross_attentions to model output and fix cross-attention head masking (#10699) 2021-04-26 14:16:21 +02:00