mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-07 06:40:04 +06:00
![]() * Output cross-attention with decoder attention output * Update src/transformers/modeling_bert.py * add cross-attention for t5 and bart as well * fix tests * correct typo in docs * add sylvains and sams comments * correct typo Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com> |
||
---|---|---|
.. | ||
callback.rst | ||
configuration.rst | ||
logging.rst | ||
model.rst | ||
optimizer_schedules.rst | ||
output.rst | ||
pipelines.rst | ||
processors.rst | ||
tokenizer.rst | ||
trainer.rst |