mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-12 17:20:03 +06:00
![]() * Adding SDPA support for RoBERTa-based models * add not is_cross_attention * fix copies * fix test * add minimal test for camembert and xlm_roberta as their test class does not inherit from ModelTesterMixin * address some review comments * use copied from * style * consistency * fix lists --------- Co-authored-by: fxmarty <9808326+fxmarty@users.noreply.github.com> Co-authored-by: Arthur <48595927+ArthurZucker@users.noreply.github.com> |
||
---|---|---|
.. | ||
de | ||
en | ||
es | ||
fr | ||
hi | ||
it | ||
ja | ||
ko | ||
ms | ||
pt | ||
te | ||
tr | ||
zh | ||
_config.py |