mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-30 01:32:23 +06:00
![]() * `gpt2` sdpa support * fix (at least) one test, style, repo consistency * fix sdpa mask in forward --> fixes generation * test * test2 * test3 * test4 * simplify shapes for attn mask creation and small comments * hub fail test * benchmarks * flash attn 2 mask should not be inverted on enc-dec setup * fix comment * apply some suggestion from code review - only save _attn_implentation once - remove unnecessary comment * change elif logic * [run-slow] gpt2 * modify `test_gpt2_sample_max_time` to follow previous assertion patterns |
||
---|---|---|
.. | ||
de | ||
en | ||
es | ||
fr | ||
hi | ||
it | ||
ja | ||
ko | ||
ms | ||
pt | ||
te | ||
tr | ||
zh | ||
_config.py |