mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-19 12:38:23 +06:00
![]() * Fix default attention mask size * fixup * add a test to make sure that even if attention mask are not provided, works * style |
||
---|---|---|
.. | ||
__init__.py | ||
test_modeling_flax_opt.py | ||
test_modeling_opt.py | ||
test_modeling_tf_opt.py |