transformers/tests/models/t5
Younes Belkada 1af4bee896
Add keep_in_fp32_modules support (#20683)
* add `keep_in_fp32_modules` support

* pass it as class attribute

* few modifs

- make tests `slow`
- fix logic

* better logic

* fix failing test

* `bfloat16` support

* Update src/transformers/modeling_utils.py

Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>

* fix

* simplify tests

* simplify tests

* fix test

* modify message

* more checks

* fix failing tests

* add more conditions

- add `is_accelerate_available`
- fixes pipleine tests that failed

* add suggestions

* Update src/transformers/modeling_utils.py

Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>

* fix failing `bnb` test

* add last safety checker

Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>
2022-12-13 11:59:57 +01:00
..
__init__.py Move test model folders (#17034) 2022-05-03 14:42:02 +02:00
test_modeling_flax_t5.py [FLAX] Add dtype to embedding for bert/bart/opt/t5 (#20340) 2022-11-28 10:21:42 -05:00
test_modeling_t5.py Add keep_in_fp32_modules support (#20683) 2022-12-13 11:59:57 +01:00
test_modeling_tf_t5.py 🚨🚨🚨 TF: Remove TFWrappedEmbeddings (breaking: TF embedding initialization updated for encoder-decoder models) (#19263) 2022-10-11 16:48:03 +01:00
test_tokenization_t5.py change the way sentinel tokens can retrived (#20373) 2022-11-23 09:35:44 -05:00