transformers/tests/models/t5
fxmarty da971b2271
Keep relevant weights in fp32 when model._keep_in_fp32_modules is set even when accelerate is not installed (#26225)
* fix bug where weight would not be kept in fp32

* nit

* address review comments

* fix test
2023-09-21 19:00:03 +09:00
..
__init__.py Move test model folders (#17034) 2022-05-03 14:42:02 +02:00
test_modeling_flax_t5.py CI with num_hidden_layers=2 🚀🚀🚀 (#25266) 2023-08-02 20:22:36 +02:00
test_modeling_t5.py Keep relevant weights in fp32 when model._keep_in_fp32_modules is set even when accelerate is not installed (#26225) 2023-09-21 19:00:03 +09:00
test_modeling_tf_t5.py Skip test_beam_search_xla_generate_simple for T5 (#25566) 2023-08-17 15:30:46 +02:00
test_tokenization_t5.py 🚨🚨 🚨🚨 [Tokenizer] attemp to fix add_token issues🚨🚨 🚨🚨 (#23909) 2023-09-18 20:28:36 +02:00