transformers/tests/models/gpt2
Tianlin Liu 0040469bb8
Correct attention mask dtype for Flax GPT2 (#25636)
* Correct attention mask dtype

* reformat code

* add a test for boolean mask

* convert test to fast test

* delete unwanted print

* use assertTrue for testing
2023-08-25 17:36:37 +02:00
..
__init__.py Move test model folders (#17034) 2022-05-03 14:42:02 +02:00
test_modeling_flax_gpt2.py Correct attention mask dtype for Flax GPT2 (#25636) 2023-08-25 17:36:37 +02:00
test_modeling_gpt2.py CI with num_hidden_layers=2 🚀🚀🚀 (#25266) 2023-08-02 20:22:36 +02:00
test_modeling_tf_gpt2.py Speed up TF tests by reducing hidden layer counts (#24595) 2023-06-30 16:30:33 +01:00
test_tokenization_gpt2_tf.py Update quality tooling for formatting (#21480) 2023-02-06 18:10:56 -05:00
test_tokenization_gpt2.py [OPT] Adds GPT2TokenizerFast to the list of tokenizer to use for OPT. (#20823) 2023-02-07 17:35:28 +01:00