transformers/templates
JB (Don) 5ea2595ecd
Add warning for missing attention mask when pad tokens are detected (#25345)
* Add attention mask and pad token warning to many of the models

* Remove changes under examples/research_projects

These files are not maintained by HG.

* Skip the warning check during torch.fx or JIT tracing

* Switch ordering for the warning and input shape assignment

This ordering is a little cleaner for some of the cases.

* Add missing line break in one of the files
2023-08-08 10:49:21 +02:00
..
adding_a_missing_tokenization_test Move test model folders (#17034) 2022-05-03 14:42:02 +02:00
adding_a_new_example_script Allow trust_remote_code in example scripts (#25248) 2023-08-07 16:32:25 +02:00
adding_a_new_model Add warning for missing attention mask when pad tokens are detected (#25345) 2023-08-08 10:49:21 +02:00