transformers/templates
Thomas Viehmann 6ed9882ddb
use functional interface for softmax in attention (#14198)
* use functional interface instead of instantiating module and immediately calling it

* fix torch.nn.functional to nn.functional. Thank you Stas!
2021-11-30 11:47:33 -05:00
..
adding_a_new_example_script Update namespaces inside torch.utils.data to the latest. (#13167) 2021-08-19 14:29:51 +02:00
adding_a_new_model use functional interface for softmax in attention (#14198) 2021-11-30 11:47:33 -05:00