transformers/examples/research_projects/movement-pruning/emmental
Thomas Viehmann 6ed9882ddb
use functional interface for softmax in attention (#14198)
* use functional interface instead of instantiating module and immediately calling it

* fix torch.nn.functional to nn.functional. Thank you Stas!
2021-11-30 11:47:33 -05:00
..
modules [style] consistent nn. and nn.functional: part 4 examples (#12156) 2021-06-14 12:28:24 -07:00
__init__.py Reorganize examples (#9010) 2020-12-11 10:07:02 -05:00
configuration_bert_masked.py Reorganize examples (#9010) 2020-12-11 10:07:02 -05:00
modeling_bert_masked.py use functional interface for softmax in attention (#14198) 2021-11-30 11:47:33 -05:00