mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-30 17:52:35 +06:00
![]() * decoder forward pass is working * no model has forward pass returning attentions * decoder ngram changed to not mix batch size * current basic forward pass returns identical result * passed test_model attentions * passed test_encoder_decoder_model_generate * passed test_headmasking * removed old block * removed comments bug/fixme * removed bug comments * applied styling * applied fix-copies * applied ngram forward comments * corrected dimension notation * applied styling and comment fixes * changed asserts for raise ValueError * changed question gen test * updated hidden_states integration test * applied styling |
||
---|---|---|
.. | ||
__init__.py | ||
test_modeling_prophetnet.py | ||
test_tokenization_prophetnet.py |