mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-31 10:12:23 +06:00
Fix tiny typo (#15884)
This commit is contained in:
parent
2eb7bb15e7
commit
e535c389aa
@ -56,7 +56,7 @@ class BertModelWithPabee(BertModel):
|
||||
the self-attention layers, following the architecture described in `Attention is all you need`_ by Ashish Vaswani,
|
||||
Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser and Illia Polosukhin.
|
||||
|
||||
To behave as an decoder the model needs to be initialized with the
|
||||
To behave as a decoder the model needs to be initialized with the
|
||||
:obj:`is_decoder` argument of the configuration set to :obj:`True`; an
|
||||
:obj:`encoder_hidden_states` is expected as an input to the forward pass.
|
||||
|
||||
|
Loading…
Reference in New Issue
Block a user