[docstring] missing arg (#6933)

* [docstring] missing arg

add the missing `tie_word_embeddings` entry

* cleanup

* Update src/transformers/configuration_reformer.py

Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>

Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>
This commit is contained in:
Stas Bekman 2020-09-07 02:36:16 -07:00 committed by GitHub
parent c3317e1f80
commit acfaad74ab
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

View File

@ -115,6 +115,8 @@ class ReformerConfig(PretrainedConfig):
vocab_size (:obj:`int`, optional, defaults to 320):
Vocabulary size of the Reformer model. Defines the different tokens that
can be represented by the `inputs_ids` passed to the forward method of :class:`~transformers.ReformerModel`.
tie_word_embeddings (:obj:`bool`, `optional`, defaults to :obj:`False`):
Whether to tie input and output embeddings.
Example::