mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-31 02:02:21 +06:00
[docstring] missing arg (#6933)
* [docstring] missing arg add the missing `tie_word_embeddings` entry * cleanup * Update src/transformers/configuration_reformer.py Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com> Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>
This commit is contained in:
parent
c3317e1f80
commit
acfaad74ab
@ -115,6 +115,8 @@ class ReformerConfig(PretrainedConfig):
|
||||
vocab_size (:obj:`int`, optional, defaults to 320):
|
||||
Vocabulary size of the Reformer model. Defines the different tokens that
|
||||
can be represented by the `inputs_ids` passed to the forward method of :class:`~transformers.ReformerModel`.
|
||||
tie_word_embeddings (:obj:`bool`, `optional`, defaults to :obj:`False`):
|
||||
Whether to tie input and output embeddings.
|
||||
|
||||
Example::
|
||||
|
||||
|
Loading…
Reference in New Issue
Block a user