mirror of
https://github.com/huggingface/transformers.git
synced 2025-08-02 03:01:07 +06:00
[docstring] missing arg (#6933)
* [docstring] missing arg add the missing `tie_word_embeddings` entry * cleanup * Update src/transformers/configuration_reformer.py Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com> Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>
This commit is contained in:
parent
c3317e1f80
commit
acfaad74ab
@ -115,6 +115,8 @@ class ReformerConfig(PretrainedConfig):
|
|||||||
vocab_size (:obj:`int`, optional, defaults to 320):
|
vocab_size (:obj:`int`, optional, defaults to 320):
|
||||||
Vocabulary size of the Reformer model. Defines the different tokens that
|
Vocabulary size of the Reformer model. Defines the different tokens that
|
||||||
can be represented by the `inputs_ids` passed to the forward method of :class:`~transformers.ReformerModel`.
|
can be represented by the `inputs_ids` passed to the forward method of :class:`~transformers.ReformerModel`.
|
||||||
|
tie_word_embeddings (:obj:`bool`, `optional`, defaults to :obj:`False`):
|
||||||
|
Whether to tie input and output embeddings.
|
||||||
|
|
||||||
Example::
|
Example::
|
||||||
|
|
||||||
|
Loading…
Reference in New Issue
Block a user