mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-30 17:52:35 +06:00
Update modeling_utils.py (#28127)
In docstring for PreTrainedModel.resize_token_embeddings, correct definition of new_num_tokens parameter to read "the new number of tokens" (meaning the new size of the vocab) rather than "the number of new tokens" (number of newly added tokens only).
This commit is contained in:
parent
4a04b4ccca
commit
23f8e4db77
@ -1706,7 +1706,7 @@ class PreTrainedModel(nn.Module, ModuleUtilsMixin, GenerationMixin, PushToHubMix
|
||||
|
||||
Arguments:
|
||||
new_num_tokens (`int`, *optional*):
|
||||
The number of new tokens in the embedding matrix. Increasing the size will add newly initialized
|
||||
The new number of tokens in the embedding matrix. Increasing the size will add newly initialized
|
||||
vectors at the end. Reducing the size will remove vectors from the end. If not provided or `None`, just
|
||||
returns a pointer to the input tokens `torch.nn.Embedding` module of the model without doing anything.
|
||||
pad_to_multiple_of (`int`, *optional*):
|
||||
|
Loading…
Reference in New Issue
Block a user