top-k instead of top-p in MixtralConfig docstring (#30687)

top-k instead of top-p in docstring
This commit is contained in:
Simon 2024-05-07 10:19:24 +02:00 committed by GitHub
parent 835de4c833
commit 4980d62af3
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

View File

@ -83,7 +83,7 @@ class MixtralConfig(PretrainedConfig):
attention_dropout (`float`, *optional*, defaults to 0.0):
The dropout ratio for the attention probabilities.
num_experts_per_tok (`int`, *optional*, defaults to 2):
The number of experts to root per-token, can be also interpreted as the `top-p` routing
The number of experts to route per-token, can be also interpreted as the `top-k` routing
parameter
num_local_experts (`int`, *optional*, defaults to 8):
Number of experts per Sparse MLP layer.