link for optimizer names (#32400)

* link for optimizer names

Add a note and link to where the user can find more optimizer names easily because there are many more optimizers than are mentioned in the docstring.

* make fixup
This commit is contained in:
Nicholas Broad 2024-08-20 15:28:24 -07:00 committed by GitHub
parent 078d5a88cd
commit 1dde50c7d2
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

View File

@ -611,8 +611,9 @@ class TrainingArguments:
The options should be separated by whitespaces.
optim (`str` or [`training_args.OptimizerNames`], *optional*, defaults to `"adamw_torch"`):
The optimizer to use: adamw_hf, adamw_torch, adamw_torch_fused, adamw_apex_fused, adamw_anyprecision or
adafactor.
The optimizer to use, such as "adamw_hf", "adamw_torch", "adamw_torch_fused", "adamw_apex_fused", "adamw_anyprecision",
"adafactor". See `OptimizerNames` in [training_args.py](https://github.com/huggingface/transformers/blob/main/src/transformers/training_args.py)
for a full list of optimizers.
optim_args (`str`, *optional*):
Optional arguments that are supplied to AnyPrecisionAdamW.
group_by_length (`bool`, *optional*, defaults to `False`):