transformers/docs/source/zh/main_classes/optimizer_schedules.md
jiaqiw09 cad1b1192b
translation main-class files to chinese (#27588)
* translate work

* update

* update

* update [[autodoc]]

* Update callback.md

---------

Co-authored-by: jiaqiw <wangjiaqi50@huawei.com>
2023-11-27 12:36:37 -08:00

2.2 KiB

Optimization

.optimization 模块提供了:

  • 一个带有固定权重衰减的优化器,可用于微调模型
  • 继承自 _LRSchedule 多个调度器:
  • 一个梯度累积类,用于累积多个批次的梯度

AdamW (PyTorch)

autodoc AdamW

AdaFactor (PyTorch)

autodoc Adafactor

AdamWeightDecay (TensorFlow)

autodoc AdamWeightDecay

autodoc create_optimizer

Schedules

Learning Rate Schedules (Pytorch)

autodoc SchedulerType

autodoc get_scheduler

autodoc get_constant_schedule

autodoc get_constant_schedule_with_warmup

autodoc get_cosine_schedule_with_warmup

autodoc get_cosine_with_hard_restarts_schedule_with_warmup

autodoc get_linear_schedule_with_warmup

autodoc get_polynomial_decay_schedule_with_warmup

autodoc get_inverse_sqrt_schedule

Warmup (TensorFlow)

autodoc WarmUp

Gradient Strategies

GradientAccumulator (TensorFlow)

autodoc GradientAccumulator