學習率預熱(transformers.get_linear_schedule_with_warmup)


學習率預熱

  • 在預熱期間,學習率從0線性增加到優化器中的初始lr。

  • 在預熱階段之后創建一個schedule,使其學習率從優化器中的初始lr線性降低到0

Parameters

  • optimizer (Optimizer) – The optimizer for which to schedule the learning rate.

  • num_warmup_steps (int) – The number of steps for the warmup phase.

  • num_training_steps (int) – The total number of training steps.

  • last_epoch (int, optional, defaults to -1) – The index of the last epoch when resuming training.

Returns

  • torch.optim.lr_scheduler.LambdaLR with the appropriate schedule.
# training steps 的數量: [number of batches] x [number of epochs].
total_steps = len(train_dataloader) * epochs

# 設計 learning rate scheduler
scheduler = get_linear_schedule_with_warmup(optimizer, num_warmup_steps = 50, 
                                            num_training_steps = total_steps)


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM