學習率預熱
-
在預熱期間,學習率從0線性增加到優化器中的初始lr。
-
在預熱階段之后創建一個schedule,使其學習率從優化器中的初始lr線性降低到0
Parameters
-
optimizer (Optimizer)
– The optimizer for which to schedule the learning rate. -
num_warmup_steps (int)
– The number of steps for the warmup phase. -
num_training_steps (int)
– The total number of training steps. -
last_epoch (int, optional, defaults to -1)
– The index of the last epoch when resuming training.
Returns
torch.optim.lr_scheduler.LambdaLR
with the appropriate schedule.
# training steps 的數量: [number of batches] x [number of epochs].
total_steps = len(train_dataloader) * epochs
# 設計 learning rate scheduler
scheduler = get_linear_schedule_with_warmup(optimizer, num_warmup_steps = 50,
num_training_steps = total_steps)