Learning Rate Scheduling Callback
Source:R/CallbackSetLRScheduler.R
mlr_callback_set.lr_scheduler.Rd
Changes the learning rate based on the schedule specified by a torch::lr_scheduler
.
As of this writing, the following are available:
torch::lr_one_cycle()
(where the default values forepochs
andsteps_per_epoch
are the number of training epochs and the number of batches per epoch)Custom schedulers defined with
torch::lr_scheduler()
.
Super class
mlr3torch::CallbackSet
-> CallbackSetLRScheduler
Public fields
scheduler_fn
(
lr_scheduler_generator
)
Thetorch
function that creates a learning rate schedulerscheduler
(
LRScheduler
)
The learning rate scheduler wrapped by this callback
Methods
Method new()
Creates a new instance of this R6 class.
Usage
CallbackSetLRScheduler$new(.scheduler, step_on_epoch, ...)
Arguments
.scheduler
(
lr_scheduler_generator
)
Thetorch
scheduler generator (e.g.torch::lr_step
).step_on_epoch
(
logical(1)
)
Whether the scheduler steps after every epoch (otherwise every batch)....
(any)
The scheduler-specific initialization arguments.