Stack for Hot Start LearnersSource:
This class stores learners for hot starting training, i.e. resuming or continuing from an already fitted model. We assume that hot starting is only possible if a single hyperparameter (also called the fidelity parameter, usually controlling the complexity or expensiveness) is altered and all other hyperparameters are identical.
HotstartStack stores trained learners which can be potentially used to
hot start a learner. Learner automatically hot start while training if a
stack is attached to the
$hotstart_stack field and the stack contains a
For example, if you want to train a random forest learner with 1000 trees but
already have a random forest learner with 500 trees (hot start learner),
you can add the hot start learner to the
HotstartStack of the expensive learner
with 1000 trees. If you now call the
train() method (or
benchmark()), a random forest with 500 trees will be fitted and combined
with the 500 trees of the hotstart learner, effectively saving you to
fit 500 trees.
Hot starting is only supported by learners which have the property
"hotstart_backward". For example, an
(in mlr3learners) can hot start forward by adding more boosting
iterations, and a random forest can go backwards by removing trees.
The fidelity parameters are tagged with
"hotstart" in learner's parameter set.
Stores hot start learners.
Creates a new instance of this R6 class.
HotstartStack$new(learners = NULL)
(List of Learners)
Learners are added to the hotstart stack. If
NULL(default), empty stack is created.
Add learners to hot start stack.
(List of Learners). Learners are added to the hotstart stack.
Calculates the cost for each learner of the stack to hot start the target
The following cost values can be returned:
NA_real_: Learner is unsuitable to hot start target
-1: Hotstart learner in the stack and target
0Cost for hot starting backwards is always 0.
> 0Cost for hot starting forward.
Helper for print outputs.
# train learner on pima task task = tsk("pima") learner = lrn("classif.debug", iter = 1) learner$train(task) # initialize stack with previously fitted learner hot = HotstartStack$new(list(learner)) # retrieve learner with increased fidelity parameter learner = lrn("classif.debug", iter = 2) # calculate cost of hot starting hot$start_cost(learner, task$hash) #>  1 # add stack with hot start learner learner$hotstart_stack = hot # train automatically uses hot start learner while fitting the model learner$train(task)