Featureless torch learner. Output is a constant weight that is learned during training. For classification, this should (asymptoptically) result in a majority class prediction when using the standard cross-entropy loss. For regression, this should result in the median for L1 loss and in the mean for L2 loss.
Parameters
Only those from LearnerTorch
.
Super classes
mlr3::Learner
-> mlr3torch::LearnerTorch
-> LearnerTorchFeatureless
Methods
Inherited methods
mlr3::Learner$base_learner()
mlr3::Learner$encapsulate()
mlr3::Learner$help()
mlr3::Learner$predict()
mlr3::Learner$predict_newdata()
mlr3::Learner$reset()
mlr3::Learner$train()
mlr3torch::LearnerTorch$dataset()
mlr3torch::LearnerTorch$format()
mlr3torch::LearnerTorch$marshal()
mlr3torch::LearnerTorch$print()
mlr3torch::LearnerTorch$unmarshal()
Method new()
Creates a new instance of this R6 class.
Usage
LearnerTorchFeatureless$new(
task_type,
optimizer = NULL,
loss = NULL,
callbacks = list()
)
Arguments
task_type
(
character(1)
)
The task type, either"classif
" or"regr"
.optimizer
(
TorchOptimizer
)
The optimizer to use for training. Per default, adam is used.loss
(
TorchLoss
)
The loss used to train the network. Per default, mse is used for regression and cross_entropy for classification.callbacks
(
list()
ofTorchCallback
s)
The callbacks. Must have unique ids.
Examples
# Define the Learner and set parameter values
learner = lrn("classif.torch_featureless")
learner$param_set$set_values(
epochs = 1, batch_size = 16, device = "cpu"
)
# Define a Task
task = tsk("iris")
# Create train and test set
ids = partition(task)
# Train the learner on the training ids
learner$train(task, row_ids = ids$train)
# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)
# Score the predictions
predictions$score()
#> classif.ce
#> 0.64