Skip to contents

Fully connected feed forward network with dropout after each activation function. The features can either be a single lazy_tensor or one or more numeric columns (but not both).

Dictionary

This Learner can be instantiated using the sugar function lrn():

lrn("classif.mlp", ...)
lrn("regr.mlp", ...)

Properties

  • Supported task types: 'classif', 'regr'

  • Predict Types:

    • classif: 'response', 'prob'

    • regr: 'response'

  • Feature Types: “integer”, “numeric”, “lazy_tensor”

  • Required Packages: mlr3, mlr3torch, torch

Parameters

Parameters from LearnerTorch, as well as:

  • activation :: [nn_module]
    The activation function. Is initialized to nn_relu.

  • activation_args :: named list()
    A named list with initialization arguments for the activation function. This is intialized to an empty list.

  • neurons :: integer()
    The number of neurons per hidden layer. By default there is no hidden layer. Setting this to c(10, 20) would have a the first hidden layer with 10 neurons and the second with 20.

  • p :: numeric(1)
    The dropout probability. Is initialized to 0.5.

  • shape :: integer() or NULL
    The input shape of length 2, e.g. c(NA, 5). Only needs to be present when there is a lazy tensor input with unknown shape (NULL). Otherwise the input shape is inferred from the number of numeric features.

Super classes

mlr3::Learner -> mlr3torch::LearnerTorch -> LearnerTorchMLP

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.

Usage

LearnerTorchMLP$new(
  task_type,
  optimizer = NULL,
  loss = NULL,
  callbacks = list()
)

Arguments

task_type

(character(1))
The task type, either "classif" or "regr".

optimizer

(TorchOptimizer)
The optimizer to use for training. Per default, adam is used.

loss

(TorchLoss)
The loss used to train the network. Per default, mse is used for regression and cross_entropy for classification.

callbacks

(list() of TorchCallbacks)
The callbacks. Must have unique ids.


Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerTorchMLP$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

# Define the Learner and set parameter values
learner = lrn("classif.mlp")
learner$param_set$set_values(
  epochs = 1, batch_size = 16, device = "cpu",
  neurons = 10
)

# Define a Task
task = tsk("iris")

# Create train and test set
ids = partition(task)

# Train the learner on the training ids
learner$train(task, row_ids = ids$train)

# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)

# Score the predictions
predictions$score()
#> classif.ce 
#>       0.68