Skip to contents

Below is a list of neural network layers that are available in mlr3torch.

Key Label
nn_adaptive_avg_pool1d 1D Adaptive Average Pooling
nn_adaptive_avg_pool2d 2D Adaptive Average Pooling
nn_adaptive_avg_pool3d 3D Adaptive Average Pooling
nn_avg_pool1d 1D Average Pooling
nn_avg_pool2d 2D Average Pooling
nn_avg_pool3d 3D Average Pooling
nn_batch_norm1d 1D Batch Normalization
nn_batch_norm2d 2D Batch Normalization
nn_batch_norm3d 3D Batch Normalization
nn_block Block Repetition
nn_celu CELU Activation Function
nn_conv1d 1D Convolution
nn_conv2d 2D Convolution
nn_conv3d 3D Convolution
nn_conv_transpose1d Transpose 1D Convolution
nn_conv_transpose2d Transpose 2D Convolution
nn_conv_transpose3d Transpose 3D Convolution
nn_dropout Dropout
nn_elu ELU Activation Function
nn_flatten Flattens a Tensor
nn_gelu GELU Activation Function
nn_glu GLU Activation Function
nn_hardshrink Hard Shrink Activation Function
nn_hardsigmoid Hard Sigmoid Activation Function
nn_hardtanh Hard Tanh Activation Function
nn_head Output Head
nn_layer_norm Layer Normalization
nn_leaky_relu Leaky ReLU Activation Function
nn_linear Linear Layer
nn_log_sigmoid Log Sigmoid Activation Function
nn_max_pool1d 1D Max Pooling
nn_max_pool2d 2D Max Pooling
nn_max_pool3d 3D Max Pooling
nn_merge_cat Merge by Concatenation
nn_merge_prod Merge by Product
nn_merge_sum Merge by Summation
nn_prelu PReLU Activation Function
nn_relu ReLU Activation Function
nn_relu6 ReLU6 Activation Function
nn_reshape Reshape a Tensor
nn_rrelu RReLU Activation Function
nn_selu SELU Activation Function
nn_sigmoid Sigmoid Activation Function
nn_softmax Softmax
nn_softplus SoftPlus Activation Function
nn_softshrink Soft Shrink Activation Function
nn_softsign SoftSign Activation Function
nn_squeeze Squeeze a Tensor
nn_tanh Tanh Activation Function
nn_tanhshrink Tanh Shrink Activation Function
nn_threshold Treshold Activation Function
nn_unsqueeze Unqueeze a Tensor