Skip to contents

Applies Layer Normalization for last certain number of dimensions.

State

The state is the value calculated by the public method $shapes_out().

Credit

Part of this documentation have been copied or adapted from the documentation of torch.

Input and Output Channels

One input channel called "input" and one output channel called "output". For an explanation see PipeOpTorch.

Parameters

  • dims :: integer(1)
    The number of dimensions over which will be normalized (starting from the last dimension).

  • elementwise_affine :: logical(1)
    Whether to learn affine-linear parameters initialized to 1 for weights and to 0 for biases. The default is TRUE.

  • eps :: numeric(1)
    A value added to the denominator for numerical stability.

Internals

Calls torch::nn_layer_norm() when trained. The parameter normalized_shape is inferred as the dimensions of the last dims dimensions of the input shape.

See also

Other PipeOps: mlr_pipeops_nn_adaptive_avg_pool1d, mlr_pipeops_nn_adaptive_avg_pool2d, mlr_pipeops_nn_adaptive_avg_pool3d, mlr_pipeops_nn_avg_pool1d, mlr_pipeops_nn_avg_pool2d, mlr_pipeops_nn_avg_pool3d, mlr_pipeops_nn_batch_norm1d, mlr_pipeops_nn_batch_norm2d, mlr_pipeops_nn_batch_norm3d, mlr_pipeops_nn_block, mlr_pipeops_nn_celu, mlr_pipeops_nn_conv1d, mlr_pipeops_nn_conv2d, mlr_pipeops_nn_conv3d, mlr_pipeops_nn_conv_transpose1d, mlr_pipeops_nn_conv_transpose2d, mlr_pipeops_nn_conv_transpose3d, mlr_pipeops_nn_dropout, mlr_pipeops_nn_elu, mlr_pipeops_nn_flatten, mlr_pipeops_nn_gelu, mlr_pipeops_nn_glu, mlr_pipeops_nn_hardshrink, mlr_pipeops_nn_hardsigmoid, mlr_pipeops_nn_hardtanh, mlr_pipeops_nn_head, mlr_pipeops_nn_leaky_relu, mlr_pipeops_nn_linear, mlr_pipeops_nn_log_sigmoid, mlr_pipeops_nn_max_pool1d, mlr_pipeops_nn_max_pool2d, mlr_pipeops_nn_max_pool3d, mlr_pipeops_nn_merge, mlr_pipeops_nn_merge_cat, mlr_pipeops_nn_merge_prod, mlr_pipeops_nn_merge_sum, mlr_pipeops_nn_prelu, mlr_pipeops_nn_relu, mlr_pipeops_nn_relu6, mlr_pipeops_nn_reshape, mlr_pipeops_nn_rrelu, mlr_pipeops_nn_selu, mlr_pipeops_nn_sigmoid, mlr_pipeops_nn_softmax, mlr_pipeops_nn_softplus, mlr_pipeops_nn_softshrink, mlr_pipeops_nn_softsign, mlr_pipeops_nn_squeeze, mlr_pipeops_nn_tanh, mlr_pipeops_nn_tanhshrink, mlr_pipeops_nn_threshold, mlr_pipeops_nn_unsqueeze, mlr_pipeops_torch_ingress, mlr_pipeops_torch_ingress_categ, mlr_pipeops_torch_ingress_ltnsr, mlr_pipeops_torch_ingress_num, mlr_pipeops_torch_loss, mlr_pipeops_torch_model, mlr_pipeops_torch_model_classif, mlr_pipeops_torch_model_regr

Super classes

mlr3pipelines::PipeOp -> mlr3torch::PipeOpTorch -> PipeOpTorchLayerNorm

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.

Usage

PipeOpTorchLayerNorm$new(id = "nn_layer_norm", param_vals = list())

Arguments

id

(character(1))
Identifier of the resulting object.

param_vals

(list())
List of hyperparameter settings, overwriting the hyperparameter settings that would otherwise be set during construction.


Method clone()

The objects of this class are cloneable with this method.

Usage

PipeOpTorchLayerNorm$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

# Construct the PipeOp
pipeop = po("nn_layer_norm", dims = 1)
pipeop
#> PipeOp: <nn_layer_norm> (not trained)
#> values: <dims=1>
#> Input channels <name [train type, predict type]>:
#>   input [ModelDescriptor,Task]
#> Output channels <name [train type, predict type]>:
#>   output [ModelDescriptor,Task]
# The available parameters
pipeop$param_set
#> <ParamSet(3)>
#>                    id    class lower upper nlevels        default  value
#>                <char>   <char> <num> <num>   <num>         <list> <list>
#> 1:               dims ParamInt     1   Inf     Inf <NoDefault[0]>      1
#> 2: elementwise_affine ParamLgl    NA    NA       2           TRUE [NULL]
#> 3:                eps ParamDbl     0   Inf     Inf          1e-05 [NULL]