Skip to contents

Represents a neural network using a Graph that usually costains mostly PipeOpModules.

Usage

nn_graph(graph, shapes_in, output_map = graph$output$name, list_output = FALSE)

Arguments

graph

(Graph)
The Graph to wrap. Is not cloned.

shapes_in

(named integer)
Shape info of tensors that go into graph. Names must be graph$input$name, possibly in different order.

output_map

(character)
Which of graph's outputs to use. Must be a subset of graph$output$name.

list_output

(logical(1))
Whether output should be a list of tensors. If FALSE (default), then length(output_map) must be 1.

Value

nn_graph

Examples

graph = mlr3pipelines::Graph$new()
graph$add_pipeop(po("module_1", module = nn_linear(10, 20)), clone = FALSE)
graph$add_pipeop(po("module_2", module = nn_relu()), clone = FALSE)
graph$add_pipeop(po("module_3", module = nn_linear(20, 1)), clone = FALSE)
graph$add_edge("module_1", "module_2")
graph$add_edge("module_2", "module_3")

network = nn_graph(graph, shapes_in = list(module_1.input = c(NA, 10)))

x = torch_randn(16, 10)

network(module_1.input = x)
#> torch_tensor
#> -0.1652
#> -0.0971
#>  0.0747
#> -0.1365
#> -0.3213
#> -0.1505
#> -0.2514
#>  0.0201
#> -0.2495
#> -0.0511
#> -0.0209
#> -0.1577
#> -0.1063
#> -0.1217
#> -0.0642
#> -0.2225
#> [ CPUFloatType{16,1} ][ grad_fn = <AddmmBackward0> ]