R/nn-nonlinear.R
nn_nonlinear.Rd
Shortcut to create linear layer with nonlinear activation function
nn_nonlinear(in_features, out_features, bias = TRUE, activation = nn_relu())
(integer
) size of each input sample
(integer
) size of each output sample
(logical
) If set to FALSE
, the layer will not learn an additive bias.
Default: TRUE
(nn_module
) A nonlinear activation function (default: torch::nn_relu()
)
net <- nn_nonlinear(10, 1)
x <- torch_tensor(matrix(1, nrow = 2, ncol = 10))
net(x)
#> torch_tensor
#> 0
#> 0
#> [ CPUFloatType{2,1} ][ grad_fn = <ReluBackward0> ]