General interface to recurrent neural network models
rnn(
mode = "regression",
timesteps = NULL,
horizon = 1,
learn_rate = 0.01,
epochs = 50,
hidden_units = NULL,
dropout = NULL,
batch_size = 32,
scale = TRUE,
shuffle = FALSE,
jump = 1,
sample_frac = 1
)
(character
) Model mode, default: 'regression'.
(integer
) Number of timesteps to look back.
(integer
) Forecast horizon.
(numeric
or dials::learn_rate
) Learning rate.
(integer
or dials::epochs
) Number of epochs.
(integer
) Number of hidden units.
(logical
or dials::dropout
) Flag to use dropout.
(integer
) Batch size.
(logical
) Scale input features.
(logical
) Shuffle examples during the training (default: FALSE).
(integer
) Input window shift.
(numeric
) A percent of subsamples used for training.
This is a parsnip
API to the recurent network models. For now the only
available engine is torchts_rnn
.
Categorical features are detected automatically - if a column of your input data (defined in the formula)
is logical
, character
, factor
or integer
.
Neural networks, unlike many other models (e.g. linear models) can return values before any training epoch ended. It's because every neural networks model starts with "random" parameters, which are gradually tuned in the following iterations according to the Gradient Descent algorithm.
If you'd like to get a non-trained model, simply set epochs = 0
.
You still have to "fit" the model to stick the standard parsnip
's API procedure.
library(torchts)
library(parsnip)
library(dplyr, warn.conflicts = FALSE)
library(rsample)
# Univariate time series
tarnow_temp <-
weather_pl %>%
filter(station == "TARNÓW") %>%
select(date, temp = tmax_daily)
data_split <- initial_time_split(tarnow_temp)
#> Error: `in_id` must be a positive integer vector.
rnn_model <-
rnn(
timesteps = 20,
horizon = 1,
epochs = 10,
hidden_units = 32
)
rnn_model <-
rnn_model %>%
fit(temp ~ date, data = training(data_split))
#> Warning: Engine set to `torchts`.
#> Error in analysis(x): object 'data_split' not found