General negative log likelihood loss function
loss_negative_log_likelihood.Rd
Bear in mind, that the number of the model outputs must reflect the number
of distribution parameters. For example, if you use normal distribution (tfprobability::tfd_normal()
),
which is described with two parameters (mean and standard deviation), the model
should return two values per each timestep. In othr words, it produces a distribution
as a forecast rather than a point estimate. When the model is trained, we have two options
to generate the final forecast:
use the expected value of the distribution (e.g. mean for normal distribution)
sample a value from the distribution Additionally, having the distribution we can compute prediction intervals. Remeber also about the constraints imposed on the parameter values, e.g. standard deviation must be positive.
Arguments
- distribution
A probability distribution function from
tfprobability
package. Default:tfprobability::tfd_normal()
References
D. Salinas, V. Flunkert, J. Gasthaus, T. Januschowski, DeepAR: Probabilistic forecasting with autoregressive recurrent networks, International Journal of Forecasting(2019)
Examples
y_pred <- array(runif(60), c(2, 10, 2))
y_true <- array(runif(20), c(2, 10, 1))
loss_negative_log_likelihood(
distribution = tfprobability::tfd_normal,
reduction = 'auto'
)(y_true, y_pred)
#> tf.Tensor(14.322809, shape=(), dtype=float32)
loss_negative_log_likelihood(reduction = 'sum')(y_true, y_pred)
#> tf.Tensor(286.45618, shape=(), dtype=float32)