Residual block for the WaveNet TCN
layer_tcn.Rd
This is a block composed i.a. from causal convolutional layers. It may be considered as a replacement for the recurrent layers.
Usage
layer_tcn(
object,
nb_filters = 64,
kernel_size = 3,
nb_stacks = 1,
dilations = c(1, 7, 14),
padding = "causal",
use_skip_connections = TRUE,
dropout_rate = 0,
return_sequences = FALSE,
activation = "relu",
kernel_initializer = "he_normal",
use_batch_norm = FALSE,
use_layer_norm = FALSE,
use_weight_norm = FALSE,
input_shape = NULL,
...
)
Arguments
- nb_filters
The number of convolutional filters to use in this block
- kernel_size
The size of the convolutional kernel
- padding
The padding used in the convolutional layers, 'same' or 'causal'.
- dropout_rate
Float between 0 and 1. Fraction of the input units to drop.
- activation
The final activation used in o = Activation(x + F(x))
- kernel_initializer
Initializer for the kernel weights matrix (Conv1D).
- use_batch_norm
Whether to use batch normalization in the residual layers or not.
- use_layer_norm
Whether to use layer normalization in the residual layers or not.
- use_weight_norm
Whether to use weight normalization in the residual layers or not.
- dilation_rate
The dilation power of 2 we are using for this residual block
References
Sh. Bai., J.Z Kolter, V. Koltun, An Empirical Evaluation of Generic Convolutional and Recurrent Networks for Sequence Modeling](https://arxiv.org/abs/1803.01271) (2018)
Examples
# \donttest{
inp <- layer_input(c(28, 3))
tcn <- layer_tcn()(inp)
model <- keras_model(inp, tcn)
model(array(1, c(32, 28, 3)))
#> Error in py_call_impl(callable, dots$args, dots$keywords): tensorflow.python.framework.errors_impl.UnimplementedError: Exception encountered when calling layer "conv1D_0" " f"(type Conv1D).
#>
#> {{function_node __wrapped__Conv2D_device_/job:localhost/replica:0/task:0/device:GPU:0}} DNN library is not found. [Op:Conv2D]
#>
#> Call arguments received by layer "conv1D_0" " f"(type Conv1D):
#> • inputs=tf.Tensor(shape=(32, 28, 3), dtype=float32)
# }