This class implements a layer that calculates the LeakyReLU
activation function for each element of a single input.
Here is the formula of the activation function:
f(x) = alpha * x if x <= 0
f(x) = x if x > 0
void SetAlpha( float alpha );
Sets the multiplier used for negative values of x
. It is equal to 0
by default, which makes the function equivalent to ReLU
.
There are no trainable parameters for this layer.
There is only one input, which accepts a data blob of arbitrary size.
There is only one output, which returns a blob of the same size as the input blob. Each element of the output contains the value of the activation function calculated on the corresponding element of the input.