This class implements a layer that calculates the GELU
activation function for each element of a single input.
Precise formula:
f(x) = x * 0.5 * ( 1 + erf( x / sqrt( 2 ) ) )
Approximation:
f(x) = x * sigmoid( 1.702 * x )
Whether to calculate the exact value using the Error function (TCalculationMode::Precise), or an approximate one (TCalculationMode::SigmoidApproximate).
void SetCalculationMode( TCalculationMode );
There are no trainable parameters for this layer.
There is only one input, which accepts a data blob of arbitrary size.
There is only one output, which returns a blob of the same size as the input blob. Each element of the output contains the value of the activation function calculated on the corresponding element of the input.