Share

activation_SELU

The Scaled Exponential Linear Unit (SELU) activation function, which induce self-normalizing properties.

Equation

For an input vector x, the SELU activation function computes:

output = scale * (max(0, x) + min(0, alpha * (e^x - 1)))

Where:

alpha = 1.6732632423543772848170429916717; and scale = 1.0507009873554804934193349852946

Inputs

x

The input vector to the SELU activation function.

Outputs

output

The activated vector after applying the SELU function.

Was this information helpful?