Share

activation_ELU

The Exponential Linear Unit (ELU) activation function helps to mitigate the vanishing gradient problem for negative input values.

Equation

For an input vector x, the ELU activation function computes:

If x > 0:

output = x

If x <= 0:

output = alpha * (e^x - 1)

Inputs

x

The input vector to the ELU activation function.

alpha

A hyperparameter (commonly set to 1.0).

Outputs

output

The activated vector after applying the ELU function.

Was this information helpful?