Share

activation_mish

The Mish activation function applies a smooth, self-regularized non-linear activation that multiplies the input by the softplus of the input, supporting deeper network training without saturation.

Equation

For an input vector x, the Mish activation function computes:

output = x * tanh(softplus(x))

Inputs

x

The input vector to the Mish activation function.

Outputs

output

The activated vector after applying the Mish function.

Was this information helpful?