activation_ReLU
The Rectified Linear Unit (ReLU) activation function which transforms negative input values to 0 and retains positive input values.
Equation
For an input vector x
, the ReLU activation function computes:
output = max(0, x)
Inputs
x
The input vector to the ReLU activation function.
Outputs
output
The activated vector after applying the ReLU function.