• Help Home
  • English (US)
    1. English (US)
    2. 简体中文
    3. 日本語
Bifrost
    • Bifrost Release Notes
    • Working in the Bifrost Graph Editor
      • Working in the Bifrost Graph Editor
      • Build and modify graphs
      • Install compounds and graphs
      • Simulate dynamic effects
      • Create instances of geometric objects
      • Apply materials
      • Bake textures to file
      • Machine Learning
      • Run the Bifrost command-line tool
      • Bifrost Node Reference
        • Core
        • Diagnostic
        • File
        • Geometry
        • MachineLearning
          • Activation
            • activation_CELU
            • activation_ELU
            • activation_PReLU
            • activation_RReLU
            • activation_ReLU
            • activation_ReLU6
            • activation_SELU
            • activation_hard_shrink
            • activation_hard_sigmoid
            • activation_hard_swish
            • activation_hard_tanh
            • activation_leaky_ReLU
            • activation_log_sigmoid
            • activation_mish
            • activation_sigmoid
            • activation_soft_max
            • activation_soft_min
            • activation_soft_plus
            • activation_soft_shrink
            • activation_soft_sign
            • activation_tanh
            • activation_tanh_shrink
            • activation_threshold
          • Layer
          • Utils
        • Modeling
        • Rendering
        • Rigging
        • Simulation
        • USD
        • Wedging
      • Troubleshooting Bifrost graphs
    • Working with Bifrost Graphs in Maya
    • Bifrost Rigging
    • Flow Wedging
    • Bifrost Scripting Documentation
    • Bifrost Developer Help
    Share
    • Email
    • Facebook
    • Twitter
    • LinkedIn

    activation_ReLU

    The Rectified Linear Unit (ReLU) activation function which transforms negative input values to 0 and retains positive input values.

    Equation

    For an input vector x, the ReLU activation function computes:

    output = max(0, x)

    Inputs

    x

    The input vector to the ReLU activation function.

    Outputs

    output

    The activated vector after applying the ReLU function.

    Parent page: Activation

    Was this information helpful?

    Except where otherwise noted, this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License. Please see the Autodesk Creative Commons FAQ for more information.

    • Privacy Statement
    • Legal Notices & Trademarks
    • Report Noncompliance
    • © 2025 Autodesk Inc. All rights reserved