Effects interfaces

SSAO (Screen-space ambient occlusion) control

The information in MPassContext provides sufficient information to indicate if a plug-in is being invoked to render passes that must support SSAO (screen-space ambient occlusion).

The internal algorithm for SSAO requires a specific normal-depth pass to produce intermediate results. By default, the renderer sets up a specific shader to be used in this pass. In the case where a plug-in shader causes geometry to be displaced or normal to be modified, it is possible to override this default shader.

This normal-depth pass can be detected by querying MPassContext information in a similar fashion to how shadow map passes can be detected (see Section 4.2.1 Shadowing control). For this pass, the pass identifier is: MPassContext::kNormalDepthPassSemantic.

If any tessellation shaders are required, then the override should be added to an MPxShaderOverride instead of an MShaderInstance.

To add support, any custom shader should take into account the following conditions.

The following is a sample of the shader used for this pass, written in HLSL:

// Check if the back-facing normal needs to be flipped. 
extern bool isSingleSided = false; 
extern float mayaNormalMultiplier = 1.0f; 

// Shader semantic supported by Viewport 2.0 to indicate whether 
// the projection matrix flips the Z component of a point when transformed
// -1.0 if so, otherwise 1.0. 
float gProjZSense : ProjectionZSense; 

// Vertex shader input structure.
struct VS_INPUT_NormalDepth
{
    float3 Pos : POSITION;
    float3 Normal: NORMAL;
};

// Vertex shader output structure.
struct VS_TO_PS_NormalDepth
{
    float4 HPos : SV_Position;
    float4 NormalDepth : TEXCOORD0;
};

// Vertex shader.
VS_TO_PS_NormalDepth VS_NormalDepth(VS_INPUT_NormalDepth In)
{
    VS_TO_PS_NormalDepth Out;
    
    // Transform the vertex from object space to clip space.
    Out.HPos = mul(float4(In.Pos, 1.0f), gWVPXf);
    
    // Record the normal and depth components for the pixel shader.
    // NOTE: This depends on whether the view direction is along the +Z or -Z axis.     
    // The projection matrix "Z sense" determines this. 
    Out.NormalDepth.xyz = mul(In.Normal, gWVITXf);
    Out.NormalDepth.z = gProjZSense * Out.NormalDepth.z;     
    Out.NormalDepth.w = gProjZSense * mul(float4(In.Pos, 1.0f), gWVXf).z; 

    return Out;
}

// Pixel shader output structure.
struct PS_OUT
{
    float4 Normal : SV_Target0;
    float4 Depth : SV_Target1;
};

// Pixel shader.
PS_OUT PS_NormalDepth(VS_TO_PS_NormalDepth In, bool isFrontFace : SV_IsFrontFace) 
{
    PS_OUT Out;

    float3 normal = normalize(In.NormalDepth.xyz);     

    if ( !isSingleSided )     
    {         
        float normalMul = isFrontFace ? mayaNormalMultiplier : -mayaNormalMultiplier;
        normal *= normalMul;     
    }     

    // Set the normal for an unsigned normalized integer target, and depth for a floating-point target.     
    Out.Normal = float4((normal + 1.0f) * 0.5f, 0.0f); 
    Out.Depth  = In.NormalDepth.wwww;

    return Out;
}
NOTE:Described above is the behavior of the internal pass for Maya 2016.

Motion blur control

The information in MPassContext provides sufficient information to indicate if a plug-in is being invoked to render passes that are required to support motion blur.

The internal algorithm for motion blur requires a specific motion-vector pass that produces intermediate results. By default, the renderer sets up a specific shader to be used in this pass. In the case where a plug-in shader causes geometry to be displaced, it is possible to override this default shader.

This motion-vector pass can be detected by querying MPassContext information and checking for a pass semantic matching this string: motionVectorPass or MPassContext::kMotionVectorPassSemantic.

If any tessellation shaders are required, then the override should be added to an MPxShaderOverride instead of an MShaderInstance.

To add support, a custom shader should take into account the following conditions.

// Shader semantics supported by Viewport 2.0.
extern float4x4 World : world;
extern float4x4 previousWorld : previousWorld;
extern float4x4 WorldViewProj : worldviewprojection;
extern float4x4 viewProjection : viewProjection;
extern float4x4 previousViewProjection : previousViewProjection;
extern float2 viewportSize : viewportPixelSize;

// The uniform variable will be set by Viewport 2.0 with the value of the
// hardwareRenderingGlobals.motionBlurShutterOpenFraction attribute,
// which denotes the percentage of frame time for which the shutter
// is open, with 0 denoting that the shutter is not open at all, and 1
// denoting that the shutter is open for 100% of the frame time.
extern float shutterOpenFraction = 0.200000f;

// Vertex Shader
VS_TO_PS_MotionVector VS_MotionVector(VS_INPUT_MotionVector In ) 
{ 
    VS_TO_PS_MotionVector Out; 
    Out.Pw = mul( float4(In.pm,1), World).xyz;
    Out.OtherFramePw = mul( float4(In.pm,1), previousWorld).xyz; 
    Out.Pc = mul( float4(In.pm,1), WorldViewProj );
    return Out; 
} 

// Pixel Shader
float4 PS_MotionVector(VS_TO_PS_MotionVector In ) : SV_Target
{ 
    float k = shutterOpenFraction * 100.0f; 
    float4 Pc = mul( float4(In.Pw, 1.0f), viewProjection ); 
    float4 prevPc = mul( float4(In.OtherFramePw, 1.0f), previousViewProjection ); 
    float2 curUV  = Pc.xy / Pc.w; 
    float2 prevUV = prevPc.xy / prevPc.w; 
    float2 vec = (curUV - prevUV) * shutterOpenFraction * 0.5f; 
    vec *= viewportSize; 
    float vecLength = length(vec); 
    vec *= min(k / vecLength, 1.0f); 
    vec /= viewportSize; 
    return float4( vec.x, -vec.y, 1.0f - (Pc.z / Pc.w), vecLength ); 
} 
NOTE:

The algorithm above applies to Maya 2016 and is only one example of how motion vectors may be computed. A custom shader may implement motion blur in any manner appropriate.

The XML wrapper for this implementation for a given release can be found using the MEL command:

ogs -xml mayaMotionVector;

From the API, the MFragmentManager:: getFragmentXML() method can be used to query this fragment. The sample plug-in fragmentDumper demonstrates the use of this method to dump out fragments: :

dumpFragment -fn mayaMotionVector;

Depth of field control

The information in MPassContext provides sufficient information to indicate if a plug-in is being invoked to render passes that must support depth of field (DOF).

The internal algorithm for DOF requires a specific pass to produce intermediate results. By default, the renderer sets up a specific shader for this pass.

The pass can be detected by querying MPassContext information and checking for a pass semantic matching this string: dofPass or MPassContext::kDOFPassSemantic.

In the case where a plug-in shader causes geometry to be displaced, it is possible to override this default shader.

The shader must compute the circle-of-confusion (CoC) value and depth values per pixel. These values are written to the R and G channels of a floating point output target (R32G32), which are bound as color target 0. Depth testing is performed with a locally bound depth target.

The computation is as follows:

0.5 * alpha * abs( Z - Zf ) / Z, where alpha = F*F/(A*(Zf-F))

Where:

The following is an example of a shader used for this pass, written in HLSL:

float4 mayaCoCDepth( float alpha, float focusDist, float3 Pw, float4x4 view ) 
{
    float z = abs( mul( float4(Pw, 1), view ).z );
    float CoC = 0.5f * alpha * (z - focusDist) / z; 
    // Write to R, G channels of output
    return float4( CoC, z, 0, 1 ); 
} 

Where Pw is the world space position and view is the view matrix.

The current internal computation for alpha and focus distance is given below:

float ItoM = 0.0254f; // inches converted to m
float CMtoM = 0.01f; // cm converted to m

// Obtain the hyperfocal distance from the camera in m.
//
float focus = <camera shape’s focusDistance> * CMtoM;
focusDist = focus * 100; // Convert back to cm.

// Compute alpha, the COC at infinity, in m
//
float fStop = <camera shape’s fStop> * <camera shape’s focus region scale> // Apply region scale
float F = <camera shape’s camera scale> * <camera shape’s focal length> * 0.001f; // in m
float alpha = F * F / (fStop * (focus - F));

// Convert to UV space
float apertureX = <camera shape’s horizontal film aperture> * 0.001f;
float apertureY = <camera shape’s vertical film aperture> * 0.001f;
alpha /= min( apertureX, apertureY );

Camera shape parameters are denoted by the <> delimiters in the above pseudo-code.

NOTE:

The algorithm above applies to Maya 2016 and is only one example of how COC may be computed. A custom shader may implement COC in any manner appropriate, and does not need to use Maya’s camera node attributes.

Compositing control

The information in MPassContext provides sufficient information to indicate if a plug-in is being invoked to render passes that must composite a beauty pass output without post effects over one with post effects.

The internal algorithm for compositing requires a post-effect pattern pass and a non post effect pattern pass to produce an alpha mask.

During a post effect pattern pass, each render item included in post effects is drawn with color write disabled and depth write enabled by default in order to fill an intermediate depth target with depth values of visible surfaces (regardless of whether they are opaque or transparent because both alpha testing and blending are disabled). This pass can be detected by querying MPassContext information and checking for a pass semantic matching this string: PEPatternPass or MPassContext::kPEPatternPassSemantic.

During a non post effect pattern pass, each render item excluded from post effects is drawn with respect to the above depth target, but the color write is enabled in this pass so that the alpha mask target and the depth target can be updated when passing the depth test. Similar to the beauty pass, the shader assigned to each render item is used to shade its geometry by default, but only the alpha channel of the shading output is written to the alpha mask target. This pass can be detected by querying MPassContext information and checking for a pass semantic matching this string: nonPEPatternPass or MPassContext::kNonPEPatternPassSemantic.

Render items are excluded from post effects by default. The MPxGeometryOverride and MPxSubSceneOverride interfaces have render item level control, and if needed, they can specify that certain render items be included in post effects by setting MRenderItem::setExcludedFromPostEffects() to false. The MPxDrawOverride interface has object-level control, and it can be included in post effects by either overriding its virtual function MPxDrawOverride::excludedFromPostEffect() to return false, or by starting the draw classification with drawdb/geometry/includePostEffects/ during registration.

In the case where a plug-in wants to produce a customized pattern during a non post effect pattern pass, MPxShaderOverride and MPxDrawOverride can use this information to perform either a more or less complex rendering, and MShaderInstance can be used from within either interface for convenience. To add support, any custom shader should take into account the following conditions:

After a non post effect pattern pass, the intermediate color target is bound as AlphaMask during internal compositing as shown below, while the beauty pass output with post effects is bound as SrcTarget and the beauty pass output without post effects is bound as DstTarget. The compositing operation combines the two such that SrcTarget appears in the backgournd and DstTarget appears in the foreground, and it can be expressed as DstTarget over SrcTarget.

float4 CompositeWithAlphaMask( float3 UV,
                               texture2D SrcTarget, sampler SrcTargetSampler,
                               texture2D DstTarget, sampler DstTargetSampler,
                               texture2D AlphaMask, sampler AlphaMaskSampler )
{
    float4 srcColor = SrcTarget.Sample( SrcTargetSampler, UV.xy ); // Color with post effects
    float4 dstColor = DstTarget.Sample( DstTargetSampler, UV.xy ); // Color with no post effects
    float alphaMask = AlphaMask.Sample( AlphaMaskSampler, UV.xy ).a;
    return lerp( srcColor, dstColor, alphaMask );
}

The XML wrapper for this implementation for a given release can be found using the MEL command:

ogs -xml maya_CompositeWithAlphaMask;

From the API, the MFragmentManager:: getFragmentXML() method can be used to query this fragment. The example plug-in fragmentDumper in the Developer Kit demonstrates the use of this method to dump out fragments:

dumpFragment -fn maya_CompositeWithAlphaMask;

Frame and draw context

For information regarding frame and draw context, see Frame and draw contexts.

NOTE:

If MPassContext::shaderOverrideInstance() is called to obtain an override shader at draw time, then the plug-in must update the isSingleSided parameter based on the lighting state obtained from MFrameContext.

IMPORTANT:For geometry requirements on a custom shader, if the appropriate input streams are not provided, then the renderer internally attempts to create them. This may occur per frame and thus affect performance. As an example, if an MPxGeometryOverride were written to provide the geometry for the custom shader, and the code only returns position but not normal streams, then an attempt to internally derive normal values occurs.