The information in MPassContext
provides sufficient information to indicate if a plug-in is being invoked to render passes that must support SSAO (screen-space ambient occlusion).
The internal algorithm for SSAO requires a specific normal-depth pass to produce intermediate results. By default, the renderer sets up a specific shader to be used in this pass. In the case where a plug-in shader causes geometry to be displaced or normal to be modified, it is possible to override this default shader.
This normal-depth pass can be detected by querying MPassContext
information in a similar fashion to how shadow map passes can be detected (see Section 4.2.1 Shadowing control). For this pass, the pass identifier is: MPassContext::kNormalDepthPassSemantic
.
If any tessellation shaders are required, then the override should be added to an MPxShaderOverride
instead of an MShaderInstance
.
To add support, any custom shader should take into account the following conditions.
MFrameContext
when this is available to the plug-in.The following is a sample of the shader used for this pass, written in HLSL:
// Check if the back-facing normal needs to be flipped.
extern bool isSingleSided = false;
extern float mayaNormalMultiplier = 1.0f;
// Shader semantic supported by Viewport 2.0 to indicate whether
// the projection matrix flips the Z component of a point when transformed
// -1.0 if so, otherwise 1.0.
float gProjZSense : ProjectionZSense;
// Vertex shader input structure.
struct VS_INPUT_NormalDepth
{
float3 Pos : POSITION;
float3 Normal: NORMAL;
};
// Vertex shader output structure.
struct VS_TO_PS_NormalDepth
{
float4 HPos : SV_Position;
float4 NormalDepth : TEXCOORD0;
};
// Vertex shader.
VS_TO_PS_NormalDepth VS_NormalDepth(VS_INPUT_NormalDepth In)
{
VS_TO_PS_NormalDepth Out;
// Transform the vertex from object space to clip space.
Out.HPos = mul(float4(In.Pos, 1.0f), gWVPXf);
// Record the normal and depth components for the pixel shader.
// Note: This depends on whether the view direction is along the +Z or -Z axis.
// The projection matrix "Z sense" determines this.
Out.NormalDepth.xyz = mul(In.Normal, gWVITXf);
Out.NormalDepth.z = gProjZSense * Out.NormalDepth.z;
Out.NormalDepth.w = gProjZSense * mul(float4(In.Pos, 1.0f), gWVXf).z;
return Out;
}
// Pixel shader output structure.
struct PS_OUT
{
float4 Normal : SV_Target0;
float4 Depth : SV_Target1;
};
// Pixel shader.
PS_OUT PS_NormalDepth(VS_TO_PS_NormalDepth In, bool isFrontFace : SV_IsFrontFace)
{
PS_OUT Out;
float3 normal = normalize(In.NormalDepth.xyz);
if ( !isSingleSided )
{
float normalMul = isFrontFace ? mayaNormalMultiplier : -mayaNormalMultiplier;
normal *= normalMul;
}
// Set the normal for an unsigned normalized integer target, and depth for a floating-point target.
Out.Normal = float4((normal + 1.0f) * 0.5f, 0.0f);
Out.Depth = In.NormalDepth.wwww;
return Out;
}
Note: Described above is the behavior of the internal pass for Maya 2016.
The information in MPassContext
provides sufficient information to indicate if a plug-in is being invoked to render passes that are required to support motion blur.
The internal algorithm for motion blur requires a specific motion-vector pass that produces intermediate results. By default, the renderer sets up a specific shader to be used in this pass. In the case where a plug-in shader causes geometry to be displaced, it is possible to override this default shader.
This motion-vector pass can be detected by querying MPassContext
information and checking for a pass semantic matching this string: motionVectorPass
or MPassContext::kMotionVectorPassSemantic
.
If any tessellation shaders are required, then the override should be added to an MPxShaderOverride
instead of an MShaderInstance
.
To add support, a custom shader should take into account the following conditions.
MFrameContext
when it is available to the plug-in.Required output values:
The following is a sample of the shader used for this pass, written in HLSL:
// Shader semantics supported by Viewport 2.0.
extern float4x4 World : world;
extern float4x4 previousWorld : previousWorld;
extern float4x4 WorldViewProj : worldviewprojection;
extern float4x4 viewProjection : viewProjection;
extern float4x4 previousViewProjection : previousViewProjection;
extern float2 viewportSize : viewportPixelSize;
// The uniform variable will be set by Viewport 2.0 with the value of the
// hardwareRenderingGlobals.motionBlurShutterOpenFraction attribute,
// which denotes the percentage of frame time for which the shutter
// is open, with 0 denoting that the shutter is not open at all, and 1
// denoting that the shutter is open for 100% of the frame time.
extern float shutterOpenFraction = 0.200000f;
// Vertex Shader
VS_TO_PS_MotionVector VS_MotionVector(VS_INPUT_MotionVector In )
{
VS_TO_PS_MotionVector Out;
Out.Pw = mul( float4(In.pm,1), World).xyz;
Out.OtherFramePw = mul( float4(In.pm,1), previousWorld).xyz;
Out.Pc = mul( float4(In.pm,1), WorldViewProj );
return Out;
}
// Pixel Shader
float4 PS_MotionVector(VS_TO_PS_MotionVector In ) : SV_Target
{
float k = shutterOpenFraction * 100.0f;
float4 Pc = mul( float4(In.Pw, 1.0f), viewProjection );
float4 prevPc = mul( float4(In.OtherFramePw, 1.0f), previousViewProjection );
float2 curUV = Pc.xy / Pc.w;
float2 prevUV = prevPc.xy / prevPc.w;
float2 vec = (curUV - prevUV) * shutterOpenFraction * 0.5f;
vec *= viewportSize;
float vecLength = length(vec);
vec *= min(k / vecLength, 1.0f);
vec /= viewportSize;
return float4( vec.x, -vec.y, 1.0f - (Pc.z / Pc.w), vecLength );
}
Note:The algorithm above applies to Maya 2016 and is only one example of how motion vectors may be computed. A custom shader may implement motion blur in any manner appropriate.
The XML wrapper for this implementation for a given release can be found using the MEL command:
ogs -xml mayaMotionVector;
From the API, the MFragmentManager:: getFragmentXML()
method can be used to query this shade fragment.
The sample plug-in fragmentDumper demonstrates the use of this method to dump out fragments:
dumpFragment -fn mayaMotionVector;
The information in MPassContext
provides sufficient information to indicate if a plug-in is being invoked to render passes that must support depth of field (DOF).
The internal algorithm for DOF requires a specific pass to produce intermediate results. By default, the renderer sets up a specific shader for this pass.
The pass can be detected by querying MPassContext
information and checking for a pass semantic matching this string: dofPass
or MPassContext::kDOFPassSemantic
.
In the case where a plug-in shader causes geometry to be displaced, it is possible to override this default shader.
The shader must compute the circle-of-confusion (CoC) value and depth values per pixel. These values are written to the R and G channels of a floating point output target (R32G32), which are bound as color target 0. Depth testing is performed with a locally bound depth target.
The computation is as follows:
0.5 * alpha * abs( Z - Zf ) / Z, where alpha = F*F/(A*(Zf-F))
Where:
F is the focal length of the lens
A is the aperture number
Zf is the focus distance
alpha is the CoC at infinity
The following is an example of a shader used for this pass, written in HLSL:
float4 mayaCoCDepth( float alpha, float focusDist, float3 Pw, float4x4 view )
{
float z = abs( mul( float4(Pw, 1), view ).z );
float CoC = 0.5f * alpha * (z - focusDist) / z;
// Write to R, G channels of output
return float4( CoC, z, 0, 1 );
}
Where Pw is the world space position and view is the view matrix.
The current internal computation for alpha and focus distance is given below:
float ItoM = 0.0254f; // inches converted to m
float CMtoM = 0.01f; // cm converted to m
// Obtain the hyperfocal distance from the camera in m.
//
float focus = <camera shape’s focusDistance> * CMtoM;
focusDist = focus * 100; // Convert back to cm.
// Compute alpha, the COC at infinity, in m
//
float fStop = <camera shape’s fStop> * <camera shape’s focus region scale> // Apply region scale
float F = <camera shape’s camera scale> * <camera shape’s focal length> * 0.001f; // in m
float alpha = F * F / (fStop * (focus - F));
// Convert to UV space
float apertureX = <camera shape’s horizontal film aperture> * 0.001f;
float apertureY = <camera shape’s vertical film aperture> * 0.001f;
alpha /= min( apertureX, apertureY );
Camera shape parameters are denoted by the <> delimiters in the above pseudo-code.
Note: The algorithm above applies to Maya 2016 and is only one example of how COC may be computed. A custom shader may implement COC in any manner appropriate, and does not need to use Maya’s camera node attributes.
Most internal Maya shapes are affected by post effects, with some exceptions such as image planes.
In contrast, plug-in render items are excluded from post effects by default.
The MPxGeometryOverride
and MPxSubSceneOverride
APIs can override this default behavior per render item by returning false
from MRenderItem::setExcludedFromPostEffects()
.
On the other hand, the MPxDrawOverride
API provides object-level control: it can be included in post effects by either overriding virtual function MPxDrawOverride::excludedFromPostEffect()
to return false
, or by starting the draw classification string with drawdb/geometry/includePostEffects/
during registration. Lastly, semitransparent pixels on all surfaces, whether native and drawn by a plug-in, are excluded from post effects in Viewport 2.0.
In order to composite opaque surfaces affected by post effects with those excluded from post effects, Viewport 2.0 performs a full-screen pass blending between the original beauty pass output and the output of post-effect passes, based on an alpha mask render target generated by a post-effect pattern pass followed by a non-post-effect pattern pass.
During the post-effect pattern pass, render items included in post effects are drawn with color writes disabled and depth writes enabled in order to fill an intermediate depth buffer.
A plug-in can detect this pass by querying the MPassContext
for a pass semantic matching the PEPatternPass
string constant (defined as MPassContext::kPEPatternPassSemantic
).
During the non-post-effect pattern pass, render items excluded from post effects are drawn with the same depth buffer, but with color writes enabled and the alpha mask render target set.
A plug-in can detect this pass by querying the MPassContext
for a pass semantic matching the nonPEPatternPass
string constant (defined as MPassContext::kNonPEPatternPassSemantic
).
By default, render items are drawn in the non-post-effect pattern pass with the same shader as in the beauty pass, however only the alpha channel of the shader output is written to the alpha mask render target due to its 8-bit alpha format.
At the same time, plug-ins using the MPxShaderOverride
and MPxDrawOverride
APIs can use the above information to perform arbitrarily complex rendering in this pass, generating custom patterns.
For convenience, MShaderInstance
can be used with either one of these APIs.
More information about the render target can be obtained from an MDrawContext
instance.
After the non-post-effect pattern pass, the screen-space compositing pass is performed as illustrated by the shader pseudo-code below.
The AlphaMask
texture in this code represents the alpha mask render target, SrcTarget
is the output of post effect passes and DstTarget
is the beauty pass output without post effects.
The result is DstTarget
blended over SrcTarget
.
float4 CompositeWithAlphaMask( float3 UV,
texture2D SrcTarget, sampler SrcTargetSampler,
texture2D DstTarget, sampler DstTargetSampler,
texture2D AlphaMask, sampler AlphaMaskSampler )
{
float4 srcColor = SrcTarget.Sample( SrcTargetSampler, UV.xy ); // Color with post effects
float4 dstColor = DstTarget.Sample( DstTargetSampler, UV.xy ); // Color with no post effects
float alphaMask = AlphaMask.Sample( AlphaMaskSampler, UV.xy ).a;
return lerp( srcColor, dstColor, alphaMask );
}
The XML wrapper for this implementation for a given release can be found using the MEL command:
ogs -xml maya_CompositeWithAlphaMask;
From the API, the MFragmentManager:: getFragmentXML()
method can be used to query this shade fragment.
The example plug-in fragmentDumper in the Developer Kit demonstrates the use of this method to dump out fragments:
dumpFragment -fn maya_CompositeWithAlphaMask;
Finally, after opaque surfaces have been composited, transparent surfaces are rendered without post effects.
For information regarding frame and draw context, see Frame and draw contexts.
Note: If
MPassContext::shaderOverrideInstance()
is called to obtain an override shader at draw time, then the plug-in must update theisSingleSided
parameter based on the lighting state obtained fromMFrameContext
.
Important: For geometry requirements on a custom shader, if the appropriate input streams are not provided, then the renderer internally attempts to create them. This may occur per frame and thus affect performance. As an example, if an
MPxGeometryOverride
were written to provide the geometry for the custom shader, and the code only returns position but not normal streams, then an attempt to internally derive normal values occurs.