MPxShadingNodeOverride (along with its subclass MPxSurfaceShadingNodeOverride) is the API entry point to override the hardware shading fragment used to render a plug-in software shading node in Viewport 2.0. Unlike MPxShaderOverride, implementations of MPxShadingNodeOverride are not responsible for the full draw. In fact, such implementations do no drawing at all.
Implementations of MPxShadingNodeOverride must be associated with specific types of shading nodes. In most cases, a plug-in defines a shading node, and a separate MPxShadingNodeOverride is written to provide Viewport 2.0 support. MPxShadingNodeOverride implementations must be registered with MDrawRegistry using a classification string. Shading nodes with classification strings that satisfy the override classification are translated for Viewport 2.0 using the override. The classification string must begin with "drawdb/shader" to be recognized by the system.
The responsibility of a shading node override is to define a hardware shading fragment to be associated with a software shading node. Ideally, the fragment has input parameters which correspond exactly to the input attributes on the node and output parameters, which correspond exactly to output attributes on the node. This is not always possible; normally, hardware fragments support a subset of the functionality of the software shading node. Whenever possible, a shading node override should be able to operate in isolation using only information from the shading node it is operating on. It should not need to know anything about other nodes in the shading network.
When translating a shading network for display in Viewport 2.0, Maya traverses upstream in the dependency graph in a depth first manner, starting with the surface shader connected to the shading engine. For each node it visits, it creates an instance of the shading node override associated with the node type. The override created may be internal or defined by a plug-in (the system sees no difference between the two). After creating an override for a node, Maya requests the name of the fragment or fragment graph to be used for the node from the override. It then connects the fragments together in a graph which approximates the DG connections of the nodes in the Maya shading network. Once fragments for all nodes in the network are created and connected together, Maya connects some global fragments for lighting and geometry information and then compiles the final fragment graph into a shading effect. This shading effect is then applied when drawing scene objects to which the original surface shader has been assigned. If there is a problem translating any node in the network (for example, no override, bad fragment, unable to form connections), then Maya simply prunes the translation at that node; and as a result, the sub-network rooted there provides no contribution to the final shading effect.
When there is a change to an existing shading network, the override system is again invoked. If the change alters the topology of the Maya shading network; or, if an override for a node in the network specifies that the change is equally large, then the pre-existing overrides are all deleted and the translation is redone from scratch. If the change is simply a change to an attribute value, then the overrides are invoked to update the values of any parameters on the final shading effect that correspond to the parameters on the original fragments. Much of the value updating is handled automatically by Maya, but this can be overridden in a specific implementation where needed.
Since Maya sees no difference between internal and plug-in shading node overrides, implementing MPxShadingNodeOverride for a plug-in software shading node is the best way to provide a Viewport 2.0 representation that works with all of the Viewport 2.0 systems. Lights, shadows, screen-space effects, transparency, acceleration structures and integration with unrelated plug-ins all work automatically.
Figure 43: Shows a non-trivial Maya shading network and how it is translated to a hardware fragment graph which is used to create the final shading effect. Each shading node has an associated shading node override. The override specifies the hardware fragment or fragment graph to be created for the node. Those fragments are joined together, along with some global lighting and geometry fragments, to produce a final fragment graph. The input parameters of the graph drive the input parameters of the contained fragments.
Shader fragments and fragments graphs are managed by MFragmentManager. New fragments and/or graphs may be defined using XML and registered with Maya through MFragmentManager. These fragments and graphs may then be referenced by implementations of MPxShadingNodeOverride. For the purposes of MPxShadingNodeOverride, fragments and fragment graphs can be used interchangeably. A fragment graph is merely a special case of a fragment which still has input and output parameters but is composed of other fragments instead of directly defining shading code. In this guide, assume that any reference to a fragment can also refer to a fragment graph.
The XML formats for fragments and fragment graphs are fully defined by XML Schema Documents (XSDs) (see XML Schema). However, at a high level, the format allows the author to define the input and output parameters of a fragment as well as define the code for a Cg and/or HLSL method to implement the shading as required for OpenGL and/or DirectX 11. For fragment graphs, the format allows you to name multiple pre-existing fragments, define how they are connected, and define how the attributes of those fragments map to the external input and output interface of the fragment graph.
All supported internal Maya shading nodes are implemented using fragments and fragment graphs. In fact, MFragmentManager allows you to query the XML for many of Maya’s internal fragments. It also provides facilities for instructing Maya to dynamically dump intermediate fragment graphs, along with the final effect definition, to disk. This allows fragment authors to examine the process of how Maya takes individual fragments and joins them together to ultimately produce the final shading effect. With careful examination, you can produce fragments which integrate very well with pre-defined Maya fragments. Also, plug-ins may reuse Maya’s internal fragments, if possible, instead of defining their own.
The easiest way to learn how to define fragments in XML is to examine pre-existing definitions either from the Developer Kit or by dumping out some of Maya’s internal fragments. The following is a sample fragment taken from the brickShader developer kit sample. The XML first defines the name and settings of the fragment, then the input parameters (called properties), default values for some of the inputs, and the output parameters. Finally it defines two implementations for the fragment: one for OpenGL (written in Cg) and the other for DirectX 11 (written in HLSL). Although the two implementations are identical here (Cg and HLSL have a lot in common), this is not always the case. In particular, you must handle the differences between GL and DX with regard to texture access and matrix order.
<fragment uiName="pluginBrickTexture" name="pluginBrickTexture" type="plumbing" class="ShadeFragment" version="1.0"> <description><![CDATA[Brick procedural texture fragment]]></description> <properties> <float3 name="brickColor" /> <float3 name="jointColor" /> <float name="blurFactor" /> <float2 name="uvCoord" semantic="mayaUvCoordSemantic" flags="varyingInputParam" /> <float2 name="uvFilterSize" /> </properties> <values> <float3 name="brickColor" value="0.750000,0.300000,0.100000" /> <float3 name="jointColor" value="0.750000,0.750000,0.750000" /> </values> <outputs> <float3 name="outColor" /> </outputs> <implementation> <implementation render="OGSRenderer" language="Cg" lang_version="2.1"> <function_name val="pluginBrickTexture" /> <source><![CDATA[ // // Helper function for implementing brick texture fragment // float btnplinearstep(float t, float a, float b) { if (t < a) return 0.0f; if (t > b) return 1.0f; return (t - a)/(b - a); } // // Actual brick texture fragment code, corresponds to the function_name tag // in the implementation definition and signature matches input/output // parameter definitions of the fragment // float3 pluginBrickTexture( float3 brickColor, float3 jointColor, float blurFactor, float2 uv, float2 uvFilterSize) { uv -= floor(uv); // map uv to 0-1 range float v1 = 0.05f; float v2 = 0.45f; float v3 = 0.55f; float v4 = 0.95f; float u1 = 0.05f; float u2 = 0.45f; float u3 = 0.55f; float u4 = 0.95f; float du = blurFactor*uvFilterSize.x/2.0f; float dv = blurFactor*uvFilterSize.y/2.0f; float t = max( min(btnplinearstep(uv.y, v1 - dv, v1 + dv) - btnplinearstep(uv.y, v2 - dv, v2 + dv), max(btnplinearstep(uv.x, u3 - du, u3 + du), 1.0f - btnplinearstep(uv.x, u2 - du, u2 + du))), min(btnplinearstep(uv.y, v3 - dv, v3 + dv) - btnplinearstep(uv.y, v4 - dv, v4 + dv), btnplinearstep(uv.x, u1 - du, u1 + du) - btnplinearstep(uv.x, u4 - du, u4 + du))); return t*brickColor + (1.0f - t)*jointColor; } ]]></source> </implementation> <implementation render="OGSRenderer" language="HLSL" lang_version="11.0"> <function_name val="pluginBrickTexture" /> <source><![CDATA[ // // Helper function for implementing brick texture fragment // float btnplinearstep(float t, float a, float b) { if (t < a) return 0.0f; if (t > b) return 1.0f; return (t - a)/(b - a); } // // Actual brick texture fragment code, corresponds to the function_name tag // in the implementation definition and signature matches input/output // parameter definitions of the fragment // float3 pluginBrickTexture( float3 brickColor, float3 jointColor, float blurFactor, float2 uv, float2 uvFilterSize) { uv -= floor(uv); // map uv to 0-1 range float v1 = 0.05f; float v2 = 0.45f; float v3 = 0.55f; float v4 = 0.95f; float u1 = 0.05f; float u2 = 0.45f; float u3 = 0.55f; float u4 = 0.95f; float du = blurFactor*uvFilterSize.x/2.0f; float dv = blurFactor*uvFilterSize.y/2.0f; float t = max( min(btnplinearstep(uv.y, v1 - dv, v1 + dv) - btnplinearstep(uv.y, v2 - dv, v2 + dv), max(btnplinearstep(uv.x, u3 - du, u3 + du), 1.0f - btnplinearstep(uv.x, u2 - du, u2 + du))), min(btnplinearstep(uv.y, v3 - dv, v3 + dv) - btnplinearstep(uv.y, v4 - dv, v4 + dv), btnplinearstep(uv.x, u1 - du, u1 + du) - btnplinearstep(uv.x, u4 - du, u4 + du))); return t*brickColor + (1.0f - t)*jointColor; } ]]></source> </implementation> </implementation> </fragment>
As previously mentioned, Maya combines the fragments for each node in a shading network and turns the overall fragment graph into a shading effect. The parameters of this effect come from the input parameters of all the fragments. In order to avoid name clashes when multiple fragments define parameters with the same name, Maya renames most parameters uniquely.
In most cases, the values for the parameters on the shading effect are automatically driven by the attributes of the Maya nodes that were used to create the effect. This is done by matching the attributes on each Maya node to the parameters of the corresponding fragment. The name and type of the attributes must match the name and type of the parameters for this automatic relationship to be established. In the sample XML above, the input parameters (or properties) of the brick texture fragment have the same name and data type as the input attributes defined for the brickTexture plug-in node. Thus, Maya automatically sets the values for those parameters on the final shading effect using the values from the attributes on the brickTexture node. No further work is required by the shading node override.
Plug-ins may also specify associations between attributes and parameters of the same type, but with different names, by implementing MPxShadingNodeOverride::getCustomMappings(). This method is called immediately after the fragment is created, but before the automatic mappings are done. No automatic mapping is performed for any parameter on the fragment that already has a custom mapping.
Any attribute on the node that has no mapping to a parameter on the fragment is ignored. Similarly, any parameters on the fragment without a mapping to an attribute on the node is ignored (unless custom parameter setting is done by MPxShadingNodeOverride::updateShader()).
All functionality is driven through these attribute parameter mappings. When Maya is traversing the shading network and building and connecting fragments, it only traverses connections where the input attribute on the node has a defined mapping (custom or automatic). Also, as fragments are combined for all the nodes in the Maya shading graph, their parameters are renamed in order to avoid collisions (allowing the same fragment type to be used multiple times in a graph). Only parameters with mappings are renamed; all others may suffer name collisions which produce unpredictable results.
In addition to informing Maya of the relationship between attributes and parameters, custom mappings may be used to prevent Maya from trying to connect other fragments to a particular parameter; or, to prevent a parameter from being renamed (name collisions become the responsibility of the user). Custom mappings may also be used to tell Maya to rename a parameter to avoid name collisions but not to associate it with any attribute (set the attribute name of the mapping to an empty string). The values for such a parameter must be set manually by implementing MPxShadingNodeOverride::updateShader(). This can be useful when a parameter is not directly driven by an attribute, but must be set with a computed value.
Like input parameters, if the output parameters of the fragment specified by an override match the name and type of output attributes on the associated shading node, Maya automatically forms connections between the output of the fragment and the inputs of other fragments as required by the shading network. Any output attributes on the Maya node that do not have a corresponding output parameter on the fragment are ignored. Output parameters on the fragment which have no matching output attribute on the node are also ignored.
Currently, the fragment system only supports one output parameter per fragment for normal shading fragments. To create a fragment with multiple outputs (to match multiple outputs on a Maya shading node), such a fragment must define its single output parameter as a “struct” output. A separate fragment must be created to define the struct type and the main fragment must be connected to this new struct definition fragment in a fragment graph. The graph can then be used by Maya and Maya automatically matches the names and types of the struct members to the output attributes of the shading node where required. See the checkerShader or fileTexture sample plug-ins for examples of how struct output fragments are created and used.
Again, like input parameters, you can define custom mappings for the cases where the names of the output parameters on the fragment do not match the names of the output attributes on the node (as long as the types still match). Implement MPxShadingNodeOverride::outputForConnection() to handle custom output mappings.
An implementation of MPxShadingNodeOverride, which specifies a fragment to use along with optional custom attribute-parameter mappings, function well in the Viewport 2.0 shading system. The parameter values on the final effect are automatically set to the values of the attributes on the Maya node whenever the Maya node changes. However, if additional control is required, the implementation may also override the updateDG() and updateShader() methods. These two methods are called when Maya needs to update the parameter values on the final shader, and can be used to set the values of parameters on the final effect which do not map directly to attributes on the shading node.
In updateDG(), the override should query and cache any information it needs from the Maya dependency graph. It is an error to attempt to access the DG in updateShader(), and doing so may result in instability.
In updateShader(), the override is given the MShaderInstance of which the fragment it specified is a part. It is also given the full list of attribute-parameter mappings known to Maya for the node (both automatic and custom). Since most parameters are renamed from the original names on the fragment, the implementation must use the "resolved" name from the mappings to set values on the MShaderInstance. The implementation may set the value of any parameter on the shader instance; however, any parameter with a mapping that defines an attribute is set automatically. Only parameters without a mapping or with a mapping that has no attribute need to be handled. Although it is possible to set values on the MShaderInstance for parameters from other fragments in the Maya shading node graph, this behaviour is not recommended or supported. Such values may get overwritten and behaviour is unpredictable.
See the developer kit samples phongShader and fileTexture for examples of custom parameter setting.
In hardware shading terms, all fragment input parameters discussed so far are considered uniform parameters. It is also possible to access varying parameters in fragment code (that is, parameters which are driven by vertex data), as well as parameters with semantics the value of which are automatically populated by the rendering system (also called system parameters).
In both cases, this is accomplished by specifying a “semantic” in addition to a name and type when defining the input parameter. In the case of varying parameters, you must also mark the parameter as varying. In the brick texture XML sample above, the “uvCoord” parameter is an example of a varying parameter. The semantic “mayaUvCoordSemantic” is a special semantic that, when used on parameters with the name “uvCoord”, causes Maya to ensure that the correct UV set is made available to the fragment (based on UV linking). The table below lists semantics that are recognized for varying parameters.
Name | Type | Meaning |
Pm | 3-float | Object space position |
Pw | 3-float | World space position |
Pv | 3-float | View space position |
Nm | 3-float | Object space normal |
Nw | 3-float | World space normal |
U0 - U7 | 2-float or 3-float | UV coordinates from 1 of 8 specific channels. It is preferable to use the “mayaUvCoordSemantic” semantic if UVs are needed, as Maya handles the allocation of UV channels among all fragments in the final shading effect so that there are no collisions. |
mayaUvCoordSemantic | 2-float or 3-float | A non-specific UV channel. Maya ensures that the correct UV data is filled in based on UV linking in the Maya dependency graph. The name of the parameter must be set to “uvCoord”. |
Tw | 3-float | World space tangent. It is preferable to use the “tangent” semantic if tangents are needed, as Maya handles allocating channels among all fragments in the final shading effect so that there are no collisions. |
tangent | 3-float | World space tangent. Maya ensures that the correct tangent data is filled in based on UV linking in the Maya dependency graph. The name of the parameter must be set to “mayaTangentIn”. |
Bw | 3-float | World space bitangent. It is preferable to use the “bitangent” semantic if bitangents are needed, as Maya handles the allocation of channels among all fragments in the final shading effect so that there are no collisions. |
bitangent | 3-float | World space bitangent. Maya ensures that the correct bitangent data is filled in based on UV linking in the Maya dependency graph. The name of the parameter must be set to “mayaBitangentIn”. |
Vw | 3-float | World space view vector |
fcolor or C_4F | 4-float | Vertex color. Supports 1 varying parameter in OpenGL mode. |
colorset | 4-float | Vertex color. Supports more than 1 varying parameter in both OpenGL and DX11 mode. |
The semantics for system parameters are detailed in Semantics supported by Viewport 2.0 to be used with MShaderInstance. These are often used to access active matrices like the current world-view-projection matrix. See the depthShader developer kit sample for an example of using system parameters.
MPxSurfaceShadingNodeOverride is an extension of MPxShadingNodeOverride, specifically for surface shaders. Plug-in surface shader nodes that may be connected directly to a Maya shading engine should define an override which derives from this class instead of MPxShadingNodeOverride when providing support for Viewport 2.0.
Like MPxShadingNodeOverride, implementations of MPxSurfaceShadingNodeOverride must be associated with specific types of shading nodes. MPxSurfaceShadingNodeOverride implementations must be registered with MDrawRegistry using a classification string. Shading nodes with classification strings that satisfy the override classification are translated for Viewport 2.0 using the override. The classification string must begin with "drawdb/shader/surface" to be recognized by the system as a surface shading node.
In addition to providing all the functionality of MPxShadingNodeOverride, MPxSurfaceShadingNodeOverride also lets the override specify attributes and parameters that are treated in unique ways specific to surface shaders.
See the lambertShader sample plug-in in the developer kit for examples of the use of these special settings.
Maya also treats MPxSurfaceShadingNodeOverride objects differently than MPxShadingNodeOverride objects. In particular, Maya attempts to connect the shader fragments for lights to the fragment provided by the implementation of MPxSurfaceShadingNodeOverride. The following named input parameters on the fragment are recognized and have the specified Maya lighting parameters automatically connected to them.
Parameter name | Parameter type | Automatically connected value from lights |
Lw | 3-float | World space light direction vector for diffuse lighting |
HLw and SLw | 3-float | World space light direction vector for specular lighting |
diffuseI | 3-float | Diffuse irradiance for light |
specularI | 3-float | Specular irradiance for light |
ambientIn | 3-float | Total contribution from all ambient lights |
When there are multiple lights in the scene, the contribution from the fragments that make up the final shading effect is computed once for each light and then the results are accumulated. Therefore, the parameters above all apply to the light that is currently being computed. Overrides should not define custom attribute-parameter mappings for the special light parameters. See the fragments and fragment graphs for the Maya Phong shader (mayaPhongSurface) for examples of how light information can be used (dump the fragment and fragment graph XML using MFragmentManager).
Sample plug-ins from the developer kit which implement MPxSurfaceShadingNodeOverride include: lambertShader, phongShader, depthShader, and onbShader.