Data Classes

These objects describe the renderable geometry in the scene, and present a thin (but obvious) abstraction over the underlying GPU resources they represent. They're used by shapes to describe the renderable geometry (this applies to both Maya's internal shapes and plugin shapes). And they're used by shaders to bind and render the geometry (again, this applies to both Maya's internal shaders and plugin shaders). This clean separation means that custom shaders can work with Maya shapes, Maya shaders can work with custom shapes, and custom shaders and custom shapes can also work together.

The basic "handshake" between shapes and shaders is as follows:

  1. The shape is queried for the list of render items it needs to render (including multiple material sub-geometries, wireframe selection, component display, etc).
  2. The shader is queried for its geometry requirements (e.g. "I need positions and UV set foo").
  3. Maya works out the super-set of all the geometry requirements on each shape (based on all the render items that use it).
  4. The shape populates all the geometry buffers based on its current state.
  5. The shader receives the list of render items it needs to render, and it can pull out the geometry buffers it needs for each geometry item.

However, unlike previous Maya viewports, Viewport 2.0 tries to cache and re-use as much data and information as possible - so unless the scene state is changing, only the final render call will be used to render the cached state.


MGeometry represents the vertex and index data for any renderable geometric entity in Maya. These are most commonly shapes (meshes, NURBS surfaces, subdivision surfaces), but it may also be used to represent any other in-viewport elements (including the grid, manipulators, tool feedback, etc).

In the case of DAG objects, each MGeometry instance holds all the renderable data for all instances of a single object. This includes all of the vertex and index buffer data that describes the shape’s control points.

MGeometry is used in several places in the Viewport 2.0 API. In some cases it is simply meant to provide read-only access to existing geometry data, and in others the user is required to fill the MGeometry object with vertex and index data needed to draw a particular object.


MVertexBuffer is a thin wrapper around a graphics card vertex buffer. Each vertex buffer has a name, a semantic (position, normal, uv, etc), and a type (e.g. float3) and this data is encapsulated by an instance of the MVertexBufferDescriptor class. Other than hardware memory and shader interpolant limits, there are no Maya limits to the number of vertex buffers that can be added to an MGeometry object (e.g. an object can have multiple position streams, multiple normal streams, etc)


MIndexBuffer is the index equivalent of MVertexBuffer. An MGeometry object can include 0, 1, or multiple index buffers depending on how many renderable objects (render items, see below) the MGeometry object represents.


MGeometryRequirements describes the geometry streams and index buffers that are required to draw all render items associated with a specific object. This class is passed to implementations of MPxGeometryOverride to indicate what geometry needs to be produced.

The list of vertex buffers stored on this object is formed by taking the union of the requirements from all of the render items stored on the object. Any one render item can be examined to determine the requirements for that item; however, all data is shared where possible.


MRenderItem describes a renderable primitive, something that is actually rendered on the graphics card. Each MRenderItem includes:

The render item is deliberately a light weight object to make it easy and efficient to share heavy vertex/index data across multiple renders of the same underlying geometry (e.g. to draw wireframe over shaded mode re-using the same position data).

Some render passes (e.g. shadow pass, depth pass) will use the display mode/option to select which render items to display, but will override the material in order to render other surface properties.

The shader associated with a render item is what drives the geometry requirements that need to be fulfilled to draw the object. These requirements can be retrieved from MRenderItem and will be nonempty if a shader is assigned. Shaders can be acquired through MShaderManager.

Render items may be enabled or disabled to allow or prevent them to draw without needing to delete the render item and recreate it later.

Render items are created in two places. First, Maya automatically creates one render item for each shader assignment to each instance of each object. These render items may be disabled but not removed and are automatically associated with the shader described by the Maya shading assignment. Second, users may add additional render items per instance for any custom purpose.


This class is used to describe the properties of either an existing vertex buffer or one that needs to be created (i.e. to specify geometry requirements).


A simple list of MVertexBufferDescriptor objects.


A simple list of MRenderItem objects. Each item is owned by the list.