These objects describe the renderable geometry in the scene, and present a thin (but obvious) abstraction over the underlying GPU resources they represent. They're used by shapes to describe the renderable geometry (this applies to both Maya's internal shapes and plugin shapes). And they're used by shaders to bind and render the geometry (again, this applies to both Maya's internal shaders and plugin shaders). This clean separation means that custom shaders can work with Maya shapes, Maya shaders can work with custom shapes, and custom shaders and custom shapes can also work together.
However, unlike previous Maya viewports, Viewport 2.0 tries to cache and re-use as much data and information as possible - so unless the scene state is changing, only the final render call will be used to render the cached state.
MGeometry represents the vertex and index data for any renderable geometric entity in Maya. These are most commonly shapes (meshes, NURBS surfaces, subdivision surfaces), but it may also be used to represent any other in-viewport elements (including the grid, manipulators, tool feedback, etc).
In the case of DAG objects, each MGeometry instance holds all the renderable data for all instances of a single object. This includes all of the vertex and index buffer data that describes the shape’s control points.
MGeometry is used in several places in the Viewport 2.0 API. In some cases it is simply meant to provide read-only access to existing geometry data, and in others the user is required to fill the MGeometry object with vertex and index data needed to draw a particular object.
MVertexBuffer is a thin wrapper around a graphics card vertex buffer. Each vertex buffer has a name, a semantic (position, normal, uv, etc), and a type (e.g. float3) and this data is encapsulated by an instance of the MVertexBufferDescriptor class. Other than hardware memory and shader interpolant limits, there are no Maya limits to the number of vertex buffers that can be added to an MGeometry object (e.g. an object can have multiple position streams, multiple normal streams, etc)
MIndexBuffer is the index equivalent of MVertexBuffer. An MGeometry object can include 0, 1, or multiple index buffers depending on how many renderable objects (render items, see below) the MGeometry object represents.
MGeometryRequirements describes the geometry streams and index buffers that are required to draw all render items associated with a specific object. This class is passed to implementations of MPxGeometryOverride to indicate what geometry needs to be produced.
The list of vertex buffers stored on this object is formed by taking the union of the requirements from all of the render items stored on the object. Any one render item can be examined to determine the requirements for that item; however, all data is shared where possible.
The render item is deliberately a light weight object to make it easy and efficient to share heavy vertex/index data across multiple renders of the same underlying geometry (e.g. to draw wireframe over shaded mode re-using the same position data).
The shader associated with a render item is what drives the geometry requirements that need to be fulfilled to draw the object. These requirements can be retrieved from MRenderItem and will be nonempty if a shader is assigned. Shaders can be acquired through MShaderManager.
Render items are created in two places. First, Maya automatically creates one render item for each shader assignment to each instance of each object. These render items may be disabled but not removed and are automatically associated with the shader described by the Maya shading assignment. Second, users may add additional render items per instance for any custom purpose.