Texture Mapping

mental ray supports texture, bump, displacement and reflection mapping, all of which may be derived from an image file or procedurally defined using user-supplied shaders. Although there are various support functions provided by the mental ray shader interface, all these functions are fully implementable in shaders.

Texture Files

Procedural textures are computed by shaders, while image textures are read from image files. In practice, textures are a combination of both methods: a procedural texture shaders accepts an image texture parameter, which it accesses to read and filter pixels, and then modifies it according to other parameters to implement projections, scaling, cropping, replication, and other common operations on texture images. Purely procedural texture shaders that do not rely on texture images at all also exist, for example marble or cloud shaders. Some shaders use textures for special purposes; for example, a fur shader that computes hair in a volume might use a texture image to control the hair length or brushing direction.

The following table lists the file formats accepted by mental ray for reading texture image files:

format description color
map
compress comp. bits/comp. extensions
rla
rlb
Wavefront image - RLE 3, 4 8, 16 color+z
pic Softimage image - RLE, - 3, 4 8
alias Alias image - RLE 3 8
rgb Silicon Graphics color - RLE, - 3, 4 8
jpg JFIF image - JPEG 3 8
png Portable Network Graphics - RLE, - 3, 4 8
yes RLE, - 3, 4 8
exr OpenEXR - All 1, 3, 4 half, float tiling
layers
pyramid
tif* TIFF image - RLE, Deflate 1 1, 4, 8
- RLE, Deflate 3, 4 8, 16, float
yes RLE, Deflate 3, 4 4, 8
iff Maya IFF image - RLE, - 1, 3, 4 8, 16, float tiling
color+z
picture Dassault Systèmes PICTURE - RLE 3 8
hdr Radiance RGBE - - 4 8
ppm Portable pixmap - - 3 8, 16
tga Targa image - RLE, - 1, 3, 4 8
- RLE, - 3 5
- RLE, - 4 5/1
yes RLE, - 3, 4 8
lwi Solidworks texture (read-only) - RLE 3 8
bmp MS Windows/OS2 bitmap - - 3, 4 8
yes - 3, 4 1, 4, 8
dds DirectX texture - DXTn 1, 3, 4 8, 16, float
qnt Quantel/Abekas YUV image - YUV 3 3
ct* mental images texture - - 4 8, 16, float
st* mental images alpha texture - - 1 8, 16
- - 1 float
vt
wt
mental images basis vectors - - 2 16
zt mental images depth - - 1 float
nt
mt
mental images vectors - - 3 float
tt mental images label (tag) - - 1 32
bit mental images bit mask - - 1 1
map memory mapped texture - - any any pyramid
remap (tiled) texture - - any any tiling
pyramid

In the table, any combination of comma separated values determines a valid format subtype. For example, the SGI RGB image format will be read if the data type is 8 bits per component with or without alpha, and either RLE compressed or uncompressed.

Note The actual image format is determined by analyzing the file content, not just by checking the filename extension. This allows replacing texture files with memory-mapped textures without changing the name, for example.

The asterisk (*) indicates omission; for example, ct* includes ctfp (floating point), cth (HDR), and ct16 (16 bits).

The extensions column informs about special support for file format features.

Color+Z
The format allows to store a color image and a depth layer in a single file.
Layers
The format supports storing multiple layers (or images) with different data type (or channels) into a single file.
Tiling
Images are stored organized in tiles rather than as a single large block. mental ray is able to optimize access to the file using texture caching. Typically, tiling can be used together with layers and pyramid.
Pyramid
The image is stored along with a sequence of pre-filtered subimages in lower resolutions. This helps texture filtering in mental ray.

Typical image types like black/white, grayscale, color-mapped and true-color images, optionally compressed, are supported. Some of them could be used to supply additional alpha channel information (where number of components is greater than 3). The collection covers industry standard formats that are platform independent, like TIFF, JFIF/JPEG, and OpenEXR, but also special UNIX (PPM) and Windows (BMP, DDS) types, as well as proprietary formats of widely used applications. The mental ray proprietary formats, normally created by mental ray itself, are mainly available to exchange data not storable with other formats. As a special case, mental ray allows storing RGBE data into file formats that accept RGBA. For formats which support to store multiple layers mental ray allows to store several frame buffers in the same file. For IFF and RLA formats, both color and depth buffer can be stored in a single file. For OpenEXR files the number of layers is basically unlimited.

A user-defined material shader is not restricted to the above applications for textures. It is free to evaluate any texture and any number of textures for a given point, and use the result for any purpose.

Typical texture shaders allow a named texture map to be given as an input parameter, and return a color value taken from the texture image. Such shaders can be connected to any color input of other shaders, for example diffuse color input of a material. The color of the diffuse component will then vary across a surface. To shade a given point on a surface, the coordinates in texture space are first determined for the point. The diffuse color used for shading calculations is then the value of the texture map at these coordinates. The shader interface is extremely flexible and permits user-defined shaders to use a number of different approaches, or completely different formats. The remainder of this section describes the standard shader parameters only.

The standard mental ray material shaders support texture mapping for all standard material parameters except the index of refraction. Typical parameters like shininess, transparency, and reflectivity are values of scalar type which may be mapped by a scalar map. Many shaders use color maps to implement bump mapping, a technique that simulates surface structure by sampling the color map several times to determine a normal perturbation; other shaders accept a vector map (normal map) directly that requires only a single sample to achieve the same effect.

Color textures are normally not implemented in the material shader but in a separate texture shader, which is then referenced by the material shader. This separation of work between material and texture shaders allow more flexibility because any texture shader may be combined with any material shader, without having to program any new combination in a new material shader. Also, texturing does not have to be programmed into every material shader parameter. Instead, the material shader offers simple parameters like "diffuse", "ambient", "transparency", and so on. Each parameter may be assigned a static color such as "white", or it may be attached to a texture shader. Even if the material shader was never programmed to accept textures, the operations for which it has an input parameter become controllable by a texture. In fact, it is possible to build whole multi-level graphs of shaders using parameter assignment.

Shaders that do all the work internally are called monolithic shaders, while shaders designed for easy graph building are often called base shaders.

Determining the texture coordinates of a point on a surface to be shaded requires defining a mapping from points on the surface to points in texture space. Such a mapping is itself referred to as a texture space for the surface. Multiple texture spaces may be specified for a surface. If the geometry is a polygon or subdivision surface, a texture space is created by associating texture vertices with the geometric vertices. If the geometry is a free-form surface, a texture space is created by associating a texture surface with the surface. A texture surface is a free-form surface which defines the mapping from the natural surface parameter space to texture space. Texture maps, and therefore texture spaces and texture vertices, may be one, two, or three dimensional.

Pyramid textures are a variant of mip-map textures. When loading a texture that is flagged for filtering, like with the filter property in the scene description, mental ray will build a hierarchy of different-resolution texture images that allow elliptical filtering of texture samples. The map texture file format may already contain such a pyramid, which can save a lot of time and memory during rendering. Without filtering, distant textures would be point-sampled at widely separated locations, missing the texture areas between the samples, which causes texture aliasing. Texture filtering attempts to project the screen pixel on the texture, which results in an elliptic area on the texture. Pyramid textures allow sampling this ellipse very efficiently, taking every pixel in the texture in the ellipse into account without sampling each pixel. Pyramid textures are not restricted to square and power-of-two resolutions, and work with any RGB or RGBA picture file format. The shader can either rely on mental ray's texture projection or specify its own. Filter blurriness can be adjusted per texture.

A procedural texture is free to use the texture space in any way it wants, but texture files are always defined to have unit size and to be repeated through all of texture space. That is, the lower-left corner of the file maps to (0.0, 0.0) in texture space, and again to (1.0, 0.0), (2.0, 0.0), and so on; the lower-right corner maps to (1.0, 0.0), (2.0, 0.0), etc. and the upper right to (1.0, 1.0), (2.0, 2.0), etc.

Copyright © 1986, 2015 NVIDIA ARC GmbH. All rights reserved.