We added the choice of NVIDIA's OptiX GPU or Intel's Open Image CPU denoiser when rendering and for lightmap baking. Set this in the UI or with Python. Use setDenoiserModel
for switching to the AOV OptiX Denoiser, instead of using the default HDR OptiX Denoiser for raytracing.
It is possible to combine GPU raytracing with CPU denoising and vice-versa. Therefore, CPU raytracing with CPU denoising is now possible, if you don't have compatible RTX GPUs.
Video captions: For Denoising, you can now select between the NVIDIA GPU denoiser or Intels Open Image CPU Denoiser. This can save you a lot of GPU memory when rendering a huge image.
We suggest trying the CPU denoiser, though slower, for rendering very large images (16K), where the GPU might run out of memory due to your graphic card. This provides better results for lightmaps and image rendering. In Render Settings > General Settings > Antialiasing > Type, select either Auto or CPU.
To set the default behavior used when rendering, select Edit > Preferences > Render Settings > Antialiasing and set values for Denoiser, Denoiser Type, and Denoiser Inputs.
Sets the image denoiser used for raytracing. Choose from:
Sets the denoiser used. Choose from:
GPU/Auto - If the hardware and driver version support it, a GPU-based denoiser will be used; otherwise, a CPU-based denoiser will be used.
If GPU/Auto is chosen and image resolution is set higher than the GPU denoiser can handle or no compatible hardware is found, VRED automatically uses to the CPU denoiser.
CPU - Always uses the CPU-based denoiser.
Sets the input buffers used for the denoiser. Choose from:
To set which denoiser is used, use setDenoiserType(CPU)
for the CPU denoiser and setDenoiserType(GPU)
for the GPU.
VRED provides the option to use one of two variations of the OptiX denoiser. The default is the HDR and the newer one is the AOV, which supports some additional features.
setDenoiserModel(2)
switches to the AOV denoiser.setDenoiserModel(0)
uses the default HDR denoiser.