Thats a very valid question… this I imagine goes to the render dev team, this may be the only situation where we may need to have vector pass together with motion blur.
What do you think @brecht ?
Thats a very valid question… this I imagine goes to the render dev team, this may be the only situation where we may need to have vector pass together with motion blur.
What do you think @brecht ?
I will run tests with the temporal denoiser and post results.
The steps to denoise the animation are described in D11442:
bpy.ops.cycles.denoise_animation()
in the Python console (And wait as Blender will freeze while it denoises the animation)Note: The part you might be confused about is bpy.ops.cycles.denoise_animation()
. You might think you need to put in some parameters, but you don’t. If you just use the command bpy.ops.cycles.denoise_animation()
it will work.
I will post a video showing what to do if you want it.
Great, thanks for this info! This was the thing that I was missing. All the OpenEXR and passes stuff is pretty selft-explanatory. But I didn’t know that I can just leave the parameters empty.
I have finally finished the renders. It can be watched at the link below, or can be downloaded from the link below so you don’t have to deal with Google’s compression (Note: You may need to wait a bit for the 4k version of this video to process if you want to stream it rather than download it):
Some notes:
OptiX Standard
, OptiX Temporal
, and OIDN
. Let’s explain each.Accurate
and making use of Albedo and Normal passes.0.01 - OptiX Standard
. The 0.01
is the Noise Threshold
setting for that specific render as the sample count was set to 4096 and adaptive sampling was enabled to achieve a specific noise level.Animated Seed
turned OFF. I personally found that having Animated Seed
off helped OptiX temporal denoising, so that’s why I did it. However it should be noted that adaptive sampling does kind of act like an animated seed. Although my testing was rather limited and it may in fact be better with Animated Seed
turned on (I will do more testing).Classroom
” scene from this site: Demo Files — blender.org
There are more denoising setups that could be tested. Different prefiltering settings of OIDN (None
and Fast
), a complex multi-pass OIDN denoising setup in the compositor, NLM, NLM temporal denoising, temporal re-projection like Statix
talked about, etc. I’m sorry if I didn’t test the specific denoiser you wanted to see.
I will try to run some more tests on more scenes in coming days and post the results here.
Hi, you put a lot of effort in to Cycles over the last months and I really appreciate this.
I guess many other user feel the same.
Cheers, mib
An update on this post. I have tested with the Animated Seed setting on and off to see how it impacts temporal denoising.
Having the animated seed turned off did help with temporal stability, but it introduces structured patterns across frames in the denoising, most noticeable when using a lower sample count.
I rendered two more scenes. The Junkshop and Monster scene from the Blender demo files: Demo Files — blender.org
Some notes about the renders:
OptiX Temporal
, Standard OIDN
, and Multi-Pass OIDN
. Let’s explain each.OptiX Temporal
is the new OptiX temporal denoiser, making use of OptiX 7.3. The denoiser was given the Render
, Denoising Albedo
, Denoising Normals
, and Vector
passes.Standard OIDN
is OIDN 1.4 with prefiltering set to Accurate
. The denoiser was given the Render
, Denoising Albedo
, and Denoising Normals
passes.Multi-pass OIDN
is OIDN 1.4 with prefiltering set to Accurate
applied to each render pass (E.G. Diffuse direct
, Diffuse indirect
, Glossy direct
, etc) prior to merging them to recreate the image. The Denoising Albedo
and Denoising Normals
are used as the inputs for each step of denoising. I ran this test as I’ve seen a few sources online suggest this produces better detail preservation that Standard OIDN
and I can confirm it in some scenes.Note: I would recommend downloading the file to watch it in the highest quality:
Now, let’s talk about some of the results.
OptiX temporal
to Standard OIDN
. However, if you look above the bed at the purple cloak, OptiX temporal
is more temporally stable than the other options.Overall, it seems OptiX temporal
denoiser loses detail compared to the other options, including the non-temporal form of OptiX denoising, in favor of temporal stability. Which you prefer is up to you.
I’m probably going to run one more test comparing denoisiers unless anyone has anything specific they’d like me to test.
The test I am conducting is I’m comparing NLM Temporal
denoising to OptiX Temporal
denoising.
One thing I would like to note for people that have been following the tests I’ve done: My results are only applicable to the scenes I tested. There are many more test scenes with different objects, material types, movements, etc, that will have different results.
I got an NLM Temporal test done. It can be downloaded from the link below.
Some details about what was used:
Denoising Albedo
, Denoising Normal
, and Vector
auxiliary passes.Denoising Data
with 5 frames of temporal history. The Strength
and Feature Strength
is set to 0.5
and the denoising radius is either 8px or 24px as noted in the name of the render.Have you tried with animated seed/noise on? Or what was the motivation to turn it off?
I have run tests with the animated seed off and on.
Having it on lead to greater temporal instability after denoising. There are “artifacts” that occur due to the animated seed being turned off, but I was personally willing to accept those artifacts in my test for improved temporal stability.
Try using low Samples, it’s difficult to see the difference when the samples are already high. Use samples of between 4 to 8 let’s see the real difference in denoising
OptiX 7.5 has been released with some denoiser-related news, of which I understand very little about what they are
With regard to the denoising changes, it seems two main things have occurred:
At least that’s what I interpreted the release notes to mean.
I wonder if the UPSCALE feature would be available for RTX cards only…
NVIDIA® OptiX™ Version 7.5.0
Graphics Hardware:
● All NVIDIA GPUs of Compute Capability 5.0 (Maxwell) or higher are supported.
Graphics Driver:
● NVIDIA OptiX 7.5.0 requires that you install an R515+ driver.
● Windows 8.1/10 64-bit; Linux RHEL 4.8+ or Ubuntu 10.10+ 64-bit
Hi, just trying this out. I need to do it from my own EXR files generated with Blender’s file output node.
I’ve set the file output node as below so that it’s layers and naming convention match what blender usually outputs:
This is the comparison between the blender output on the left, and my exr on the right:
but it’s giving me an error:
bpy.ops.cycles.denoise_animation()
Error: Could not find a render layer containing denoising data and motion vector passes
I’ve set blender’s output to the location and file name of my own EXR sequence prior to running the script, so I’m not sure what the issue could be. Is my naming convention incorrect in the file output node?