Cycles feedback

@JohnDow
Thanks for the new pictures.
Which setting in NLM did you render with, 8px or 24px?
If it was only 8px, could you render another one with 24px?
(please, for comparison)

The SuperImageDenoiser does in this diner scene the best job by far.

[quote=“Alaska, post:807, topic:18598, full:true”]
@Pinus I will run tests for you.

@Alaska
Fantastic, the direct comparison is always the best, thank you!

I used default settings with radius set to 8 pix. I’ve updated the original post with 24 pix version and I also added ‘ground truth’ version - 3000 samples, no denoising.

Since the full benefits of OIDN 1.4 is available through the compositor now, what you can do to improve detail capture is string the denoising normal output through a sharpen filter before piping it into the denoise node (as long as the filtering is set to ‘none’).

Of course this means there will be a little bit of noise that remains, but not in a way that is any worse than the noise ignored by the old NLM algorithm. Then there’s denoising the individual lighting passes and combining them if you really want to make sure every tiny detail is intact.

2 Likes

NLM denoiser actually creates very annoying patches/spots that are just as bad as washed out textures. Lukas Stockner once said that he was working to solve that problem. But as long as that is not fixed, NLM creates those patches/spots that are very noticeable to the eye.

There is no point in making comparisons in this thread with other denoisers that are not included in official Blender (2.93 or 3.0). There are node combinations using OIDN denoise node that give much better results than a single OIDN denoise node:

I think SID uses some kind of these node combinations in compositor.

Also in those tests the total render+processing time was not shown, which is important. One of the methods could get a good result but with a lot of processing time.

you are right if we are talking about low sampel rates, then ODIN and Optix is better
than NLM. But when it comes to removing noise at high sample rates, NLM is better.

The combination of high sampel rate and NLM at 24px produces super
homogeneous, crisp images, and also preserves the color transitions.

NLM shows its limits at low sampel rates, then especially with reflective surfaces
(like the screen) and indirect light, but this can be adjusted.

NLM handles surfaces better than ODIN (texture Paperbox) and produces no light
where there is none (metal head of pepper shaker)
(compare NLM 24px with 3000s no denoiser and ODIN)
NLM is not perfect, but much closer to the original (3000s no denoiser = reference)

When you need a high resolution image in large format (poster print) then you can
clearly see where ODIN or Optix have their weaknesses. Also at high sampel rates
ODIN produces clearly visible washed out areas in actually calm areas, with NLM
this does not happen to me.

I will upload a few sample images in the next few days.

Thanks to all for this fair and open discussion.

@Ace_Dragon
thanks for the tip, I’ll give that a try!

If you have a clear example where NLM is better at reducing noise and taking render times into account, you provide the scene .blend file here to convince developers. Otherwise the discussion in this thread may start to get in the way of other Cycles-X related issues.

That is not entirely true. So that NLM denoiser does not have problems with fireflies and generates even more spots/patches, the input for NLM desnoiser is freed from fireflies before performing the noise reduction (This is why for OIDN denoise node it is preferable to use “Image” output instead of “Noisy image” output in versions prior to 3.0). If you remove fireflies, then you are removing lighting:

Speaking of denoisers… Keep in mind: If you use HDRI as a single source of light in your scene - OptiX can introduce a ton of noise if you touch the exposure settings in the color management. Here is an example:

HDRI is the only light source in the scene. Exposure set to 5.

100 samples, OIDN, A+N, Prefilter set to none:

100 samples, OptiX, A+N:

Same scene but with all lights disabled with exposure set to 5:

OIDN:

OptiX:

1 Like

@Pinus
To each one his/her own eyes.
In all those tests (thank you @JohnDow) I can’t se one better than the first OIDN.
The “better” preservation of textures in Kcycles and Superdenoiser tests look like boosted detail contrast, as you can see comparing the wall wooden panels in the ground truth image. NLM gives splotches here and there (look for reflections in the mugs and tv)

Did a quick test at 128 samples.

While NLM does preserve textures quite a bit better, rest of the image is a disaster. Given that OIDN is constantly being improved by Intel (I think there’s even a gpu accelerated version coming up), I don’t see a point to keep NLM around. The time it would take to improve or even maintain it for 3.0 can be spent on much more important features.

By the way, those denoising artifacts I showed earlier were caused by adaptive sampling creating uneven noise patterns in dark areas, but that’s a separate issue. When using uniform sampling, AI denoising works fine.

1 Like

That test looks like a big “R.I.P. NLM” gravestone

3 Likes

I was about to upload multiple images, but I’m too new ; )
and I’m only allowed to upload one image per post… : (
How long does it take to be allowed to upload multiple images here?

This thread is filling up with images about noise reduction and this is getting a bit offtopic from my point of view.
As I have said repeatedly several times, images are not useful in this case. If you want to convince developers you should share a .blend file showing that NLM denoiser works much better than OIDN 1.4 and is therefore worth keeping. With a .blend file, developers and other users will be able to experiment and verify or not what you say.
You can share .blend files from Google Drive, Dropbox. Or if it is not greater than 24MB then from here:
https://pasteall.org/blend/

I understand what you mean. NLM in Blender was always only useful for removing residual noise at very high render samples for production, and it worked well for this case. Even with solution for temporal consistency possibility.
You share those .blend files where for these very high render samples for production where there is only residual noise, NLM would be better preserving textures than OIDN 1.4 with None or Accurate prefilter option. If OIDN 1.4 is only slightly inferior than NLM in this case, I don’t think it’s worth continuing to insist on keeping NLM when OIDN is constantly improving with each new release, and when we have the ability to use Denoise node in compositor for more flexibility.

3 Likes

@YAFU
We have to discuss this on the basis of concrete examples, because someone at
Blender decided alone, without consultation, to throw NLM overboard, and many
from the Blender community do not agree with this.

Examples also serve to make others aware of the problems.
ODIN is an Intel project, even if it is open source, what do the Blender devs want to
do here? Same situation with Optix, Nvidia is taking care of that.

All AI denoisers still have a lot to “learn” Probably in 2-3 years they will be much
better and disqualify NLM, but until then we need a working alternative for high-
samples high-res render projects.

With NLM we would also have an independent project.

I’m happy to share my .blend file with the responsible blenders developers, but I
want to put this on a broad basis, because I absolutely can’t understand the
decision to drop NLM. I want also a simple and clean solution, and not that it says
again “known issue” and then follows, like so often, you have to tweak this and this,
and then you have to add in the compositor this and this nodes, then it works…
(like Mantaflow, what a disaster…)

We need simple solutions; select the denoiser, choose a few settings, render = done.

Let the users decide which denoiser they want to use, that’s actually the strength of
“open-source”. isn’t it?

Blender like many other large Open Source projects, not only use their own solutions, they also use a lot of external libraries/projects:
https://svn.blender.org/svnroot/bf-blender/trunk/lib/linux_centos7_x86_64/

Resources for development and maintenance are limited. If the development and maintenance of own solution does not have greater advantages than using an external one, then the external one is used.

I’m sorry it took a while to get results. I was busy with other stuff and go distracted.

Here are the results. For the best viewing experience, open them in a new tab and switch between tabs to compare them.

Scene rendered at 512 samples without denosing


Scene rendered at 512 samples with NLM denoising with a radius of 8px


Scene rendered at 512 samples with NLM denoising with a radius of 24px


Scene rendered at 512 samples with OIDN 1.4 denoising with prefilter set to None and making use of the denoising Albedo and Normal passes


Of these tests, I personally believe OIDN 1.4 with prefilter set to None and making use of the denoising Albedo and Normal passes is the best option at 512 samples. Both of the NLM renders have blotches that the OIDN one does not. OIDN does appear to have issues with some texture detail on the pillows being distorted, but I believe the removal of blotches is more desirable then this artifact.

It should be noted that there are parts of the scene with artifacts in all these renders. This is a issue with the scene, not the denoisers or renders.

It should also be noted that the texture detail in many parts of this scene are really fine meaning it can be hard for the denoiser to work with.


Scene rendered at 4092 samples without denosing


Scene rendered at 4092 samples with NLM denoising with a radius of 8px


Scene rendered at 4092 samples with NLM denoising with a radius of 24px


Scene rendered at 4092 samples with OIDN 1.4 denoising with prefilter set to None and making use of the denoising Albedo and Normal passes


Out of these options at 4092 samples, NLM and OIDN 1.4 both have different aspects I prefer, but I would generally pick OIDN 1.4.

6 Likes

My two cents on the denoiser debate is that you all left repeatability/reproducibility out of the question, which is a giant topic on animations. The best denoiser isn’t worth anything if the solution it determines or guesses (AI) is different on every frame, then you get soft wobbling splotches instead of pixel noise which is a lot harder to come by in compositing. Anyone knows how NLM handles this? Coming from 3dsmax with Vray / Corona and haven’t done any animation tests with 2.9x NLM so far.

NLM has a animation denoising component with inputs to increase temporal stability. It can be called from the Python console, however I personally was unable to get it to work. Probably because I wasn’t using it properly.

OptiX supports animation denoising with inputs to increase temporal stability, and Patrick Mours is working on a patch to add support for it in Blender/Cycles, once again called from the command line. From my own limited testing the OptiX animation denoiser increases temporal stability when compared against standard OptiX, but seems to only really match OIDN in temporal stability. And OIDN doesn’t have a temporal component. Although, my testing is limited and I might of been using something wrong, so don’t take my word on this.

Presumably in the future OIDN will add a temporal component to increase temporal stability, and Blender/Cycles will update to make use of that. But that feature isn’t out yet so we can’t talk about it.

Disney has a temporal denoiser that I believe is AI based that’s really good. But it doesn’t appear to be publicly available and it’s not in Blender so comparisons can’t really be made.

There are some other temporal denoisers you can access. But none of them are included in Blender/Cycles by default.

2 Likes

I think Super Image denoiser is using Cycles NLM temporal denoiser feature when you choose Temporal denoiser from SID (Blender 2.93 or earlier).

Regarding NLM vs OIDN 1.4. As I have pointed out before, probably NLM denoiser preserves fine texture details better than default OIDN or single OIDN node in compositor at high render samples. But you have to take into account two things, NLM will change lighting and shadows of the scene from what I have already explained before regarding processed fireflies. And a custom OIDN group in compositor is going to be superior to NLM even preserving fine details at high render samples.

Here I share an OIDN group for those who want to Append it to the scene and do tests (OIDN nodes are using Accurate prefilter there):

Perhaps Blender by default should include some addon to add custom OIDN group in compositor and enabled correspondent passes to make it easier for the user.