2024-02-20 Render & Cycles Meeting

Attendees

  • Brecht Van Lommel (Blender)
  • Thomas Dinges (Blender)
  • Weizhen Huang (Blender)
  • Lukas Stockner (Blender)
  • Christophe Hery (Meta)

Notes

  • Not much work remaining for Blender 4.1, some small things for OpenImageDenoise and keeping track of any significant new bugs reported.
  • There was a review of the following work in progress:
    • Blue noised dithered sampling. Biggest downside is that this doesn’t do anything with the default setting of max samples 1024, it only works with something small like 4. We don’t have a known solution at this point, maybe something simple like using it for the first few samples and then switching is fine in practice. Control variates were brought up as a potential solution. Will have to be tested and researched a bit.
    • Volume stack priority and nested IOR. Decided to make priority an object level property, same as for example Renderman. Did not see a good reason to have it at shader node level, and seems convenient to be able to change this for objects that share the same material.
    • Thin film iridescence to Principled BSDF
    • Multi-scale Principled Hair Huang Model
  • It’s now possible to test pull requests on the buildbot on NVIDIA gpus with the +gpu blender-bot option. There’s also a mechanism for tracking NVIDIA and Apple Silicon GPU performance over time, more info on that will follow.

Practical Info

This is a weekly video chat meeting for planning and discussion of Blender rendering development. Any contributor (developer, UI/UX designer, writer, …) working on rendering in Blender is welcome to join and add proposed items to the agenda.

For users and other interested parties, we ask to read the meeting notes instead so that the meeting can remain focused.

14 Likes

Maybe I’m missing something, but if I remember correctly, one the key points of that method was to «produce significantly more faithful images, especially at low sampling rates.» In their original presentation authors also said: «The method works best with low sample counts when the amount of approximation areas is the highest.» In other words, it’s not a ‘limitation’ or ‘downside’, it’s just a feature :slight_smile:

Sounds nice. When we use denoisers, we can set the number of samples after which the denoiser starts working. I think, something like that could be used here. First 9-16 “bare” samples rendered using “blue-noise”, the rest using “traditional methods”. When it comes to “adaptive sampling” in this situation, it should only kick in after the “blue-noise” samples finished rendering.

Indeed, for preview in Viewport or Direct Light, Limited GI preset, I don’t use default preset of 1024 samples max. Nobody has interest to keep it for integrator presets lower than Default one.
I use 8, 24, 32, 48 samples for preview in viewport.
NPR stuff with Direct Light may be less than 8.

What does that base 4 means, here ?
Does it mean that user is restricted to 4, 16, 64 and 8, 24, 32 will be garbage ?
Or is there no downside at low samples for different samples amount than base 4 ?
If there is a benefit only for 3 or 4 different samples counts, the feature does not worth the effort.
If there is a benefit for 64 or 256 samples counts in the range from 0 to 64 or 0 to 256, it worths it.

  • Blue noised dither sampling provides a benefit for any low number of samples.
  • Sampling patterns tend to works best with 4, 16, 64, … samples, with or without blue noise dithered sampling.
  • Any number of samples should give correct results.

Combining different techniques like this can increase noise compared to using a single sampling pattern for every sample. Obviously some testing will need to be done for this specific configuration, but it may not be something that actually comes to Cycles due to these properties.

It is hard to say anything without any tests. You may be right and it is a dumb decision to switch noise patterns of the fly, or maybe it is not a big problem or even not a problem at all. Time will tell.

PS. On a separate note, there are a lot of different “pattern” based methods of increasing viewport performance. You just have to pick which one (or a few) fits your needs. For example, these two images may look the same, but one of them has half of its pixels missing. The color information for the missing pixels was taken from the neighboring ones. In other words, you can get away with only half of the pixels actually rendered without your brain even noticing the difference. Some wouldn’t even notice if those pixels were actually missing :slight_smile:

4 Likes

Devtalk uses lossy image compression on images uploaded to the platform. This will impact the ability to spot the actual differences between the images you uploaded.

1 Like

Biggest downside is that this doesn’t do anything with the default setting of max samples 1024, it only works with something small like 4.

Are there any drawbacks to it, like slower renders? If it produces improved results at lower samples but the same at higher, then it still sounds like an objective improvement in practice.

There is no noticeable difference in render speed.

What Brecht was talking about was actually this:
If the max sample count is set to 4, then at 4 samples per pixel, you will generally get a blue noise pattern.

If the max sample count is set to 1024, then at 4 samples per pixel, you are unlikely to get a blue noise pattern. But at 1024 you will get a blue noise pattern, but since it’s a high sample count, you are unlikely to notice the benefit due to the low noise.

So it improves at low sample counts, but only if you’ve set the sample count to be low. If the sample count is high, there are very few to no benefits at low samples counts.

For this blue noise dithering technique to function properly, the number of samples used in rendering, and the “max samples” setting must be the same.

3 Likes