What happened to Scrambling Distance? It´s been used in Theory and it will help speed up rendering

Thanks for pointing out the importans of the these experimental features. Time saved on rendering = saved money = more money to devs.
It’s an important a useful features for a certain use cases and even thought it may break a certain things it’s all use case and artist’s experience dependent, till it provides a reasonable good-looking result asap.

Is there a build with a Persistent data patch and all other features mentioned here avaliable for testing?
Also how the Scrambling Distance and Dithering is usable with animation rendering? Is it usable for denoising outside of blender?

1 Like

Peristent data has been broken for quite a while so there arent any builds including it. Scrambling and dithered both work with blenders denoiser but external denoisers im not too sure they would have the best of luck (Just dithered should be fine for external denoisers though. Should make them work even better actually)

1 Like

LINK to all renders in .png: https://1drv.ms/f/s!AtmMeBB1cwnzmY9V4vvP2f2obQrRGQ

So I’ve been trying the build for the last 3 days for test renders and from what I’ve seen it works… just great.

There is just one setting really and I found a key to understanding what to expect at least in my cases which is texture heavy scenes lit by HDRs (or sun, spot lights and point lights) and lots of indirect light.

Tbh I still am unable to set simplify AO consistently correctly so I just leave it be. Compared to this it’s so much harder to set simplify AO right.

Set correctlty this literally makes 1070 into 1080ti

I’ve been trying 3 different scenes so far with DS and SD.

Here I present the first one. It’s rather simple geometry with 4k textures and 4k HDR. I have one much heavier I will post later.

I have a lot of findings I put into a graph to better understand and get some conculsions. It’s rather interesting:)

Graph:

Reference render 10000 samples, Dithered Sobol:

Scrambling Distance from 0.01 to 1.0

Scrambling Distance divided by reference

700 samples vs 1400 samples at Scrambling Distance 0.01 (probably should’ve done 0.1 instead because 0.01 is really asking for trouble anyways)

Lastly this is Correlated Multi Jitter vs Dithered Sobol.
When it comes to Dithered Sobol I see little difference at least in texture heavy scenes. But maybe with some change to the denoiser it could potentially work better.


In conclusion I will from now on always use Scrambling Distance builds and would be super super super super super super super super super super super super super super super super super super super super super super super super super super super super super super super super super super super super super super super super happy to see it in Master (experimental) because the feature deserves it 100%.

For archviz I am sure it can be in Supported feature set too but I am not able to test any variable scenario alone and there have been talks it lacks in some very specific stuff like Hair.

0.25 almost the same = 27% faster
0.2 pretty great already = 30% faster
0.1 good preview / often acceptable = 40% faster
0.01 asking for trouble! = 48% faster

11 Likes

Wow, this studio is amazing!

I think this test make clear that, at least, this should be in experimental to give everyone the ability to test it, this will greatly help us… in fact it will help us A LOT!!!

@brecht the interest of users is there, don´t you think we could have this implemented during Beta at least and take a final decission before 2.8 release?

9 Likes

Hey, I just wanted to note that the problem I was having with the build wasn’t because of the build, it was an error on windows (probably because of one of it’s recent freaking updates), and the fact I had too many Blender builds on the same folder.
I didn’t knew this, maybe you did hehe, but to test builds is better to have them on a folder different to the official main install of Blender.

@JuanGea sorry man I just read your response about going to blender.chat to see how to fix the issue. I think I just glazed over it while reading the thread.

Anyway, I can get back to do some testing after work, I really want to try how it works with hair and volumes. :smiley:

1 Like

Do you have linux build maybe? i have sceen that literarly tooks 27H to render and i wonder how much time it would be faster with scrambling distance and dithering.

No problem @JulianPerez :slight_smile:

Perfect. Exactly what I think.

1 Like

LINK to all renders in .png: https://1drv.ms/f/s!AtmMeBB1cwnzmZABj6pzYKWkOc8ENw

Here is the much heavier scene. There’s a spot light for each lamp plus point light in the space. There’s lots of geometry not visible on camera on the floor above and below.

It has clamping and filter glossy turned off so don’t mind the fireflies near the windows. It nicely shows how their distribution changes though especially with SD 0.001.

0.001 was really extreme just to see what happens… It’s interesting though that when it comes to GI, 0.05 is already fine. It might change the lighting a bit at places but it’s very minor.

Scrambling distance helped slightly less here as you can see with the blue line representing the previous scene in the Graph. It’s still very significant though.

As usual here is the gif:

What worries a bit is the underside of lamps which is Light Grey color with Specular 0.1 with Roughness at 0.250. With the lower scrambling distance values you can see the reflections of the lights (or anything really) don’t get spread out nearly enough as they are sampled from less angles - this improves with more samples. It looks fine from 0.05 up although different (maybe even better actually).
Scene2_pointlightreflections

So in conclusion this scene also benefits from Scrambling Distance. And when I take into account the higher sample count needed to get clean reflective caustics and better data for denoiser it certainly helps Ton.

3 Likes

I finally had some time to do more testing, took an old scene I had as a base to test different materials, and threw in an aditional simple smoke sim just to have several cases into the same scene.
All the renders share the same settings:

  • Denoising off
  • Clamp indirect: 100
  • Light threshold: 0.02
  • Reflective caustics on
  • Refractive caustics off

Here’s the comparison gif, images below (open the images in a new tab to see them at full scale):
Comparison

From this tests I can say that anything other than SSS and volume (for smoke) get pretty good results even at low values for scrambling distance. SSS gets messed up with anything lower than 0.2; at that value the subsurface result looks slightly different but I wouldn’t say is bad, with enough samples and denoiser activated the difference won’t be noticeable unless you have side by side comparisons.
I can’t see any glaring error or artifact on the hair, another test with fur on an animal or a carpet might be necessary.
At the extreme value of 0.005 the fire and smoke gives a cool flat look, almost like 2d :stuck_out_tongue: too bad the domain is so visible there, otherwise it could be used to create some NPR effects…

2 Likes

Nice test! Could you try one Scrambling Distance with different amount of samples? Also watch out that if you don’t heat up the PC before benchmark these short renders will render considerably slower. Just render for about 1 minute and then relaunch the render.

It seems in this scene the speedup is much less compared to mine. Only 7.5% at 0.2 from 1.00 and you already get some diminishing results. 0.5 even takes longer.

I think it could be because the different shading you have is rather complex at places and scrambling the sampling distance doesn’t help that much - like glass, fire, SSS and hair?

But it could also be that the rest of the scene is quite simple… I found out that for super simple scenes of looking at a plane from above and having just a sunny HDR the speedup from 1 to 0.1 is about 14%. Result is 100% identical.

I can’t say why the render at 0.5 took longer than the one at 1 (maybe it was another process using a lot of the resources at that moment) but overall I think the difference in render times is because of the materials, all of them are using just principled shader or principled volume, but the ones that are usually more complex (SSS, hair, volume) make the render slower.

You’re right, it seems that simpler scenes get the most benefit from scrambling distance. But even with complex materials, any speed up no matter how small can help a lot.

Here’s the scene with SD at 0.2 and different samples: 100, 250, 500 and 1000 respectively.

Without the fire, which is difficult for cycles anyway, the scene at 500samples and SD at 0.2 looks fine; hair, sss, and volume (yellow cube) don’t appear to have any visible artifact.

Honestly, the more I see these tests the less I feel the need of the scrambling feature. I’d like more effort invested in stuff like adaptive sampling or better smoothing in denoise.
That said, I understand people like @LordOdin, that found the way to turn into an advantage the flaws of the feature.

Except for cases where you have a closeup of a character lit only by a bonfire, scrambling distance can help a lot for scenes where the materials aren’t very complex (think archviz and vfx). Plus, there’s not much effort needed, as it’s pretty much ready to be included anyway.

1 Like

The thing is that Scramble Distance and Dithered Sobol are already developed and well tested in production by Theory in every single production they do, so it’s a matter of “enabling” it in master, while what you are asking for is something that has to be developed.

It could be further improved? I’m sure it can, but also sampling, or any other Cycles feature, that does not prevent Cycles from having other features,

It’s clear that those two features are super beneficial in a lot of situations, so I don’t understand why you say that the most you see the tests the less you see it’s benefits… the benefit is render time and leas noise, and for denoising, Dithered Sobol is far better than sobol.

Cheers!

1 Like

Also, what are the flaws you see in the feature? Because I only see problems that are solved with more sampling, like the noise itself :slight_smile:

So here’s the last render.

I actually rendered only 2 images with high sample count at 4450 x 3150 px.

Speedup was 30% at SD 0.1 which is pretty impressive.

The visual difference is that wooden beams are slightly darker on sides facing the camera and it might be my subjective assessment that 0.1 was denoised very slightly better and the already pretty much invisible “clouds” are even less visible.

I reckon it would be about 25% at SD 0.15 without visible difference. Don’t have the time to rerender right now though.

This is reference at SD 1.0 and rendered at 3333 samples for 10 hours 40 minutes:

This is scrambled at SD 0.1 and rendered at 3333 samples for 7 hours 30 minutes:

7 Likes

Can you render it with like lowest amount of samples possible in both cases i mean… stock 2.79 and with SD etc. turned on at 0.1 Like acctual real case. because rendering this at 10k samples is like… not really a real benefit… i mean the big advantage of SD is that it is much more clean with same amount of samples. so question is how much speed we can gain i would love to test this on my newest works but i need. linux builds sadly.

Well, it has a benefit, you reach the same amount of samples that gives you a clean picture in less time, in that case with a 30% of time reduction :slight_smile:

I want to post my test pictures, what I do is to reduce the amount of samples I need, I will do both pictures with the same amount of samples.

But IMHO I think here there is enough information and tests that shows that those two features could be, at least, present as experimental features to let everyone test this, so far Linux users and OSx user could not test it because we don´t have Linux and OSx builds, wich is a pity.

Cheers!

3 Likes

I still had no time to publish my own tests, I´m finishing a big project and is the one I will use to show this.

But @bretch what do you think about all this?

Don´t you think this deserves to be at least as an experimental features? (specially since it does not requires additional development)

Cheers!

3 Likes