What happened to Scrambling Distance? It´s been used in Theory and it will help speed up rendering

acctualy i remember the D808 case. i also readed it again. u said back in that time that one of the reasons why u droped it it was because it would destroy back comp. but since 2.8 is doing this… maybe there is new aproach that we can acctualy implement adaptive sampling and scrambling distance. i mean. they did the job… even in test era and that was long time ago… done on knee.

1 Like

Then it´s the perfect time to include it.

On the other hand, I shifted from the build with SD and DS to an stardard master daily build version, and nothing broke, the only thing was that instead of having Dithered Sobol it was configured as Correlated Multijitter, but everything was working normally.

I hope those two features can be included in master as experimental.

1 Like

The build will crash with textures being plugged in to volumes. Its unrelated to scrambling or dithered sampling though.

You should never ever get slower results with scrambling unless you are using a CPU with adaptive clocks or you are doing stuff in the background which you shouldn’t be doing when trying to benchmark things

Also IDK adaptive sampling definitely wasn’t finished and would take a lot of work on Lukas part to get it implemented. But scrambling and dithered can simply be committed today and work in master(2.79lol)

Im sure with the api changed there would need to be some gui adjustments for 2.8

3 Likes

I want to share with alll of you a live streaming that I did reviewing the optix denoiser AND these two features, the interesting part for the thread is the part where I go over the features, I can´t find anything hurting Blender or any user, and it works flawlessly.

The Scramble Distance and the Dithered Sobol part starts in minute 25 more or less.

I hope it´s ok that I put this here, I sincerely think that it is related to all this.

Cheers!

5 Likes

This video does a really good job showing all the good parts and bad parts of scrambling. It took @JuanGea less than 3 days to fully understand scrambling and make this in depth video explaining how it should be used… 3 Days! If an artist cant figure this out they probably shouldn’t be trying to use blender in the first place.

Anyone doubting the importance of these features just watch the video. (He even shows you how to break your renders without using scrambling haha)

4 Likes

Something that wouldn’t be entirely doable without the help of scrambling + dithered because it would just take too long. Also its helped a little with the optix denoiser

8 Likes

Thanks for pointing out the importans of the these experimental features. Time saved on rendering = saved money = more money to devs.
It’s an important a useful features for a certain use cases and even thought it may break a certain things it’s all use case and artist’s experience dependent, till it provides a reasonable good-looking result asap.

Is there a build with a Persistent data patch and all other features mentioned here avaliable for testing?
Also how the Scrambling Distance and Dithering is usable with animation rendering? Is it usable for denoising outside of blender?

1 Like

Peristent data has been broken for quite a while so there arent any builds including it. Scrambling and dithered both work with blenders denoiser but external denoisers im not too sure they would have the best of luck (Just dithered should be fine for external denoisers though. Should make them work even better actually)

1 Like

LINK to all renders in .png: https://1drv.ms/f/s!AtmMeBB1cwnzmY9V4vvP2f2obQrRGQ

So I’ve been trying the build for the last 3 days for test renders and from what I’ve seen it works… just great.

There is just one setting really and I found a key to understanding what to expect at least in my cases which is texture heavy scenes lit by HDRs (or sun, spot lights and point lights) and lots of indirect light.

Tbh I still am unable to set simplify AO consistently correctly so I just leave it be. Compared to this it’s so much harder to set simplify AO right.

Set correctlty this literally makes 1070 into 1080ti

I’ve been trying 3 different scenes so far with DS and SD.

Here I present the first one. It’s rather simple geometry with 4k textures and 4k HDR. I have one much heavier I will post later.

I have a lot of findings I put into a graph to better understand and get some conculsions. It’s rather interesting:)

Graph:

Reference render 10000 samples, Dithered Sobol:

Scrambling Distance from 0.01 to 1.0

Scrambling Distance divided by reference

700 samples vs 1400 samples at Scrambling Distance 0.01 (probably should’ve done 0.1 instead because 0.01 is really asking for trouble anyways)

Lastly this is Correlated Multi Jitter vs Dithered Sobol.
When it comes to Dithered Sobol I see little difference at least in texture heavy scenes. But maybe with some change to the denoiser it could potentially work better.


In conclusion I will from now on always use Scrambling Distance builds and would be super super super super super super super super super super super super super super super super super super super super super super super super super super super super super super super super super super super super super super super super happy to see it in Master (experimental) because the feature deserves it 100%.

For archviz I am sure it can be in Supported feature set too but I am not able to test any variable scenario alone and there have been talks it lacks in some very specific stuff like Hair.

0.25 almost the same = 27% faster
0.2 pretty great already = 30% faster
0.1 good preview / often acceptable = 40% faster
0.01 asking for trouble! = 48% faster

11 Likes

Wow, this studio is amazing!

I think this test make clear that, at least, this should be in experimental to give everyone the ability to test it, this will greatly help us… in fact it will help us A LOT!!!

@brecht the interest of users is there, don´t you think we could have this implemented during Beta at least and take a final decission before 2.8 release?

9 Likes

Hey, I just wanted to note that the problem I was having with the build wasn’t because of the build, it was an error on windows (probably because of one of it’s recent freaking updates), and the fact I had too many Blender builds on the same folder.
I didn’t knew this, maybe you did hehe, but to test builds is better to have them on a folder different to the official main install of Blender.

@JuanGea sorry man I just read your response about going to blender.chat to see how to fix the issue. I think I just glazed over it while reading the thread.

Anyway, I can get back to do some testing after work, I really want to try how it works with hair and volumes. :smiley:

1 Like

Do you have linux build maybe? i have sceen that literarly tooks 27H to render and i wonder how much time it would be faster with scrambling distance and dithering.

No problem @JulianPerez :slight_smile:

Perfect. Exactly what I think.

1 Like

LINK to all renders in .png: https://1drv.ms/f/s!AtmMeBB1cwnzmZABj6pzYKWkOc8ENw

Here is the much heavier scene. There’s a spot light for each lamp plus point light in the space. There’s lots of geometry not visible on camera on the floor above and below.

It has clamping and filter glossy turned off so don’t mind the fireflies near the windows. It nicely shows how their distribution changes though especially with SD 0.001.

0.001 was really extreme just to see what happens… It’s interesting though that when it comes to GI, 0.05 is already fine. It might change the lighting a bit at places but it’s very minor.

Scrambling distance helped slightly less here as you can see with the blue line representing the previous scene in the Graph. It’s still very significant though.

As usual here is the gif:

What worries a bit is the underside of lamps which is Light Grey color with Specular 0.1 with Roughness at 0.250. With the lower scrambling distance values you can see the reflections of the lights (or anything really) don’t get spread out nearly enough as they are sampled from less angles - this improves with more samples. It looks fine from 0.05 up although different (maybe even better actually).
Scene2_pointlightreflections

So in conclusion this scene also benefits from Scrambling Distance. And when I take into account the higher sample count needed to get clean reflective caustics and better data for denoiser it certainly helps Ton.

3 Likes

I finally had some time to do more testing, took an old scene I had as a base to test different materials, and threw in an aditional simple smoke sim just to have several cases into the same scene.
All the renders share the same settings:

  • Denoising off
  • Clamp indirect: 100
  • Light threshold: 0.02
  • Reflective caustics on
  • Refractive caustics off

Here’s the comparison gif, images below (open the images in a new tab to see them at full scale):
Comparison

From this tests I can say that anything other than SSS and volume (for smoke) get pretty good results even at low values for scrambling distance. SSS gets messed up with anything lower than 0.2; at that value the subsurface result looks slightly different but I wouldn’t say is bad, with enough samples and denoiser activated the difference won’t be noticeable unless you have side by side comparisons.
I can’t see any glaring error or artifact on the hair, another test with fur on an animal or a carpet might be necessary.
At the extreme value of 0.005 the fire and smoke gives a cool flat look, almost like 2d :stuck_out_tongue: too bad the domain is so visible there, otherwise it could be used to create some NPR effects…

2 Likes

Nice test! Could you try one Scrambling Distance with different amount of samples? Also watch out that if you don’t heat up the PC before benchmark these short renders will render considerably slower. Just render for about 1 minute and then relaunch the render.

It seems in this scene the speedup is much less compared to mine. Only 7.5% at 0.2 from 1.00 and you already get some diminishing results. 0.5 even takes longer.

I think it could be because the different shading you have is rather complex at places and scrambling the sampling distance doesn’t help that much - like glass, fire, SSS and hair?

But it could also be that the rest of the scene is quite simple… I found out that for super simple scenes of looking at a plane from above and having just a sunny HDR the speedup from 1 to 0.1 is about 14%. Result is 100% identical.

I can’t say why the render at 0.5 took longer than the one at 1 (maybe it was another process using a lot of the resources at that moment) but overall I think the difference in render times is because of the materials, all of them are using just principled shader or principled volume, but the ones that are usually more complex (SSS, hair, volume) make the render slower.

You’re right, it seems that simpler scenes get the most benefit from scrambling distance. But even with complex materials, any speed up no matter how small can help a lot.

Here’s the scene with SD at 0.2 and different samples: 100, 250, 500 and 1000 respectively.

Without the fire, which is difficult for cycles anyway, the scene at 500samples and SD at 0.2 looks fine; hair, sss, and volume (yellow cube) don’t appear to have any visible artifact.

Honestly, the more I see these tests the less I feel the need of the scrambling feature. I’d like more effort invested in stuff like adaptive sampling or better smoothing in denoise.
That said, I understand people like @LordOdin, that found the way to turn into an advantage the flaws of the feature.

Except for cases where you have a closeup of a character lit only by a bonfire, scrambling distance can help a lot for scenes where the materials aren’t very complex (think archviz and vfx). Plus, there’s not much effort needed, as it’s pretty much ready to be included anyway.

1 Like