Cycles feedback

I believe LazyDodo meant you launch Blender, as in the GUI version of Blender, from the command prompt and use the -noaudio option to stop Blender from producing Audio. You then go to the scene you want to render and press render/F12. And then maybe run that test a few times. One with normal settings, one with suggestions by lsscpp (keep user interface, lock user interface, that sort of stuff).

Or maybe I’m didn’t understand your original comment.

Okay, I’ll test it later.

Meanwhile…

I did a little more digging…

Excuse me @JohnDow but I’m still stuck somewhere between the posts above from the last days, and I quite didn’t get one thing.
In your tests you get ~equal rendertime when you render your files from regular command line, and when you render then from command line with ALSO your file opened in a full UI Blender instance.
Am I right so far?
So, my very question is: why? What is the benefit? Why bother opening blender?
Because to my understanding, the two “command line” part are the reason why you get the same rendertime. In the latter case Blender just happens to be opened, that’s all. It doesn’t seem to be an obstacle for performance, unless your memory explodes due to heavy scenes.
I work like this everyday: more than one Blender instance opened, maybe one of them rendering. Maybe another one doing realtime viewport cycles duties.
I don’t see any new discovery in those numbers. But I’m probably missing the point, so this is what I meant when I said I’m confused, and it’s why I am asking.

I believe it’s more of a thing where you can open Blender, do work on your scene, then when you want to get a “final preview render” to see how things have turned out then modify the scene to correct small issues. You can do this “render” part via two different ways. Have the Blender interface open and press F12 or you can render the scene with the command line. Rendering it with the command line means the scene is rendered quicker. And you can render from command line, getting fast renders, while having Blender open so you can quickly go back to modifying your scene.

I think it’s more of an investigation than an actual use case for Blender? Not sure. I’m not @JohnDow and thus don’t know exactly why this investigation has been done.

1 Like

Thats not what i meant, @JohnDow got it right

I noticed that there’s an issue with adaptive sampling and denoising in the viewport. Basically adaptivity leaves too much noise in the darker areas of the image and the noise pattern is not uniform, which doesn’t play well with AI denoisers. In this example large areas don’t get any samples at all, so they’re left black, and denoiser sees it as real details, creating black artefacts.


Here’s how it looks without adaptive sampling:

I left adaptive sampling setting at default because noise threshold doesn’t change anything and min samples have to be raised close to the total samples to reduce the noise. But that defeats the point of using adaptive sampling.
However the noise pattern is much better in the final renders and can be used with denoising, so it seems that there’s something wrong with adaptive sampling for viewport specifically. Is this a known issue?
Maybe it would be better to disable adaptive sampling in the viewport when denoising is used?

1 Like

Adaptive sampling is just not working well in that case, regardless if using denoising or not. An example .blend file would be useful to investigate it.

Note that the default viewport settings are:

  • Noise Threshold: 0.1
  • Max Samples: 1024
  • Min Samples: 0 (automatically set as 32 based on the noise threshold)

Regarding foreground vs. background rendering performance. Sergey just today committed the change that makes final render display use the same system as viewport render, which may help improve performance.

2 Likes

Adaptive sampler is working really well, it’s good to see that it is now on by default :smiley:

This makes a lot of sense actually.
There are addons that can launch CLI render of the current opened blend file (‘Loom’ comes to mind)

Do you mean 0.01? I did some more testing and 0.01 produces the same result as 0 (default value), while 0.1 is much more noisy. Also it seems that 0.01 is a hardcoded minimum threshold for viewport rendering as going lower doesn’t change anything. Can you confirm if that’s the case?

And speaking generally, why is adaptive sampling leaving so much noise in darker, more complicated areas? I though the point of adaptivity is to give more samples to difficult parts of the image to produce a more uniform noise level overall. But in my example it seems to do the opposite, as seen in the sample count pass:


Shouldn’t sample counts be inverted?

You might not be using the very latest cycles-x version with the adaptive sampling changes. If you have a .blend file that works poorly with that version, we can investigate it.

Maybe a stupid question, but does Cycles-X already include support for the ray-based nodes, such as Bevel and AO? I’ll finally have the time to try it soon.

Thanks.

It does, was only broken a month ago but I think it’s working fine now. Maybe inconsistencies in behaviour but it’s fully functioning

1 Like

Ok, I tried a new build and it’s much better with separate settings for viewport and final render. And noise threshold can go below 0.01 now.
But the issue with insufficient sampling in dark areas still remain. I will clean up the scene and send it to you later

Adaptative sampling is not a magic solution that will resolve any kind of noise. For the renderer to properly measure the noise level, you need enough sampling 1st, or it won’t make the difference between noise and lights/details.
How many min. samples do you have in your scene ?

Operating system: Windows-10-10.0.19041-SP0 64 Bits
Graphics card: NVIDIA GeForce RTX 3060/PCIe/SSE2 NVIDIA Corporation 4.5.0 NVIDIA 471.96

Blender Version
Broken: version: 3.0.0 Alpha, branch: cycles-x, commit date: 2021-09-06 18:07, hash: rBd8088307847c

It appears volume rendering has gotten worse. In the render below when the volume is inside another volume it gets masked out by the inner volume object material. Previously it was performing same as 2.93.

2.93 latest RC vs CyclesX


1 Like

There was a scene provided by another user that shows a similar issue to the ones you’re talking about. However from testing with the latest Cycles-X it seems that this issue only occurs in the viewport when the noise threshold is set to 0. 0.1 seems to be fine, same with probably a lot of other values.

Caustics can also be a cause for problems with adaptive sampling:

I thought that issues like the one with caustics was caused by the light being highly unlikely to hit. As such most rays end up traversing through the scene and hit various parts of the scene that have vary little variance, resulting in very little noise in that area. And so the adaptive sampling system just assumes the noise threshold had been reached and stops. But if it continued sampling, at some point a ray will eventually hit the light resulting in a bright spot detected as “a lot of noise” and the adaptive sampling system won’t stop prematurely. Obviously with the case of caustics this issue can be assisted with different rendering techniques, some of which are planned for the future of Cycles?

Both scenes can be found here: Adaptive Sampling issues - Google Drive

41ce217d3ce9 GUI/CLI vs. 7bbe12b64f46 GUI/CLI

Win10 2004 / 3600X / 16Gb / GTX1660 (471.68)

Barbershop scene (CUDA, 2048x858, custom camera position, 100 samples, OIDN):

41ce217d3ce9

GUI: 95
CLI: 82

7bbe12b64f46

GUI: 89
CLI: 82

Profiler:

Additional notes:

  1. For some reason 7bbe12b64f46 crashes a lot. Besides that, having ‘Persistent data’ on crashes Blender when your press F12 or breaks the renderer (usually) after the first render (CUDA).

  2. I encountered a problem (or not?) with the tiled rendering. I tried it with my one of my test scenes. I set the image resolution to 2k (2560x1440) and then tried different tiles settings. It worked just fine with tiles set to 256, 512 and 1024. But when I set it to 2048, Cycles rendered both tiles, but right after the second tile denoised they both disappeared and I ended up having no picture at all. Just a blank screen with the time stamp of the rendering time (saved picture was just black). The same happened when I set the image resolution to 8k and the tile size to 4096. All four tiles were rendered just fine, but after the last, 4th, tile was denoised - the picture disappeared again. I tried the same with CLI and saw a bunch of errors after each tile:

  • Error writing tile Could not open “E:\Downloads\Blender\Program\BlenderTemp\blender_a08428\cycles-tile-buffer-9604-2099307596352.exr” (bad allocation)
  • called OpenEXROutput::write_tiles without an open file
  • Error writing tile called OpenEXROutput::write_tiles without an open file
  • Error opening tile file E:\Downloads\Blender\Program\BlenderTemp\blender_a08428\cycles-tile-buffer-9604-2099307596352.exr
  • Error reading tiles from file.

Cycles had no problem rendering 8k image with the tile size set to 1024 or with Auto Tile disabled.

1 Like

in my scene GPU + cpu takes almost twice as long compared to GPU. could someone check please, my pc is old. I use cyclesx 06-sep-21.it only happens with this scene