Cycles feedback

The most likely cause for the AO pass being “messed up” is due to the fact that the AO pass isn’t fully implemented yet. It has been added back to Cycles-X recently by making use of shader ray tracing, as talked about here: rB9aa5aeee46ae however shader ray tracing AO probably doesn’t take into consideration transparency properly and as such that might be why you’re facing this issue. It will hopefully be fixed when the AO pass is updated to use a technique other than shader ray tracing, something that seems to be planed based on the list here: ⚓ T87836 Cycles X - GPU Performance

  • Avoid shader ray-tracing for AO pass
2 Likes

is there any way of bypassing it? if I’m desperate enough i could use eevee just to render an AO pass separately and blend it into my composition, or use a older version of blender just for the AO pass

In the case of your scene it seems like transparency is used to cut out pixels from something resembling a Minecraft character. What you can do to get AO working “properly” is to cut out those pixels yourself by removing geometry rather than using a transparency texture.

If you don’t want to go through the process of cutting everything out, I would personally recommend rendering the AO again in a different render engine, regular Cycles or EEVEE, or render the entire scene in regular Cycles rather than parts in Cycles and parts in Cycles-X. As a side note, AO doesn’t need a lot of samples to resolve to a relatively noise free result, so if you’re rendering the AO pass in regular Cycles you can probably get away with a much lower sample count and still get comparable image quality.

1 Like

Looks like after the July14th build, CyclesX no longer recognizes my GTX980 and reports no available GPUs. It reverts to CPU-only rendering.

So I was wondering if with all the recent progress on the Cycles-X project that is visible on the phabricator we can have some info about AMD and Intel GPU support?
I noticed a point about AMD in 2021-06-08 Blender Rendering Meeting
Can we possibly have a little snek peek what’s going on inside?

1 Like

This issue should be fixed when ⚙ D11976 Fix background "leaking" into combined in Cycles X is merged into the Cycles-X branch.

2 Likes
  1. Denoising with normal causes normal map colors to show in the final render combined with the actual render

  2. Viewport is significantly slower with both of my 3090s enabled, while renders are about twice as fast. Would NVLink bridge help?

1 Like

Yeah I see 1 also, seems only happen with Optix denoiser

The workaround is to use compositor, set up denoise but don’t connect normal data.

hmm not in my case, The denoise in the compositor uses open image which is completely fine on my side even with normal. Are you having problems with both optix and opening image?
Optix Viewport:


Optix Final Render:

Open Image Viewport:

Final Render with Open Image Denoise node in compositor (with all 3 channels pluged):

Open Image in render tab without compositor:

Open image is completely fine in both compositor and render tab both viewport and final, while optix has problem in the viewport and also in final render (EDIT: I messed up the setting in the previous edit and now the screenshots are fixed)
PS Also Cycles X is finally consistently faster then master on my machine now
also I thought the OIDN in render tab and compositor were the same, but apparently the compositor one is better.

This is specifically an issue with Optix for me.

1 Like

Follow up to duel cards being slower than one:

I think it’s specifically the Optix denoising step that is much slower when both cards are enabled. When disabling Optix, using duel cards is substantially faster in the viewport. However, when enabling optix in the viewport, things get very very chuggy. There seems to be something broken about optix when more than one GPU is being used.

Cycles X is saying “Out of Memory” when rendering scenes @ 10,000x5,000 pixels, while non Cycles X would normally have handled this exact render. Is this a new limitation & need to upgrade 32gb ram or a bug maybe?

Hi, tiled rendering use less memory, they are working on tiles for high resolution images. This will be slower but you can render. Luxcore use this for Years now.

Cheers, mib

1 Like

This list poped up on Blenderartists today. At the bottom of it you can see there are steps planned to support high res renderings.

https://developer.blender.org/T87837

@mib2berlin While I like Luxcore overall, tiled rendering technically is available but I would not call it usable. It did a lot of tests, cause I often render high res scenes and it was not satisfactory. For one it’s almost impossible to set up and find the best settings unless you are willing to do a lot of test renderings for each scene. And then it comes with a compromise in quality if you really want to keep speed and memory in reasonable regions. The developers themselves admitted, this this is a relic that needs a lot of work to get up to date.

1 Like

Just in case for those with GTX 9xx series and GPU not available and grayed out on render tab (at least on Linux). A workaround is, open the scene and then in Edit > Preferences > System tab, you do some modification there, for example choosing Optix tab and then CUDA tab again. In this way there is some kind of refresh and GPU will be available in render tab.

https://photos.app.goo.gl/KWZrSje7pjhv9Gpt5
Here is a video comparing the viewport render speeds of 1 vs 2 3090s enabled vs Optix enabled and disabled.

Without Optix, 2 3090s are twice as fast, and smoother.
With Optix, using 2 GPUs is slower and chuggier.

EDIT - I MADE A MISTAKE IN THE VIDEO, the OPTIX videos are flipped, i switched the 1 vs 2card clips.

the new build * a966ed155046 cannot recognize my gpu device in Optix mode

seems Optix not work acctually render on CPU

With the new daily build (23.7.2021) Cycles-X is about 3% faster for me. Thank you for that. (I’m testing daily)

What I’m still missing: I’m not able to change the Render-Samplerate. Well, I can change it but it has no affect until I save & reload the file.

1 Like