Cycles Requests

That’s a technique to combine different sampling strategies and it’s in use all over Cycles already. For instance here:
image
That’s what “Multiple Importance” does.

3 Likes

hey, thanks for the useful reply. But i think bi-directional rendering can be used for more than just caustics and speed. for eg. if I have a ball with subsurface scattering enabled and I have a light inside the ball no change would happen to the ball on the outside. it would look like a normal ball. But I think with bi- directional it could detect a light inside the ball and show the ball being illuminated from the inside. This also enables the ability to create physically accurate lens flares which is always a plus (i know that lens flares are caustics aswell) but I think the uses of bi-directional rendering alone is a massive plus. I’m sorry if I sound annoying because apparently this feature/request has been asked multiple times before, but I do think that blender would be better with this feature. Again, I am sorry if there’s something that I’m completely missing.

1 Like

if I have a ball with subsurface scattering enabled and I have a light inside the ball no change would happen to the ball on the outside.

That’s a limitation of the shortcuts in the SSS algorithm, not forward-only path tracing… Use a volume shader or give the ball thickness (ex, with a solidify modifier) and this will work fine in current Cycles.

Bidir is almost never used in production rendering because it tends to cause more problems than it solves. And most of the problems it solves are things that are easily cheated around anyway (ex, doing lens flares in post). Read Lukas’ post again, particularly this bit:

It’s not a question of how hard some algorithm is - I could easily implement a basic form of this, or SPPM, or VCM, or whatever in Cycles in a week or so. The problem is that Cycles is a production renderer, and that comes with expectations.

Everything we include has to work with live preview rendering, GPU rendering, ray visibility options, arbitrary surface materials, arbitrary light materials, volumetrics, DoF and motion blur, tiled rendering, denoising and so on - that’s not even close to the full list. Every single point I mentioned above is a complication/challenge when implementing VCM, for example.

1 Like

Not to mention that having a robust bidir pathtracing working both on cpu and gpu is really tricky.
So, given where cycles is today, probably it not considered worth the effort

1 Like

One problem with BiDir rendering is that it breaks the ray type dependent materials that Cycles was designed to allow. Just not compatible with the core philosophies of Cycles, I’m afraid.

That said, I really would like a way to properly get caustics within Cycles. BiDir is effective for that but not compatible. I think there are other algorithms though.

3 Likes

Honestly I wouldn’t mind bidirectionality even if it means no ray type variables.

Unless I’m underestimating the frequency of usage of ray switching in shaders ? To me it’s a trick to hide some objects, create special effects or speed up renders, but I seldom use it.

2 Likes

i did try using a solidify modifier but because of the size of the scene and the amount of polys a solidify modifier is a no go. And using cycles in a water environment with a lot of caustics produces a lot of noise. (I know there’s a way to fake caustics in the shader but it has a lot of limitations). And do you have any idea if this limitation in the SSS algorithm will be fixed? (maybe like a randomwalkv2 like the one in maya) (idk what the difference is exactly or if it fixes the issue that you mentioned with the current SSS randomwalk) Thanks for the response btw.

Cycles Random Walk SSS has recently been optimized:

https://developer.blender.org/D9932

2 Likes

fwiw lately I use it quite a bit

Shame. Would’ve loved to see cycles support some sort of physically accurate caustic rendering or any type of BiDir. I think it would make a really big impact in underwater scenes and blender as a whole to me.

Hi maybe developers are aware about that, is 2 years old open code, but if not, here something new that maybe can be adapted to code or I wish, I really don’t know if that is possible, anyway is an open source research.

A new open source research by Jacopo Pantaleoni. that can be implemented on Cycles this is a new contribution to the rendering world, here the Paper

Fermat, is a high performance research oriented physically based rendering system, trying to produce beautiful pictures following the mathematician’s principle of least time.

https://nvlabs.github.io/fermat/index.html

Left Fermat an opensource research right Path tracer

Quote from Jacopo Pantaleoni

only GPU - CPU BVH builds are abysmally slow and would never be fit for realtime graphics

the renderer is Fermat (an opensource research testbed), which uses OptiX just for BVH build and tracing rays, and schedules shading manually. Today it could also be rebuilt on DXR or VulkanRT.

greetings.

6 Likes

Have you cheked the LuxCore render engine, it has a biderectional engine as well as a Path Tracing + Light Tracing method for caustics which uses the CPU for the specular rays and the GPU for everything else resulting in greater speeds than using the Bidirectional Engine.

1 Like

I think renderman is coming out soon so it would be great to give it a shot since it apparently supports everything we discussed.

I saw that paper 2 days ago, but i missed the video. Frankly it looks too good to be true, as usual.
It reminds me the first Brigade demos i saw in the past.
One promising thing is this quote from the abstract:

[…] and enhancing its robustness to complex lighting scenarios. The core algorithms are highly scalable and low-overhead, requiring only minor modifications to an existing path tracer.

I really hope that this time we can have some sort of implementation.
Too many times the hype (read: drool) was frustrated by some fair and technical explanation of why “we just can’t put this into Cycles”.
BF is looking for a skilled developer for Cycles, who knows…
Let’s all cross our blender-fingers!

6 Likes

Unfortunately, being GPU-only seems like a disqualifying factor.

There you go… me like this :unamused:

Everything that runs on GPU can be implemented to run on the CPU as well. It might not have the same performance benefits.

2 Likes

There is hope in the universe!
Ok that makes sense, if AFAIK GPU code is a sort of a subset of CPU instructions

4 Likes

Hey, I’ve been running into a specific issue with blender as a whole when previewing renders in the viewport in cycles. When I want to see how detailed my final image is I have to run a render just to find out that I didn’t have enough resolution to capture the details and I have to bump up the resolution in the render settings and re-render it. It would be good to have an option like what Maya or c4d have where the render preview can either be a separate window or directly implemented into the viewport.

Hi, may I misunderstood but you can set viewport render to 0 to render forever until it is clean or change samples after render stops to a higher number.
It does not restart the render. For example set samples to 16, start viewport render, change to 32 and it starts from 16.

Cheers, mib