"Principled v2" feedback/discussion thread

note that this is a single bounce of “pure white” (Illuminant E) light and zero roughness.
Deeper bounces will very quickly saturate, and roughness also deepens the color for basically the same reason (because the surface will sometimes reflect itself.

image

Here is copper and gold after 0-4 bounces (0 means you’re still at white light) using unpolarized light.


Copper seems to be more a saturated brownish-red,could be the sample who knows.

it’s gonna look more saturated when it’s darker lit and also from bounce light.

Note, that the saturation actually increases really fast. Already on the second bounce, gold is out of gamut at F0. (in terms of sRGB, the blue channel ought to be negative) For copper it takes five bounces to get there.
Here are some deeper bounces for copper and gold:


The swatches go from 0-32 bounces from bottom to top and from 0-90° in 1° increments from left to right. Here they are shown in AgX to avoid the spectral problems. But of course that means the colors aren’t quite as directly usable.

1 Like

Ok its difficult to check if this material appearence is correct,since we dont have the sample or reference images from.My tests of your equation comparing the values from the refractive site i posted, the result seems correct.
I think the next step should be to compare the nk materials with reference nk materials from other renderengines.

edit,i have rechecked the Born Wolf equation without squaring for R0 its the same result as your equation for R0 output.

I found the paper from adobe (with f82 math).Lukas posted it before but his link is not working anymore.

I have build the gulbrandsen grazing equation node setup for tint color.Seems to work decent as well.

Thanks,i fixed it.

1 Like

The bottommost subtract and add nodes should use r, not the square root of r.

1 Like

yeah, it’s just what happens if you plug in 0 for the angle. Cosines become 1, Sines become 0, so a bunch of stuff falls away automatically.

Squaring it will effectively amount to doubling the number of bounces.

1 Like

Thanks for the paper link

Parameter values for real metals
In Table 1 we show parameter values for several common metals. To derive these, we started with tabulated spectral complex refractive indexes for each metal, attempting to use measured data with the best balance of recency and resolution within the visible spectrum [17]. Then, for each metal, using 1-nm increments, we calculated the reflectivity at 0◦ , 82◦ , and 90◦using the real Fresnel equations and integrated these values with Illuminant D65 (to align with the white point of many common RGB color spaces) and the 2◦physiologicallyrelevant XYZ color-matching functions (those transformed from the 2◦LMS cone fundamentals) [15]. We then converted the XYZ data into various color spaces, performing a few minor additional adjustments to deal with the limitations of the RGB conversions. Finally, we set the F0 parameter to the 0◦ reflectivity directly and set the F82-tint parameter to the ratio of our calculated reflectivity at 82◦and the value predicted by the Schlick approximation at 82◦
.

[15] http://cvrl.ioo.ucl.ac.uk/

Interesting

steel,titanium,silver,gold nk,Born and Wolf+gulbrandsen tint ,Agx


10 Likes

Do you have a comparison with default 4.0 maybe? Edit: Nvm.

what are you talking about? This is default 4.0.

I suppose the HDRI lighting the models differently based on their position is a bit misleading (Gold being lit green while Silver blue) Nevermind then.

Rendered with 4.1.0 alpha.Since kram and i have posted the node setups you can do you own tests.If something is still unclear pls ask.

Exactly.You can see the HDR used in the single silver rendering above.It has sky ,trees ,bushes and grass.I wonder why such a question came up about,its just reflections.
I deactivated the HDR for the four material objects,because the backround looked to distracting.

I think gold looks lit green because the blue sky is reflected yellowish by gold, gaining that green tint

no, that’s gold being lit teal (which is both green and blue in terms of channels) but since gold is so yellow (red and green in terms of channels), it turns green. Gold is extremely reflective in red and barely reflective in blue.

It’s possible that in reality the sky wouldn’t look as greenish in gold, but if there is a significant difference, it’s gonna be one you’ll need spectral rendering for to achieve

I want to express my gratitude to the Cycles module team for their outstanding work on the shaders – bravo! However, I’ve noticed that the Cycles glass shader seems to be lacking in functionality. Unlike real-world glass properties, it often doesn’t allow light to pass through and requires workarounds to achieve desired results. I’m curious about the intended function of the Cycles glass shader and whether there are plans to address its limitations. I’m aware of the architectural glass toggle that @lukasstockner97 is working on, but the current glass node appears to be ineffective. Thank you!

1 Like

in what way is it ineffective? Could you explain?

The default glass shader simply won’t allow light through it. It’s easily evident. I don’t know what else to explain. https://youtube.com/clip/Ugkxw_iUAdO5XuHnDJ_ECAQx-7vN2NErgJ8V?si=z3ptMguZNS4JdXzl

I saw that this morning. Cycles does let light through glass, it’s just that Cycles is a heavy ray-tracer, and it doesn’t let as much through compared to other engines. Comparing it to LuxCore, which is developed to deal with such things is a bit of an unfair comparison. Arnold, another really commonly used raytracing engine has the same behavior as Cycles, and lets in less compared to other engines. It isn’t broken, it’s just the nature of the engine.

Architectural glass will do the hack in the video and pretty much eliminates that issue.

I acknowledge that it’s not broken, but the current behavior clearly falls short of the expected ideal, particularly in architectural visualization where there’s an expectation of light transmission through transparent mediums. No excuses should be made, given that this aligns with the inherent nature of glass objects.

We all quite aware of the workarounds and tricks but that isn’t ideal IMHO

1 Like

@Roggii and @Bobo_The_Imp , I’m just going to clarify the what’s happening with the glass.

Lights are primarily sampled in Cycles (and many other ray tracing render engines) through a process called “Next Event Estimation” (NEE). The way this works is a ray will hit a surface (E.G. The ground), and make a decision on if it should do NEE or continue with random ray tracing, and if it decides to do NEE, it then picks a light and fires a ray straight at it. If there is a object in the way that’s not transparent, that ray is considered “in shadow” for that light.

The issue with glass is that it’s not 100% transparent, so it gets classed as an obstacle and marks the ray as “in shadow” (unless you are using the architectural glass trick).

So the fix is simple right? Just don’t mark rays that hit the glass during NEE as “in shadow”? Not really.

This can be done in two main ways.

  1. Make it so when a NEE ray hits a piece of glass, make the glass transparent. This is basically the same as the architectural glass trick.
  2. Just don’t mark the ray as “in shadow” and keep the glass glass. But then you encounter an issue. The distortion caused by the glass is likely to make it so the ray misses the light. So even with this technique, things aren’t going to be improved much.

So what can be done to improve it? A few things. Of which quite a few are already in Cycles.


Make glass rough:

When a ray encounters perfectly smooth glass, the ray passes through it in accordance to Snell’s law and is likely to miss the light. But if the ray encounters a rough piece of glass, the ray can decide to trigger NEE off of that piece of glass ensuring it hits a light. This can greatly help with the transmission of light through the glass.

Sharp vs Rough Glass

Smooth Glass:

Rough Glass:

Obviously increasing the roughness of the glass isn’t great for most scenes. For example windows, which are expected to be smooth.

So Cycles has a feature called “Filter Glossy”. Which will increase the roughness of the glass when a ray hits it after previously hitting a diffuse surface. This allows the glass to remain smooth to the camera and certain material types, but appear rough when it’s needed the most. It works similarly (but not the same) as this setup:

This setting can be found in the Render Settings - > Light Paths -> Caustics -> Filter Glossy section.

Note: Increasing the Filter Glossy setting will decrease the accuracy of caustics through the glass. Because the caustics are being rendered as if the glass is rough.


Make it so NEE works with glass:

There is a technique to try and make NEE work with transmissive materials like glass. It’s called “Manifold Next Event Estimation”. The idea is simple. Do NEE, but take into consideration the distortion of the ray as it passes through the glass material.

Cycles has this feature. It’s known as “Shadow Caustics”.

Shadow Caustics vs Normal rendering

Normal rendering:

Shadow Caustics:

You will of noticed something. There’s a section of the “shadow” that is just black, when it would be expected to be red. There’s quite a few limitations with the Shadow Caustics feature, and one of them is causing that issue. I can’t remember the technical details on why this happens, but it’s a known issue and so Manifold Next Event Estimation is usually only used for specific setups (E.G. Rendering light passing through the cornea and onto the iris in an eyeball). Maybe it can be improved in the future?


Make it so the random rays are more likely to hit the light:

I mentioned earlier that when a ray hits a diffuse surface, it can decide to do NEE or continue tracing rays randomly. Well the “continue randomly” method is how a lot of the light passing through the glass is actually sampled. So what if we improve this?

There are many ways of doing it, but I’ll only mention a few.

Learn which way the light is coming from:

If the ray tracer had some way to learn which directions the light was coming from, even when bouncing off and passing through other objects, then that learned data can be used to help guide rays to focus on those important regions.

Cycles has feature for this, it’s called Path Guiding. Currently it can only be used on the CPU, but Intel is working on GPU support.

Normal vs Path Guiding

Normal Rendering:

Path Guiding:

Disclaimer: Path Guiding is not designed to solve caustics. It’s just that as a side effect of how it works, it can help with caustics solving.

Make the light easier to hit:

The main issue with randomly hitting a light through a piece of glass is that the light is so small from the perspective of the ground in the shadow of the glass, that picking random directions that actually hit the light is unlikely. So if we could make the light appear bigger to the ground, then that will increase the chance it get’s sampled. The main ways of doing this are increasing the size of the light, and moving it close to the ground object. Obviously this technique can only be applied in select situations.

Examples of the impacts of increasing light size

Small:

Medium:

Large:


And then we can combine some of these technique together.
So here’s an example showing what a small increase in light size, path guiding, and a bit of filter glossy can do.

Normal rendering vs Adjusted rendering

Original:

Slightly bigger light, path guiding, and a little bit of filter glossy:

Note: This is what changed between the original scene and the modified scene:

  1. Light radius was increased from 0.01m to 0.05m.
  2. Filter glossy was increased from 0.0 to 0.2.
  3. Path Guiding was enabled.

Other techniques:

There are obviously other techniques that can be applied to help out with this situation. Or modifications can be made to existing techniques to make them better.

Reverse the ray tracing direction:

The main issue we have with sampling light through a piece of glass is that it’s unlikely that a ray will hit the ground, then randomly pick a direction that actually hits the light. But what if we did it the other way? We trace rays from the light to the ground?

Well, you get a similar issue. It’s unlikely that the ray will hit the ground and pick a random direction that hits the camera.

But if you combined “light to ground” with “camera to ground” ray tracing, you can get the benefits of both approaches.

This technique is known as “bi-directional path tracing”, or a modified form of it can be used called “photon mapping”. Cycles doesn’t support these techniques. I can’t remember if there was a specific reason it isn’t supported, or if it’s just a “we haven’t gotten around to it” thing. So you’ll need to wait for a comment from a Cycles developer about that.


And just a quick recapper thing.

The glass is not broken. It does allow light through it. It’s just that depending on the scene setup and render settings, it can be hard for Cycles to sample light through the glass.

If your only desire was to sample the light more often, without using any tricks, then Path Guiding and bi-directional path tracing are the kinds of options you want to pick. Maybe increase the light size if you can.

But if you’re willing to loose some accuracy, you can adjust the filter glossy settings and use the architectural glass trick.

24 Likes