Cycles and surface normals (normalmap shading)

I’ve been trying to get to the bottom of a problem that I (and others, telling by forum posts and bug reports) face with using modified surface normals in materials. The resulting reports are usually something along the line of “broken normal map shading” or “bump mapping looks off in cycles but not in eevee”.

Here’s a list of normalmap related issues and forum posts, just from a quick search:

Summary

#126904 - Cycles: Normal Maps appear "flat" with low resolution mesh - blender - Blender Projects
#131605 - Cycles normal shading issue in assets with baked normal maps - blender - Blender Projects
#74367 - Bump Map and Normal Map causes flat shading. - blender - Blender Projects
#95729 - cycles normal shading issue with normalmaps and heightmaps - blender - Blender Projects

Bad normalmap issue

Normal map artifacts in cycles but not eevee - Materials and Textures - Blender Artists Community
Blender use normal map to render strange shadows in cycles, but everything is normal in eevee - Lighting and Rendering - Blender Artists Community

I would like to start by explaining what I think is going on, and why I think this is an important problem to solve.

Since this touches upon a core aspect of cycles, I probably won’t be able to tackle the problem myself, but I might be able to do some experimentation in code to get closer to a solution.


Tangent space normal maps are a clever way to keep surface detail high and poly count low, while still allowing the geometric surface to be deformed. In a perfect dot(n, l) world they work nearly flawlessly. In a path traced world, the trick can fall apart as issues arise when the sampled surface does not face the same direction as the geometric normal. Reflected rays can intersect with the surface and create shading artifacts.

Despite this, they are still crucial in workflows, especially in photogrammetry. If you are on a tight performance budget, a strongly decimated mesh with a heavy normalmap can go a long way.
I would like to argue that the current implementation is insufficient.


To demonstrate the issue, I have made the following example. It’s a high-poly mesh baked onto a decimated version of itself. A bit of an extreme example, and not best practice, but not unlike what might come out of a photoscan. The blend file can be found here.

The normal and position pass look clean. Yet the shading result looks wrong, and while this is currently by design, I think it would be worthwhile to reconsider this design so that it align better with real-world scenarios.

To demonstrate that his is not a hypothetical issue, here’s an example with a photogrammetry asset from Polyhaven:


(notice the lack of definition of surface details)

And from Quixel:


(notice the artifact-y stripes along the top)

The cycles result suffers from two problems:
First, Bump Map Correction. This recently-added toggle allows us to disable this correction. With the option enabled, the application normalmap looks wrong to me. The surface is only darkened by normals pointing away from the lightsource. With the option enabled the surface looks more natural, with highlights where the normals are bent towards the direction of the light.

This is only part of the solution though. The second problem, possibly shadow-terminator related, causes dark spots to show up with strong vertex normal deviations, as shown on the top of this object. This can be mitigated by splitting the mesh by edge and baking an object-space normal map instead. This is not a viable solution to the problem in practice for users, but it shows what the result could look.

I’m diving into the Cycles source to see if I can spin up a proof of concept for a fix : )

20 Likes

Yes - there is something quite wrong with cycles since forever… One thing i noticed, especially with the built-in textures like noise (that should get ‘perfect’ derivates for shading from the texture).

The strange thing is this - when set to displacement (shading only, no geometry changes) it looks much better… I think the same is true for normal maps, but haven’t tested it thoroughly:


1 Like

I found the discussion:

basically its a known issue…

About this issue with bump node, I think that you should set “strength” always at 1 and play with “distance” to modulate the intensity.

I may be wrong but, to my understanding, bump node “distance” Is the equivalent of displacement node “height”.

“Strength” looks like a mix factor between the shaded result without normal map and shaded result with normal map.
Let me know your thoughts.

1 Like

The setup you have can be largely mitigated by disabling the “Bump Map Correction” boolean in the material settings. See this:

I personally think that this option should be disabled by default rather than enabled, as I believe it is problematic in more cases than it is helpful.

However, this only works if the true normal ≈ vertex normal. When that is not the case, as highlighted in my example, this setting

Thanks for your investigations.

I just gave your blend file a go and Cycles renders it like this (I had to shift the HDR to around 152° to more closely match your rendering):

Now for the fun part. Octane renders it like this:

Very similar artefacts and facetting.

The Octane version I used is the OctaneBlender Addon 29.13 Beta Release in Blender 4.2.5

2 Likes

Now that’s interesting!

Very similar indeed. Can you see the normal pass in octane?

Sure. Looks different than in Cycles (object space vs. world space or screen space or whatever):

But your normal pass doesn’t look “clean” when you look closely. It’s just harder to see with the super saturated red, green and blue colors. But there are some sort of dents running across the faces.
In the Octane normal shading pass they’re easier to spot especially on the lower two sides of the cube.

1 Like

That’s true, due to the lack of bit-depth there is some streaking and banding going on in the worst areas, but it’s minor.

In my previous tests I also tried Unity’s and Unreal Engine’s path tracers.
(both game engines have an experimental full path tracing mode, no rasterizing involved there)

Unity looks clean, Unreal shows the same artifacts.

2 Likes

Interesting results.

By the way, while further investigating the problem by giving a render with Houdini Karma a try, I found that most of the HDRs coming with Blender (in the /datafiles/studiolights/world folder) are “damaged”, most likely because they were resized with a sharpening filter. This causes negative values around bright parts of the images like e.g. the sun. This isn’t affecting the normals / bump problem but it’s quite shocking nonetheless.

In the “sunrise.exr” we’re currently using here, the area around the sun looks like this:

Many of the dark pixels are actually negative, which will very likely not improve any rendering. Closeup of the sun:

Most of the other HDRs have the same problem:


image
image

I also checked in older Blender versions (3.6 etc.) and current main and it looks like these were having this issue since forever. I guess I will file a bug report…

EDIT: Done #132172 - Factory HDRs coming with Blender have artefacts / negative pixels - blender - Blender Projects

2 Likes

And it’s getting weirder. I now have two renderings, both with Houdini / Karma.

#1 with Karma XPU (i.e. CPU and GPU combined):

image

#2 with Karma CPU:

image

XPU looks clean, same exact scene with CPU shows similar artefacts as Cycles / Octane / Unreal Engine).

EDIT: And finally a rendering with Houdini / Mantra:

EDIT #2: Blender 4.3 with Radeon ProRender

1 Like

In the following ticket, Alaska replies:

Yes, this is a known limitation. When sampling lights we include how the light relates to the position and more importantly the normals of the mesh to help decide whether or not to sample the light.

The normals we use are the normals prior to any normal mapping. Which means in your cube scene, we use the “smooth normals”.

This causes issue for your scene as the normals on the top of the cube “curve” around to the side of the cube, causing parts of the light to be considered “behind the surface”

And it gives the lines to modify, but you’ll have to test it :

Just quickly adding to this, there are probably more places you need to modify than I listed in that comment. Those were just the main places that were having a impact.

1 Like

I vaguely recall I have looked into this once as well. I don’t remember the details, but related problems occur with reflection /specular lighting and normal maps, where a normal was clamped to prevent bounced rays from going below the surface ( I guess this is the ‘bump map correction’ checkbox?). It’s hard to really solve in a raytracer.

This is the “Maybe Ensure Valid Reflections” funciton.

Very interesting to see the different results! Especially curious that Karma CPU looks so different from XPU. Mantra looks nice. ProRender too.

and

Thank you! I’m going to experiment a little bit, I have no experience with the blender / cycles codebase so it will take a while.

It’s tricky as I believe that two things, ensure valid reflections and some second culprit as highlighted in the git comment from Alaska are to blame.

Somebody else had a very similar complaint/observation to this thread.

Various parts of Cycles will use the geometric normals (or smoothed geometric normals depending on settings), not the shader normals, to make decisions about various things. This leads to the issue obvserved.

A simple “fix” is to just overwrite these geometric normals with your shader normals. Here’s example of this:

EEVEE (“Desired”) result:

Stock Cycles (Has the shading issue):

Overwriten geometric normals:

Code change:

--- a/intern/cycles/kernel/svm/closure.h
+++ b/intern/cycles/kernel/svm/closure.h
@@ -448,7 +448,8 @@ ccl_device
 
       if (bsdf) {
         bsdf->N = N;
+        sd->N = N;
+        sd->Ng = N;
         float roughness = param1;
 
         if (roughness == 0.0f) {

You might be wondering, if the fix is so simple (Basically 2 lines of code), then why isn’t it done.

  • This specific code change only works for the diffuse BSDF. Although it’s easy to expand to other BSDFs.
  • This method is extremely hacky and probably has some knock on issues.
  • And most importantly. It doesn’t work if you have more than one shader. And the Principled BSDF is made up of multiple shaders meaning this method is incompatble with it.

I’ve had ideas on how to resolve this (Instead of overwriting the geometric normals, just use the shader normals in the various areas that use geometric normals and lead to the observed issue), but I’ve always been concerned of introducing bias if different parts of the rendering don’t match up. An example being this bug report: #120119 - 4.1 Regression: Cycles: Area lamp artifacts - blender - Blender Projects

This bug occured because the normals during Next Event Estimation did not match the normals during forward path tracing, meaning any light that was large enough to be “commonly sampled via forward path tracing” ended up rendering incorrectly.

4 Likes

Interesting. I guess the issue is that neither the geometric normal, nor the vertex normal is a good stand-in for the normal we want.

The Principled BSDF consists of multiple shaders but they share the same surface normals.
But indeed, if you blend multiple materials with different normalmaps for example, then I see that this creates trouble.

Can you elaborate on why this only works for diffuse?

Simply because the code snippet I provided only overwrites the geometric normals with the shader normals in the diffuse shader setup code. It does not overwrite the geometric normals in any of the other shaders setup code.

But it’s as “simple” as copying those two lines of code to the other shader setup code.

Oh ok, I understand.