Blender 4.2 - EEVEE-Next Feedback

We feel the same way before. And sometimes when we hear people say “if this does not work for you then go back to version blah blah”…we feel even more insulted LOL.

But stepping back, we realized that Blender foundation has kept most of the versions so far. In that sense, if we have a particular scene that works well with for example version 2.8, then there won’t be a reason for us to migrate it anywhere else because 2.8 license does not expire (it’s not cloud based stuff…unlike the recent Autodesk family which we migrated came from). So as long as 4.1 keeps on working on our systems, then we keep both 4.1 and 4.2 (or later if we choose to bypass 4.2 completely). The only problem is when Windows gets updated…I think Blender dropped support for Win 7 sometime ago. Again by the time something like that happens Next would have been much improved already…to a point when we may not even miss legacy anymore.

We remember when Eevee legacy first came out…I have to admit we thought it was a joke, especially for some of our team mates coming from the game engine background, but legacy improved a lot since then and in fact now our production is at the moment married to it.

7 Likes

Or at least delay the release until it reaches the level of stability and quality that is all but expected for an LTS release. It really feels like we are so close now and the BF probably does not have the resources to copy Unity’s mess of having similar, but different rendering backends side-by-side indefinitely.

Again, the fact that this will an LTS release has to be emphasized, it may be better to not take the high-risk move of being gung-ho on the release deadline (because for starters Blender is a FOSS project, it does not have shareholders that will bolt if even a small delay is done).

2 Likes

The las build of Blender 4.2 is crashing A LOT when using eevee next with old eevee legacy file

I open a bug report with the file attached.

4 Likes

@Emi_100 I experienced exactly this on my last animation… I confirm all this. Thanks for reporting the bug!

@Lawrence_Teng Blender still has the 2.8 available you say!?

I still keep this one... and some shortcuts are still the same!

2 Likes

Any issues and possible fixes you’ve encountered? I’m also dealing a lot with hair and sss, and I’m figuring out a jump to Next as well.

  • IF the hair shader uses translucency, a value of 0 for thickness in the Material Output of the hair makes it look closest to Next
  • enabling Jitter on lights seems to improve shadows on hair
  • I simply use the best settings on render and shadow (highest resolution, lowest resolution limit, highest precision, etc). It gives a small difference, though can’t say yet if it improves enough. I keep samples to 64

Current issues:

  • hair roots look harsh. Due to lack of Contact Shadows, and SSS skin not blending with the roots anymore
  • Next noise is mostly ignored by Denoise node (in Legacy I use it to remove noise in hair). Bug reported

Biggest one for me is the harsh hair roots. Any render, material or light settings that you found to improve this?

Visually, we found that an entire group of fur/hair of the same colour will look okay in legacy, but in next because of the absence of blending we have to stagger the colors. The idea is to have colours that are very close to each other in shade; they should be different enough to avoid that single matt of same colour that looks very harsh in NEXT but at the same time not giving the impression of dirty uneven hair. Just be as subtle as you can get away with (depends on lighting etc). Basically this will give this illustion of “blending”. We have two colours in our example below, but maybe one can have more than two…but we have not tried that.

Also we reduced the diameter of the hair. But we have more of it. So we duplicate the hair/particle set but change the seed in the children, again this will stagger the effect so that the hair root area looks “softer”. We find that duplicating and changing the seed of the children works better than just having a higher number in a single hair/particle set; blender will crash or just refuse to render it (even in legacy it happens).

It is not perfect but this what we needed to soften the root area. In some characters where they are entirely covered by fur, we did not have to worry as we don’t see the root / skin boundary. But some animals (eg monkeys and I guess humans LOL will have this boundary).

Because there is no contact shadow, the eyes of our characters looks a bit washed out. The contact shadow in legacy created a beautiful shaded effect on our characters iris. But in next, we have need to pull a texture trick. But we will see. Also the absence of contact shadows means we sort of “darken” our hair/fur colours also a bit to make them stand out. We are not planning to jump right away to Next. 4.1 and legacy is very good for us at the moment. We are on standby when it comes to next at the moment.

Hope this helps you.

1 Like

Thanks for sharing.
I’m playing around with the duplicate particle system with different seed, see how my work can benefit from it.

Here is my most basic setup for hair:

I realize now that I need to set Thickness to 0 because I use translucency.

In some cases I also use Alpha, as it makes hair look light and fuzzy, without the need to lower the Hair Shape Diameter.
This does create some noise, so it’s important for me that the Denoise node in the Compositor works with Next (as it did with Legacy)

2 Likes

That’s a nice beefy hair shader setup you got there. But then that is to be expected because your character has special hair that is unique. Ours is not that complex and because a lot of our characters have fur we try to keep our sanity LOL.

We find that reducing the hair diameter and replicating the particle is effective, in combination with the mixed colour, to induce an artificial “anti-aliasing” effect on the skin/hair area. We retro-fit this setup to our Legacy scenes and the effect looks even better but is very subtle because the blending is already there, but for close ups that will be nice.

I am not really technically knowledgeable in how particles are calculated but replicating the particle systems seems to behave as if it is “instanced”. I may be way off but if feels that way to us. Because the render time when using this method is a lot shorter than using a single particle set with equal amounts of hair. The good thing about the replication of particle is that Next did not seem to mind it at all; it did not render too much longer (it is still longer than legacy but not earth shattering). So the same furry character for us in a simple white background renders about 5 to 15 seconds longer in Next. But the same character in one of our big backgrounds… Next seems to freak out more and takes astronomically longer. We are probably not setting it up correctly.

1 Like

I assume you still put everything in a Node Group? For quick access of properties per character. Might also speed up building of shaders, as there’s less nodes to process (assuming all characters share the node group).

I fear I cannot use much thinner hair, as it doesn’t always fit a stylized look. So I’ll need a root fix without scaling.

In Next, I’m looking at an increase of 1-3 seconds render time, depending on settings.
Though I’m talking about average render times below 10 seconds on a laptop with RTX2060.

Legacy HD without Denoise Node ~2 seconds

The longest is still building the shaders.
A first render can take up to a minute longer. With Raytracing another 20-30 seconds. That’s why blender seems frozen when rendering in viewport the first time.

1 Like

While the new shadows looks better in real time they aren’t really accurate, especially at bigger sizes. The old eevee is not perfect, but still a lot better. Is this something that’s going to be fixed? :slight_smile:
Look under the chin, ear and also on the above the nose.

1 Like

You seem to be using Shadow Tracing, which is expected to give inaccurate results at very high light radius, turn on jittered shadows in the render settings and under the light to help with that.

1 Like

There is not visible difference whatsoever enabling jittered shadows.

Did you only enable it in render settings or did you also enable it in the light settings (as in, the settings of the specific light you want to enable it on)?

Ah yes there it is. Had an addon (Photographer) that hid some settings under the light. But yup okay, so we still have to fiddle with settings to get Eevee to look proper, good to know!

It’s a very cute character. There are a lot of very talented artists here. So cool to see everyone’s work. Your character has these nice eyes. Did you notice any difference in the way Next vs Legacy renders the iris?

When testing new versions of Blender, I recommend enabling “Keep showing Blender Light panel” in Photographer’s Preferences > Light panel :slight_smile: New version is coming soon with EEVEE next support.

Eye textures are flat, so that won’t give issues. I’m currently upgrading the eyes to fake the iris depth with normal vectors, which shouldn’t give issues either.

Do you mean eyes that require refraction? Because that one definitely requires some special care.

The Material Nodes with the Mix Shader are also mentioned here:
https://developer.blender.org/docs/release_notes/4.2/eevee_migration/#:~:text=Setup%20to%20disable%20shadow%20per%20material

1 Like

Thanks for the comparison.
I’m attaching an example from my own stuff (to avoid NDA things with our studio etc). The eyes have concaved iris area (just the traditional 3d eyeball shape) nothing unusual. We noticed that next and legacy renders our eyes differently. The one for legacy seems more vivid and there seems to be a subtle shading effect, particularly on the upper edges of the eyeball/iris area (probably from the eyelash contact shadow which next does not have (?) This is fixable by texture/shader settings of course, not the end of the universe.
eevee_iris_eyes

1 Like

In your image, Legacy has a more distinct shadow or AO at the bottom of the eye, near the eyelid.

Does the eye white have SSS? If yes, might want to play with thickness in the Material Output node.
Otherwise there’s enough other settings to play with… :sweat_smile:

1 Like

Yes we have SSS on the eyeball. Yeah we have a bit of playing around to do with the settings :scream_cat:

We have been so spoiled by Legacy to a point where we have almost become negligent even with scene optimization. Legacy was so forgiving and so fast. Now with Next, it’s like the we can hear that the “schoolmaster is on his way”, so we are now trying to pull our crap together again and behave like we do with our cycles scenes.

2 Likes