Blender 4.2 - EEVEE-Next Feedback

We have now extended the range to [1, 9] in Blender v4.2.

They are not expected to match. Basically, in the compositor, the biggest possible size is 9 and you go down from there by halving the effective bloom size, in EEVEE, the lowest possible size is 1 and you go up from there by doubling the effective bloom size. The maximum possible size for EEVEE is dependent on the image resolution, and is logarithmic in nature. I feel the the compositor’s approach is better in that particular case, because smaller sizes are less useful. But really, I think we should have a [0, 1] float for the size.

Eventually, I would like to do something like you mentioned. So basically:

  • Remove the Mix parameter and replace it with a simple Strength input, whose range is from zero up.
  • Provide 3 outputs from the node:
    • Image. Which is the input plus the glare.
    • Glare. The generated glare only. Should it be affected by the strength?
    • Highlights. The highlights only.

I wrote a more detailed task about the possible improvements to the Glare node.

9 Likes

I am reminded about how to generate half-alpha shadow without noise.
it seems jittered shadow won’t help. Render Method also won’t help.
Any suggestion or ways to denoise?

This post was flagged by the community and is temporarily hidden.

I don’t see why realism has to be mutually exclusive to creativity. I love how UE5 looks and i can’t wait for Eevee Next. I mainly do VFX and previz work in Blender and Eevee Next is exactly what i need to speed up my work.

I never suggested that it was.

Updating something like EEVEE to take advantage of new features in hardware and to keep it maintainable for the future is a lot of work. There are not a lot of developers available to do the work. Updating EEVEE and at the same time keeping the old version doubles (or maybe even triples? because you now have to make sure 2 render engines which use the same structures don’t break eachother) the work.

Please remember that blender is created by real people, who put a lot of work and love into creating the best they can do. Not some nebulous ‘them’ who ‘poorly manage’ software.

10 Likes

Just to target EEVEE out of your comments, it is still under development. They are planning a separate NPR engine that will allow for more creativity.

Until they are finished with the respective engines, telling them they are incorrect or that they are intentionally making the software worse doesn’t help anyone. Developers know where their work is deficient.

6 Likes

I’ve seen this type of statements about the plans of new NPR engine a lot, but this is really non-sense.You can’t take away people’s arms by promising to give them robotic arms in the future.

Creating a completely new NPR engine only means it will take more time not anywhere less. It’s not something that will be finished tomorrow. Very likely it will take many years and what do you expect artists to do for the time being? especially if it was doable in eevee legacy and you lose ability in eevee next, then it’s only a regression due to function lost, and there should not be any excuses.

2 Likes

EEVEE Next clearly has a focus on some of the great new tech that we can use in rendering, such as raytracing, global illumination, and vertex displacement. While these features help achieve photo-realistic art styles, I don’t think it takes away from NPR styles.

I understand your frustration here, but can you give examples of art styles that were possible in Legacy Eevee that are no longer possible in EEVEE Next?

That was misphrased, the NPR engine isn’t meant to be a engine altogether, but rather an extra step between Rendering and Compositing.
Besides, what was exactly “lost” from Legacy to Next? The main one I can name is Shader to RGB refraction, but that’s only because they ran out of time to implement it for 4.2.
Even then, if there is an unimplemented feature that you absolutely rely on with no workarounds there is nothing stopping you from using an older version until it is implemented.

1 Like

I am a Blender user, not a dev, I have no incentive to falsely promote the system. But Blender’s development isn’t able to be done the moment the devs want it done. A user community with access to dev logs has to be realistic about the ability the devs have to match intent. This is an argument of expectations vs reality. People are way too harsh about Blender without contributing code.

And yes, the NPR engine is supposed to be an intermediary between the final rendered image and the compositor, or something to that effect, I’m not exactly an expert, but they are working on an NPR engine. Name semantics whether it’s an engine or a render layer state or whatever aside, ‘NPR engine’ is the most simplistic vernacular to apply.

1 Like

It’s not simply the engine. but design, implementation, feedback, revision. All of these will take time, and you also risk change of plans, priority and others. Not to say the fact of a different learning curve and change of workflow on user side.
Again, you can’t take away people’s arms by promising them for a robotic arm in the future. Also As long as NPR engine is not coming out right away, it must not be an excuse for issues which NPR people are facing with.
It’s not the first time in the development lots of things doesn’t go with the plan. You could have learnt from the history.

There are lots of differences in various aspects, that’s why there were so many feedbacks, questions and bug reports posted all over the place. Some of them are fixed, solved, responded, some of others are not.

I also just asked question about “noisy shadows when alpha is halfway between 0-1”. If you have solutions for it, please let me know.

2 Likes

Please see my above comments about NPR “engine”.
It’s also ironic you mentioned “Expectation” Vs “Reality”, the reality is that nothing won’t take time, and I dont believe in “anything that’s planned for the future”, I only believe in what is “already finished”.

One thing I also want to remind you is that the reason people are angry is not because features people are expecting are not implemented, but because features that were used to be available but are lost due to whatever reasons.

Note many issues are already found to be bugs and are solved or not. But there can be many more coming with or without a solution, while the release date is very close.

As I already tried a lot in testing, investigating and reporting, I can totally see people’s untrust and panic towards the development, which should take more time to actually grow, instead of rushing into the development right around BCON3.

1 Like

This is the final thing I will say, as you are clearly just venting: I am not a developer, I am a Blender user. I am aware of things that may be considered as regressions. Being mad on the forums isn’t helping anything. Contribute code if you believe Blender needs to go a different direction. This isn’t the place to be angry. Go to Reddit, Discord, or Twitter and shout there.

As for feedback about EEVEE, we can continue from here, as per the thread title.

4 Likes

I am not the person who started this discussion. I simply want to point out that “planning an NPR engine in the future” should not be used as an excuse for current regressions. NPR artists deserve to have their voices heard and issues addressed now, not years later.

If you’re a user who is not directly involved in development, please allow the developers to handle the discussions without escalating the situation unnecessarily.

I want to clarify that I am not just venting. I have put in significant effort in investigating and reporting these issues, as you can see from my progress in this thread and in the bug reports. Your accusation is not constructive and only highlights a lack of understanding of the situation.

3 Likes

I would also like to know. Lowering the light’s resolution limit only goes that far.

On top of that, is it even possible to get colored shadows?

1 Like

You Tube link

Darkening occurs when objects intersect, when the HDR light is turned on.
How to visualize architecture with such obscuration when objects intersect?

P.S. Moderators, give me the opportunity to publish photos and videos.

I believe this is an inherent limitation of EEVEE being a raster engine and is not possible to “fix”, so the best you can do is increase the samples until it looks “good enough”, but at that point it might be better to just use Cycles. Maybe this could be solved with HWRT, but I am not sure if ray traced shadows are a part of the plan.

Then it is another regression from eevee legacy which you underestimate.
Shadow from half transprency object can matter in both NPR or PBR.
Cycle may not be a good option due to workflow and performance. In fact, if you need to end up with cycle anyway, what’s the point to upgrade legacy to eevee next? while you could use legacy?
Increasing Sample size may not be an option as its performance will definitely not be comparable to eevee legacy.

and so on so forth.

1 Like

This is likely to be a ambient occlusion obscuring your view. If it is impacting your ability to see your work, then you can change some settings related to it, or turn it off entirely.

To turn it off, you can either:

  1. Turn off EEVEE-Nexts ray tracing as the ambient occlusion effect comes from the “Fast GI” fallback when you’re using the ray tracing feature.
  2. Or increase the ray tracing max roughness to 1 so the "Fast GI’ fallback won’t ever be used.

If you want to adjust settings to reduce it’s impact, then you can:

  1. Head to the EEVEE Render setting for ray tracing.
  2. Expand the Fast GI Approximation option.
  3. Adjust the Distance parameter.