This is a weekly video chat meeting for planning and discussion of Blender Viewport & EEVEE module development. Any contributor (developer, UI/UX designer, writer, …) working on Viewport & EEVEE in Blender is welcome to join.
For users and other interested parties, we ask to read the meeting notes instead so that the meeting can remain focused.
Next Meeting: 27 January 2025, 11:30 AM to 12:00 PM Amsterdam Time (Your local time: 2025-01-27T10:30:00Z → 2025-01-27T11:00:00Z)
Attendees
Clement
James
Jeroen
Miguel
Fixing bugs
Last month a large set of selection, overlay bugs were solved. There are still a few left that needs some input from Clement to see what the best way is to solve them.
There was also a discussion to guard against misuse of the draw manager API. sometimes extraction or drawing fails because the input mesh isn’t consistent. Guarding against this will show the issue during development, results in better triaging and a clearer understanding what is supported and not. This requires more time to flesh out how this could be added without less runtime overhead. Asserts is one thing that comes to mind.
Regression test failing when running on buildbot. The cause is that buildbot works on MacOS 13 which doesn’t support atomic textures. It falls back to device atomics. However fixing the issues now timeouts the command buffer. Command buffer timeouts happened in the past on the buildbot but couldn’t reproduce on our development systems.
Are there any plans to tackle shader compilation times? In my daily usage, I often wait over 10 seconds for a shader to compile, even though some shaders are quite simple like Principled Shader.
Obviously these issues don’t arise with simple scenes, but I do get these long compilation times in production work. Here is a video that demonstrates the issue. The compilation takes like 10+ seconds. During this time, only one core is working on the shader compiling, my settings are set to use 22 cores, and I have i9-12900 + 3080 TI which performs very similarly to the 3090.
I also have a question. Cycles has a feature called Persistent Data, that allows skipping re-loading textures for rendering for subsequent frames after the first one. Do you plan to implement that for Eevee? It is a different issue from the previous post, although similar. Texturing loading in one of my scenes takes around 40 seconds every frame, where the textures are about 8GB in total, while rendering takes about 2 seconds.
See Projects 2025 blogpost about shader compilation performance. Note that there are already many tasks being done on this topic (parallel shader compilation, vulkan backend etc)
Texture loading is a known issue. In your case the textures should already be in CPU memory. To reduce errors when rendering many frames we reupload them to the GPU, otherwise memory leaks could happen due to fragmentation. I can imagine that Vulkan/Metal has more control over the texture memory, but unclear how much work it is and how much benefits it will have on the render time side when used.
Any plans to have a viewport render preview for Eevee Next?
Hint: if camera jitter is impossible to preview, this dramatically changes how any shot including an object in the foreground occludes things in the main scene.
Very specifically, if I place an object in the foreground that strategically hides things behind it, and the preview looks very different than the render, I will see something I wasn’t meant to see (way more often than the opposite, not seeing what you were meant to see, though I suppose this is also possible).
But my question is generic: any plans to have a viewport preview that’s accurate?
Not planned, also not sure how usable it is as adding camera jittering in the viewport has big drawbacks as overlays, grease pencil, add-ons are not aware of it and will just mess up the viewport.
I am using Eevee with Nvidia 1050ti and it could be really slow with procedural noise bump.
Would love to see a global Bump switch like the one we have for Raytracing for example.
So we would be able to uncheck the Bump in the viewport and Turn it On only before the final render.
Right now, you need to comb through all the materials to unplug all the bump nodes, which is not ideal in my humble opinion.
Or do we already have something like this in Blender?