2023-03-13 Eevee/Viewport Module Meeting

Practical Info

This is a weekly video chat meeting for planning and discussion of Blender Eevee/viewport module development. Any contributor (developer, UI/UX designer, writer, …) working on Eevee/viewport in Blender is welcome to join.

For users and other interested parties, we ask to read the meeting notes instead so that the meeting can remain focused.

  • Google Meet
  • Next Meeting: March 20, 2023, 11:30 AM to 12:00 PM Amsterdam Time (Your local time: 2023-03-20T10:30:00Z2023-03-20T11:00:00Z)


  • Clement
  • Jeroen
  • Omar
  • Michael
  • Miguel
  • Thomas

Planning 2023

  • Last week a Clement visited Blender HQ to do several planning sessions what we will be doing in the upcoming year. A blog post will be written about this, but in short:
    • Focus on Eevee-next, porting existing code to the new code-base where possible.
    • Research how GI can be implemented in Eevee-next. There are many different algorithms that all have side effects.
    • Viewport compositor will be focusing on support for render passes (both Eevee and Cycles). Requires changes of RenderEngine API.
    • Focus on Vulkan Backend.
  • Eevee-next plannig has been updated (https://projects.blender.org/blender/blender/issues/93220)
  • Michael and Jason also visited Blender HQ for some technical sessions and planning for the Metal Viewport. Focus will be on adding features that Eevee-next requires (SSBO, Indirect draw/compute), performance and handling bigger scenes.


  • Rotation API has land.
  • Current volume algorithm has been ported to Workbench-next.
  • Design has been made how to port Subsurface scattering to Eevee-next.
  • Some developments have been done to add support for volumes back to Eevee-next. This is still work in progress for the upcoming period.
  • Discussion with Line-art and NPR rendering. Result of the discussion is that line-art will be using geometry nodes. A discussed GPU implementation isn’t a priority and would not use much of our time.

Viewport compositor



Exciting stuff! For screenspace GI, be sure to check out Godot’s recent SDF based one click solution for real time open world GI. Specially because afaik it’s a technique they invented. They also have a higher quality solution that can be applied to specific volumes, like cubemaps, but that works in real time.


I hope blender can implement DDGI for Global Ilumination

1 Like

DDGI ? No ! RTGI Better, realtime and dynamics ! Look “GI-1.0 GPU OPEN AMD” EQUIVALENT At Lumens and fully open source! Actually the implémentation choosen for the first release of eevee-next GI are “Surfels GI” Basicly in Forza horizon 5 Surfels GI as choosen too ! The technology of Forza Horizon 5: an Xbox Series X masterpiece | Eurogamer.net

Surfel Gi is very promising i hope see the alpha very fast in EEVEE-NEXT blender !

“GI-1.0 GPU OPEN AMD” Based on Voxel GI Docs : https://gpuopen.com/download/publications/GPUOpen2022_GI1_0.pdf


Interesting read about RTGI (GI-1.0 GPU OPEN AMD)


I believe you’re all enthusiastic about what we are doing, but please stay on topic!

Discussing which GI is better and discussing details about the implementation isn’t what this topic is about. Other people read this topic can be misinformed when reading information not originated from the actual engineers and developers working on the implementation. This isn’t efficient for development and we (the development team) need to spent time to correctly inform people. Time that could have been spent on actually engineering and developing.

There seems to be a resemblance between a GI solution and a religion. We are not and cannot follow a specific implementation. Our implementation is custom tailored as we have different requirements. We get inspired by other implementations. Getting information from random new articles whitepapers and youtube videos isn’t at all helpful. Trust us, we know them and have more insight due to our experience and connection with industry partners.

We don’t spent time on discussing this as our goal is to deliver a working open source GI solution, not to explain the pros and cons between all the hundreds of different algorithms out there. Especially as this doesn’t bring anything back to Blender.

If you want to learn more about our implementation you’re free to study the source code and help out. If you’re not able to please wait until we have something to show and for you to test. In the meanwhile don’t confuse other people.