Real-time Compositor: Feedback and discussion

What are the next nodes planned for implementation? Inpaint, Z compare, and the keyer node would all be great.

Last week I was working on the Fast Gaussian mode of blur, which is needed to implement the Glare node. But unfortunately I encountered some road blocks on the way so it is still not finished.
For next week, I will probably look into other modes of the Glare node as well as variable size blurring and reduction operations. Maybe look into the Inpaint node as well.

12 Likes

Thank you so much for all youā€™ve done @OmarEmaraDev.

I had a feeling glare would be a tough one! Itā€™s one of the most visually appealing compositing nodes, which led me to believe there would be an added level of sophistication with getting it up and running. Really hoping it can be figured out! Fog glow is one of my most used compositing nodes in my projects and to see it while working on a project will be so valuable!

You really are paving the way for the next generation of artists. Gone are the days of waiting minutes to see a render with compositing effects. You really are pushing this real-time compositing to the next level and making blender so much more reactive and responsive! I really appreciate what youā€™ve done so far, and look forward to what you get to next!

3 Likes

eeveeā€™s bloom effect donā€™t use gaussian blur to pull out that effect ?

No, EEVEE uses a special algorithm that roughly combines increasingly lower resolution smoothed versions of the image to generate its bloom. So it is a low quality bloom, but can be done very fast.
Part of the realtime compositor project would be to extend the modes in the Glare node to include faster methods like the one uses by EEVEE. But this will likely be done after we are done with the existing ones.

15 Likes

Love it so far! Just concerned I donā€™t have my luma, chroma and vector scopes in realtime from viewport - which make color grading in viewport a little redundant, since I will have to render to use these scopes for color grading anyway.

Will the scopes be ported to the 3D View too?

6 Likes

Its not as practical but you can use false color instead of filmic to see if you re over exposed, but yeah it is needed to have scopes working on the viewport

I havenā€™t considered this to be honest because I am not familiar with the workflow of using scopes in color grading. But since you brought it up, we will definitely look into it, though not very soon as our priority is somewhere else right now.

How do you use scopes in a normal compositing sessions? Open an image editor with output image of the viewer node?

9 Likes

Yes, viewer node to image editor preview.

This is key for all renders, getting the luma scope right for white and black balance for all screens, chroma scope for color consistency between shots, and the vector scopes for color weight and also more finesse on balance for render consistency between shots. All of these are key for rendering and color grading - a key feature of the compositor.

10 Likes

Yes PLEASE! Having live scopes for image level assessment would be truly groundbreaking. Its very annoying to push renders to viewer window for scopes access or even colour picker spot checks.

Also being able to have multiple Previews available at the same time to compare scopes would be very helpful, currently you have to render an image to have multiple open at once.

6 Likes

And as we are speaking of that, also a realtime pixel-color-check (as in image viewer rightmouse click) would be awesome

2 Likes

Awesome progress!
@OmarEmaraDev Regarding the histogram\scope\pixel checker, Do you remember my suggestion during your GSOC? IMO that could be an exciting aspect of the viewport compositor. (especially with the new geo-node view node)
(I am talking about the ā€œnumerical viewer nodeā€ suggestion, ignore the first oneā€¦)

I remember. :slight_smile: But I am not sure if it is even feasible at the UX level, because sampling of the blocks will not work for any reasonably detailed image and would be rather hard to control even for uniform images. But thatā€™s something for another time.

Of course, the idea is a little raw, but I am not sure how exactly the compositor relates to the windows screen pixels, so I didnā€™t get your reasoning.

In the meaning time, I made a geo-node (float)viewer.

Maybe for blender 4.2 :laughing:

1 Like

Iā€™m working with some 1920x1080 footage in a PNG image sequence. When I connect the footage directly to the Composite node, it extends past the viewport (which is understandable on a 1920x1200 monitor.)

Is there a way to scale the compositor image output down to match the Layout viewportā€™s view rectangle? I tried the Scale node with yesterdayā€™s build, but it didnā€™t affect the footage display.

1 Like

Not sure why the scale node doesnā€™t work for your case. Are you using the render size mode? Can you demonstrate the issue with a screenshot?

Sure. Hereā€™s a 1920x1080 image being routed through a Scale node set to Absolute, and resized to 1280x720. However, the image still extends off the viewport edges.

That seems expected, what you probably really need is to set the scale mode to Render Size with Fit sizing.

Hmm. Doesnā€™t seem to affect the output image size.

1 Like

It seems to work for me. Can you open a bug report?

2 Likes