Real-time Compositor: Feedback and discussion

Glad to finally have a way to have proper viewport glow in Cycles.

Currently fog glow produces radically different results between CPU and GPU compositors on the same settings, is there a way to use the GPU compositor for the final render? Since i cannot find one.
I have also been hearing that the Eevee-style bloom being used for fog glow is a temporary solution, and that it will be moved to it’s own node when one is found, is that true?

5 Likes

@OmarEmaraDev Just saw your weekly update and saw you implemented the approximated fog glow using my Eevee bloom idea!!! I am so thankful for you man! We finally have a temporary fog glow implementation that will work in the real-time compositor! You are my hero!!! You are going to save a lot of artists with this one!

@etti Yes, it is a temporary solution! I thought it would be better to have an inaccurate fog glow that we can use real-time for now, rather than as it was where we were omitting the fog glow completely in the real-time compositor and having to wait a long time for an accurate version using an FFT implementation. This way ensures artists can still have a real-time way to see glow until the FFT implementation is finished. :slight_smile:

I definitely was begging for this feature up above so I’m so happy @OmarEmaraDev was able to take some time to break it down and make it work! Even if it’s just a temporary solution it’s so much better than no fog glow at all in the real-time compositor! :slight_smile:

1 Like

we can take the position of a fragment and then get it in local space to a object and get a texture cord from its X/Y to for instance draw a widget on top of the offscreen.

local = matrix.inverted() @ fragment_pos

we can do something similar in 2d without rotation too for like cursors / UI elements to the rendering

vector2 diff = (texCord - element_pos_in_screen_space)
float clip_d = clamp(diff.length * multi,0,1)
vec4 sample = texture2d(buffer, diff)
out image = mix(main,sample,clip_d)

3 Likes

Using the GPU compositor for final render is not possible at the moment, but is planned for the next milestone. We did discuss the possibility of including the bloom-like implementation as a separate mode or node, but there is no concrete plan at the moment.

I would suggest you try evaluating the bloom output at an exponential function to simulate a physical based lenticular halo PSF, that way you get an output that is a bit closer to the CPU compositor. An easy way to do this is to set the Mix to 1, use an RGB Curves node, and add the result to the input.

6 Likes

Maybe I’m misunderstanding something but looks like Map UV node doesn’t respect the passepartout?

edit: Displace node too?

issue with colors changing when disabling a hue correct node (mostly in oranges and reds) even though there is no adjustments on the node , seems to be caused by my mixer node group that is just color combine and separates. Can any one else duplicate. The green frame on the far right is where the nodes are, just plug in any image into the scale node on the left.

photonode.blend

Question:
Now that the compositor editor aims to be a realtime experience,
how will we handle heavy-to-compute image processing functions?

let say someone wants to implement an AI upscale node in the compositor, would such implementation get rejected because it can’t be made into a real-time experience?

An upscaler in the viewport doesn’t make much sense, but still, your question is valid for any other heavy computation process to be delivered realtime

1 Like

Thanks for letting me know, my main annoyance was just from the fact that I’d need to fiddle around with the settings after already rendering the scene. So if this gets to land in 3.6, so maybe the exponential function could just be made built in to make it match the CPU result, since final renders won’t use the viewport image anyway.

Question @OmarEmaraDev , will the Realtime Compositor support the Viewport Render Image/ Animation?

2023-04-14 230433

Having an expensive viewport upscaler looks like solving a problem with additional computation because you don’t want to do a computation in the first place.

viewport upscaler

@silex You misinterpreted my question, I never suggested it should be done in the viewport.

I’m asking what will become of heavy image-processing nodes. Are such nodes doomed to be rejected because they cannot be translated in this new real-time experience… ect… AI upscaling is just one example, many other examples could be found

1 Like

Aah, sorry about that. To me it sounded like you do. Thanks for clarification.

If I understand it correctly the plan is to eventually have GPU acceleration in standard Compositor, so the nodes that are expensive now might not be in the future.

What examples are you thinking of specifically? I’m not aware of any planned compositor nodes that do resource-intensive image processing. This feels like a very hypothetical concern, unless you have something specific in mind

I’m thinking mostly of AI image processing functions
AI upscale
AI style transfer ect…
Compositor seemed like the perfect place to implement such heavy processing tools, until everything became realtime focused

Can you share that blend file?

The exponential function is a too subjective to apply as part of the node, because we are not applying it as a convolution, so I am unlikely to consider doing that.

I don’t think we will ever reject an operation simply because it can’t become sufficiently realtime, because the realtime compositor will also have the ability to execute as part of the render pipeline, where realtime execution is not a hard requirements. How will we handle those nodes for the viewport? I am not sure yet, but we will make that decision then.

That seems technically possible, though a discussion of the design is probably needed. I will try to bring that up and see if we can implement it.

8 Likes

Manage to get the color shifting issue confirmed. #106965 - viewport compositor - color change when disabling Hue correction node with no adjustments - blender - Blender Projects
You can voice your opinion if use blender for any color correction work. To replicate add a color correction node , push saturation high and add hue correction node (colors will shift with no adjustments to the node).

FR: It would be great if there is a way or an option to make the realtime compositor ignore the lineart GP modifier edges. This will make realtime stylized rendering even more possible. While it might be helpful in some cases to also process the lineart, it is generally preferred that the lineart is drawn on top of all the processing.