Real-time Compositor: Feedback and discussion

Can I ask for clarification about your workaround you’ve posted before? As I see some people would ask for this and I guess this would help them.

So as I understand, this workaround should be working only in viewport (to match the CPU bloom) and for final render it should be muted, right?

Let me clarify that the Fog Glow implementation in the Realtime Compositor is essentially a higher quality and more temporally stable version of the Bloom implementation in EEVEE. This is more of an artistic effect than a realistic approximation of eye or camera response to highlights.

On the other hand, the Fog Glow implementation in the CPU Compositor is a not-so-good approximation of the response of the eye or camera to highlights, implemented by convolving the image with a specific point spread function, which in layman terms means we spread the highlights into neighbouring regions in a specific way described by this function. This roughly takes the shape of an exponential function, that’s why I proposed you approximate it using an exponential curve as you demonstrated in your image.

So your assessment is correct, final renders should have the RGB Curves node muted, since it already takes an exponential form. Though don’t expect the workaround to work nicely in all cases, since trying to adjust the point spread function by adjusting its result is fragile.

9 Likes

Should ask something regarding this. If this patch for GPU Anisotropic Kuwahara gets merged, will the CPU implementation be updated to use the same one as GPU. Mainly asking since using GPU compositor for final renders is still not a thing, and i do prefer how the new implementation looks by a lot.

1 Like

Yes, if this patch gets approved, we will follow it with a patch that implements the same method for the CPU.

3 Likes

There is an option to enable GPU compositing on the final render in Blender 4.0. The steps to do this are as follows:

  • Select from the top of Blender Edit -> Preferences
  • In the preferences window select the Experimental tab
  • Enable the option called Experimental Compositors
  • Open compositor node editor
  • Tick Use Nodes
  • In the side bar (Access by pressing N) select the tab called Options
  • Change the Execution mode to Realtime GPU
4 Likes

Tested Anisotropic Kuwahara in viewport with the official hair demo. Looks very convincing and fast!

21 Likes

Massive difference between Viewport and vinal render. Glow is almost entirely lost. In the final you can see white outline around hair, that is from glow, but anything outside of silhouette is lost. This is the setup I’m using, basically it’s using Alpha channel as mask for mix node between regular image and glowy image.

Can anyone help me make final render match viewport compositor?

You are using the alpha of the render as the alpha of the composite. This will clip everything in your node setup that lies outside that matte.

3 Likes

I knew somebody would comment this I shouldn’ve unplugged. No, that’s not it. I was trying without that and same result, I plugged that as last resort to see if it would change anything but didn’t. Regardless of that Alpha result is the same. And anyway, if alpha was to change anything it would change in Viewport too, since I also plugged it in Viewer Node.

See five posts above

Great Job so far @OmarEmaraDev. One question, according to the Intel website https://www.openimagedenoise.org , the Open Image Denoiser is now supported by GPU as well. Would it be possible to make the denoiser options and the node supported by GPU as well, and not just CPU?

It’s likely that the OpenImageDenoise node will get GPU support in the future. But first the OpenImageDenoise system for Blender/Cycles must be updated to a new version so the GPU can be supported. Current progress on updating OIDN to the new version can be found here: #108314 - WIP: Update to OpenImageDenoise 2.0 - blender - Blender Projects

3 Likes

This was raised earlier, and as I understand Viewport uses GPU implementation of glow, in final Render there’s old, CPU version of glow. Right now it’s not possible to match it accurately, only eyeballing it.

In the near future Render compositor will use Viewport version based on GPU. This is quite problematic because people are not aware of it and they think it’s bug.

@EliotMack Would you say the Distance option of the Inpaint node is integral to its usefulness? Or would it be okay to practically have the distance hard coded to infinity such that it inpaints all transparent pixels?

Hard coded to infinity would be fine. I’m using it to inpaint the area behind an actor on a green screen.

-Eliot

Also when will the Movie Clip input node in the Compositor get a Color Space pull down? There is already one in the Image Texture input in the Materials node graph that works well.

1 Like

We created two test builds for testing automatic realization practically. Details and demonstrations are provided in the following comments.

4 Likes

@OmarEmaraDev are the last 6 unported nodes, specifically Defocus and Cryptomatte, dependend on Sergey’s refactorings, or what is the timeframe for them?

Those nodes are not really dependent on Sergey’s work, but I am just not giving them priority since they require passes anyways and will not be much useful without them:

  • Vector Blur.
  • Defocus.
  • Cryptomatte.

I already started on the preparatory work for those two nodes:

  • Inpaint.
  • Double Edge Mask.

I just haven’t got to implementing this node yet, it can probably be even computed on the CPU and cached as far as I can tell:

  • Keying Screen.

This node requires a special algorithm called FFT that is very hard to implement in an optimized way, so we are looking into using a library for it, but the integration process is not straight forward:

  • Fog Glow Glare.
16 Likes

Hi if possible to add effects on diferentes objets in the viewport realtime compositive. I need to add blur to the box and not to the suzanne.