Real-time Compositor: Feedback and discussion

Guys, is it possible or will be possible in future to mix Mist pass with viewport result in realtime ?

Questiin 2

Is it possible anyhow baked in Davinci resolve LUT add to the render by functionality or realtime compositor ?

Thank U a lot for answer. Blender Devs. you are the best, guys !!!

Same here. To get good looking bloom I needed parallel connected 16x fast gaussian blurs with based radius of 0.001 and multiplied each one output by 1.6 (adjustable) as size for next one in stack.

have a tutorial i can read?

Has @OmarEmaraDev back to work to the Real-time Compositor?? Can’t believe that blender 3.5 is almost out and we aren’t going to have the fog glow node in realtime :frowning:

1 Like

Yes, I am now back again starting this week, though I will only be working part time and will thus only be available every other week.

Can’t believe that blender 3.5 is almost out and we aren’t going to have the fog glow node in realtime.

Indeed, the 3.5 ship have sailed already unfortunately. But I am already eyeing an implementation for Fog Glow in 3.6.

22 Likes

Great to hear that you’re back! I read the EEVEE-Viewport meeting notes at 2023-03-13 Eevee/Viewport Module Meeting, and am glad that you’re working toward enabling multi-layer workflows.

We just released a beta of our new virtual production system that uses the Viewport Compositor heavily. There is both an iOS app called Jetset (sign up at Lightcraft Jetset if interested), and a desktop side called Autoshot that uses the Jetset-captured data to build a full 3D tracked and composited VFX shot in Blender. (Downloads are free at Lightcraft Jetset).

I already built a workflow with tracking 3D garbage mattes with the Viewport Compositor (Lightcraft Jetset), and then wiring it into the 2D compositor at Lightcraft Jetset), but it would be amazing to have that all wired into the Viewport Compositor in real time.

It’s exciting to see all the viewport & EEVEE-Next work flying along!

4 Likes

Hey @OmarEmaraDev!

When you get around to implementing the ID mask node for the real-time compositor, could you also update it to account for ID masks of objects that are out of focus?

Currently the ID mask produces jagged edges for objects that are out of focus, see below.

The cryptomatte node produces the correct mask for objects out of focus, but it cannot be used inside node groups.

Thanks for all your hard work!
-Tim

3 Likes

This is inherent to the ID Pass itself, not the node implementation, as far as I know. And that’s one of the reason why we have a cryptomatte node. So what you are asking is not possible from the point of view of the compositor.

That’s unfortunate to hear. Who is responsible for maintaining the ID pass?

On a side note, is there a way to make the cryptomatte work inside a node group instead?

1 Like

The ID pass generation is the responsibility of whatever engine produces it, but it is standard at this point. I wouldn’t rely on it changing somehow.

I am not sure about Cryptomatte to be honest, I haven’t looked into it yet.

2 Likes

It “just” looks like it’s sampled only once, when it should need at least something like 16 or 32 samples to be somewhat smooth. At least these jagged edges have the signature look of low montecarlo sampling.

2 Likes

I hope it’s that simple of a fix. If not, I would hope that the ID mask could reuse whatever code that gives the cryptomatte its smooth-edged mask.

Whether or not the current ID mask output is the “standard”, I think we can all agree that it leaves a lot to be desired.

1 Like

I might be wrong but afaik id passes are non antialiased because the id points to an object or not, there is no inbetween. Makes sense that this looks that way when its out of focus.

Has the mist or the Zpass got implemented?

Thanks

Looks like if the Noisy Image is mixed with the Image in composition, the result is a full black screen for the Noisy Image:
0%mix:

100% mix:

No passes are supported yet. This is the next milestone for the project though.

That’s also because the noisy pass is not supported yet and just returns a zero color.

Well, it is not about how many samples you can take. Lets say you opt to accumulated the IDs of objects, much like other light passes. Further, say you have two touching objects, whose IDs are 1 and 1000 respectively. If you render the ID pass with an infinite number of samples, you will get a smooth gradient from 1 (Maybe linear?) to 1000. Now, how would you generate a mask for the 0 object? Split the difference and use a threshold of 500? What if there is another object whose ID is 500?

So you see, accumulating more samples will not really solve the issue. That’s why we have Cryptomatte.

2 Likes

@OmarEmaraDev I forgot about these. Should I repost these as bugs on Gitea?

I didn’t really verified with latest versions yet, but I think at least some of these are still there. Not sure about things where viewport compositor looks more correct.

1 Like

Most of those are expected differences as far as I can see, with some having a more correct behavior in the realtime compositor.
There is one or two that should be handled though, so I will look into them and submit a fix directly. Thanks for bring them to my attention.

2 Likes

How would the cryptomatte work in the example you gave and why couldn’t the ID mask be adapted to use the cryptomatte’s method on the backend?

Ah yes, that makes sense, there can be no blending of values. I suppose the only way is to have a pass for each ID then. How does Cryptomatte do it ?