Real-time Compositor: Feedback and discussion

If it was a background image, yes. But the Environment pass needs to be added by definition. See the following figure in the manual:

https://docs.blender.org/manual/en/latest/render/layers/passes.html#id5

Further, the pass description mentions:

Emission from the directly visible background.

So if the background is not visible, that is, occluded by an object, it should be zero.

oh I see, the image confused me because the environment pass is also seen behind the objects, of course it should behave as you describe.

I would just like to add positive feedback to this thread - The multi-pass real-time compositor is a dream come true. Thank you so much for all the work youā€™ve put in.

5 Likes

With the new passes, Grease Pencil/Lineart disappears when using compositor. While annoying, does that mean the GP is some kind of separate layer that could be accessed on its own?

And a question - will it be possible to have access to view layers?

Rendering GP to itā€™s own view layer - It already can be, yes.

1 Like

Thatā€™s a good point. GP is technically its own separate layer that gets drawn over the render, so now that the Render Layers node returns actual passes, it naturally will not be visible, but it is not accessible in other ways, so thatā€™s a problem.

We can either:

  • Make the combined Image pass a special case that includes GP. Which means that GP will be included in compositing as before, but means GP will not be visible when the user combines passes manually like one typically does.
  • Draw GP on top of compositing. Which means GP will be excluded in compositing, but it will be visible even if users combine passes manually.

And a question - will it be possible to have access to view layers?

There are no plans for that at the moment.

Best option, from user perspective, would be having GP as its own AOV. Of the two options you propose, both could be useful in their own way. Option B would be better/logical for Lineart, butā€¦ GP and Kuwahara has potential as well.

1 Like

Quickly skimmed through discussion; forgive me if this has already been covered. Iā€™m a video essayist who makes VFX-heavy content and the GPU compositor looks super promising for real-time keying and footage integration in the 3D viewport. However, when I try to use a Viewer node in the Compositor as the data input for an Image Texture node in a Material graph, it results in a purple error material. Are there any plans to support this workflow in the future? (I canā€™t wait to bail on Adobe for good haha. Keep up the great work!)

Hereā€™s my current setup for what Iā€™m trying to do, in case it helps:

2 Likes

This is a forbidden dead loop.
Blender goes this order: texture shading ā†’ rendering ā†’ compositing.
See? If you plug the viewer back into texture shading, the loop goes on forever.

What you need is the long planned texture nodes. Go and read the blog post.

1 Like

@baoyuā€™s assessment is correct. But maybe you can do it the other way around. Create an AOV pass that includes the UV of the plane, then use the MapUV node to map your image after processing to the scene.

We will opt for option A because it retains the old behavior. But there are plans for better EEVEE GP integration, so hopefully what you are describing should be possible.

2 Likes

Just wanted to say that the viewport compositor is magic.
Tried in 4.3 with passes and itā€™s just a joy to work with

10 Likes

Thanks for the suggestion. Iā€™m new to Shader AOV passes, so I may not have this setup correctly. It seems the viewport compositor doesnā€™t like plugging in the UV AOV pass into the Map UV node (in this setup, Iā€™m trying to map the noise tex to the footage plane before doing anything fancy with keying). AOV output specifies a single numerical value passthrough, does it recognize UV coordinate data as such or do I need to create an AOV for each axis?

@baoyu , youā€™re totally right; texture node functionality is exactly what Iā€™m looking for. Thanks for the link. Granted, Iā€™m basically trying to do what the blog post says might not be practical - using heavy texture processing in real-time. My hope for not having to bake/render a chromakey pass on footage may be too much of a stretch :smile:

I appreciate all the feedback. I know this isnā€™t a troubleshooting thread, sorry if Iā€™m derailing. Glad to move this to a more appropriate place.

Store the UV in a Color AOV, making sure to add 1 to the Z component. Then use it in the compositor.

2 Likes

Āæshould it be working in the viewport compositor the criptomatte node? i saw demos in twitter but is not working for me

This is a bit embarrassing but, even after 10 years of game dev, Iā€™ve never thought twice about how a UV map works. It being data packed in a texture kinda blew my mind haha.

Anyway, appreciate the direction. Iā€™m getting somewhere but ran into a few caveats. I wanna capture light/shadow data and apply that to the keyed footage, so I need the image plane to receive light, but also be transparent to allow the background to be seen behind the keyed subject.

  • Considered render passes but, as youā€™re aware, thatā€™s not yet supported by the viewport compositor.
  • Tried making the plane transparent, but the shader alpha also effects the keyed image mapped in the compositor.
  • Even if the above worked, the keyed subject wouldnā€™t receive light due to transparency or cast shadows due to being a post process.

Iā€™m not a compositor wizard, so thereā€™s probably solutions or workarounds for these. Glad for any advice you have. Also, impressed by the keying nodes in the compositor. Coming from Keylight in AE, Iā€™m able to get comparable results in less time.

1 Like

The Ellipse Mask node as well doesnā€™t seem to be image size independent

What do you mean? Its size is defined as a fraction of the render or input size, so it is not pixel-size dependent.

Youā€™re right, my bad. Itā€™s the blur node after the mask that gives different results, since itā€™s pixel based, Sorry

Sorry if itā€™s already been asked about/exists as a feature request somewhere or is already an existing option, but now that we have a really useful realtime GPU compositor, are there any plans for tagging a node as ā€œskip in realtime compositorā€ ?

I use the denoise node in all my renders, but its existence prevents the realtime compositor from working, and since I often quickly look in rendered view and then render for full res as a test, having to connect and disconnect that node every time is really annoying :frowning:

4 Likes

Second that. Something similar to the ā€œis viewportā€ node on geometry nodes could do it.

5 Likes