Incredible work so far Omar, I am really excited to see how this develops! I was very impressed when I tried a build and could overlay several video files with masking and blurs all playing back in realtime. As mentioned by others, seeing this integrate with the VSE longterm would be amazing.
You mentioned about redoing some of the noise nodes. Personally I would love to see integration with the new proposed texture nodes as a longterm goal. You could do some very nice 2d mograph using the realtime compositor and texture nodes. Has there been any talk of the texture nodes being able to be used in the compositor in the future?
We havenât considered that yet, most likely because the new textures nodes are still in early stages of development, so it will take some time before we think about integration with the compositor. What we know, however, is that the available Textures node can not be used due to missing GPU acceleration and will be deprecated and removed when the new texture nodes gets introduced, so we will have to provide alternatives. Initially, the alternatives will likely just be a port of the shading textures nodes to the compositor, similar to how Geometry Nodes uses them.
I just gave it a try and blurring the default cube in the viewport alone brings pure joy!
This is fantastic and will save tons of time and enable a lot of great effects which would otherwise be allmost impossible.
Not sure if this is already fixed in the patches, but having a material with Mix Shader node connected to Diffuse BSDF or Principled BSDF crashes Blender, regardless of the viewport compositor. This does not happen in normal 3.3 alpha build of the same day (June 28).
Also suggestion. I guess this is a bit out of scope for the project but it would be convenient if Node Wrangler supports Shift + Alt + LMB shortcut in Compositor to connect the node to Composite node. Like how it connects to Group Output node in Geometry Nodes.
Hi, just opened it, but one issue thatâs stopped me from trying it immediately is the âunsupported nodeâ message. This was because I had a texture node in the branch. This is fine, I understand itâs still a work in progress, but Iâd suggest that muting unsupported nodes should then get it all working. At the moment it seems you have to remove the unsupported nodes. This is a bit of a nuisance having to remove and then re-add the unsupported nodes afterwards.
Also getting errors that frames and reroutes are unsupported. Perhaps they can be removed from the unsupported node list, as they donât influence the result.
Amazing to see the lens dispersion in real time though! Awesome work.
As expected for MacOS, this version goes up in smoke on lastes macos version.
Just a cube, render layers node, lens distortion, composit and viewer node. (all those supposed to work)
Why is it for download if the version isnât working out of the box after compiling for Mac. DID someone tested the build before uploading !??
Always the same with Windows developersâŚ
So nice to see this well developed already. Itâs going to be a real game changer for Blender. Even small things, like doing some brightness/contrast adjustments and seeing it in the viewport is a big qol improvement.
But I was wondering if there are any plans to address the issue where any srgb image or video added to the compositor would get Blenderâs view transform from the render tab applied to it, thus ruining the colors? Some direct control over which parts of the compositor node tree gets color managed would be really helpful.
I initially just made unsupported nodes pass their input through or return zero but it was later turned into this error during review, which I now realized needs more work. If you think the other approaches are better, we can look into restoring the old behavior.
I didnât realize reroutes and frames were taken into account, will make sure to fix that regardless.
MacOS versions which donât support OpenGL 4.3 are expected not to work. We should have communicated that better indeed.
There was recent work in that area with the Color Space node and Image node color space settings. While I will not be working on that area soon myself, I will definitely look into those concerns once the realtime compositor project is in a good state to prioritize other work.
Will this be ported to metal any time soon !?
Because otherwise it will be the same dudu as magic mouse implementation in addons!
And no single Dev. feels responsible when asked for those glitches to be fixed on macos, but every other tiny thing getâs fixed or implemented for Linux and Windows including the newes funniest equipment, but not magic mouse or any apple stuff that getâs sold by millions with the Mac. (Because of not so many users use macOS,⌠and whatâs with Linux⌠how many are there, and is Linux used by more usersâŚ? )
Itâs a shame - and please donât wonder why getting this comments.
Itâs not against you, itâs about how it is rated inhouse by the devâs!
(The above are comments from blender.today YT, and other Devâs about Apple, Mac, Magic Mouse collected since 1.5 years)
I think the best behavior would be to make them act like they are muted but display a little warning icon in the top like in geometry nodes. Ideally it would display a warning (âunsupported nodeâ or something like that) when hovering over the icon or node.
The real-time compositor is built over the GPU module, which uses whatever back-end that is preferred by the underlying system. So when Apple finishes Metal support in Blender, it will automatically work on Metal with hopefully no changes in the real-time compositor itself.
I will look into displaying an icon similar to GN.
Providing muted nodes and nodes that donât lead to the active composite node are excluded from your invalid node check, then you should be good to go
if you set the colour management to standard, then you can use the âColor Spaceâ node at any point in the branch that you want. It can convert from and to all colour spaces. You do need to gamma correct it though with a gamma node after the color space node, not sure why.
Is there any info available about hardware requirements for this (other than OpenGL 4.3 support)?
Once multi-pass rendering will be supported, will this allow to fetch renderlayers from several different scene-datablocks? Such as to e.g. compose a split screen from several camera-views (each scene obviously only ever containing a single active camera)?
@OmarEmaraDev , could you give me a rough idea of which areas of the source code I need to change to add a depth input to the Displace node? I want to make it so that pixels are processed in order of z-depth, so that front areas will never be overwritten by static backgrounds such as HDRIâs, such as here where the top monkey is displaced to the right, but the displaced version has all of the right hand side of the face missing because the static background pixels were calculated afterwards (presumably).
The âgamma correctâ term doesnât help again in this conversation. The reason for the need though, is that after you transform it from Linear or the Linear BT.709 working space to a formed image Filmic sRGB, the image data of Filmic sRGB is ready to be sent to the monitor, your monitor has a built in transfer function, so the Filmic sRGB is under the assumption that you image will go through that monitorâs transfer function afterwards, but instead it gets sent into compositor with the Linear BT.709 working space. Therefore what should be done is to add another node after it and convert from âready to be displayed by sRGB monitorâ to âLinearized Closed Domain sRGBâ. In other words:
The official hardware requirements are the same as Blenderâs Requirements & Compatibility â blender.org. However, this is not a hard requirement, as any GPU that supports 1) âCompute Shadersâ and 2) âImage Load Stores Operationsâ should be supported. In fact, I developed the project on an old ATI GPU from 12 years ago that only officially supports OpenGL 3.2 and does not fit the hardware requirements of Blender. So just try your hardware even if it is old, it might just work.
Compositing multiple scenes should be possible yes. More information about this should be available once we start the implementation.
By the way, do you know why even after gamma correcting, the linear to filmic srgb still doesnât match the result of setting the view transform to filmic in the color management options?