I was wondering, are there any plans for further work on the displacement feature from the material output? I find the idea as such pretty cool, but from my experimenting with it, I got the impression there is something missing for it to become really useful.
The problem I’m regularly running into with it, is that it doesn’t deal with highly-detailed (fine noise) textures nicely. Apparently it doesn’t do any antialiasing when sampling the texture, so input on a scale smaller than the tesselation size will contribute to the calculated node displacement in a undesirable way.
Everything would probably come out nice when setting the adaptive subdivision to subpixel size. But it seems currently typical memory resources are insufficient for that (Tried to set it to 0.25px, and it turned out 32G of memory are not enough).
So maybe the feature should be made more configurable? I was thinking along the lines of:
For calculating node displacement, antialias/low-pass-filter the input texture on a configurable scale (probably relative to local face size)
Provide input to the shader to allow to deal with the residual in a flexible way, like bump mapping it by whatever means works best. Probably making available (through the geometry input node) both the normals before and after the displacement would be good enough.
We would like to have better texture filtering for displacement so that there is proper antialiasing, but it just hasn’t been implemented yet. It would be automatic based on the area of the faces surrounding the vertex.
For the residual detail we have the Displacement + Bump method to handle it automatically.
Using dicing rate 0.25x would certainly be inefficient, it means you are creating about 16 quads per pixel.
Ah, okay, thanks for the answer, Brecht. Sounds like it is a feature for a little bit further into the future.
About the hybrid displacement+bump, I’m aware of that. When mentioning dealing with residuals manually, I had in mind cases where self-occlusion affects area-averaged brightness enough for there being a noticeable difference between true displacement and bump, no matter the viewing distance. Failed to create an example blend file, though. Maybe the situation is rather uncommon.
When comparing Cycles microdisplacement to other renderers, it seems very slow not only at rendering, but especially in reacting to adjustments. For example when tweaking displacement in shader, you want to see changes in real time or close to that, but at this point you have to restart the render preview or just restarting the calculation by entering and exiting editmode. Are there plans to improve this? It’s hard to know what you will get in the end and setting up takes too much time. Octane and Redshift has realtime response for the same feature.
That was an example of a similar issue someone else posted.
I was working procedurally, using nodes when the issue occurred. Thus far, True Displacement works with a subdivided plain. Yet, complex objects eg. Sphere (quad), cube, etc. the mesh splits apart at the edges.
More than likely a bug or paper cut, even though the node works with an experimental feature.