i would like to add a new primitive type in blender. a SDF (Signed Distance Field)
i though it is possible to add it like we add mesh or metaball. but its complicated, because he need a special renderer with z buffer evaluation in the fragment shader and some support for rendering. the interested things i hope i can customise or write the sdf with a graph node or directly in GLSL / HLSL.
i know i will be difficult but i want try
i have downloaded the source code, and compilled succesfull, but i dont know where to start.
do you have some diea, what this new type can request in the blender source code ?
i think i need to create a editor module and a new dna_type, but how i can write the renderer for display the primtiive in editor mode, and support it for rendering in evee .
in my local soft, i superpose a fullscreen shader where i deffine the gl_FragDepth based of the distance to the sdf, for a good merge of the sdf with the other meshs, i hope i can do same in blender. do you have any ideas what i need to do ?
i would like to just precise a point, for the moment its a test in local, but maybe if the community is interested i will share my code
I think @brecht posted a link to a commit in another topic that was the beginning of adding an openvdb volume type. This would probably be the best place to look since sdfs are well handled in openvdb anyway.
I’m quite aware of what signed distance fields are. Openvdb is not a mesh library, it is a library for storing and manipulating sparse volumetric grids, which happen to be a useful representation for arbitrary signed distance functions (or any scalar field). Since the grid is discretized there is some lack of precision, but for most real world cases this should not be a problem.
In general for implicit surfaces the input could be a shader, volume dataset (like OpenVDB) or a composition of mathematical functions like metaballs. And the evaluation can be done at render time or by generating a mesh.
I guess the idea here is to do render time evaluation based on a shader. The quickest way to implement that might be to just reuse a mesh and Cycles/Eevee shader nodes. Then you can render it as an implicit surface based on some setting in the material similar to how it works for volumes, and even support volume datasets automatically.
In many cases it’s more efficient to generate a mesh than to do the evaluation at render time, also to make it interact with other objects, modifiers, selection code, etc. The advantage of render time is that there are no memory usage constraints.
Just surface meshes can be pretty limiting for some kinds of evaluation. I had a brief poke at the metaball code to see what it would take to add enough complexity for something like implicit skinning, and the fact that it basically evaluates a mesh only at the isosurface that is displayed and doesn’t store the more general scalar field makes the projection step impossible, and gradient based blending operators quite difficult to do well. I think do that effectively you would need to store the scalar field in a grid structure like openvdb. I guess you could preserve data on both the mesh and the field at the same time.
ok, not understood very well what openVDB was. i though it was a structure and automatic meshing tool
in fact for my use, its more easier for me to write glsl code, as use the node graph for the renderer.
so i would like to add some sdf shapes in a normal scene and rendered in blender.
and i would like to play with these primitives like we play with metaball but with more control.
like we have a glsl code associated to this type of primitive,
like that we can have robust way to customize or contruct many complexe scene i think.
the physic is easier also with sdf, like sphere / sphere collision, if we can by example link a sdf shape with particles emitter
can generate many type of effects. or more, the problem can be the cost on the gpu, but for some kinds of effects
i thinck they can do great things.
i though it is possible to use sdf shape with rendering without generate mesh.
the ray/shape calculation is not so speed as classic mesh, but maybe also approximate with a less precise mesh shape.
i think there si many ways for doiing that. this is why i would like to try that,
but as i understood, for the moment the writing of a custom shader node in glsl i not possible ?
maybe we can customise the metaball code for doing another type of scalar field,
but the generated mesh base on marching cube is not terrible. and in my case i not need necessarily a mesh i thinck
I’m not an expert, but in my opinion, for a proper integration with blender rewriting/extending the metaball code to use something like openvdb to store the fields would be the most flexible way to go. It would also be nice to support more arbitrary scalar fields (like a sdf generated from a mesh surface for collision). However be warned that the metaball code is old and somewhat fiddly in how it recalculates/interacts with the depsgraph, so its not an easy task. Take a look at blenkernel/intern/mball_tesselate.c for how the surface is currently evaluated if you are interested.