Input socket for Image Texture shader node

The problem:

If you want to create a re-usable nodegroup for texture mapping, for example this trick:

It can’t really be done cleanly. Because the image texture is needed inside the nodegroup (twice, even) with stuff going into the vector socket and the output being blended afterwards.

One solution for this is to not create a nodegroup, but and add-on which replaces an image texture node with a node group containing the same, for example like

does.

It’s a pity the texture Image node (all texture nodes actually) actually conflates two things: Generating/loading texture data (colors) and doing the mapping. It would be nice if there would be simple ‘texture’ node which provides the color data (and has no inputs) , and a texturing node which takes a ‘texture’ input and a ‘mapping’ vector input and outputs the same as the current (Image) Texture nodes.

As a quick test I hacked the current Texture Image node to have an extra input socket where you can plug in another Texture image node to override the image data and this works like a charm. Makes texture mapping/texturing nodegroups much more reusable and simple.

My quick hack has some downsides

  • it’s a quick and dirty hack, and I probably did things the wrong way. I would need to put some more research into it to do it a ‘proper’ way
  • By just re-using the current Image Texture node it works only for Image Textures, and not for other textures.
  • The U.I. is a bit strange, because The overridden texture info is not hidden in any way and the vector input slot of the ‘overriding’ Texture Image node is present but ignored.

Anyway, if there’s any interest in this I could post my (extremely simple ) patch. And maybe try to develop a more clean way to get a separate ‘texturing’ node as described above.

3 Likes

An example to illustrate what I have right now.

First the way you need to do it now:


The nodegroup uses a voronoi texture to displace the texture, and to mask the seams it is blended with the same texture on the voronoi edges. (Example normal seamless tiling on the right) Because this needs use the same texture twice you can’t put the blending step into the group node, which makes it much more cumbersome to re-use this nodegroup into another scene.

Now how it looks with my prototype:

Note we can just pass the texture info into the nodegroup once .
The contents of the node group are:

Where the texture image is overridden from the group input.

Kinda sounds like you’d like to see the fields proposal from geo nodes extended to shaders.

That is a very interesting proposal. But I think my proposal here is something much much simpler.

I only propose to split the Image Texture node to separate it’s two conflated functions because that makes texture mapping nodegroups actually usable. For backwards compatibility I would propose to implement this as 2 new nodes, that when combined behave the same as the existing image texture node( and leave the existing image texture node alone). Or maybe like in my prototype as an extra mode of the existing image texture node.

It would be more logical to expand the same idea to all texture nodes, but I’m not really sure how hard that would be. I’m still in the brainstorming stage (which is why I posted it here and not on RCS)

I think this is a good problem to solve, but it needs a more clear design. There’s two approaches that seem reasonable to me.

  • Support image datablock socket types, similar to objects and collections in geometry nodes. This is conceptually quite straightforward, but only solves the problem for image textures. It would need some version patching on the existing image texture node. One thing unclear here though is how to deal with the active texture concept, as used for viewport display, baking, etc.
  • Add a mechanism to allow any texture node (or partial node graph) to be used as a function that can be sampled. This is a more powerful solution, but raises some design questions. How to specify what the texture coordinate input is for a node graph, if it’s something we can implement by just modifying the node graph on the Blender side and not having renderers needing to be aware of it, if this maps to an existing concept in MaterialX and if it’s worth trying to be compatible with that, etc.

The idea presented in this topic falls somewhere in between, but I think it more clearly needs to be one or the other.

3 Likes

This is more or less what I did in this prototype (minus proper UI work). It has the big advantage that it’s easy to implement, backwards compatible and solves the most prevalent use case. (Procedural textures don’t really have the problems with repeating patterns which need these kind of texturemapping tricks).

But it feels a bit halfbaked to do it for image textures, but not for any of the other texture nodes…

Making any node option exposable as a socket is something we should aim for. So extending that to image datablocks aligns with that goal, regardless if it fully solves this particular problem.

5 Likes