Possible unification of everything related to texture+image nodes

Currently node based operations performed in textures or images can be performed with Compositing or Materials nodes. Next with the new upcoming features such as everything nodes, there would be also considerably another one dedicated specialized editor for texture as well.

The point is that these three editors have all the same purpose, but it feels like they are duplicated, also in terms of menu design, and their capabilities differ. Such as:

Compositing

  • Has Matte operations to mask pixels, has other interesting pixel operations such as Crop, very intuitive and easy to understand operations such as Rotate image or Transform.
  • Can access any mask created in tracking and use it as some sort of “shape drawing” feature.
  • Has no texture generators at all (perhaps the plan is to get advanced textures from Texture Node Editor.

Shading

  • More or less quite the opposite capabilities of compositing
  • No access to masking.
  • No intuitive pixel transformations (UV vector mapping can be used but not exactly the same thing).
  • No pixel filters at all (such as blur or bokeh compositor has).
  • Some basic texture generators (bricks, voronoi) exist but it could be further enhanced with even more.

So the point of this thread, is to consider that if is a good idea to merge all concepts from all editors into a single one. While a refactor with all these modes seems impossible task. What makes most sense is to make the new texture editor super advanced supporting every possible image operation borrowing ideas from everywhere.

Perhaps existing editors (Shading + Compositor) can stay as they are but the point is that in terms of usage they will have to be safely depricated, one can better rely on the texture editor for all of the advanced image generation and editing capabilities.

Those editors were one in 2.79.
They just were different views of the same editor.
But textures shared were Blender Internal textures.

Lots of generic nodes are shared between compositor and shading nodes (RGB, Value, Bright/Contrast, Gamma, MixRGB, RGB Curves, Combine/Separate RGB, Math, Map Range …).

So, when a node is not shared by node editors, there is a technical reason.

The split happened because the project is to create nodes for everything (modifiers, particles, constraints…). So, the idea was to make it possible to have several node editors for different tasks for upcoming nodes rather than a strong belief in the necessity to separate existing ones dedicated to final render look.

There are technical reasons that can explain those absences.
Masks are vector shapes. You need a render engine able to handle vector textures for that.
Procedural textures are computed on the fly. They don’t have pixels to transform. You need to use math and mapping nodes for that.
But maybe there is room for texel filters.

In fact, they have been continuously enhanced since start of 2.8 refactor.
Many nodes have been expanded, many have been created and several will be added in future releases.
https://developer.blender.org/T72337

Same texture nodes used by Cycles and EEVEE should become usable everywhere in future releases.
https://developer.blender.org/T54656

Thanks for the input, really helpful information.

I might as well try to get involved in these projects since I have many ideas on the subject, as mentioned earlier.