Improve Cycles/EEVEE For Procedural Content Creation

This thread is for my GSoC 2019 project “Improve Cycles/EEVEE For Procedural Content Creation”. You can find the proposal here. And you can find the wiki page here.

This thread serve as a discussion hub for the project and a place where I share important information like project progress and documentation.


Weekly reports can be found in the reports thread.


This checklist provide a general overview of the progress I made against the deliverables outlines in the proposal:

  • [x] Dynamic socket hiding.
  • [x] Variable inputs for Mapping node.
  • [x] More vector math operations.
  • [x] Clamp node.
  • [x] Map Range node.
  • [x] White Noise.
  • [x] Improve and introduce more noise types.
  • [ ] Wider access to properties.
  • [ ] Spline info node.

Community Bonding

My project is not monolithic in nature and it relies on the needs of the end-user. So it is essential that all users engage in the discussion to have the most satisfactory result.

Procedural Textures

The first point of discussion on our agenda is Procedural Textures. Please read the following discussion document and weight it with your opinion in the thread.

Mapping Node And Vector Socket

The following document describe my initial plan for the mapping node and also some possible improvements to vector sockets.

Thanks! And feel free to contact me in anyway.


Spline info in the shading area is a great idea. What about an input from the compositor itself just like image textures? I would find it very helpful to be able to use masking along with other things inside the shading network. Thank you and good luck with your project!

I don’t think that this is something we will be considering in this project. But thanks for your input!

I think there should be a good solution for creating procedural scratches. By that I mean longer, single, lines that seem to follow the surfaces (I’m not sure how feasible the latter is).
While effects like edges with lots of small scratches can be easily created using the current noise textures, this method breaks down when trying to create long single ones because you would have to find a way to mask out your scratch. Maybe I’m jumping to conclusions too quickly, but I haven’t seen a good, procedural solution for that in blender yet.

This could also be useful for creating rivers, rain streams on windows or lightnings.
Here are some quick references
there should be parameters for controlling thiccness and details
Also note how these seem to taper towards their ends

I have no expertise in maths, so I’m not sure how feasible this all is…

Your idea about adding 4D noises is fantastic! Being able to loop 2D noise patterns and evolve 3D noise patterns sounds fantastic, especially for the mograph community.
Replacing perlin noise by opensimplex sounds good too, but please make sure that the scale settings between both roughly match (for compatibility reasons)

Other things that would come in pretty handy:

  • Adding color input sockets to the color ramp node
  • I don’t know what the current situation on the asset manager etc is, but managing node groups for tiling, blur and texture displacement is pretty cumbersome. Maybe there is a way to present them to the user as a node, but handle them as a node group internally?
  • I think adding triangle and hexagon patterns to the checkerboard node or to the brick node would be nice for creating futuristic scenes and bathrooms
  • There should probably be a seperate node for vertex color input instead of the Attribute node

I would be excited to see more of the blender internal noise textures brought to cycles/eevee
Also a seed value (animatable) for the procedural textures.


I see there will be some improvements to spline texturing. I wonder if adding support for curve uv transform mapping could be in scope in this project? This blender limitation was decribed here.
I know in it is not directly related to shader nodes, but still it would make texturing curve way easier.


Let’s say I have an UV mapped image, and I somehow want to move that resulting mapped texture in Camera coordinates, to fake a 2d edge. Or that I have a texture mapped in uv space and I want to add a noise to the vector to fake a Blur, but in global space, so it always has a constant blur radius. Idk how that would be possible with current shading nodes, or if it’s possible at all. But that concept of having a coordinate system itself be mapped to another coordinate system seems useful.

The fact that textures require a vector coming before them is also limiting for making nodegroups that act as filters, like blur, parallax, custom normal mapping… If you want to use that filter with other textures, you have to duplicate the datablock and go inside to change the texture. Textures cannot be inputs of these filters, coming from outside the nodegroup in, because the filter IS the vector, and the vector has to come before the texture. There the concept of having a mapped texture, and then be mapped again after the fact, could be reused again.

Basically, being able to move a patch of color that you see onscreen, from any input, to another part of the screen using any other coordinates AFTER the fact, or at least being easier, that would be great. Maybe it would require pixel displacement. I hope I’m not explaining myself badly and this makes sense

1 Like

This may seem like a trivially obvious point, but it seems like the single most common use case for procedural texturing is for the fast creation of realistic materials, preferably without need for UV unwrapping or manual texture painting. Most of the time this means static (i.e. non-animated) materials that attempt to emulate some kind of surface imperfections or variations - scratches, grease, dents, mould, fabrics, stucco textures, etc.

This means that (with some exceptions) procedural textures that look too uniform or too artificial will be of limited use to most users, either because they are incapable of producing natural-looking patterns, or because they require too much manual tweaking to do so.

I am making this point because I think it is important not to get too caught up in the purely mathematical side of the process - a cool noise that just looks like cool noise is of less practical use (for most users) than a noise that can actually be used to emulate something in the real world, be it marble, clay, or chipped paint.

I assume you’re already aware of this, but Iwould also direct your attention to this comment thread where users have made some suggestions:


my $0.02 : I wish Blender had better upstream node caching during interactive slider editing for better interactivity. for small trees it’s great, but when slider dragging in a largish tree, the perf gets pretty bad. Because of this I avoid the sliders and end up typing in the values instead.

the old compositing software from softimage called eddy used to have a little color coded progress bars on each node to show if a node were being read from a cache or evaluated. if meant you could easily cache all upstream nodes at the input connection and get an interactive slider drag with only the downstream nodes being evaluated. if you were editing higher up in the tree you could watch the nodes switch color while waiting for a viewport update.

editing nodes now has that same delay, but I never know if the tree is done evaluating completely or not.


Not sure how feasible each of the following are but they would definitely be useful:
Blurring / Sharpening of textures
Access to previous / next pixels to apply proper kernel operations
Randomized circular gradient spots (like inverted voronoi without overlap)
Procedural scratch / line noise (since stretched noise never looks good in all dimensions).
Fractal tree-like procedural noise.
Allow for editing of nodes that aren’t used in output without attempting to re-render scene.

1 Like

It might be too much for a summer but I would love to see a curvature shader to identify concave and convex features of an object. The bevel shader might be a good start for that.
Also a more advanced ambient occlusion shader where one could specify a direction for leaking effects would help a lot.



I’d love seeing loop nodes. My usage would be for randomizing texture projection, creating splashes/dents/whatever, to get something like this :

A few nodes come to mind:

  • Multi-Image Node: Load a folder of images and spread them over objects / mesh islands. A good example is MultiTexture Map (MultiTexture - CG-Source). It could double as a node that’s able to randomize color / values by mesh island.

  • Use Vertex Weight in Attribute node: Currently only Vertex Color is supported, would be great to have Vertex Weights supported as well.

  • Ambient Occlusion improvements: Add support for directed AO, this allows for things like grime dripping down, for example. Similar to the options in the VrayDirt shader. (How To Add Ambient Occlusion with VRayDirt For PhotoRealistic Materials)

  • Texture Unification: This is probably outside of the scope of this project, but figured I’d mention it anyway. Replace current texture solutions with one shared texture system. That way textures can be used in modifiers and shading at the same time. This allows for complex interaction between procedural modeling and shading.

If I think of more, I’ll add them later. :slight_smile:


A few years ago disney created the se expression to create procedural stuff like noise or geo…
It can be a good reference for procedural content :slight_smile: (and it’s open source :wink: )

1 Like

I suppose that it doesnt enter in the project. But for my the main things that I need to think in blender like a material generator is mainly.

  • viewports modes to see roughness, metallic, base,… inside lookdev
  • improves in the interface and UX to bake maps, actually nobody knows to use it properly, overwrite maps, don’t export or save automatically.
  • new bake maps, like roughess, metallic… i can not do a lot of maps with blender if I cannot export to my project.
  • allow to bake curvature, also in realtime in cycles like bretch told time ago.
  • could see in viewport a particular point inside the node tree.
  • normal map with option to invert green color.

Of course, noises and transform textures and specially WARP node.


The bake workflow really needs an overhaul… Great points but i think thats not really his focus…

I don’t know, improves in cycles and eevee is improves in viewport and bake interface.

But taht thinks are the base to make a material creator.

What would be valuable is the ability to create your own expressions. This could be a node that allows text input and the creation of your own in- and outputs. You could then input for example:
outputA = inputA * pow(inputB, 2) + 1
This would of course be restricted to the existing Math operations. This way instead of creating huge networks of Math Nodes, a single Node with an expression would be enough.

Adding in and Outputs could work like it currently works with NodeGroups - in the Sidebar (with your additional functionality to controll the type).

An Example would be Rendermans PxrSeExpr


Thumb up for 4D noise :+1: This is also very useful for creating tiling 2D textures, I did some initially tests, see: texturing - How to make tileable procedural noise texture? - Blender Stack Exchange

Also note that even 6D would be useful if you want something that tiles in both x/y and time, e.g. looping animated 2D tiling water texture.

Btw. maybe you already saw this:


I have some suggestions, let me know what you think about them. (English isn’t my native language thank for the consideration & your hard time reading it :slight_smile: )

  1. “ReMap Node” I don’t mean the one or similar to the one that already exists, I want to be able to use vector manipulation in the middle of the node tree and not just at his start. For this need I suggest a node that will contain two sockets, color and vector, this node will take the texture that has been generated and kind of change it to be a new texture with a default “generated” mapping. Until now the way to accomplish that effect is making a node group, but in my opinion, my way will create a more efficient and flexible node tree.
  2. “Numerical View Node”, We already have the View Node that converts any kind of information to a readable one by using the emission node. What I suggest is a View Node that will take the round of the input texture and write its Value upon the mesh in the viewport. The problem is that it’s not so informative about the exact Value, so I suggest the Node will have a pixelated option in which you can control how small each pixel is and for each of them the viewport will write its value. I have been posted this In RCS but due to my poor English, I think nobody gets what I suggest (I know, Kind of sour loser, anyway I still have the mock-up)