This is where I’m coming into blender dev from. I use it professionally, as a side-gig, but I have a rebellious streak a mile wide and when something doesn’t work, or could be better in a way I don’t see anyone else noticing, I get this fiendish little urge to rally the troops for it, or at the very least poke around to see how I could do it (usually ineffectively).
for example, the baking workflow needs some serious love, but while that’s happening, editing datablocks directly would both make it far easier, and offer more pathways to do it… and would also lead to an internal media browser, which is an idea I’ve had since I tried comparing the NLE and viewports to davinci, premiere, godot, and unreal… they all have a panel just to browse imported data. I’m mentally exploring it as I work on other, smaller issues.
I use Blender for 3D printing! I have setups that allow me to use it like CAD software with booleans, but modeling software for the basic blocking, and animation software for mechanics! There’s a lot of potential once rigging nodes becomes more robust!
exactly this. ArmorPaint was looking promising, until the developer abandoned it to work on their game engine instead (still salty about that…) this is why dedicated “layered materials” was such a hotly requested feature, even though it’s technically already possible with just some mix nodes. I am also a proponent of bringing back texture nodes, though mixing with or angling it toward the painting, UV, and baking workflows
yeah, my OG idea for bringing back the texture node system was as a dedicated, optimized system to generate procedural noises as assets, that could be used for painting on UV’s applying to volumes, or using as fields for attributes and physics to react with… but right now, I do think something like substance painter would be best. Blender has a fantastic toolset for making materials procedural, but its system for simulating texture data (dynamic paint) has an odd combination of intended uses and vertex paint features.
if texture nodes comes back, doing things like simulating liquids collecting in pockets and gaps to make rust, then baking the result, or letting the simulation run actively during the scene, for timelapses or stains, or even soaking fabric. It could also be implemented as a set of geometry nodes that produces a texture output, though I would like if it wrote to texture data, not geometry data. It would be like the ever-present “pointiness or AO for procedural materials” workflow, but far more useful and controllable, because fidelity would be based on texel density, not geometry, and physics could give the material more lifelike aging