The Tyranny of Node-Graph orientation

I work in several different programs that use Nodes.

Nodes are awesome. I love nodes. I spend so much time going back and forth between, blender, Notch and TouchDesigner that I’m even starting to not mix up my shortcuts. Me and Nodes are like this :crossed_fingers:My opinions on Nodes are hella-informed.

Anyways, every now and then I find myself in a program like pd, houdini or nuke and, am delighted at being able to have vertical graph sections. In the case of something like Nuke, it’s even more incredible because the nodes don’t intrinsically have an in and out side. You can freely arrange them in any sort of shape you’d like. With predictably mixed results:

One of my favorite things about blender’s ui is the ability to quickly create layouts that accelerate my workflow. There have been many times when it would have saved me some space and visual clutter to be able to do a vertical graph but, the shoelace-effect because of the fixed routing orientation scares me off.

I don’t know what sort of ui design review might be happening for the nodes system at present but, it would be lovely to add ‘orientation agnostic’ to the list of potential objectives for a redesign.

Thank you!


you might have a look at this thread:

I thought about it. The main reason I didn’t is because I feel like my rant is more about the nodes-UI irrespective of the system it’s currently providing an interface to.

But your point is taken. I’ll post a link to this and a terser note about graph orientation.

Currently, you can make vertical graph. But it requires to use Reroute nodes.
So, it is slow.
I suppose that a switch to modify nodes aspect from Left/Right to Up/Down orientation is a valid request.

An article has just been published about UI.

This issue does not seem to have been taken into account.

It looks like particles nodes are supposed to have a left to right data flow and top to bottom lists.
So, it seems that for particles, inverting this to a top/bottom dataflow and left/right lists will not reduce the need of a large node editor.

For the rest, that is valid. For materials or compositing nodes, that could make sense right, now.
But if this UI change is supported for second milestone of “everything nodes” project, UI devs have time to improve design of their proposal.


Blender’s nodes have the advantage that you can see text for each input and output socket, and that you can provide an unlimited number of sockets.

Vertical node systems typically impose a restricted list of sockets and don’t allow you to see text or values for each input or output.

A node like this can’t really work in a reasonable way in a vertical node tree system:

The fact that you can adjust values on the node itself is also a fairly unique advantage of Blender’s nodes. Many other node implementations make it so that users have to go somewhere else to adjust the settings for each node.

As for the ‘tyranny’ argument, all node trees can become complex, vertical or horizontal. The way to combat that, is to improve features like Node Groups, so that you can encapsulate things better, rather than having 1000 nodes in one node tree.


Hi @billrey,

I was considering this point about the interaction fields and exposed pins and, wonder if it may be kind of a liability masquerading as a feature.

it does a great job of of providing you with controls and useful info when you’re up close and only working with a few nodes on your screen but, creates a lot of visual noise when you zoom out to get an overview of the larger network. Even when the node you pointed out gets collapsed to potentially reduce visual complexity you wind up with this:

That’s a whole lot of not very useful visual data.

It’s nice to be able to get to most controls (i’d love it if the ‘match movie length’ button were added to the image in node in the shader graph). I don’t think it’s that critical that each property have a pin icon directly next to it. If node links were defined by dragging a cable onto a field you want to control instead of the adjacent pin, the user would have a larger target for making connections and, the pin could manifest on the shortest distance edge of the node with a tooltip-style label instead.

A nice side effect of this might even be that minimized nodes minimize in a more useful way.

1 Like

agreed, from a UX perspective I don’t think I would call “having an infinite number of inputs that you can see all at the same time” is an advantage per se.

There’s probably a better way to handle the UX for large nodes like this. Consider all of the nodes we currently have and I think we can quickly come to the conclusion that the Principled BSDF is an outlier- most nodes are not that input heavy. Add to that the fact that all of the inputs are almost never used simultaneously (most people use a handful of inputs to make a basic dielectric PBR material) and you have an edge case on top of an edge case.

Perhaps there is some solution where all of the inputs are visible in the N panel, and can be ‘exposed’ to the node based on the user’s intent? A checkbox next to the input name in the N panel seems a bit heavy handed and awful, but I’m just rambling off some ideas here :slight_smile:

Either way, I don’t really have much of a horse in this race, I generally prefer left to right layouts because I have more experience with them, but from the few times I’ve used a vertical layout I could see the appeal to some folks. All that said if we had the option to change the direction I certainly wouldn’t complain.


I think another factor to consider in node graph orientation is display shape. Monitors are now much wider than tall. You can view way more nodes in a horizontal orientation. This wasn’t much of a factor when monitors and tvs had a 4:3 aspect ratio.

Most users tend to have unique workspace panel layouts, based on task and preferences. It’s unrealistic to assume that the node graph will be in a wide pane just because it’s on a wide screen.

As Billrey mentioned, some nodes in blender only work on the current layout, and other render engine nodes are like this as well. I find almost everything in blender’s node editor better than in any other program material nodes. (But that’s not the exact topic here).

The fact that we’re looking at what other node-oriented programs that aren’t quite about shading, but rather compositing tells a lot, anyway, I still think we should look for the advantages those layout provide, I imagine many math-driven might make easier to understand and visualize with a less fixed layout, so we should still consider this, probably a example of a material that looks more readable would be nice to let us see the potential better.

Ps: Althought beyond of the OP title, I recall from game engines have some nice features that could be implemented, maybe @BartekMoniewski (On twitter) can tell you more.

every modern operating system also allows you to easily rotate a display to make it taller than it is wide, so the reverse logic could be applied. walking through my studio I see a LOT of vertical monitors. I personally think it’s weird, but it is popular.


A nice side effect of this might even be that minimized nodes minimize in a more useful way.


There’s probably a better way to handle the UX for large nodes like this. Consider all of the nodes we currently have and I think we can quickly come to the conclusion that the Principled BSDF is an outlier- most nodes are not that input heavy. Add to that the fact that all of the inputs are almost never used simultaneously (most people use a handful of inputs to make a basic dielectric PBR material) and you have an edge case on top of an edge case.

If anyone wasn’t aware of this option already.

You can hide unused node sockets and when done working with nodes I always tend to just select all nodes and “Toggle Hidden Node Sockets” Option using quick hotkeys.

And I completely agree with @billrey here, I’ve seen Nuke workflows and they seem like a hot mess to me, everyone in the Blender team is opposed to floating panel windows that open up when accessing node sockets in a program like Nuke. I don’t hate the vertical aspect of it, just not fond of the execution.

However, I am quite fond of the VFX graph system implemented in the Unity game engine.

@billrey I’m aware that there is no need to pour much valuable time and effort into catering to everyone’s opinions on how this system should be implemented but If there was a way to intelligently reroutes node connections automatically line up in a pleasant manner, it would improve the readability of node trees.

A focus toggle that when you select any particular node/group, all the incoming and outgoing connections are highlighted, darkening the backdrop and creating a stark contrast.

Sort of what happens when you group nodes.


Absolutely - there are many improvements to be made. Better auto-arrangement is one of them.

Unity’s blocks make sense for collecting things in a list - useful for our particle nodes.


I was not aware of that option and will certainly be using it in the future.

I think that some of my OP is being interpreted as a clamoring for being able to turn my graph vertically when, I really just mean to say that ideally, I could orient the flow of my graph in any direction I choose. Moving objects around during visual programming is a central part of the concept/flow loop for me. Blender’s left-to-right node pin-out creates an implicit-bias for graph layout. That’s not a crisis that needs immediate resolution (dramatic terms like ‘Tyranny’ notwithstanding) but, I don’t think that it’s something that should be accepted/defended because it’s already built that way.

Nuke doesn’t have floating panels for accessing node info in my experience. One of the rather nice things about it is that node parameter sheets are all docked into a stack on the right.

It’s kind of like an ‘n’ panel that shows multiple node properties. Panels can be added and arranged to suit different tasks.
edit: Yes, I know, that’s a pic of Natron…

Unity has one of the sexiest graphs out there. Another personal favorite for me is Luna:


I am joining this discussion, as there is more development towards node based workflows (geometry nodes, particle nodes, animation nodes) and also multiple addons which use node graphs.

Change towards vertical design is necessary, as in large node trees horizontal layout creates much longer setup than vertical and is less easy to work with. In example, if there is a small setup of 10 nodes arranged horizontally that makes it quite long already. But if 10 nodes are arranged vertically, those will take less space. To think the same with 50 or 100 nodes, then it makes big difference to work with.

Above Nuke screenshot is a good example of this, as it has nodes both ways which is needed in real situations. To get this benefit the design should also follow Nuke with nodes being just used to access their properties which are in separate panel and not to have properties inside nodes, which can make nodes very large if have many options. Small nodes without parameters allow larger overall view while working, as no need to zoom close to individual nodes to change parameters. Nodes itself don’t need to display inputs all time, but only when hovering cursor on top of node. Nodes could display possible output/input connections as a pop up on mouse hover.

This should be the standard way, but it is not limiting to vertical graphs, as users can also create horizontal if prefer. I am actually thinking that it could be possible to have current way and proposed way coexist by having a node editor option to choose either “compact” or “old” node graphs. Just to change which visual style to use. Compact to be described Nuke style and old just like Blender has now, so nodes are bigger and also display parameters and inputs.

Also everyone who has used Nuke knows that its node graph superiority is largely because of how convenient it is to work with. Everything is really easy to select and connect. Blender and some other software have very small selectable areas and connection dots. Blender works quite well even zoomed out, but output/input dots could “pop out” more clear when hovering mouse on top. Also framing nodes and reroute dots works much better in Nuke. To move nodes out of frame in Blender, user needs to use a shortcut to disconnect node from frame. In Nuke user can just move nodes out or inside frame. Also reroute dots in Blender are fixed in place, while in Nuke dots can be just dragged to new place as those are directly moveable. Dots have also much larger selectable area in Nuke. Nuke also does good job of aligning nodes automatically when close enough horizontally or vertically to other nodes to create clean understandable setups. Nuke’s node graph might look more ugly but it is very user friendly.

Lot of comparison to Nuke, but it has the best node graph there is. Houdini uses 2 types of node graphs. Nuke style in general, but shaders are similar to Blender’s. So combination of 2 styles is also possible.

1 Like

I disagree. Take UE4 for example- the average gameplay blueprint is easily over 100 nodes, and with a competent developer (not a spaghetti artist) it is quite easy to follow. I’m not saying that vertical layouts are bad, but your main point that a vertical design is the only possible way to handle extremely complex node networks is just… well… wrong.

1 Like

Sure. Maybe that part of the text is indicating quite strongly that I would like to have only vertical possibility. Actually I think it is necessary to have the option for it, which is difficult with current node graph. The mention in this thread’s first post of Nuke not having an in and out sides of nodes is relevant for this. And also I would like to have overall improved node graph with small improvements of handling nodes, frames and dots. Maybe this thread title (not mine) with quoted part of my text made it sound bit harsh.

As an example I have a Blender material setup with more than my mentioned 100 nodes and I can’t zoom out enough to see it in full, as for some reason Blender’s node graph zoom is restricted to some maximum amount. This is just an example affecting big setups in general. If my node setup would be with smaller nodes, I could fit the setup in one view by keeping very neat. Better in vertical for my liking, but also in horizontal. :slight_smile:

Though, I guess when I mentioned the problem with restricted maximum zoom out, someone will educate me and tell how to have infinite zoom in Blender.

I work in blueprint.

Massive horizontal graphs are just as much of a pain there as they are elsewhere. Usually we wind up managing them by breaking them up into functions, macros or custom event chains. In some ways maybe this actually encourages you to more aggressively modularize your code, but to @SakuP’s point, it just nice to be able to arrange things how they’re laid out in your head instead of being forced to lay it out left to right.

I was actually looking at some of the ui workshop notes on the nodes, and it seems like there is some notion of doing vertical linkages for certain types of relationships between node objects. This would be kind of like TouchDesigner, where data links are laid out horizontally, while hierarchical relationships are wired vertically.

I feel like the freedom of arrangement is definitely a value-add to systems like nuke and Houdini, but I don’t actually like how either of these systems have chosen to depict the ‘in’ and ‘out’ pins on nodes. So I see it as a UI-design conundrum. I would love to see what @billrey and other UI folks working on blender might come up with if the layout of node-graphs wasn’t considered a given. :slight_smile:

As far as cribbing ideas from other apps goes I wouldn’t even say it’s my highest priority.
When I’m working in blender node trees, the two most common features I sometimes try to activate by muscle memory are ‘wiggle-out’ (Houdini) and ‘drag off pin for autocomplete’ (UE).


I think the main reason why graphs are horizontal, it is because it allows for all pins have their names right next to them. I agree that horizontal graphs are way too long, but imagine some node like principled pbr shader, with its input pins stacked on the top of the node, without labels next to them. I’ll lose my sanity as soon as I look at it.

And, the next thing, a lot of nodes are vertically bigger, than horizontally, they are tall and not wide, which means stacking them vertically won’t help and make things even worse.

In my opinion, nodes should get smarter and smaller.
You can hide unconnected pins (ctrl+h). What about node properties that don’t have pins? should they be displyed in the node graph, or should they be visible just on the sidebar, when the node is selected? The interesting thing, it is already present in blender node editor. It just needs some good defaults and “automagic”.

The bulkiness of nodes for me seems to be more fundamental. We kind of gave up abstraction for the sake of clarity, and went back from writing code line by line, to designing hardware-like printed circuit board layouts, but with nodes. And planning nodes layout can be a huge overhead.


As of now the design of graphs is 100% WYSIWYG, meaning that the entire graph is exactly what it looks like and all of the node properties are editable. Strong coupling between node looks and how the graph works.

The trap here is that as the node gets bigger every time, you would have to find ways to manage space more efficiently. Such as for example with grouping, or frames, these are some handy ways to pack things. You end up making the information more compact and concise so things are easier to understand from a high level view.

In the picture, this setup is known to experienced users and immediately understood. Is a total high level approach with only about collapse+hiding. However at some point if you try to remember what you did there, or read a graph someone else made. You would end up expanding everything and restructuring so is understood.

Since it takes too many steps to do so, no one end up compacting the nodes so much, they just get it as it is and resolve to lots of scrolling. Or perhaps split the viewport many times so you examine different parts of the graph each time.

2020-11-17 11_53_12-Blender

So a new way will have to be utilized to make things work in both views. There should be a Default view as it is and Compact view which makes the summary. In some sense the Compact view will have lots of potential to be aligned either vertically or horizontally, according to what each user prefers.

1 Like