Where Blender's Axes are Defined?

perfec example, look at the relation between the axes. if you would put x right horizontal in front view, you woudl see y running up and Z comming OUT of the plane, you have just indicated the way the axes are defined in mathematics, thank you! :slight_smile:

So where should I start looking to have this changed

What you’re describing here is coordinate system handedness. Which is separate from what axis is chosen as up axis. Also not that Cycles uses a left-handed coordinate system: x to the right, y up, z pointing INTO the screen.

Front view in 3D math space.

What you describe is a top view. In math when you draw in 2D for graphing, in reality, it’s like drawing on the floor, not on the wall.

2 Likes

so where should I start looking?

Everywhere. It’s literally everywhere in the code. There isn’t just a place where it’s defined. You could spend a full year changing how the coordinates are defined/bug fixing, OR you can spend one week getting used to a different coordinate system. Note that if you pick the former, you will make the majority of established users very upset when their files stop working and they have to relearn the coordinate system.

3 Likes

It IS a perfect example because AS IT IS DRAWN on the wolfram site is exactly how it is in Blender. Notice that the Z axis (as described in the text on the wolfram site) is pointing vertically and it uses the same handedness as Blender.

Just a side note. H also supports the Z-up option, and it seems it uses the “world” transform approach, where there is a master root, that defines the orientation. It easy to see that in effect, by playing with the preference, the objects get really rotated, they don’t maintain their place in space, they maintain the coordinates, when switching between Y and Z up.
Overall, I think the best is to adapt to each, I use both Unity and Unreal, and each one uses a different up-axis also.

There you have it @hewi. Perhaps the way you make this work Is to basically create a World orientation object. It would be a child of the current Z-up coordinate system while the objects of the scene(s) could be children of the World orientation object. Then, basically, whenever you choose “Global” as your transform orientation, objects would actually be transforming in relation to this world orientation object. As long as you give the user controls for rotation and axis mirroring, they could have any orientation. Y-up, Z-up, or Y-at-a-42.7°-pitch-up would be possible. The view grid and navigation gizmo would have to reflect this user orientation.

There will still be many places in the code that assume z-up, but maybe this could be a smart approach.

Code can be full of z-up assumptions, from physics to even geometry editing code. An example in another app with a name starting in 3 and ending in x, caping (generating mesh to fill a closed spline) assumes that the local object space is orient in Z, so trying to cap a rectangle (or any other closed spline) would not work if it wasn’t project in xy, this was awful when working with lines.
But yes this invisible root node would be the “easy” solution, almost like the Scene Collection having a transform.

2 Likes

There may be a master root, but from this description it’s not obviously the case. Rotating the viewport camera rather than every object in the scene would have the same effect.

I don’t think it would be easy if you take into account all indirect effects of this. The mismatch between actual world space coordinates and those presented to the user would need to be accounted for in various places. Changing viewport navigation and cameras seems more isolated.

That’s why I say “easy”, from an outside perspective. I know that coding-wise this will be a nightmare.

Would something like this ever be considered for inclusion in master, anyway?

Why not? It is clearly something that users want, as this topic arises several times a year, always with the same discussion. As long as the code submitted uses a good primitives that make the code clear and make it obvious how, when developing future code, to do it in an “axis-choice aware” way, I suspect it would be accepted. But as we keep saying in this thread: this is a hard project, and I have my doubts anyone would want to spend the (potentially) months necessary to do it properly.

4 Likes

That maybe indeed what happens behind the scenes. Since it’s something I don’t use, I never tried to understand or dived into the inner workings.

Often requested but hardly ever justified as far as I can see. It’s pretty much always a matter of habit ie something that anyone can get used to. It’s not like anything is actually broken because of having a left-handed coordinate system with Z as the up axis. Exporters take care of the conversion if necessary, etc.

(Unless I misread something ?)

Yes, exporters can handle the conversion, so it is just a matter of making sure that they all do (including ones written by external addon developers), and, as you say, having the user get used to the differences.

I am not a user (much) and therefore the defense of this request doesn’t fall on me. As a developer, I like your line. But we should understand the plight of people who spend most of their time in one app and then flit in and out of Blender. If the axes systems are different, I imagine the cognitive jarring that happens as one goes back and forth is disturbing and maybe even hurts productivity (do people who deal with left-hand coordinate systems vs right-hand ones get used to their objects looking mirrored in Blender? Or do they put on a mirror modifier while working in Blender?)

You could argue that this is similar to keymap differences. People have to get used to them too. But at least for those, Blender has tried to make them configurable, and users are asking for the same flexibility about axis choice.

All of which is to say: if I thought this were an easy thing to do, I might consider doing it myself. But it is definitely not an easy thing to do.

1 Like

Right-handed :wink: If it were left-handed a lot more compatibility issues would arise

I’m confused, I held my left hand up and it seemed to me like it was oriented the same way as Blender’s, X positive going to the right side, Y positive going away from me and Z positive going up ?
Oh is it pointing the index finger up ? I was doing it like I was pointing a gun, index going forward

@Howard_Trickey sure, I am taking no particular side. It just seemed like an enormous amount of work to solve something that’s just a matter of habit. But then again some people probably have to work with more different conventions than I do, and at the end of the day, I can imagine it being tiresome, switching between handednesses like that.

No, the index finger is used for the Z direction, thumb is Y, middle finger is X

Edit: that’s how I use it anyway, and that works. You can do different configurations, but they still need to match this figure in https://en.wikipedia.org/wiki/Right-hand_rule

It depends on how you are configuring your and and what you are identifying with each of the axes. I like to use this method to determine the handedness: point your thumb in the +Z direction and curve the rest of your fingers. If you the rotate your hand in the direction of the curve, you want to be going from the +X axis to the +Y axis. If that happens with the left hand, you are in a left-handed system; if the right hand, then a right-handed system.