Where Blender's Axes are Defined?

It’s quite funny that even using your hand as reference to define handedness leaves different options :slight_smile:

As Paul said, the real nightmare is finding out how to handle a differently-handled system. Pretty much everywhere in the code where a cross product is calculated and then the code relies on which direction is forward in that cross product result – well, that is code that will not work if the system has the other handedness. Just to pick a random example from code I wrote in bevel:

cross_v3_v3v3(norm_perp1, dir1, norm_v1);
cross_v3_v3v3(norm_perp2, dir2, norm_v2);
normalize_v3(norm_perp1);
normalize_v3(norm_perp2);

copy_v3_v3(off1a, v->co);
madd_v3_v3fl(off1a, norm_perp1, e1->offset_r);
add_v3_v3v3(off1b, off1a, dir1);
copy_v3_v3(off2a, v->co);
madd_v3_v3fl(off2a, norm_perp2, e2->offset_l);
add_v3_v3v3(off2b, off2a, dir2);

An axis-aware fix to this code would be to replace either the cross’s or the madds/adds with functions that do one thing for left-handed systems and another for right-handed ones.

Also, wherever a BMFace is made, the ordering of the vertices will have to change for a differently-handed system.

This sort of thing probably happens thousands of time in Blender’s code. It feels like a non-starter to m to try to identify them all and put in axis-aware fixes. There would also likely be a little performance hit because those axis-aware functions now have to do a test. I think an approach that would be more promising would be to keep all coordinates and faces as is – assuming a right-handed system – and only change things on display and export (and API usage? probably). So, transforming coordinates before showing them in the viewport or rendering. Finding all the display boxes that reveal underlying coordinates and transforming those before display. Transforming coordinates before delivering them to API functions.

I was missing the fact that each finger is supposedly associated with a particular axis ! But even knowing that, there is indeed a little margin of error in the way the hand is held. I was looking at this picture which seems to be alone in showing the hand held like a gun, all others show hands pointing up. Aaaaanyway !

Wouldn’t inverting the X part of the vector work ? it seems so simple it’s probably not quite right, haha.

The question to ask is at what level to handle the two types of handedness? Some operations on 3D models can be done independently of the handedness of the coordinate system, for example merging two vertices at their center. That’s simply (v_a + v_b)/2. But if you want to rotate a vertex v around an axis A by +45 degrees (say) then it will rotate in one direction in a left-handed system but in the opposite direction in a right-handed system. So even though the underlying math for rotation will be the same, the outcome has a different meaning. This already influences things like how mouse movement translates into a rotation. In a left-handed coordinate system the rotation angle will need to be the negative of the value in the right-handed system.

Another example is the polygon winding order Howard mentioned: if you define the front-side of a polygon as the side where you see the vertices in counter-clockwise order (which is what Blender uses) in a right-handed system then you would need to use a clockwise order in a left-handed system to get the same results. So you can’t really separate the math from the interpretation based on the handedness of the coordinate system. Which makes it tricky, as the code might need to handle lots of cases where handedness comes into play and you would need two versions of a certain operation.

So there’s lots of subtle effects involving handling coordinate system handedness. Luckily, most 3D modeling software uses the same handedness, i.e. right-handed. The different up vectors between models made in different software will then only cause a 90 degree model rotation around X when importing, so that’s fairly easy to fix.

But in general it would be an improvement to have a general set of transformation actions that can be performed during importing and exporting, both to handle changing the up axis, but also coordinate system handedness, and maybe even unit conversion (e.g. inches to cm). Currently, some importers support explicit transform options (like OBJ), others implicitly change the up axes (like glTF, which defines Y as up which gets transformed to Z up during import).

The way I always learned in school / university was: thumb = X, index finger = Y, middle finger = Z. Then it doesn’t matter how you rotate your hand, the relation of the axes to each other (handedness) stays the same.

Thanks, I see how it can be widely problematic…
@Zsolt_St yes that seems to disambiguate it !

Welp, looks like it’s time for us all to move to Switzerland

:yum:

2 Likes

Sh*t, they’re y-up !

The Netherlands should make a Z-up currency in retaliation!

actually it’s rather ambiguous. That coordinate system points into space at an angle. Could easily turn the hand a bit to make either Y or Z up.

As everybody else in this (scientific) world
:wink:

but you redefine Y as -Y and you are good to go!

X up or bust.

3 Likes

Our country is so flat we only need two dimensions :wink:

1 Like

I’d say the same applies for Estonia, but then we do have the Big egg mountain and the Small egg mountain(note: both are glorified hills)… :sweat_smile: