Get Hex (Gamma Corrected) color

I am sorry, but that is not entirely true.

  1. They are not garbage values, they are just three integer values written in a row one after each other in a hexadecimal format, e.g. FFFFFF being 255, 255, 255 (pure white color).
  2. Hexadecimal format is not archaic. It is industry standard format of displaying binary data, because it can represent a byte ranging in values from virtually 0-255 can be represented with just two hexadeciaml digits, e.g. 7F for 127. So, hexadecimal is used a lot by programmers.

What you are trying to tell has nothing to do with hex, it relates to different color space problems. Blender, as far as I understand, uses a linear color space for its color pickers. The output hex value is a converted value of the color in linear color space to the sRGB color space. The hex string, thus, is useful for quickly copying over and exchanging the color between applications, because most of them expect an sRGB value.

The reason why it should be exposed to API is because, for example, when writing an importer/exporter for a game format, one might need to import some colors. And… Blender expects them to be in linear color space, so they have to be preconverted prior to setting them to properties with the reverse version of Brecht’s formula from the post above. Additionally, when trying to map values from GLSL or other shaders to Blender’s nodes, the input colors have to be converted back to sRGB color space, since the math from shaders is meant to be working in that space, not linear.

In fact, when writing my own addon for supporting a custom game format, I had to deal a lot with these issues as they are not clearly documented anywhere. There is no information specifying the color space used by color pickers (e.g. the color picker from vertex paint somehow uses sRGB, but the rest including custom ones seem to be using linear).

2 Likes