When images are saved into a file, there are two ways to store the color data: with or without gamma.
Utility - sRGB - Texture means that the color data in the file is stored with gamma. This is common for most image formats (PNG, JPG, TIFF, WEBP, TGA, etc.)
Utility - Linear - sRGB means that the color data in the file is stored without gamma. This is for more specialized image formats (HDR, EXR, TIFF float, etc.)
That’s all it is, it’s just based on how the image is storing the colors. Blender needs to know which image format it is, so it can properly convert it into ACES. This is also necessary for Filmic (which is why you need to select sRGB, Linear, or Non-Color when using Filmic).
If you don’t know what the image format is, you can look it up. The “HDR format” column tells you whether the file format is Linear (Yes) or sRGB (No).
That’s correct, it just gives you the raw data without transforming it (which is equivalent to Non-Color in Filmic).
Let’s not use the term “gamma”. It is ambiguously used in different context and it often confuses both the people using the term and the people hearing it. I bet many of you guys don’t really know what the term “Gamma” originally means.
So the term Gamma was really supposed to be a variable in the Power Law function. And this Gamma itself when taken out of that Power Law Function context, doesn’t mean anything other than a greek letter.
For example, I am writing a math function f(x) = x +1, and now I refer to the function as “X” wherever I decide to use it without mentioning the original fucntion at all. Do you see the madness now?
The thing about using “Gamma” without the context is that now people are using the term gamma while they have no idea what a power law function is. Therefore people don’t actually understand what Gamma really stands for, and they use the term as what they thought the word means.
“With or without Gamma”, what does that even mean? How can a power law function be without that variable? This is what happens when you use the word “Gamma” nowadays, ambiguous meanings.
I guess what you mean is “image can be saved with Linear encoding or non-Linear encoding”. Note that not all non-Linear encodings are power law function. For example, sRGB has two standard functions, one for encoding and one for display (the original intension was to create a mismatch in encoding and display to darken the image for flare compensation, but this approach is sort of out dated and the new school approach would be using the same function to do a “no operation”, this is covered in the Chris Brejon article I linked above). So yeah there are two sRGB transfer functions, one is power law function 2.2, the other one is a piece-wise function that darkens the shadow. The piece-wise function cannot be described using pure power law function, therefore if you try to reference it using Gamma it will be completely wrong.
So guys, don’t use Gamma, use proper words like “Power Law Functions” or “Transfer Functions” or “Non-Linear Encoding” etc. instead, depend on what you really mean.
Also sadly Blender still has the “Gamma” node and the “Gamma” setting in Colormanagement panel with this ambiguous name, I hope one day it gets changed into “Power Law” node and “Inverse Power Law” setting in CM panel to avoid all the ambiguity. (And yes they are inverse, I don’t think people realize it because they mostly use the term “Gamma Correction” to refer to both, I mean you can see how confusing it is now.)
AFAIK you are correct with that behavior, one thing though is that the behavior is not what “Linear sRGB” means.
The colorspace option in Blender’s texture node has a full name, “Source Image’s Colorspace”. Therefore it’s a setting for you to mark what the original image was encoded in. Then OCIO will transform the image from the colorspace you mark it as, to the “Scene Linear” colorspace specified by the OCIO config. Therefore “Linear sRGB” means the original image was encoded in sRGB colorspace with a Linear transfer function. The conversion thing is just what OCIO does.
This is not really Filmic nor ACES, this is OCIO. That’s why in the beginning of the thread when people said something like “ACES means you take images from different colorspaces and unify them in the same working colorspace”, I told them this is the approach already with just OCIO.
I would prefer the term “encoding” rather than “format”
Jesus why is everyone fighting over a colorspace. Stop trying to convince people that ACES is some sort of dark scam that only enlightened people can see the hideous truth it’s hiding in the back that is Gamut Clipping, and that Troy is the ultimate savior by coming up with Filmic to map it correctly eh?
We don’t care.
If you work in the VFX industry, one of the workflows requires me to match the CG gray-ball to the on-set gray-ball, which means I have to load an image as a reference in Blender and have a proper colorspace to pick from, not have some weird workarounds to just cope with this. It doesn’t even take effort and the devs would do something if people didn’t start this “holy war” about color spaces. I just wanna get the job done without the saturation and value being flung around just cause I used the color wheel. HONORING THE OCIO. That’s it. nothing deep. And I really appreciate the effort Troy is putting out to pump awareness that ACES is not what it truly is, and thanks to him I actually know better in regards to its limitations and how to work around it and even when to NOT use it. But that’s not what op asked, nor what anyone that wants ACES asked. A proper implementation as in either HARDCODE IT just like the colorspace for the color wheel is hardcoded in OR…hear me out…honor the OCIO in every aspect that matters, this doesn’t mean to re-write the whole compositing node to fix this, which will help btw, it just means to make blender more usable when other OCIOs are being used other than Filmic. And if they want to be blatant like Pablo, just say it, but in the name of Ton, don’t break the momentum of pushing Blender to be a well-rounded industry tool.
I don’t think the OP agrees with you on that front
Ok joke aside. We are talking about image formation (form an image from open domain/electromagnatic radiation data) and how people didn’t really want what they thought they want (which was why Chris asked people what they meant when they said they want ACES).
If you believe that’s what we are talking about, you didn’t get what I wrote then.
Filmic skews too! Therefore
This is not correct.
What we want is to raise awareness of the related matter, to make more people aware of the cons instead of blindly following the trend.
As to how Filmic skews, I think we have a perfect example:
Eary I really relate to the mentality you have, knowing the technical aspect of art is a crucial step to leveling up. But knowing the technical debts of a software in-order to compensate for a problem does not solve the problem. I understand you completely but what you don’t realize with these color skews with bright lights is something so minuscule compared to the industry and can mostly be fixed in post at east, even ACES has a couple of look transforms to fix some. Apart from that can you teach me about this stuff in depth including the calculation and whatnot if I can contact you in any way.
Jesus man you’re regressing back to square 1. Listen to me.
CAMERAS…LOTS OF CAMERAS okay just hear me out; ARRI, Red, URSA mini, Cannon ---------> IDT, Then…the CG Renders, with textures, HDRIs, and what not being in ACES ------ > ACEScg, I don’t care if it’s just another linear space, I don’t care if it’s just a fancy Rec. 2020, I don’t care if they lied about it being a system and I don’t care that it has clipping issues, I just want a clean pipeline that can be documented and is also future proof ( don’t quote this and go on about ACES being a lie ), that’s not the point.
AFTER THAT…Footage + CG + Color checker = happy client. The client doesn’t want to hustle with anything else other than what they know, the client wants VFX delivered back in ARRI Log or Rec 709 or sRGB or the sheer amount of colorspaces that they are comfortable with. I SEND IT through that. Period. It doesn’t need any hidden layer analysis to use this system. No one cares about the minor issues that are not dominant on common production renders, most problems arise when working with extreme light sources as far as I’m concerned. No one asked ACES to be perfect and from Camera sensors to RAW to computer files to display devices and to the eye, everything has way too many variables to be bogged down with the level of precision you expect of ACES to be just because it said it’s an Encoding System. This is Blender’s forum. What most people ask, be it wrongly phrased, is let’s not have a s****y color management, and it doesn’t need to come from die-hard ACES fans or Troy worshippers to implement this. The fact that the reference image clips everything to 8-bit is a deal-breaker let alone having a color wheel with a hardcoded Rec 709 baked into it that goes crazy at a slight touch. This is only a problem you’ll ever encounter if you ARE IN PRODUCTION. like actually trying to change your life with this, and as the guy said. Playing with “the big boys”, requires the use of ACES. Stop citing that article over and over again like it’s the bible of color management, I’m not saying it’s wrong, but that just defeats the whole purpose and it gives the devs a lot of wiggle room to slack off when it comes to stuff like this. If Blender had an enterprise version like RedHat for Linux, anyone would buy it with a blink of an eye just to get decent color management and a little bit of a standard workflow. And since you are sooo into defending filmic, I would love to know the industry or studio you work for that uses Filmic for VFX I/O.
Do you have any idea how easy this is to fix in any software that is aware of pixels and hue shift. You get these comments from Creative directors or who ever is in charge, and i change the light source to match what i want, just cause I used a specific colour in my colour wheel and it didn’t show me that exact same colour at a higher brightness, doesn’t give me a reason to ditch a whole workflow. Specially something childish as this man, common man, for someone speaking this technical you need to know better. And I feel like I am pushing this towards something it was not intended to so I would love to end my discussion here brother, much love. I swear one of these days someone is going to say, “Then use Maya” inside of a blender forum lol.
Cool. where is the Cannon Log format, where are the P3 formats, where is the ADX? Have you ever been in a production where you had to be sure YOU CAN, IF YOU HAD TO.
Man I get it, that’s not the point, you’re providing a pain killer for a critical condition. Treating the symptom instead of the cause. Any way please forgive me for my raised tone and language, I hope you know how frustrating this can get in deadlines and what not and having to do manual work just because something obvious wasn’t implemented in an open-source software after you contribute what you can is a bit depressing to be honest.
Again, I am not defending Filmic, I even went ahead and gave you an example of Halloween Spider demo file to show you Filmic’s flaw.
What I replied to you was basically anchored around what the OP said about wanting a workflow to take footages from these colorspaces and linearize then to the scene linear space of ACEScg and then deliver the result in ACEScg EXRs. If that’s not what you want, I guess you can realize that many of you guys don’t actually want the same thing when you all say “we want ACES”
Stop listening to yourself only and read what I’m writing with an open-minded personality, it’s not about delivering in ACEScg using an OCIO file, its about fixing the color management of Blender. What prevents me from using the current ACES configs then if the only thing I wanted was to deliver in ACEScg
If that’s the case we are actually on the same side here: Believers of having native ACES implementation is not going improve Blender’s color management, but the other way around, fixing Blender’s color management will make the use of configs like TCAMv2 or ACES etc. practical.
We are in agreement here.
It has never been a Filmic vs ACES thing. It has never been.
That’s not the case, a stripped down ACES config is only 25 MB (uncompressed), 5 MB (compressed).
If it’s not included by default in Blender, then the user has to download a 1,931 MB repo which takes 16 minutes even with a very good connection, and certainly would be very difficult with a poor connection (most people in the world don’t have good internet). Then they have to manually strip out the things they don’t need, otherwise the UI overflows.
Of course the user has no idea about how to do any of that, so they’ll have to spend hours learning, as opposed to just… changing a setting. Most artists are not very technical, they just want things to be easy.
Lots of other tools (Maya, Arnold, Mari, Resolv, Unreal, Unity, BabylonJS, etc.) support ACES out of the box, it’s a simple setting. But for some strange reason people in the Blender community think that’s a crazy thing to do.
It’s like trying to argue that “Blender shouldn’t support FBX and OBJ, that will just bloat up the Blender binary, people should manually install a 1.9 GB file if they want those things.”
It’s like you actively want to make things difficult and drive people away. That’s a great way to ensure that Blender is never taken seriously and other non-free software continues to dominate. That’s really not the attitude the Blender community should be having.
The attitude Blender should be having is, “How can we make the best software we can for our users? How do we make things easier and more productive for our users, so our users waste less time? How can we be better than our competitors? How can we ensure that open source software wins over closed source software, for the sake of bettering the entire world?”