Even in sRGB they are broken.
yes, agreed 100%. I think that the official OCIO answer on the “color_picking” role would be : “we removed it from the next ACES OCIO Configs because its behavior is too different between softwares.” Which in my opinion was not necessarily the right move.
I would have preferred that we settle on defining it precisely and making sure that all developers are on the same page. This is the current definition :
colors in a color-selection UI can be displayed in this space, while selecting colors in a different working space (e.g.
The ambiguous word here is “displayed”. I asked in one of the OCIO meetings to replace it by “set” or “defined”. But instead, they updated the docs with this warning :
Unfortunately there is a fair amount of variation in how applications interpret OCIO roles. This section should be expanded to try and clarify the intended usage.
So I guess it is up to the Blender users/devs to implement it the way they prefer. If I can give you any advice, here is the link for developers for MixingHelpers in C++.
And here is the mockup I did and presented at a couple of OCIO meetings some months ago :
I thought it would help clarifying some concepts about Color UI (not specifically in Blender but more for softwares in general).
Thank you Chris.
Now let’s have a drink.
You´re welcome ! Anytime !
Mais c’est toi qui paye…
Nothing is wrong with asking for a better and more solid implementation of OCIO/ACES like this is done in other apps. Others users in this topic already mentioned all the rocky experience one has when trying to setup ACES workflow in Blender. I am personally more interested in integration with other apps and the studio workflows where this kind of stuff is asked from me. Arguing on the look issues and why it sucks are not something that matters atm for my use case.
For reference, the roles used by Blender are documented here:
rendering space. It’s not clear to me how using
rendering for the working space like Maya would work, what we should use
scene_linear for when it is different than
In general I don’t believe in making a distinction between the working color space and rendering color space. If you’re already having to deal with the complexity and confusing consequences of those being different, you might as well go all the way and use spectral rendering.
color_picking one we could try to make more compatible, but both the current OpenColorIO definition and the proposed one are unclear to me. However my understanding is that Blender uses it the same way as the OpenColorIO MixingHelpers, that is as the “approximately perceptually uniform space” mentioned in the docs.
I think the main thing we are missing in our color picker is a distinction between different types of colors (albedo, radiance, UI element), as well as options to use a different view or look than the scene for albedo and radiance type colors.
I should also mention that shader nodes like Blackbody, Wavelength and Sky Texture do not have their color space hardcoded and use the
aces_interchange role to figure out the appropriate chromaticities. Although there are some things we should improve there for better results.
It’s quite tricky to be compatible with other apps, we’d almost need to add options to make reading the config compatible with Maya, Houdini, etc. Or we could support e.g.
blender_scene_linear overrides that can be added to a config. Ideally the OpenColorIO definitions would be clarified and followed by all apps though.
I think the biggest issue when using ACES is that you would need to convert all color textures and colors to match the color space. Hence there need to be a way to globally influence the color space of a scene, without adjust each texture and shader. That would be an overkill.
Also standard colors, RGB values, need to be floating bit and not 8bit. Kind of big change for Blender. But nearly all other commercial software has done that change years ago. The software I worked with 3dsmax, Maya, C4D and Modo all have a global color management and support for LUTs In shaders and rendering. I miss that in Blender.
For example, in Blender you can set each bitmap texture to a different LUT, but the scene default is missing. So while all textures in other software is threaded as a standard LUT defined by your scene setting, which would be sRGB for all colors, and Linear for the rest (with normal map as exception), Blender wants you to set all manual. Now good look if you have a scene with 100 shaders and want to use ACES… well maybe someone writes an Add-on, but I think that should be done different.
Actually you don’t. We use ACES in Blender. When you import your texture, you just need to set it to sRGB texture and that’s it.
Ok, yeah maybe I was wrong. But you would have to do adjust standard RGB colors, as they have not LUT.
You simply set the textures to the appropriate color space:
Utility - sRGB - Texture(if it’s RGB)
Utility - Linear - sRGB(if it’s an HDR or EXR)
Utility - Raw(for roughness maps, normal maps, etc.)
This is exactly like setting
Non-Color in Filmic. You can use exactly the same textures that you currently use with Filmic, no conversion needed. Your workflow is exactly the same, you do everything exactly the same as in Filmic (except with a very different look, of course!)
You can even use 8-bit PNG, though in that case you might get some clipping. This isn’t specific to ACES, it’s a limitation within Blender itself.
There are some issues, such as with black body, the sky texture, the color picker, etc. but overall ACES works just fine in Blender. Of course improvements to OCIO should be made, but OCIO is currently functional in Blender.
The real issue is the setup: in order to use ACES, you have to download a
zip file which is several gigabytes, then you have to edit the
config.ocio file to remove the unneeded color spaces, then you have to configure the
OCIO environment variable.
The goal is that using ACES in Blender should be just as easy as using Filmic: you just select ACES from the dropdown list, no setup required. That definitely isn’t true today, but hopefully it will be that easy in the future.
Oh lord am I glad I always find this types of threads super later and don’t have to waste my time unavoidably engaging with people enjoying the sights from mount Dunning-Kruger. (Some of these “experts” even teach from there!) lol.
Just here to say, because it seems this is news to some, there’s an “All views” version of Filmic-Blender that works seamlessly on any software that supports OCIO. I use it in Nuke, Houdini, Vray, Resolve, Krita, Affinity etc. and in all cases, like loading any config.ocio in any software, is just a couple of clicks. This has existed for years. I don’t know what people were talking about in the beginning about “upcoming versions” for it or it being “too hard to use in Nuke”. Here is the link to it:
No changes from the original Filmic, just all the 1D contrast LUTs for the looks merged into separate views to account for all the softwares that don’t have a place for “Looks” in their UIs. We made it and Troy released it 3 years ago and people keep making convoluted tutorials and false complaints about using Filmic outside Blender
I’m not gonna engage in this pretense of a discussion on “Filmic vs ACES” as if that made any sense.
Just gonna say I’m surprised a thread on this topic can go this long without nobody mentioning ¡OCIO nodes for the compositor! (something that would actually improve quality of life for Blender’s compositing workflow and simplify interoperability beyond it), hey, you could even load an ACES2065-1 AP0 encoded file with one of those and make it work with Filmic from Blender’s CM!
I am glad that you are able to use Filmic in other apps easily but this topic is not about how to use Filmic in other apps.
It’s called a forum, it was in answer to brought up comments on the thread. I am glad you’re able to play the policeman of off-topic situations, but this is a thread about Filmic and ACES, please stay on target.
This is not really off topic, it is directly related to the first post
I never really understood the difference between Linear and Raw, but thanks to this topic I’m starting to get an inkling of understanding…
- Linear sRGB means the data is transformed into the currently active ‘internal’ rendering colorspace, whatever that may be. So even though it’s linear the primaries and intensities can be different.
- Raw means just take the value as is.
Did I understand this correctly?
Sorry for the semi-offtopic post. I don’t often have the possibility to have this many colornerds look at my questions
When images are saved into a file, there are two ways to store the color data: with or without gamma.
Utility - sRGB - Texture means that the color data in the file is stored with gamma. This is common for most image formats (PNG, JPG, TIFF, WEBP, TGA, etc.)
Utility - Linear - sRGB means that the color data in the file is stored without gamma. This is for more specialized image formats (HDR, EXR, TIFF float, etc.)
That’s all it is, it’s just based on how the image is storing the colors. Blender needs to know which image format it is, so it can properly convert it into ACES. This is also necessary for Filmic (which is why you need to select
Non-Color when using Filmic).
If you don’t know what the image format is, you can look it up. The “HDR format” column tells you whether the file format is Linear (Yes) or sRGB (No).
That’s correct, it just gives you the raw data without transforming it (which is equivalent to
Non-Color in Filmic).
This I know I’m not that clueless :-D. The confusion for me was the difference between ‘linear’ and ‘raw’ formats.
I’m not sure exactly what the difference is, but I do know that Linear does some extra transformations, whereas Raw / Non-Color don’t do anything.
Let’s not use the term “gamma”. It is ambiguously used in different context and it often confuses both the people using the term and the people hearing it. I bet many of you guys don’t really know what the term “Gamma” originally means.
So the term Gamma was really supposed to be a variable in the Power Law function. And this Gamma itself when taken out of that Power Law Function context, doesn’t mean anything other than a greek letter.
For example, I am writing a math function f(x) = x +1, and now I refer to the function as “X” wherever I decide to use it without mentioning the original fucntion at all. Do you see the madness now?
The thing about using “Gamma” without the context is that now people are using the term gamma while they have no idea what a power law function is. Therefore people don’t actually understand what Gamma really stands for, and they use the term as what they thought the word means.
“With or without Gamma”, what does that even mean? How can a power law function be without that variable? This is what happens when you use the word “Gamma” nowadays, ambiguous meanings.
I guess what you mean is “image can be saved with Linear encoding or non-Linear encoding”. Note that not all non-Linear encodings are power law function. For example, sRGB has two standard functions, one for encoding and one for display (the original intension was to create a mismatch in encoding and display to darken the image for flare compensation, but this approach is sort of out dated and the new school approach would be using the same function to do a “no operation”, this is covered in the Chris Brejon article I linked above). So yeah there are two sRGB transfer functions, one is power law function 2.2, the other one is a piece-wise function that darkens the shadow. The piece-wise function cannot be described using pure power law function, therefore if you try to reference it using Gamma it will be completely wrong.
So guys, don’t use Gamma, use proper words like “Power Law Functions” or “Transfer Functions” or “Non-Linear Encoding” etc. instead, depend on what you really mean.
Also sadly Blender still has the “Gamma” node and the “Gamma” setting in Colormanagement panel with this ambiguous name, I hope one day it gets changed into “Power Law” node and “Inverse Power Law” setting in CM panel to avoid all the ambiguity. (And yes they are inverse, I don’t think people realize it because they mostly use the term “Gamma Correction” to refer to both, I mean you can see how confusing it is now.)
AFAIK you are correct with that behavior, one thing though is that the behavior is not what “Linear sRGB” means.
The colorspace option in Blender’s texture node has a full name, “Source Image’s Colorspace”. Therefore it’s a setting for you to mark what the original image was encoded in. Then OCIO will transform the image from the colorspace you mark it as, to the “Scene Linear” colorspace specified by the OCIO config. Therefore “Linear sRGB” means the original image was encoded in sRGB colorspace with a Linear transfer function. The conversion thing is just what OCIO does.
This is not really Filmic nor ACES, this is OCIO. That’s why in the beginning of the thread when people said something like “ACES means you take images from different colorspaces and unify them in the same working colorspace”, I told them this is the approach already with just OCIO.
I would prefer the term “encoding” rather than “format”