Filmic as default is hurting the experience for people

I mean it in a specific context of people who blame color management for their images not looking right.

Or in other words, if someone puts up a side by side example, where they just turned off color management, and claim that this alone makes their image look better, then clearly they need to acquire some more competence in this area to understand what’s going on.

4 Likes

then teach them to fix lighting in their video compositing, don’t just whinge that Blender isn’t 100% compatible with newbies, please.

I imagine that, if you save your render as EXR, then it doesn’t matter what colour-management “look” you choose, the saved pixel values will be exactly the same.

You should be able to confirm this in an image editor that can handle deep floating-point pixels. Like GIMP.

It seems to me that there are a lot of misunderstanding in this thread. I can’t say I understand it all but at least after reading about this topic for some time now I think I can clarify some things for you.

Before I write everything, I would actually like to link a kind of long article but I guarantee it would be a worth read, by Chris Brejon: OCIO, Display Transforms and Misconceptions - Chris Brejon

And then about Filmic vs “Standard”

First of all, a showcase image showing why Filmic should be the default and why “Standard” should be avoided at all cost:


What does white mean? What did you mean when you said “white looks grey?”
Troy mentioned in his Hitchhiker’s guide to digital color that sometimes the most simple question gets overlooked, and it snowballed into some other misconceptions.

In this context, I assume you mean R=G=B=1. Here comes the problem, does it really mean white? What is white exactly? Again think about it if you haven’t if you overlook this you will miss a lot more.

The answer is, in current Blender’s case, R=G=B means it is using Rec.709 colorspace’s white point, D65 (6504k color temperature, sort of like one of the many “hue” standards for different kind of whites). (BTW this meaning of white would change after Cycles becomes spectral). Note that it does not have to be 1, R=G=B can be any number ranging from 0 to infinity (this is why it is called open domian, or scene referred), and it would still be D65. Therefore, in open domain scene linear’s context, white and grey is the same color with different emission intensities.

However, our monitor cannot emmit infinity power of light. You can have Nishita sky texture in Blender using real world’s sun strength, but can your monitor emit the same power as the sun? Can your monitor alone brighten the entire side of the Earth? Obviously no. So as a color management system, something needs to be done about it. the “Standard” view transform was never designed to handle this problem, the sRGB standard EOTF was supposed to only deal with range from 0 to 1, instead of 0 to infinity. Therefore the standard view transform never worked for Open Domain renderers like Cycles and Eevee.

What does Filmic do differently?
In Filmic Blender’s documentation we have this quote from Troy:

Filmic does two things:

  1. It compresses the scene referred linear radiometric energy values down to the display / output referred range. This aspect is known as a transfer function or tone mapping. The shape of the Filmic Base Log with a contrast aesthetic roughly emulates a photographic film curve.
  2. It compresses the gamut for high intensity values. As colour ratios increase in intensity, highly saturated ratios tend to be resistant to transfer function compression, which results in peculiar feeling imagery with some regions feeling appropriately over-exposed and others “lingering” behind. Filmic considers all colour values fair game, and attempts to blend colours into a consistent output that matches our learned expectations from film emulsion-like media.

For the first point, Troy back in the days made this image:


From: cycles render engine - How to get accurate colors with Filmic Blender - Blender Stack Exchange

I believe that this image by Troy is enough to understand the first point, also known as “Tonemapping” (the name tonemapping is actually misleading because it has nothing to do with "tone’, but intensity).
Filmic takes the open domain intensity of around 16, and map it back to the 0 to 1 range for sRGB encoding. Therefore if you want display referred white from Filmic, you need to use R=G=B=16, not 1.

The second point is bit harder to understand, but let’s break it down. Look at the graph above made by Troy. Look at the sRGB transfer output, compare it with the original scene value’s RGB ratio, see how it changed the ratio because the out of range channels are clipping.

This is the root issue for the skwing. The buttom line is you cannot have a monitor as bright as the sun, you just cannot. A simple tonemapper may extend the range 0 to 1 to something larger, but it still has an upper limit, when values hit that limit, it is still going to skw. Therefore at least in the foreseeable future, this clipping problem will still exist for a long time with us, if not forever. But as a color management system, something needs to be done about it.

The first thing to solve a problem is to study it. Note that this part is covered in the Chris Brejon article I linked above, so if you are interested please go read it.
The skwing problem is dubbed the name “Notorious Six” by Troy and others, I don’t know who came up with this name first but this is how they call it. Here is why the name:


Here is a so called “color sweep”, a range of different colors increasing intensity from left to right. This is easiest why to see the “Notorious Six”. See as the intensity increases, the individual RGB channels are hitting their “1” walls and the ratio between the three skwed, and the wide range of colors becomes only 6 colors.
And here is Filmic:

What it did was that instead of letting it skw depend of the output device’s encoding, it tries to map it manually to a more sane color, in layman’s term “desaturate” it, note the more accurate term would be intensity based gamut mapping. The mapping ground truth is called “chromaticity linear” (or in layman’s term the destination of the mapping would be in the same “hue” of the original intent.) Note that Filmic is not perfect though, you can still see the “Notorious Six” before it goes to white. So it is definitely not perfect, but at least it tries to deal with the problem.

Conclusion, Filmic should be the default instead of “Standard” inverse sRGB EOTF, as it was designed to deal with Open Domain Scene Linear data that Cycles and Eevee generate, while the “Standard” inverse sRGB EOTF was not.

23 Likes

I believe that the thread makes a good point. However the terms of the standard preferences, the technical specifications of the rendered picture, are not exactly absolute. You can’t be sure about which is the ideal settings to choose.

However since the render settings are customizable, it would be easier to chunk these concepts into “Render Profiles”, so with only one selection you would be ready to go.

For the sake of the example, as of now I can identify between three very distinct render styles with their equivalent examples.

  • NPR: anime, cartoon, guilty gear, grease pencil style
  • 3D Animation: pixar, disney, toy story, sintel, sprite fright
  • Photorealism: iron man, avengers, tears of steel

So in the far ends you have NPR and Photorealism, but in the middle the zone is kinda neutral, so perhaps you would go for filmic but pump the environment lighting and the materials with emit so things look clear. This is a place for lots of creativity and expirementation. But in indeed on the other choices the settings to be chosen are very clear.

1 Like

5 posts were split to a new topic: Filmic and ACES

TL;DR: Filmic as default is fine, mostly because templates for GP, Editing etc switch to “standard” as needed. Blender does need to give the user more control over colour information though.

I can see the original poster’s point of view and understand how it could be confusing to new users trying to render NPR artwork and seeing everything looking desaturated and grey. On the flip side, new users are going to be watching tutorials to learn and a lot of Grease Pencil tutorials tell viewers to switch to standard. It shouldn’t really be a problem, it’s just part of the process of learning new software.

For all those saying Filmic is “correct”, well, that’s “incorrect” in my point of view. Use cases are different for each user, sometimes Filmic isn’t what’s needed, even for non new-users and photorealistic renders. I tend to render out EXR whenever I can, comping/editing in a different application. Some of these renders are photorealistic, but I switch to “standard” because I know that the comp won’t be done in Blender and they don’t have access to the filmic colour profile. Yes, you can load the Filmic OCIO in other applications, but where’s the reverse of this for Blender? If the pipeline is using a different colour profile, we should be able to import OCIO files into a project to match the comp/edit/grade/rest of the pipeline. At the minute, we’re flying a bit blind, especially with VFX work where the CG should match the footage as closely as possible.

I’ve been following Troy on Twitter for a long time now. It’s clear he knows his stuff in regards to colour. For him, the creator of Filmic, to call colour management in Blender a dumpster fire over and over again makes me think that Blender should give colour management priority and tackle the issue once and for all.

8 Likes

As I understand it, the whole point of EXR is the the data is always linear, regardless of whatever colour profile you might attach. So there is no need to invert a colour profile, simply (re)interpret the pixel data according to whatever profile you want.

You should be able to check this in an image editor that can natively cope with floating-point pixel components, like GIMP.

It seems some misconceptions continue. Maybe my previous post was not clear enough. So here I am writing more stuff.

No it has nothing to do with the rendering styles you listed here. It only has to do with one question:

“Am I working with data that is ‘0 to Infinity’ open domain/scene referred, or ‘0 to 1’ close domain/display referred?”

If you are 100% sure everything you are working with are within 0 and 1, and you will not use any operation to make the data go beyond 1 like changing the exposure etc, you are safe to use “Standard” sRGB Inverse EOTF because you are pretty much working in close domian in this project. Otherwise, like most of the time you would work in Blender, you will be dealing with open domain data with no upper limit, so you should use Filmic or TCAMv2 or whatever, just not “Standard” sRGB Inverse EOTF.

I would not say Filmic is “correct” because:

But using Standard sRGB Inverse EOTF on your open domain EXRs is 100% certainly wrong. As I said in my previous post, sRGB was not designed for data ranging from 0 to infinity. In the era of sRGB’s making, open domain/scene linear workflow was not a thing yet so it was never considered at all. It is completely wrong to use sRGB Inverse EOTF to view an EXR saved from Blender, 100% wrong.

As for “where’s the reverse of this for Blender”, I don’t quite understand what you mean. If you use EXR format, here is the answer:

So if your software down the pipeline uses Standard sRGB Inverse EOTF to view your imported open domain EXR, it’s wrong. It’s either you have the wrong setting (like you are aware you could have loaded Blender OCIO config), or that software is just not capable of dealing with open domain data at all. If that’s the case, I suggest you abandon that softare and go find an alternative. A software with hardcoded sRGB Inverse EOTF view transform is never designed to work with your EXR files. Go find other software that support loading OCIO config.

4 Likes

Slightly offtopic maybe, but I want to check of what I learned from reading these topic is correct:

To correctly interpret a linear exr file you still need to know what primaries are used, so you still need some colorprofile information. Or are the primaries for an exr file set in stone in the exr standard? Do I understand this right? Or do I miss some subtle detail?

I’m no colour expert, so please correct me if I’m wrong. With the “Where’s the inverse for Blender”, I’m mostly thinking about VFX work. If you shoot RAW, you’re going to have a LUT in your editing software to interpret the high dynamic range images to something viewable on your RGB monitor (I guess similar to Filmic). Being able to bring the raw footage into Blender and apply the correct LUT just to that part of the image (if it’s set to the camera background or a test comp) should be an available option. Applying the LUT to the rendered image would be useful too, but I’m guessing here you’ll tell me that the scene referred linear workspace colours of Blender are different to the RAW images from a camera and this shouldn’t be done.

As far as using standard because I know that Filmic isn’t available in the comp. Thinking about this more, it does make sense to provide the OCIO for filmic to the compositor and ask them to use it for the renders, it would be like using the correct LUT for your camera.

2 Likes

Point is the whole industry was sRGB (no filmic) and moved on to ACES. So ACES is the new standard. Its not easy to convert to ACES as nearly all textures, also game related, are in sRGB and with 8 bit color inputs in Blender its even worse.
So Filmic just ruin it even more, as you do not really see the results you would get in other render engines. Its just my advice after a long time career to have these two standards be defaults. Currently Blender ships without ACES support, so that should also be changed.
One of the first things I did in Blender so, was turn off Filmic LUT. Its an advice of professionals. Not a duty.

4 Likes

Aces isn’t the standard and not even the most commonly used configuration. It’s just basically an adopted pipeline( colour management system-arguable*) just as filmic is in blender currently. And also filmic doesn’t ruin your renders that’s a hoax. Filmic is actually as good as aces and ARRI k1S1 just not a common name in the VFX industry.

Well I think everyone has a different picture of that based on your work experience. The people around me would shake their heads about your comment, but then its totally ok for me. Its all depend on what you doing. I see no one using filmic or used it, hence for me its more of a problem that I remove in order to get the result I’m used too. That one of the aspects of our job, that people use software for different proposes. So for anyone in the game industry this discussion makes no sense at all.
But I can say, that everyone here working with Redshift or Vray on animations (not stills) is using ACES right now. Hence its a standard If you do jobs in the film and advertisement business. But again, you might see it from a different location and I wouldn’t say you wrong. But I guess there is no paper that analyses the whole industry.

2 Likes

Was just getting set up to do this, but it turns out that the resolve LUTs aren’t fully working :confused: It’s all well and good to tell people that they should be doing things a certain way, but it’s often not practical or possible… I’m guessing this is a limitation on the Resolve side of things, but I’d wager a lot of editing/compositing apps will have similar limitations and it’s a bit ridiculous for me to tell clients what software they should be working in to support my renders :expressionless: https://github.com/sobotka/filmic-resolve

This is where you get it wrong.

LUT are lookup tables, but you’re not solving the tonemapping, what you do in Resolve is the following:

  1. Load the linearized image as EXR.
  2. Use Filmic/ACES/whatever to tonemap it (that is, convert the linear HDR data to something displays can show and handle)
  3. Then, apply LUT or looks at will.

If you want to mimic Filmic, you need to take in account it uses Filmic Log, so when you load Filmic, to get the same result as in Blender, you use the same settings

And then you proceed. This way of working requires a full understanding of OCIO and what’s going on. Not as simple as loading a LUT and call it a day.

2 Likes

Nope, definitely not as simple as loading a LUT and calling it a day. Thanks for the details, very much appreciated.

This is true. According to the Technical Introduction to OpenEXR, you have a chromaticities attribute that defines how to interpret the channels. This is given in the usual CIE XYZ coordinates. In particular, it is possible for the channels to represent CIE XYZ values directly.

One implication is, regardless of what the channels actually represent, according to this convention, the pixel components are always going to be linear in CIE XYZ space.

1 Like

I like the idea of being able to create render presets that could be saved and loaded. or is this possible already?

As far as I know, there isn’t one, however it can be created as an addon, if there is a good list with all settings needed for a few cases, it would be feasible to include them.

In some of my scripts I switch some prefix render settings like this, very simple stuff actually, but saves a bit of time. bpy.data.scenes["Scene"].view_settings.view_transform = 'Standard'

As for example since the Filmic alters the viewport colors, it makes color sampling with the eye-dropper giving wrong values.