The point of the test is to make sure the darker skin is not too orange “beefy” and lighter skin is not too pale. My version’s darker skin is arguably a little too orange, might need to work on that. But you get the point.
Don’t modify your setting and use the same one to test both light saber and skin tone, since ARRI and my version are doing so with one unchanged view transform, if you change your setting it will be unfair.
Second: Abney effect compensation.
Look at your blue lights in the back ground, don’t they look like purple?
It’s because of abney effect:
This means the user using your image formation algorithm will see the blue light looking a bit purple as it gets brighter. So now we need to purposefully shift the color a little bit to retain the blueness sensation. The blue light in light saber can be a good testing EXR. But here is another one:
Note abney effect is not just Blue, we need to shift all three primaries.
Use this one instead, I’ve improved the saturation now to always be perceptually the same brightness even when cranking right up or turning off altogether.
I have the feeling that a lot of the complaints people have about filmic are caused by being used to the artifacts standard ( i.e. plain clipping and maybe a bit of gamma) causes. A bit like people preferring the ‘warmer’ sound of vinyl over CD (which is also caused by the limitations of transferring a signal in an unbounded domain to a bounded domain in a certain way).
There’s not really anything wrong with that of course. I like vinyl a lot myself. But I do think it’s useful to realize that you might just like the look of the rgb clipping.
Also there’s people like @3di who just prefer doing their tonemapping in the compositor. Which is a valid opinion, but not necessarily the best for a default setting.
@3di oops sorry, meant to reply to the topic. Not to you specifically.
If you disable the normalize inside the group, these settings appear to get very nice results on the colour chart. My eyes are getting tired, but I think I’ve avoided the troublesome 6:
I am going to bed soon as well, but for your next attempt, these are some goals:
Skin tone, and make sure the same setting works for both skin tone and sweep
Abney effect compensation
“Wider Gamut rendering”. Try to achieve acceptable result without breaking the above tests. Here is what your current approach yields with BT.2020 sweep:
Yes, u r right about it, u really have great understanding and sum up the tread instead of judging people like ''photograph cant have whites, photograph cant be like that ‘’ etc
So I just keep saying I wanted to add some LOOKS on filmic that can give us that clipping and gamma or some brightness explosions , so we can do everything from filmic and add little bit of those standard artifacts to filmic by additional LOOKS, so people can see filmic can also do that too by secondary transforms etc in the form of LOOKS. People keep saying its wrong when we want to have that control on filmic to have little bit not too much like srgb , of that artifacts or imperfections because it add realism like flaws etc.
although there are some messy leftovers in this one
Since the recent introduction of the Convert Colorspace node, this is actually possible, although there were tons of pitfalls in learning how to do it correctly.
I’d love if it Blender had OCIO nodes for this reason. Just load the individual functions various OCIO transforms and looks use into nodes you can plug in and mix and match and adjust on the fly. You wouldn’t necessarily even need the various Filmic Looks but instead could just use a single look that does it all.
You can already replicate all that in the compositor with lots of work, but it’s much slower than a 3DLut approximation would be. The above node setup was completely unusable before the full frame compositor. Now it’s at least reasonably performant. But adding all too many things after still breaks it. Also, it takes a whole lot more knowhow than just dropping in premade functions.
To me that’s not an issue with any given view transform though. It’s an issue with the current state of Blender’s feature set for the purpose of color management.
Quite a bit of issues seem to at least theoretically be on the agenda in this bug https://developer.blender.org/T68926 but even once this is completed I think more could be done.
It’d also be neat to get some matrix manipulation functionality. That way you could easily work in completely custom color spaces as needed. - the Color Conversion nodes only work with presets. I’d like to specify my own custom color spaces and then have blender come up with the matrices.
But anyways, all of that is very in the weeds. If you want to build your own color formation chain, you absolutely can. But most people just want something that “works”, and Filmic, for most situations, works better than Display Native. And AgX, for pretty much all situations, works better than Filmic, especially in the most recent iterations.
I actually dont mind having abney effect from blue if it will give more variation and detail. Is it physically accurate to reduce abney effect on blue? isnt it normal to have it? also , is it the same thing as what flourish version as I read in BA.
Abney is a perceptual effect. This isn’t about physical accuracy per se. The “physically accurate” thing to do would be to render spectrally, and then have a monitor that is able to accurately reproduce those spectra in each pixel, at any intensity.
Color Management is about getting something that makes sense in lieu of that “real” thing, given hardware limitations: We only get to use three colors rather than fine spectral control, and at a limited maximum intensity.
It turns out that linearly changing such a three light system’s intensities does not translate into constant hues. I.e. we have, among others, the Abney effect.
His result really look like ARRI, even kind of more closer to it than AGX?
Maybe only needs that dark tones spreading or leaking from left to right on blue red and green so it will be more contrasty and deptful and be more like ARRI. I liked it
Dont mind the terms about physically accuracy, I just try to mean perceptual accuracy and just asking that if its was a real world and we input a blue filter on light and would we see it purple or blue if its too bright like in example or would it have little bit of purpleness? or what I try to ask is are we trying to make lights more expectable by reducing abney effect so people can have expectable renders while using blue for a light?
This is really less about how it’s “in real life” as that is inherently an ill-posed problem given hardware limitations. It’s more about predictability and expectations.
That said, it’s a subtle effect and you gotta be careful to not over-correct. FWIW, while mostly an improvement, I don’t think the current iteration of AgX necessarily gets it quite right.
It’s tricky to evaluate though.