Blender Support for ACES (Academy Color Encoding System)

That’s a slightly outdated technique.
For starters it’s a workaround, Fusion is like the After Effects of resolve, you don want to make a Fusion comp for every file you input just to circumvent bad color management. You expect to be able to work reasonably with the files without firing up Fusion unless you need advanced tracking, chromas, or full motion graphics. It’s slow, and cumbersome to work with. This is just a workaround.
And this also works only in an unmanaged colorspace. That means Color Management is turned off.
With Color Management turned on, this technique doesn’t work either. The image breaks just like before.

The problem is with the Filmic files, inside a Color Managed environment.

Anyway, my point is that I vote for Yes Having ACES inside Blender, and it happens someone already did it so it must not be that difficult to integrate. I guess. But I honestly don’t know.

3 Likes

you should be able to right click the file in resolve and bypass colormanagement, then use a color transform node to bring it into the flavor of aces your using , then at the end of the pipeline another color transform to bring it out of aces. the input settings should be 709 and linear (first color trans node) and for the last color transform out of aces you might want to set the gamut mapping to saturation compression if going to a smaller color space.

edit : you won’t get the same filmic look because it is not bake into the exr

edit 2: using acescc gives me issues , but acescct seems to work. will have to look into this more

Resolve isn’t terrific with linearized data, as it began life as a closed domain app, not open domain light data. To properly ingest open domain light data, a DCTL is required, as LUTs, the sole method of import on the non-Studio version of Resolve, are a tad destructive.

Resolve also doesn’t support OpenColorIO yet. Support may be coming. This is exactly why Filmic Log was included, so apps that are closed domain can work with the post image formed data. Export to Filmic Log and grade from there in Resolve.

As for the endless ACES discussions, it’s quite funny. ACES isn’t much of a colour management system, as is trivially demonstrable; compare an HDR output to SDR and you’ll see pretty quickly that labelling it a “colour management system” is rather hilarious.

If the software supports OpenColorIO, it works. This is why Filmic works and is shipped in Renderman 24; Renderman 24 supports OpenColorIO.

Feel free to crawl the many informed examples on ACES Central that demonstrate rather clearly how ACES is a rather poorly designed approach that does not work. Maybe ACES 2.0 will, but until then, it’s a bit of a joke.

1 Like

This approach gives a better starting point. It’s a pity that the CT has to be applied manually and bypass Color Management, but it works better. I’ll run some tests to see if it holds in grading.
Still, ACES may be supported just to make studios happy, and give Blender more relevance.
Again, I’m no fan of ACES, I just don’t think the endogamic approach of “Blender is so much better than anything else that we don’t care about silly standards” does Blender any good.

We want Blender to have a chance of being a Big Player for the Big Studios? then add ACES instead of complaining about how screwed ACES is.
We want Blender to be used by architects? then add Lumen units for lights. It doesn’t matter if us nerds think the Watts per steridian squared is a more accurate unit because irradiance. Commercial lights are specified in lumen, and architects need to set lights to lumen… (I get how lumen works compared to irradiance, let’s not get into that discussion too, it’s just an example).

Blender did giant steps towards being a more serious 3D software, let’s continue in that direction
Or it can stay a toy for nerds and hobbyists.
A toy with solid science though.

2 Likes

Please closely read what I’ve typed here.

For starters, there are close to zero applications, let alone Blender, that are colour managed. No really, it’s a deep and dark rabbit hole. The upside is that some Blender developers are pretty savvy on the pixel management front, as well as being acutely aware of breakage points.

Second, wider gamut working spaces come with huge problems with respect to the quality of imagery. Do some legwork and research, and one will find very quickly that the imagery from wider gamut rendering looks horrible. It’s a complex domain. When I was developing the earliest iterations of Filmic, I experimented with using a BT.2020 working space. And guess what? The downsides were so huge that I couldn’t release it in good faith.

Could I have slapped a curve on wide gamut content and screamed out in marketing speak GOOMBA ENCODING SYSTEM! Yes. But could I release it in good faith with people trusting it? Hell no.

So again:

ACES uses OpenColorIO, and it is trivial to integrate it if required.

Maintaining that sloppy system comes with a large learning curve, and generates imagery that is fundamentally broken by default, is not a good design choice.

Feel free to look at the endless complaints online of software that has force rammed that broken system down image makers’ throats and see how it’s going. Plenty of folks are pissed.

So in summary:

  1. Blender isn’t fully managed yet. Many pieces of software are far behind Blender on this front, so it’s not a specific knock on Blender.
  2. ACES is convoluted and sloppy. It really is a shit storm of a system for people looking to generate renders.
  3. ACES generates utterly asstastic looking broken work by default. As someone who has played the role of support person for a large number of people and smaller studios, this is not an overstatement. It comes with far more problems than people think.
  4. ACES does not work as a colour management system. Feel free to have anyone who thinks it does tackle the issues head on. TL;DR: Slapping a dumb curve on wide gamut primaries isn’t a system. It ■■■■■■■ sucks. Literally any and all values that escape the output gamut are device dependent. That means that it’s not managed, as is easily verifiable comparing HDR to SDR output.
  5. Forming an image from light data is a complex process, and one of active research.
2 Likes

( @troy_s )
I ran a few tests.
Filmic EXR vs ACES EXR, exact same scene, same frame rendered in Blender 3.0, Cycles CPU.
Placed into Resolve, set to ACEScct Color managed.
And I found only very subtle differences in performance between Filmic and ACES.

Bypassing the input transform and using CT in the first node gave terrible results, unusable…
but, setting the input transform to ACEScg and no CT gave very usable results, both for Filmic and ACES.

The image was designed to stress the system a bit, with an HDR background set to 100 intensity, a few emitter textures set to 500. To maximize the dynamic range. And a Macbeth chart to have some color reference.
Exposure compensated in blender to look well exposed in the viewer.

This is the plain 8 bit Blender render:

Filmic set to bypass plus Color Transform set to r709/Linear:

Trying to correct offset down destroys the image:

Curiously (and just so the developers have another interesting fact at hand) if the gamma is set to some Log profile the data falls into place… (and it’s NOT Filmic Log, it’s plain Filmic). Any log profile has a similar effect, with some minor differences:

Using the ACES version with bypass and color transform yields a very similar result, here:

Now, this is the Filmic EXR, just set to ACEScg input transform, and no CT node at all:


And the ACES, same settings:

But both could be recovered easily using just the offset and colormatching the chart for balance.
Filmic:

And ACES:

There’s no discenible difference in performance between the two images.

And pushing the image further down never breaks it, both fade gracefully to black without any artifacting. All values are kept inside Rec709 at all times, except in the extreme darkening, but that is expected due to the lack of saturated blacks in Rec709.

The winning ACES worflow for both Filmic and ACES is to just set the input to ACEScg, and balance and correct the image as needed.

So far the test in Resolve/ACES, I’ll do more tests in a Davinci Wide Gamut space to see how it goes there.

1 Like

What exactly is a Filmic EXR?

An .EXR exported with the default Filmic Color Transform selected.
The ACES .EXRs are exported from a version of Blender that has the colormanagement folder replaced for one made for the ACES workflow.

The filmic color transform does not apply at all if you export an exr.
The saved exr will be hdr scene-linear-srgb image and has nothing to do with filmic.
Some correct me if I’m wrong though.

1 Like

Yes and no…
Yes, looks like both images behave exaclty the same. Then I don’t know exactly how you get to export ACES, or how to NOT export ACES, because, none of the images respond as expected when treated like sRGB or Rec709 Linear, but react well to be treated as ACEScg colorspace and gamma.

There is no “Filmic EXR” nor “ACES EXR”. Blender saves the light data directly from the render buffer. That means that if the values are transformed to AP1 using an ACES config, the light is in open domain AP1 in the EXR. For Filmic, the light data is an open domain BT.709.

I had a massive argument that EXRs must be managed for ingestion and encoding about a decade and a bit ago, and one developer thought they were right. They were wrong.

This is wrong.

The light data would be AP1, and “proper ACES” will expect AP0, so the chain is wrong.

This would be the correct approach to tag the light data as AP1, but totally incorrect for Filmic. The proper Filmic chain cannot be achieved in Resolve without starting from Filmic log, which bakes the gamut compression into the encoding.

Both results are a sloppy mess.

Not at all. Pretty clear to see how the imagery ends up nightmare fuel.

Filmic is very simple for use in Resolve; export in Filmic Log and then apply the contrast curves as required via LUT.

There is no “plain Filmic”. Filmic Log is and has always been the proper encoding for just this reason, and the design in Blender is a hack.

This is uh… heavy chroma laden mixtures contain no complimentary light. BT.709 and literally any colour space has them because it has nothing to do with the colour space. What you might be seeing is gamut clipping and skew and broken output when using wider gamut garbage.

4 Likes

Thanks, that actually helped a lot to understand some things.

The link for the luts for those who want it GitHub - sobotka/filmic-resolve: Filmic Resolve Cube LUTs

2 Likes

Yeah, I’ve re read the manual. So all EXRs are Linear R709, no matter the color science. To have the transform impacting the output the export should be in PNG 16bit. Tried that and it’s, true, but lots of latitude and color data is lost. Still better than 8bit.

but both images ended up exactly the same.

What is wrong exactly?
I’m talking about the color science selected for the whole project in resolve. It has nothing to do with AP0/AP1. You choose either ACEScc or ACEScct. Both are almost equal but cct has a log curve approaching blacks, and cc is full linear.
You may select AP0 or AP1 as input transform in each individual clip.

But both EXRs behave exactly the same. Trying to import Plain Blender EXRs as 709/Linear failed catastrophically. I don’t know why. I just present the facts.

I couldn’t find “sloppy mess” in the CIE website glossary. I don’t understand what you mean. I’m talking about this image:


I don’t see a mess there. I see no clipping, the blown out highlights are blown in a similar way it would in a camera, the shadows have detail and color, and no weird color shifts. The warmer colors a due to balancing the whites to the chart in the shadws, that’s cooler than the rest of the image.
The results I see are by far the best I’ve been able to get from Blender into Resolve with Color Management enabled.
If you have a workflow that gets better images than this, please tell me and I’ll try it.
All I’m trying to do here is to find a proper way to integrate my beloved Blender into the Resolve workflow I use, that so far gave me lots of headaches.
Tell me what you see that’s so wrong in the mapping in this image, or in the scopes.
I know that the composition is not especially beautiful, and the modelling is sloppy, but it was made just to test the dynamic range.

But to have Filmic Log I have to export 16 bit PNGs, right? or what format? TGA maybe? Because EXRs get exported as linear no matter what transform you select. And slapping a LUT on top of a PNG… I can try it, I don’t like LUTs because sometimes introduce artifacts and clip any value above and beyond, so they limit highlight and shadow recovery, I prefer to find the right transform function for each source. And it seems that I found one that works.

I don’t understand. In the view transform you can select “Filmic” and “Filmic Log”.
by Plain Filmic I refer to Filmic.

It’s funny that you get angry because I said that these finally worked as expected.

The only big question for me is why the EXRs worked ok tagging them as ACEScg and broke using Rec709/Linear. But that’s just a question. I just wanted to make Blender EXRs work in resolve.

Thank you! I’ll try those.

No.

Rendering engines are mostly agnostic from the math. They crunch numbers. If you feed an encoding to OCIO and say that it is “FooRGB”, then use it as an emission, that is FooRGB as a render.

There are a few remaining hard coded paths, but by and large, Cycles etc. can render with anything, because “rendering” is just dumb math based on the assumptions of what the rendering engine is fed.

TL;DR: It’s the audience that creates the meaning for the most part.

No.

There are two components to lights:

  1. The colour of the lights.
  2. The range / domain of the lights.

All rendering engines now typically generate zero to infinity emissions. However, the “colour” of the lights is another axis.

If one says “latitude”, the dynamic range of all contemporary rendering engines is zero to infinity, even if they use three colour less lights!

So there’s no difference in the latitude across colour spaces from a light transport vantage.

Because you are missing the fact that you, the audience member, are defining the encoding. If you say some arbitrary buffer is BT.709, it magically is! But in terms of absolute chromaticities relative to the light mixtures? Completely wrong.

This is a longer discussion, which I am happy to try and explain.

In this specific example, shadows aren’t more or less “saturated”. A display can only display a specific range of chroma, and any values that end up negative or beyond 100% display emission are clipped with respect to gamut. Distortions can happen due to quantization as well!

The long and short of it is that absolutely zero of the intended light mixture makes it out of the display with wider gamut rendering. While this is a deep rabbit hole, each and every pixel that is a gamut excursion is device / display dependent. That means:

  1. It’s the “wrong” light mixture with respect to the intended ratios in the render.
  2. It’s a different light mixture between displays, violating the very ground truth of pixel management.
  3. It leads to distortions and broken light mixtures in terms of aesthetics, as per 1.

You are presenting “facts” as you believe you understand them.

Filmic as Filmic Log, the way I designed it and intended, is the proper path in Resolve. Resolve is a dumpster fire in terms of contemporary approaches, but can be coaxed to behave properly, assuming one understands the lower level workings.

You aren’t looking very hard. Look at the skews as the deeper chroma ascends to display maximum.

Use simple synthetic imagery like a flat chart, and jack exposure, and it will be glaringly apparent.

Then compare against the low level primaries, versus complimentary light mixtures inside that gamut.

Don’t look at the scopes. Look at the imagery. Use your thought process. And if you are going to use metrics, use CIE metrics to validate what you see in the imagery.

TIFF works great. All of Blender’s integer encodings sadly have broken alpha though.

I assure you, as the author, that “plain Filmic” as it exists in Blender is not “plain”. The “plain”, if there ever was one, was the base encoding I designed the entire system around - Filmic Log. My configuration behaves slightly differently to what was implemented in Blender, and for good reason.

I’m not angry at all. The errors and mistakes that happen are repeated, which is why I try to spend the time helping folks get a handle on some rather slippery ideas.

At the bottom of this mess are some really simple ideas that I am 100% confident everyone can grasp. It just takes a bit of approaching it repeatedly from different peoples’ vantages to help get folks to see them.

An encoding is what someone says it is. It’s just dumb numbers in the parcel!

1 Like

@dracoroot7 @troy_s
Here’s another test of the same workflow, this time the Blender scene is set up with more standard irradiance values, to have more real world situation, the HDR environment is set to 1 in Strength, and the emitting voronoi chips are at 10.
This image was generated by the regular Blender OCIO, no files modified.
The EXR image is usable out of the box in Resolve (ACES 1.1 cct) setting it’s input as ACEScg.
It’s slightly more saturated, probably due to the difference in the primaries used by blender and ACEScg.

Blender png (filmic high contrast):

And EXR placed in resolve, no grading, balancing or correction, just the input set to ACEScg exported as still:

And this is a realtime of some pushing and pulling the offset first, and then the primaries.
I see the image behaving beautifully, not even the Alexa footage is able to hold pushing latitude like this. I don’t see anything broken. Only the saturation is a little higher than expected.

(This is for people like me that want to integrate renders into resolve color managed workflow and had problems)

2 Likes

This is a closed domain “image”, ready for consumption, and not really a decent grading entry point. Filmic Log should be the entry point here, especially for adjustments like exposure, which on nonlinear pure log encoded data, is an addition / subtraction / offset.

This is also wrong.

Blender can only dump linear open domain light data to an EXR. Loading an EXR and instructing the software that the encoding is non linearly encoded such as ACEScct is flatly wrong.

What are you trying to achieve? An offset on liberalized light is a radically different math operation to an offset on nonlinearly encoded log data, which is again different to a nonlinear ready-for-consumption final image encoding.

It is important to understand the differences.

I think you are gravely mistaken.

Again:

  1. Use a sweep pattern, using the working space primaries.
  2. Do a proper exposure adjustment.

Radically different results, domains, and random judgement is not helping anyone here.

The chroma distortions are indeed part of the screw ups.

Watch the trajectory in the 1931 plot and you’ll see the chroma distortions / curvature.

Renders composite in two ways, depending on one’s view of light data.

In a large number of cases, the composite should happen on open domain, linear light. That means that Resolve can only work with significant hoop jumping. Fusion standalone works great, but ResolveFusion has some nonlinear assumptions due to Resolve’s history as a closed domain / display referred tool.

1 Like

So? You mean that in plain (unmodified Blender) I can export EXRs in nonlinear space and choose the Colorspace? How is that done?

If you try to grade a 16 bit PNG you will run into issues because not all latitude data got enconded in the file and something got either clipped or rounded. That’s the issue I was talking about, I wasn’t questioning the theoretical infinite radiance that Cycles may be able (theoretically) to handle. My point is that grading 16 bit files is not ideal if you can export the full range of data in EXR.

mmm, nope. I cannot decide arbitrarily the encoding and you know that.
I MAY define arbitrarily whatever combination of colorspace and gamma is reading the image. But that image was encoded using ONE combination of those primaries, white points and transfer functions.
And to be able to use the image in a color managed environment I need to be able to tell those, precisely (ideal) or approximately (not ideal at all, but usually better than display-baked-8bit footage).
According to the Blender Manual, the right encoding for EXRs, whatever the transfer function, should be Rec709/Linear gamma. But if I use those the image breaks. That’s my whole point. It really breaks.
So, completely clueless about how the image should be interpreted, I ran lots of trial and error until some combination held better. These are my findings nothing more.

I think you are missing the point here. I know that a 100nits tv cannot output 4000nits, as I know that in a CMYK printed book you can’t have the kind of 0xFF0000 red you may get in a 1998 CRT monitor. But still you take a photo of that monitor, and SOME red is making it to the printed page. To make all that obvious transformations nicer is that we mess with the Rabit Hole of Color Management.
I’m trying to find a way to map elegantly the beatiful internals of Cycles rendering inside Resolve Color Management, while preserving most, or ideally all of the original data to have room for corrections.

I don’t see any out of place “skew”.
If you are talking about the visible diagonal in the waveform near the highlights, that’s a color cast, and a gradient, caused by the “outdoor” environment image that casts a bluish light on the scene, from a definite direction (the sun), hence the gradient, seen as a slope in the waveform.
And if you are talking about how that slope changes it’s angle, that has to do with how Resolve manages Offset and primaries in colormanaged environments, both ACES and DWG. Resolve “pins” both the black and white clipping points and grades in between. That causes a nonlinear response to offsetting the image. That’s a feature, not a bug and is one of the best selling points of the colormanaged environments. In unmanaged spaces highlights just clip.

Why? the scopes are wonderfully useful. For a colorist. Or a DP at least.

I do. But monitors can be deceiving you know? I think you know that too.
That’s what the scopes are for.

I already do that too. Thank you anyway for the tip.

I don’t know how to do that, besides looking at the chromaticity CIE scope to spot chroma clipping/out of gamut pixels.
You do that. And the color scientists. And the monitor calibrators.
I just want to use Blender renders in Resolve seamlessly. That seems more complicated than it should be (and that is the whole point of this thread, I understand, to be able to have a predictable profile to read renders in other apps).

Trying that right now. But they are 16bit maximum too it seems i’l check how they hold…
(anyway I think it’s a pity to loose the EXRs, it’s an awesomely flexible and they support arbitrary channels and layers… that’s unbeatable)

I was just refering to Filmic vs Filmic Log.
The word plain was just to differentiate it from Log and with the one from the Blender with the ACES modified OCIO… I may have used “vanilla” or whatever. Don’t get lost in the choice of words.

And at the top is an unclear workflow to colormanage the EXRs, and/or unsupported ACES workflow. I know that you hate ACES by now, but that is the appeal. If your app supports ACES you set the output or input to the right ACES profile and forget about all this.
Sometimes devs forget that not all of us can become rocket scientists just to film a video about a rocket. Most people that hit this wall will just say “Blender is broken, don’t use it on our pipeline please”.

Again, no. We need that the dumb numbers preserve the look of the image outside Blender.

All the OCIO Filmic integration with Blender is awesome and opened a whole world of possibilities. I use it from the days you had to replace the folder, just like I did now to add ACES support.
But with all the complexity these color and gamut issues got lately, and with all big players asking for some consistent color managing, the path to process blender renders should be clearer and hack free.
Maybe it’s just one page in the manual with clear, user-proof instructions. If the TIFF workflow is the right one, or there’s some way to use EXRs, it would be nice to have some “Integrating Blender renders into Color Managed environments” page in the manual. Once the workflow is clear any of us “users” can contribute it from a “user” point of view.

These are some of the Netflix guidelines for color management and VFX. If ACES is such a broken crap you should write and email to Netflix and tell them, because they haven’t noticed it. Or maybe they did, but the industry needs so desperately SOME standard that anything will do, even ACES. Or someone should write a new one. But in the meanwhile its ACES.

2 Likes

that’s a 500 intensity emitter, of course it will display overly saturated!!
It’s a totally unreal situation put there to test the how the image handled the limits.
The saturatio I refered was the overall saturation. Slight, but noticeable. those artifacts are completely expected.

Of course I know this workflow is not ideal. I’m trying to arrive at one.
You tell me how to place EXRs in resolve then. The R709/Linear doesn’t work.

Are you kidding? I’m not trying to achieve anything just spot clipping and weird shifts.

No. I need to see situations closer to what I deal. The devs should to that and tell us how to deal with the colorspace.

I think that’s the way Resolve handles it’s color management.
This is the same pushing but with alexa test footage
[disclaimer: this is just a push and pull of data not inteded to achieve anything]

But of course, Resolve is a crappy piece of amateur software, nothing to take seriously, just like ACES, Netflix, Animal Logic, the Academy and everything else outside Blender. And especially the users that are so ignorant that they don’t know the basic Math needed to write Color management engine from scratch so they don’t deserve to use it either.

I’m done with the tests, If do the DWG tests I’ll keep them to myself, I see there’s no point in finding a solution here.
The answer is simple: We get no official support for ACES in Blender.
bye

2 Likes

It’s not “overly saturated”, it’s skewed and the wrong light mixture, and completely fscked up tonality-wise.

False.

Film handled all of this far more gracefully. In the end, the demonstration is littered with out of gamut values and every single one will render differently display to display.

False.

This is digital RGB lowest-common-denominator garbage. Think sRGB Mark II.

See above; Resolve isn’t really a compositing package, and it began life as an output referred grading package. As I said, it is possible to jump through hoops to get EXRs handled decently such as via the OFX plug-in colour transforms, etc. but it requires knowing what one is doing.

You misunderstood what I was saying.

An offset applied to radiometric-like linear data versus nonlinear log encoded data vs image nonlinear will behave radically differently.

Figuring out what, specifically one is trying to test, means unifying the operations in question so they behave identically. For log encodings versus display linear versus open domain linear light, it requires acknowledgment of the implicit differences of state.

Sadly, in order to work in a managed approach, it requires more than mere software. One using the system needs to understand it.

sigh.

Animal Logic uses Baselight by the way… just sayin’. They also didn’t render Lego using the ACES you are using, and not even the earlier 0.X ACES. Feel free to reach out to Mr. Fry for specifics, as he was one of the key folks involved at that time.

In the end, your appeals to authority aren’t going to help you understand what is going on, nor how to maintain control over your workflow. I assure you it takes a hell of a lot more effort than simply assuming all software behaves the same, or that some magical piece of software can help you.

I speak for no one but myself with the following statement:

Read the fsck up and try to understand what I’ve typed.

ACES is supported, thankfully, so meatheads like you can use a system that doesn’t work properly and they can’t tell.

In the end it doesn’t matter at all because you have zero idea what the hell you are looking at.