Blender Support for ACES (Academy Color Encoding System)

Thank you! I’ll try those.

No.

Rendering engines are mostly agnostic from the math. They crunch numbers. If you feed an encoding to OCIO and say that it is “FooRGB”, then use it as an emission, that is FooRGB as a render.

There are a few remaining hard coded paths, but by and large, Cycles etc. can render with anything, because “rendering” is just dumb math based on the assumptions of what the rendering engine is fed.

TL;DR: It’s the audience that creates the meaning for the most part.

No.

There are two components to lights:

  1. The colour of the lights.
  2. The range / domain of the lights.

All rendering engines now typically generate zero to infinity emissions. However, the “colour” of the lights is another axis.

If one says “latitude”, the dynamic range of all contemporary rendering engines is zero to infinity, even if they use three colour less lights!

So there’s no difference in the latitude across colour spaces from a light transport vantage.

Because you are missing the fact that you, the audience member, are defining the encoding. If you say some arbitrary buffer is BT.709, it magically is! But in terms of absolute chromaticities relative to the light mixtures? Completely wrong.

This is a longer discussion, which I am happy to try and explain.

In this specific example, shadows aren’t more or less “saturated”. A display can only display a specific range of chroma, and any values that end up negative or beyond 100% display emission are clipped with respect to gamut. Distortions can happen due to quantization as well!

The long and short of it is that absolutely zero of the intended light mixture makes it out of the display with wider gamut rendering. While this is a deep rabbit hole, each and every pixel that is a gamut excursion is device / display dependent. That means:

  1. It’s the “wrong” light mixture with respect to the intended ratios in the render.
  2. It’s a different light mixture between displays, violating the very ground truth of pixel management.
  3. It leads to distortions and broken light mixtures in terms of aesthetics, as per 1.

You are presenting “facts” as you believe you understand them.

Filmic as Filmic Log, the way I designed it and intended, is the proper path in Resolve. Resolve is a dumpster fire in terms of contemporary approaches, but can be coaxed to behave properly, assuming one understands the lower level workings.

You aren’t looking very hard. Look at the skews as the deeper chroma ascends to display maximum.

Use simple synthetic imagery like a flat chart, and jack exposure, and it will be glaringly apparent.

Then compare against the low level primaries, versus complimentary light mixtures inside that gamut.

Don’t look at the scopes. Look at the imagery. Use your thought process. And if you are going to use metrics, use CIE metrics to validate what you see in the imagery.

TIFF works great. All of Blender’s integer encodings sadly have broken alpha though.

I assure you, as the author, that “plain Filmic” as it exists in Blender is not “plain”. The “plain”, if there ever was one, was the base encoding I designed the entire system around - Filmic Log. My configuration behaves slightly differently to what was implemented in Blender, and for good reason.

I’m not angry at all. The errors and mistakes that happen are repeated, which is why I try to spend the time helping folks get a handle on some rather slippery ideas.

At the bottom of this mess are some really simple ideas that I am 100% confident everyone can grasp. It just takes a bit of approaching it repeatedly from different peoples’ vantages to help get folks to see them.

An encoding is what someone says it is. It’s just dumb numbers in the parcel!

1 Like

@dracoroot7 @troy_s
Here’s another test of the same workflow, this time the Blender scene is set up with more standard irradiance values, to have more real world situation, the HDR environment is set to 1 in Strength, and the emitting voronoi chips are at 10.
This image was generated by the regular Blender OCIO, no files modified.
The EXR image is usable out of the box in Resolve (ACES 1.1 cct) setting it’s input as ACEScg.
It’s slightly more saturated, probably due to the difference in the primaries used by blender and ACEScg.

Blender png (filmic high contrast):

And EXR placed in resolve, no grading, balancing or correction, just the input set to ACEScg exported as still:

And this is a realtime of some pushing and pulling the offset first, and then the primaries.
I see the image behaving beautifully, not even the Alexa footage is able to hold pushing latitude like this. I don’t see anything broken. Only the saturation is a little higher than expected.

(This is for people like me that want to integrate renders into resolve color managed workflow and had problems)

2 Likes

This is a closed domain “image”, ready for consumption, and not really a decent grading entry point. Filmic Log should be the entry point here, especially for adjustments like exposure, which on nonlinear pure log encoded data, is an addition / subtraction / offset.

This is also wrong.

Blender can only dump linear open domain light data to an EXR. Loading an EXR and instructing the software that the encoding is non linearly encoded such as ACEScct is flatly wrong.

What are you trying to achieve? An offset on liberalized light is a radically different math operation to an offset on nonlinearly encoded log data, which is again different to a nonlinear ready-for-consumption final image encoding.

It is important to understand the differences.

I think you are gravely mistaken.

Again:

  1. Use a sweep pattern, using the working space primaries.
  2. Do a proper exposure adjustment.

Radically different results, domains, and random judgement is not helping anyone here.

The chroma distortions are indeed part of the screw ups.

Watch the trajectory in the 1931 plot and you’ll see the chroma distortions / curvature.

Renders composite in two ways, depending on one’s view of light data.

In a large number of cases, the composite should happen on open domain, linear light. That means that Resolve can only work with significant hoop jumping. Fusion standalone works great, but ResolveFusion has some nonlinear assumptions due to Resolve’s history as a closed domain / display referred tool.

1 Like

So? You mean that in plain (unmodified Blender) I can export EXRs in nonlinear space and choose the Colorspace? How is that done?

If you try to grade a 16 bit PNG you will run into issues because not all latitude data got enconded in the file and something got either clipped or rounded. That’s the issue I was talking about, I wasn’t questioning the theoretical infinite radiance that Cycles may be able (theoretically) to handle. My point is that grading 16 bit files is not ideal if you can export the full range of data in EXR.

mmm, nope. I cannot decide arbitrarily the encoding and you know that.
I MAY define arbitrarily whatever combination of colorspace and gamma is reading the image. But that image was encoded using ONE combination of those primaries, white points and transfer functions.
And to be able to use the image in a color managed environment I need to be able to tell those, precisely (ideal) or approximately (not ideal at all, but usually better than display-baked-8bit footage).
According to the Blender Manual, the right encoding for EXRs, whatever the transfer function, should be Rec709/Linear gamma. But if I use those the image breaks. That’s my whole point. It really breaks.
So, completely clueless about how the image should be interpreted, I ran lots of trial and error until some combination held better. These are my findings nothing more.

I think you are missing the point here. I know that a 100nits tv cannot output 4000nits, as I know that in a CMYK printed book you can’t have the kind of 0xFF0000 red you may get in a 1998 CRT monitor. But still you take a photo of that monitor, and SOME red is making it to the printed page. To make all that obvious transformations nicer is that we mess with the Rabit Hole of Color Management.
I’m trying to find a way to map elegantly the beatiful internals of Cycles rendering inside Resolve Color Management, while preserving most, or ideally all of the original data to have room for corrections.

I don’t see any out of place “skew”.
If you are talking about the visible diagonal in the waveform near the highlights, that’s a color cast, and a gradient, caused by the “outdoor” environment image that casts a bluish light on the scene, from a definite direction (the sun), hence the gradient, seen as a slope in the waveform.
And if you are talking about how that slope changes it’s angle, that has to do with how Resolve manages Offset and primaries in colormanaged environments, both ACES and DWG. Resolve “pins” both the black and white clipping points and grades in between. That causes a nonlinear response to offsetting the image. That’s a feature, not a bug and is one of the best selling points of the colormanaged environments. In unmanaged spaces highlights just clip.

Why? the scopes are wonderfully useful. For a colorist. Or a DP at least.

I do. But monitors can be deceiving you know? I think you know that too.
That’s what the scopes are for.

I already do that too. Thank you anyway for the tip.

I don’t know how to do that, besides looking at the chromaticity CIE scope to spot chroma clipping/out of gamut pixels.
You do that. And the color scientists. And the monitor calibrators.
I just want to use Blender renders in Resolve seamlessly. That seems more complicated than it should be (and that is the whole point of this thread, I understand, to be able to have a predictable profile to read renders in other apps).

Trying that right now. But they are 16bit maximum too it seems i’l check how they hold…
(anyway I think it’s a pity to loose the EXRs, it’s an awesomely flexible and they support arbitrary channels and layers… that’s unbeatable)

I was just refering to Filmic vs Filmic Log.
The word plain was just to differentiate it from Log and with the one from the Blender with the ACES modified OCIO… I may have used “vanilla” or whatever. Don’t get lost in the choice of words.

And at the top is an unclear workflow to colormanage the EXRs, and/or unsupported ACES workflow. I know that you hate ACES by now, but that is the appeal. If your app supports ACES you set the output or input to the right ACES profile and forget about all this.
Sometimes devs forget that not all of us can become rocket scientists just to film a video about a rocket. Most people that hit this wall will just say “Blender is broken, don’t use it on our pipeline please”.

Again, no. We need that the dumb numbers preserve the look of the image outside Blender.

All the OCIO Filmic integration with Blender is awesome and opened a whole world of possibilities. I use it from the days you had to replace the folder, just like I did now to add ACES support.
But with all the complexity these color and gamut issues got lately, and with all big players asking for some consistent color managing, the path to process blender renders should be clearer and hack free.
Maybe it’s just one page in the manual with clear, user-proof instructions. If the TIFF workflow is the right one, or there’s some way to use EXRs, it would be nice to have some “Integrating Blender renders into Color Managed environments” page in the manual. Once the workflow is clear any of us “users” can contribute it from a “user” point of view.

These are some of the Netflix guidelines for color management and VFX. If ACES is such a broken crap you should write and email to Netflix and tell them, because they haven’t noticed it. Or maybe they did, but the industry needs so desperately SOME standard that anything will do, even ACES. Or someone should write a new one. But in the meanwhile its ACES.

2 Likes

that’s a 500 intensity emitter, of course it will display overly saturated!!
It’s a totally unreal situation put there to test the how the image handled the limits.
The saturatio I refered was the overall saturation. Slight, but noticeable. those artifacts are completely expected.

Of course I know this workflow is not ideal. I’m trying to arrive at one.
You tell me how to place EXRs in resolve then. The R709/Linear doesn’t work.

Are you kidding? I’m not trying to achieve anything just spot clipping and weird shifts.

No. I need to see situations closer to what I deal. The devs should to that and tell us how to deal with the colorspace.

I think that’s the way Resolve handles it’s color management.
This is the same pushing but with alexa test footage
[disclaimer: this is just a push and pull of data not inteded to achieve anything]

But of course, Resolve is a crappy piece of amateur software, nothing to take seriously, just like ACES, Netflix, Animal Logic, the Academy and everything else outside Blender. And especially the users that are so ignorant that they don’t know the basic Math needed to write Color management engine from scratch so they don’t deserve to use it either.

I’m done with the tests, If do the DWG tests I’ll keep them to myself, I see there’s no point in finding a solution here.
The answer is simple: We get no official support for ACES in Blender.
bye

2 Likes

It’s not “overly saturated”, it’s skewed and the wrong light mixture, and completely fscked up tonality-wise.

False.

Film handled all of this far more gracefully. In the end, the demonstration is littered with out of gamut values and every single one will render differently display to display.

False.

This is digital RGB lowest-common-denominator garbage. Think sRGB Mark II.

See above; Resolve isn’t really a compositing package, and it began life as an output referred grading package. As I said, it is possible to jump through hoops to get EXRs handled decently such as via the OFX plug-in colour transforms, etc. but it requires knowing what one is doing.

You misunderstood what I was saying.

An offset applied to radiometric-like linear data versus nonlinear log encoded data vs image nonlinear will behave radically differently.

Figuring out what, specifically one is trying to test, means unifying the operations in question so they behave identically. For log encodings versus display linear versus open domain linear light, it requires acknowledgment of the implicit differences of state.

Sadly, in order to work in a managed approach, it requires more than mere software. One using the system needs to understand it.

sigh.

Animal Logic uses Baselight by the way… just sayin’. They also didn’t render Lego using the ACES you are using, and not even the earlier 0.X ACES. Feel free to reach out to Mr. Fry for specifics, as he was one of the key folks involved at that time.

In the end, your appeals to authority aren’t going to help you understand what is going on, nor how to maintain control over your workflow. I assure you it takes a hell of a lot more effort than simply assuming all software behaves the same, or that some magical piece of software can help you.

I speak for no one but myself with the following statement:

Read the fsck up and try to understand what I’ve typed.

ACES is supported, thankfully, so meatheads like you can use a system that doesn’t work properly and they can’t tell.

In the end it doesn’t matter at all because you have zero idea what the hell you are looking at.

I’ll respond just to this, and ignore all the personal attacks and namecalling.
Too much mystery here. Either it work or it doesn’t.
Tell me exactly what I should select in the Color Transform. It doesn’t have that many options.
Bypass the input transform in the color management. Then in the OFX I have:
Input Color space:________
and Input Gamma: _______
the other option are generally matter of taste, but if you have recommendations feel free to make them.

And you said the TIFFs are the way to go. What is the input transform for tiffs then? Because TIFFS are no longer linear, they get affected by the transform.

It shouldn’t be that difficult to just outline a working workflow. Without name calling or insults if possible.
And, last, if ACES is supported, how do I export an ACEScg linear file then?

1 Like

The OFX node allows one to take a linear EXR of Resolve’s “supported” encodings and change them as required. So if you are Arri LogC / AWG, the “Input Colour Space” would be AWG, and the poorly termed “Gamma” would be Arri LogC.

But again, that will only manually load the footage correctly, and you would be responsible for taking the footage to the “Timeline Colour Space”. Resolve only works on display linear at best, and that is based on a BT.709 assumption.

You can easily test and verify this by testing a comp of a blurred red shape over a fully emissive cyan background. As of last testing, only BT.709 as a working timeline space linearizes to display linear for compositing.

Fusion standalone is far easier here. But again, they are two different applications with different assumptions.

Filmic Log TIFFs are indeed nonlinearly encoded, but that is fine because Resolve doesn’t exactly work linear. It’s just part of its history. For a composite, if the goal is to composite footage, Fusion standalone would be the way to go, and you can simply linearize Filmic Log to linear using the inverse transform and an OCIO node.

I can only post and try to explain things so many times. At some point either someone says “I still don’t understand” and I keep trying to figure out what that is, or they hand wave and make appeals to authority. I am, as I’ve said and demonstrated countless times, happy to try and help the former case.

If you install the canonized ACES configuration, with the appropriate Blender tweaks, the output of an EXR would be linearized AP1 primaries. That is manageable from within any OCIO enabled compositor etc. And of course you would be left with what amounts to an unmanaged output because plenty of your pixels would be out of gamut. And then on top of that, you also end up with posterized messes due to the overall design.

If the work is Filmic based, because Blender doesn’t have fully managed file encoding, the sole option would be Filmic Log in a TIFF or like format, and then linearize the 16 bit TIFF in the compositor using the inverse transform via OCIO.

The latter works and has worked for over half a decade. The contrasts are easily applied downstream on the Filmic Log.

Ok, we may have the source of misunderstandings here. Looks like you have a very old picture of Resolve and its color science. The latest versions of Resolve work internally in another colorspace that’s not display related, nor based on 709 primaries, it’s called Davinci Wide Gamut/Davinci Intermediate, for the Colospace and the “Transfer function” (to not calling it gamma and making you angrier…) respectively. It’s Blackmagic’s attempt to respond to ACES with their own Color Managed environment, and –of course– they claim it’s far better than anything that humanity has seen so far in terms of CM. I lack the knowledge to judge or back that assumption, I just know if I am able to use it effectively in production or not. I still couldn’t. All standard cameras footages get mapped very nicely, but I ran into all sort of issues with digital 3D footage, Fusion comps (although, I recently arrived at a working solution to map the Fusion comps) and (the worst) effects, plugins and powernodes unaware of the new workflow.
There’s a small brochure outlining it here (I bet there’s more in depth data someplace else):

I assume that you will not like the DWG/DI… just by linear extrapolation of what think of all color management, but what I want to do is get the EXRs there. Or to ACES, but maybe that’s not possible for what you say

And you tested the compositing?

Give it a try with the red blurred square and the cyan background. You’ll find it has a very peculiar behaviour in 17, and is very non-uniform in response.

As best as the folks who I know who have tested it, it is a very convoluted backend that doesn’t behave consistently, or at the very least, is far from straightforward.

But again, feel free to try it with imagery that demonstrates linear versus nonlinear compositing, as that’s the best method.

I still stand by that it is far easier to control and have access to OpenColorIO nodes with Fusion Standalone.

btw, I installed this version of ACES, using the content of the 1.1 folder.

I don’t know if as of today that’s the version you approve. I guess it’s the one people (meatheads?) out there is using.

You mean something like this? (that’s 2 Solids, precomposed, and blurred in Color page)
In a Davinci Wide Gamut project:


The same test in an unmanaged YRGB, Rec709/2.4 old school/oldscience Resolve project:


I have to say that to my meathead, untrained, gamma corrected, unworthy and flatearther eyes, the Wide Gamut version looks way better, both in the preview and in the scopes. Wrong answer? Do this have some clue as to how to place Linear EXRs the right way?

Jason (JTheNinja) had a variation with the appropriate Blender needs, including the appropriate RGB to XYZ matrix and proper coefficients. I can’t remember where it is.

The DWG version I believe composites in display linear correctly, like their older BT.709 YRGB chain did.

I believe though that not all operations flip flop. Again, it’s erratic. I also don’t believe that their pseudo ACES chain works correctly, but that was tested in the early betas of 17.

I would test everything against a ground truth like Nuke or Fusion before assuming it works properly. It was inconsistent in the betas, operation depending.

Also note that Resolve is per channel, so if one want to hold the chromaticities, the versus tools are the most useful options in Resolve. Otherwise it’s skew city. Baselight on the other hand, doesn’t operate in the RGB stimulus domain, and is far more properly managed.

Thank you, i’ll search it. The one I have installed does not export Linearized AP1 EXRs definitely.

They stated last year thar they “Fixed incorrect matrix values”. It may be working better by now.

It should. Assuming you feed it AP1 lights, it will spit out AP1 light buffers. But again, gamut problems and tonality mishaps abound.

I am pretty sure it is due to their internal pipeline, which is being retrofitted, hence there are plenty of kinks and warps and weirdness. Hopefully BM won’t discontinue Fusion standalone, as ResolveFusion just isn’t anywhere near useable yet.


One image is out of blender with the official aces config set to aces and 709 for look
and the other resolve. The resolve one has no adjustments to sat or exposure .

Resolve color management was set to acescc with ap1 and output set to 709

the exr input transform was set to acescg.

choosing acescg as the input transform seems to give me and exact match (no ofx, no bypass)

using the color transform ofx with ap1 and linear gives me a great match too (under advance, only turn on white adaptation), but not exact.

Just tinkering with this because dealing with all the problems helps me figure stuff out.

edit: using an uncolormanged workflow and just using the ofx node to handle conversions I
pretty much got the results I wanted by setting the tone mapping option from davinci to
luminance (I need to test diff images) (of course all of this is with image rendered with aces
config)

Do you have any luck finding it? I did find the tweets, but not the actual config. Still trying to look for it, I’m not even sure whether Jason has actually ever posted his config.

1 Like

There’s no “winning”; log encodings are an encoding which are not suitable for display. It’s a light encoding, which means it must be suitably decoded appropriately and prepared for display.

Imagine looking at the bitstream of an MP3 using a text editor, and saying that some song “wins”. That’s what you have done here.

In terms of “correct” outputs, using the canonized rendering, there’s another rabbit hole there. See the trending to cyan outside the window? That’s not “winning”.

4 Likes

@troy_s He’s not using a log encoding, he’s showing a render of his scene (which is intended to be displayed).

However, for direct rendering ACEScc is wrong, he should be using ACEScg instead.