Thoughts on making Cycles into a spectral renderer

Hi @smilebags a rough question: what is the final goal of this work you’re doing? I mean, when in the future you’ll have all the spectral nodes in place, a robust implementation and roughly the same rendertimes… are you going to submit a patch to BF to make it land in master? Should it be an experimental option, of just an alternative to the regular RGB engine? Or it could be just a checkmark to tell “work in spectral mode”? I’d like to have an insight to the future of this project (if there is one).
Thank you anyway, following this thread is entertaining :woman_juggling:

6 Likes

I’m keeping the idea of merging to master open, though I won’t make big compromises just to get it merged. The goal is to have a solid spectral renderer implementation, and it does seem like merging is a possibility.

I’m confident the performance impact shouldn’t be too dramatic, it’s just a lot of work to convert everything over, which is very hard to fit into my spare time.

As for how it would be exposed, due to how Cycles is built, having it as a runtime toggle is going to be a lot of work, and mean a lot of maintenance work in the future. If it will be merged, it would likely be the only mode. I don’t see this being a big issue, as everything will continue to work as expected, and if you were unaware you probably wouldn’t notice you were using a spectral engine at all.

I would like to keep this conversation open about whether having this merged is something BF devs are interested in. There are more considerations to merging this than just ‘does it work’, namely, it means future developers of Cycles needs to be able to make sense of a spectral renderer. It isn’t a major mindset shift but it is something to consider.

Thanks for asking. It’s an important question to discuss, especially as it seems progress is speeding up lately.

7 Likes

There might be occasional good reasons for having a checkbox. Ideally that’d be how it works, I think.
However, Spectral as default, if the performance impact isn’t too big, will likely be a good idea. Certain edge cases just are handled much more gracefully with it.

I just thought of something: Since the blackbody node gives a spectral output, I was curious how the split RGB node handles it.
Turns out you get a pinkish grey material doing that, regardless of which color channel you use (it even happens if you use all three) and regardless of what color temperature.

So clearly it falls back to some sort of strange default. I wonder what it is.

However, I have a rough proposal of how to handle this in a reasonable manner:

Basically, you already have three spectra which define a red, a green, and a blue light source in sRGB.
Conveniently, they sum to 1,
So an easy solution might be to literally just multiply arbitrary spectra once with each of those sRGB extreme spectra.

This isn’t a perfect generalization. For instance, if you put the color sRGB green (0 1 0) through this process, it will end up having some contribution in both red and blue. But it’s the simplest somewhat reasonable thing I could think of.

Apart from that, however, really what ought to happen is two clearly different color swatches, one for RGB color and one for spectral color.
In fact, in principle, it could be interesting to have even more such options though that’s not the focus of this branch: For instance, how about a CMYK color model? You’d definitely also want to distinguish that from RGB colors.
Similarly, RGB in different color spaces or using different underlying spectra (say, optimized for reflectance vs. illumination etc.) should probably also be distinguished somehow. Although we can’t just have infinitely many inputs so clearly that’d require some deeper UI changes.

But anyway, all of that is of course leading too far for the moment. At least for now, spectral data could be converted back to approximate RGB data using the above-mentioned procedure.

It would be nice if that was achievable, but due to how Cycles is built that would be very difficult to do.

Separating RGB will just get the intensities of the first 3 wavelengths in the ray, and because there’s no correlation between the wavelengths and the channels they are in, you end up with a noisy ‘equal energy’ colour. That’s what that pinkish tone is.

This limitation should only exist with spectral colours though, regular colour values should still
be triplets until used in a BSDF so you should be able to keep doing regular RGB math with them. If that’s not what’s happening then I’ll just need to fix it.

There might be some process that can be done that can give you RGB values from a spectrum but it seems counter-productive. I’m not going to focus on that until a compelling use case comes up.

1 Like

Other colors absolutely work. And that does make sense for a simple fallback for spectral conversion but none for an artist, I think. I understand it’s not a focus, but probably ideally it should just do nothing.
Oh, or I suppose another thing that could be done is to simply add up the spectral contributions and derive the actual sRGB color from that. In that case you should be getting the spectra split correctly, such that any given color ends up being the correct thing in sRGB as well.

Honestly, the reason I wanted it was essentially just to get Blender’s own take on how it interprets blackbody materials as RGB, and to then directly compare those spectra, as opposed to the roundabout way I did it before, by using another thing to tell me what the values ought to be.
But it’s not that important.

Really, eventually (not now) it would be cool to have a variety of nodes such as spectral normalization (make the largest wavelength be 1), inversion (literally just take 1 - spectrum), reversion (exchange highest and lowest wavelengths), clamping, shift (make it look slightly more reddish or bluish - could be interesting for emulating, say, photonic doppler effect), or what amounts to an equalizer as it works in music (for base-boosting and such, equivalent to red-boosting here).
Basically, simple spectral manipulations equivalents to any given common RGB operation on one hand and a number of music-inspired maipulations on the other.

But of course that shouldn’t be the priority right now

Spectral utility nodes are on the list of things to add, the only thing that probably won’t happen out of those is the normalisation one, since while we conceptually think of the spectrum as continuous, at render time it is only looking at X wavelengths at a time. Knowing the maximum takes either solving it from first principles (by knowing all the equations that went into it) or solving it implicitly by stepping through it with a small interval to find the max value. Neither are practical to do, and this is the same reason a ‘normalise texture’ node doesn’t exist.

All the other things you discussed should be pretty trivial with some basic tools like spectrum add/multiply/divide and a spectrum curve node. These things are definitely very valuable to have and will be some of the first nodes I add once I figure out how to do it.

1 Like

On that note, I’m attempting to duplicate the ‘RGB Curves’ node and I’ve got it compiling but the node is called ‘Unknown’ in the UI and I receive this traceback when attempting to add it:

Traceback (most recent call last):
  File "C:\Users\chris\blender-dev-env\build_windows_x64_vc16_Release\bin\Release\2.90\scripts\startup\bl_operators\node.py", line 135, in invoke
    result = self.execute(context) 
  File "C:\Users\chris\blender-dev-env\build_windows_x64_vc16_Release\bin\Release\2.90\scripts\startup\bl_operators\node.py", line 122, in execute
    self.create_node(context) 
  File "C:\Users\chris\blender-dev-env\build_windows_x64_vc16_Release\bin\Release\2.90\scripts\startup\bl_operators\node.py", line 92, in create_node
    node = tree.nodes.new(type=node_type) 
RuntimeError: Error: Node type ShaderNodeSpectrumCurve undefined


location: <unknown location>:-1

My development branch for this is available here so if anyone has any experience with this, I’d love some advice on what might fix it. This the diff of what I’ve done to get to this point.

I don’t think spectral values should be passed between nodes, see my comments earlier in this topic.

This is incompatible with Eevee, OSL, MaterialX, etc, and problematic for importance sampling. Instead there can be emission and BSDF nodes that support generating spectral values directly.

I can definitely understand the incompatibility with Eevee and OSL, I don’t know anything about MaterialX so I’ll take your word for it. I don’t understand why it is incompatible with importance sampling, but I don’t know the details of how importance sampling is implemented - I suspect I’m missing some key information there.

Is the current implementation of spectral blackbody already a problem?

There are algorithms that rely on being able to importance sample the wavelengths after shader evaluation, or to evaluate/sample closures multiple times with different wavelengths.

We don’t want to shader node setups where this doesn’t work. The blackbody node implementation is a problem in that sense.

Okay. I don’t understand the issue, as the closures are already being evaluated at 8 wavelengths per ray, but I’m sure it’ll become clear when I run into it in the code. The only way I can see this being an issue is when the PDF is dependent on the channel (in RGB, or wavelength in a spectral engine), which seems unlikely.

Does ‘there are algorithms’ mean that Cycles is using such algorithms and I won’t be able to create a robust spectral renderer with Cycles without significant changes, or that such algorithms exist and you don’t want to lock yourself out from being able to use them in the future?

Mostly it’s about not locking ourself out of being able to implement algorithms in the future. Either to reduce noise in spectral rendering or to implement more advanced light sampling and global illumination algorithms.

And compatibility.

Okay, that makes sense. I can see the advantage of having that as an option for future optimisations. My primary concern is that if there is no way to directly create and modify spectra, then working with it would be unnecessarily complex. Each of the examples in @kram1032’s comment would need to be developed independently and would only work in very specific situations. I guess that’s the cost of being 100% compatible with an RGB rendering engine.

Plenty of what you are observing is anchored around idea of colour spaces, which isn’t nearly as “smooth” as some of the assumptions.

I’d point out that HSV and such pickers are really just random ass made-up formulas that are more or less RGB agnostic. What they do not do is divide up the RGB in a perceptually uniform fashion; it’s impossible as no such model exists to do so. Thinking HSV holds some sort of useful meaning would be a mistake. It’s simply a relative method to divide up a random set of RGB primaries.

Further, working around assumptions of sRGB / BT.709 within a spectral rendering engine is foolish. I’d consider the current as being a goofy stop-gap, as the whole point of a spectral engine is to be well… spectral, eventually discarding the ridiculous limitations of RGB path tracing encoding models.

It’s really, really, really easy to bump into gamut mapping issues with spectral, as has been demonstrated here. Fastest check is to sample the values via a right click in the image viewer. If a negative value appears, congratulations, you’ve managed to see a spectral result escape the gamut.

1 Like

Yeah already said that plenty of these results are out of gamut.
We’ll never get completely away from some sort of RGB-ish color picking, seeing as how that’s gonna be necessary for easy-to-use artist workflows. However, it would certainly be possible to allow various models for various situations.

Though even going out of gamut, I don’t think higher absorption, given a properly normalized spectrum, ought to ever go towards infinity.

Remember that it wasn’t that long ago that people had to learn to use RGB sliders. I always caution against “easy to use” as it’s a bit of a tautological trap. Learning how to use RGB sliders was an actual thing, and I’d claim that the generation since never learned what they meant again.

Not really sure what “infinity” means here. Can you clarify?

Sure, having some sort of helpful GUI for spectral input, perhaps with a fancy little preview of how that spectrum actually behaves, could make that very tractable.
However, doing that for textures is gonna be more difficult. People will want to put image textures into place. And it’s not like your average camera actually captures several binned wavelengths to derive a spectrum from. Also, I’d imagine that’s prohibitive in terms of memory usage for now. (That’ll go away eventually though, I fully expect)

The values towards the right there exploded. Like, they’d go to, I don’t know, 1456781554 and -268465456 and ridiculous values like that per channel. In fact they also do that for blackbody eventually.
That being said, we’re talking about ridiculously high absorption coefficients here. So I’m not too surprised that some sort of weirdness is starting to show up there, out of gamut or not.

Different issue, which is already more or less solved in terms of work produced in well defined colour spaces. In terms of camera material, that’s a rabbit hole unto itself, and wound up with gamut mapping and a plethora of other goofy rubbish.

It is still worth figuring out why it’s breaking down specifically. If it is quantisation, that’s one matter, and should probably be set to a sane limit. If it’s something else, it would be good to know what it is.

Here’s the file if you want to check it out. There is also an RGB based material as a fake user so you can easily switch that out too if you want.