Thoughts on making Cycles into a spectral renderer

I took a quick look at Mitsuba 2, very interesting indeed. The software architecture behind it is cool, I’d love to take a look in more detail at some point.

2 Likes

I have made some fairly in-depth comments and commentary about the code so far on my GitHub repo, if anyone is interested in looking at the code. It is intended for @polyrhythm who has expressed interest in helping with the project, but anyone else is free to read and give any advice they might have.

4 Likes

What a nice summary of what you have done so far! Good luck with that Next Big Hurdle :smiley:

1 Like

Thanks, it’s going to be a big hurdle indeed, but it’ll make such a difference to the engine, I feel like that’s really the point I could call it a ‘spectral rendering engine’.

5 Likes

Yeah, before this writeup I didn’t quite understand what you meant, but now I see the problem. This needs to happen. If I read that right, I think this might also be a big reason as to why this currently is so much slower than regular Cycles. It probably does the RGB-to-Spectral conversion far more often than it would need to if this part were implemented, right?

Yep you’re right, although I’m sure there are plenty of other reasons it is slow :joy: it is very poorly optimised right now so there are going to be some huge inefficiencies in how things are done. That being said, there are probably more conversions happening than necessary.

Not sure if everyone in the thread was aware that Mitsuba 2 dropped, and it is a tremendous bombshell of work, directly related to spectral given it’s architecture.

https://rgl.epfl.ch/publications/NimierDavidVicini2019Mitsuba2

8 Likes

It’s amazing they’ve achieved a material model which is entirely ‘retargetable’ to either spectral or RGB. I think essentially what happens is the RGB mode is just using spectral materials at 3 fixed wavelengths. If not, I’m very interested to find out how they handle calculations in a non-spectral primaries colour space.

1 Like

It is absolutely uh… one in three people in the world sort of work. Side note, one of the lead developers is Wenzel Jakob… and that last name should ring a bell for those who follow this thread.

It really is a nuclear bomb which probably should have everyone saying “Why don’t we just give up and go to Mitsuba 2 with Enoki?”

1 Like

Having a good Mitsuba 2 <-> Blender integration would be awesome. I wonder how it performs on very large scenes and in terms of raw resolving speed compared to Cycles.

2 Likes

Watching that video along with the magical Enoki component makes me think that there’s probably nothing that can even come close? It’s beyond next-generational stuff.

2 Likes

Please do not be deterred. Mitsuba is a tool to do research into rendering while cycles is a production renderer. Just learn from the one and apply it to the other.

2 Likes

Yep, it certainly is full of very interesting research. I’m not going to give up on adding spectral rendering to Cycles, but it certainly seems that Cycles isn’t the perfect target for advanced spectral rendering techniques, at least in its current state.

4 Likes

have a look at luxcorerender.org. lux render has good blender integration and is also gpl. its features are on par with cycles.

3 Likes

Jumping in the conversation here. For reference, I’m the author of Psychopath, which is a spectral renderer. That doesn’t mean I always know what I’m talking about, of course. I have a lot left to learn myself. So take what I say with a grain of salt. But I have at least some experience.

Yeah, that’s definitely going to be an issue. However, there are ways you can work around it to some extent.

For example, if you want to be able to specify blackbody emission, make a dedicated blackbody emission BSDF node, where the input is the temperature. (Or it could even just be an option in the existing emission BSDF node with a dropdown menu or something. But still, effectively a separate type of emission node.)

Solutions like that let you get 80% (or more) of what anyone will ever actually care about.

The ideal situation, of course, would be for all BSDF nodes to take spectral inputs instead of color inputs, and any tristimulus values would be upsampled in the shading system. But I don’t think that’s realistic to expect to happen any time soon. IIRC (could be wrong?) even OSL assumes an RGB model, and the industry is basically standardizing around that. So outside of maintaining a completely separate shading workflow from the entire rest of the industry (e.g. like Weta does), you basically don’t have any other options.

And in practice I don’t think it’s really that bad to do things that way. Practically speaking, the vast, vast majority of color data going into a renderer is going to be tristimulus anyway. And I don’t think that’s likely to change any time soon. So it’s better to handle the few cases where it most matters (e.g. light emitters) explicitly, and let the rest be handled by spectral upsampling.

I would love to see a proper spectral shading workflow take over at some point, of course. The more you learn about this stuff, the more you realize that RGB is just wrong (artistic choice notwithstanding, of course). But that’s not the situation we’re in right now.

6 Likes

I’ve kept an eye on Psychopath for a while now, that’s what convinced me I could write a little raytracer in the browser.

You’re right, I feel like at this point (considering the complexity in interfacing with OSL) that the best approach is to have dedicated spectral nodes, since light sources tend to differ the most between spectral and RGB anyway. That seems within reach for Cycles, just a few more hurdles to get over (volume and SSS code, then creating the initial spectral nodes. Maybe a blackbody and a ‘spd loader’ node is all that’s is needed initially. Effects such as thin film interference could be handled by a dedicated node. That does give most of the benefit with much less work than what would be required for a full rewrite.

3 Likes

Yeah, I think it’s probably the best way to go. It’s not what I plan to do with Psychopath… but Psychopath is a toy for playing with ideas that I find interesting, not something that people are depending on for production rendering.

Also, the more I think about it, the more I’m even doubting my own assertion about what the “ideal situation” would be. I think it’s a complex topic, and will require time and experimentation to figure out how something like a spectral shading workflow should work. Whereas “everything is RGB until you hit the BSDFs” is pretty obvious in terms of what the trade-offs and limitations are. And it integrates well with industry standards.

Oh, thanks! I’m always a little surprised when it turns out people know about it, ha ha. It’s just a toy, and isn’t actually useful for much of anything. But it’s certainly a project I enjoy hacking on. :slight_smile:

5 Likes

What is Spectral renderer?

The short version is that, instead of using three colors to illuminate a scene (namely typically the light sources that correspond to Red, Green, and Blue in sRGB color space), you use a continuum of pure spectral colors (think rainbow).

There’s a bunch of possible future benefits (once it is implemented in full - there’s lots of ways in which this is still very much WIP) but already now, in certain high saturation situations, light sources actually contribute to appearance when they previously wouldn’t have, and these situations also feature higher contrast (as my most recent tests above showed quite clearly, I think).

I’m a completely layman in regards to the subject, but shouldn’t spectral renderers provide natural chromatic dispersion/aberration and maybe iridescense to materials? (instead of having to emulate the effect eg: glass dispersion using multiple ior-variant nodes)
I don’t see most people using completely saturated lamps and colors in their scenes to notice a difference in using a spectral renderer, but I can see people noticing the difference if there’s natural dispersion in their glass shaders, or having the possibility of creating a realistic iridescent bubble shader or a realistic Compact Disk shader, or a burnt iridescent car exhaust shader, etc, without having to fake it.
Shouldn’t we also be comparing these cases? Ex, regular cycles diamond (ior 2.418) vs cycles spectral diamond, etc.
Or am I completely off-base and off-topic? If so, I’m sorry.

Edit: well I see that on blender artists the effects that I mentioned were being exemplified by smilebags.

Assuming the BxDFs are written to support it, which there currently aren’t any.

Spectral makes a difference across all indirect bounces. The spectra sharpen when indirect spectral compositions hit similar spectral compositions, resulting in more saturated values, as well as less energy loss when similar spectra hit each other. Quite different compared to RGB.

4 Likes

Once all the existing bxdfs are converted to support spectral rendering I will look at ways of extending them to show spectral phenomena without having to fake it, as you are exactly right, @Evandro_Costa, in a spectral engine, all these effects can be handled by the engine so we don’t have to fake them, and we will end up with much more natural and realistic results.

14 Likes