Thoughts on making Cycles into a spectral renderer

right. Sounds like a job for various BSDFs, each with their own purpose and, perhaps, attached material library.

There’s gonna be plenty of parameters to always fiddle with. For instance, Roughness or Density aren’t gonna go away because of this. And some of these BSDFs are gonna introduce new parameters. For instance, you could have a proper coating BSDF where thin films of some material lie on top of another, and that’s likely gonna have a coating thickness parameter.
A very similar thing will also happen for soap films (i.e. bubbles)

All that rainbowy goodness will be there, supported out of the box.

That being said, implementing such BSDFs is gonna be a new challenge. I think first it’s gonna be necessary to have sufficient compatibility with the plans of Blender Master to have it be merged.

Although there might be a slight chicken-and-egg problem there?
As far as I understood Brecht, he doesn’t want direct user access to spectra at all for a variety of reasons I don’t fully understand.

Do you think something like a complex IoR shader could be within the scope of this current project? It seems to me that tends to be one of the first things that gets to be implemented in spectral renderers so I can only assume it’s relatively doable? And it would presumably be something that could actually land in Master.

1 Like

The math behind these BSDFs is of course some work for a developer, but from the user’s perspective, selecting a thickness and a few IoR values is a lot nicer than having to do multiple-bounce wave interference and trigonometry in a node-tree - hence spectral being simpler.

I think that’s the case, yes. I’m hoping that a change we’ve made recently (isolating all the spectral nodes in their own group) might help convince him that it’s alright - there’s going to be inevitable compatibility problems with OSL and Eevee, the best we can do is make it obvious what is ‘out of bounds’ if you want an OSL and Eevee compatible material - with this new approach that’s quite simple, avoid the nodes in the spectral group and everything will continue to work as expected.

Complex IoR is entirely possible, and probably quite easy. It could be another node in the spectral group (or, if there’s a suitable Eevee and OSL fallback, maybe a regular node). The thing with a node like this is it’s definitely more on the ‘scientist’s utility’ side of rendering, and might be somewhat unintuitive for new users.

2 Likes

Iunno, the details of how it works might be, but there are hundreds of everyday life materials available in the form of Complex IoR tables, and if your goal is to, say, render gold or silver coins, it’s pretty much trivial for new users to go “oh I want this material” and plug it in and be done. Obtaining the data from scratch is hard. But in a lot of cases that work has already been thoroughly done. And plugging in that data ought to be trivial.

6 Likes

Found another source of spectra to perhaps try
https://omlc.org/spectra/index.html
This one has a bunch of spectra that are gonna be interesting for rendering proper skin. Stuff like fat, blood, water, and melanin.
Some of it, like the blood spectra (for both oxygen rich and oxygen poor blood) are in units that you still have to premultiply. They are given in terms of Liter per mole and centimeter, and they probably need to be converted to transmission spectra as well.

Clear seawater according to this source, from 100m to 0m, if my understanding of the absorption node is correct. Interesting to see the slight hue shift from a green-ish pale colour near the surface and into a deep purple-blue at 100m. This is without scattering so actual values might differ slightly.

(oceanic water, most transparent)

Edit: It might be a coincidence but you can almost see the exact colours from the photograph in the render :smile:

6 Likes

Assuming it’s correct, that Blender’s Absorption node expects transmission spectra, I just spent a bunch of time converting the blood (to be more precise, hemoglobin), fat, and water spectra to transmission ones, giving me these results:

If I got that right, the left axis is percent blood oxygen, with oxygen levels going down as you go up. Without oxygen, the blood is much darker and clearly more bluish.
From left to right, absorption density increases exponentially, starting at 0 and going up to 999.
(Note that this is in arbitrary units. I picked ones where the top and bottom end were both at least somewhat reasonable in size, so I could still put them in as a curve. Turns out blood has wildly different absorption levels at different wavelengths. You can see that via the ~400- 500nm range that’s almost 0 and the extended range in the high wavelengths that’s almost 1. Either way, that density value doesn’t have that much meaning, other than relatively speaking)

The material looks like this:

Here is a similar setup for fat, also taken from that website (only the x axis varies here, density goes from 0-99. Also, unfortunately the data starts only at 430nm)

Here is the same thing but for (pure) water. Density ranges from 0-999. Note, almost all of this is out of gamut.

The blood, fat, and water curves are based on this spreadsheet:

(I have slightly adjusted the curve for blood so it’s more even between the highest and lowest values in the transmission spectrum, making it so the lowest values aren’t quite so low. The result is almost identical though.)

7 Likes

Indeed. And hopefully, with the Asset Manager project being somewhat of a priority, shipping a default set of realistic materials copied directly from such tables as a library wouldn’t be entirely out of the question.

Also a question: Is this project mainly about colors or about light transport/tracing as well? i.e. would such playful renders be possible link (this is 1 source of light bounced or focused 7 times)?

1 Like

That link is crazy. Am I seeing light refracting through a volume ? (seeing as the contours of the beam seem to bend)

I don’t think the volume is refracting the light considering the shadows cast by the lens are still straight.
It’s probably just that the lens doesn’t focus all the light perfectly in one point and therefore the refracted light doesn’t form a cone. The Wikipedia page on spherical aberration illustrates this quite well.

2 Likes

This branch doesn’t modify the light transport algorithms at all, so unfortunately it won’t help with rendering scenes with caustics.

4 Likes

I’m experimenting with a change to extend the wavelength sampling range out to 360 to 830 but I’m not going to push it until I also develop wavelength importance sampling, due to the slightly worse noise performance than the more restricted 380 to 730 range. In some preliminary tests it could be useful in some edge cases but in most situations the difference is negligible, hence waiting for wavelength importance sampling before putting it up for everyone.

3 Likes

Isn’t the wavelength sampling range something that should/could be configurable from the UI? Okay, most artistic stuff will be in the visible range, but other (more scientific) applications will benefit from some range extension (e.g.: false-colored galaxies, robot-, animal-sight, etcetera).

Ideally, yes, that would be a nice to have though it does come with it’s own set of complexities. Really that would fit into a larger project of allowing custom spectral response curves per channel and allowing for more (or less) than 3 channels in the render output.

This also allows for matching camera response curves or emulating animal vision etc, and as you said, things like false-colour representations of spectral data.

It introduces new challenges with the importance sampling too, since the system would need to be able to determine the ‘output influence’ of each wavelength, which is probably a rather challenging thing to do in such a flexible system.

This is a project for a later date, and it would really depend on user demand since it has some pretty big implications for how the render output system works.

3 Likes

Branch updates:

  • updated Blender
  • added more operations to “Spectrum Math” node
  • float values are now automatically converted to flat spectrum
  • renamed current blackbody node to “Spectral Blackbody”
  • old “Blackbody” node is back

Note: muting “Spectrum Math” node is buggy right now.

13 Likes

Okay. Sounds like there should be some “future project” page on this? Such page would list “advanced” features, like:

  • multi-band (texture) input
  • multi-band output
  • monochromatic input
  • monochromatic output
  • custom camera response/sensitivity
  • fluorescence/phosphorescence
  • polarization
  • monochromatic input could probably happen as soon as the node is converted
  • monochromatic output would be covered by multi-band output as well (just select a single band)
  • fully correct fluorescence requires a new BSDF that allows spectral distributional input (i.e. for every wavelength of input, what’s the output distribution)
  • polarization seems like a very different beast to me. Would, of course, also be cool to have, to get fully correct bounce light, or those extremely cool shots of like water through a polarizing filter, but for most applications, I suspect it wouldn’t matter that much unless we’d get another integrator that can properly handle caustics. Which would probably be a huge undertaking akin to not just slightly updating Cycles but rather writing a whole new render engine instead.

Yep you’re right. Polarisation would be a whole project on it’s own, definitely not just another thing to be added along with spectral rendering. If you want to use a rendering engine for scientific research @Ivo, Mitsuba 2 is a great choice, it supports a lot more technical features than cycles and is a great way to learn about raytracing too.

I’ve been thinking about fluorescence a bit, and I think it’s going to be a real challenge to offer a ‘fully flexible’ solution to it, since it’s not even just a curve panel, but more like a 3d mesh of probability. Plenty of technical and user experience issues to solve in that. Other features (and performance) are a much higher priority at the moment.

3 Likes

a 3D mesh? You don’t have to include a 3rd physical dimension that way. A heat map sort of deal would be enough and also visually clearer. Basically, instead of a vector of values, you’ll end up with a matrix.

But that’s just the UI side. The actual rendering side is different of course, and unless there is already some nice algorithm (perhaps some variation of Hero Wavelength Sampling) out there, you’d probably need a new way to “do it right”.

Yep the technical challenges aside, what I was getting at with ‘3D’ there is exactly what you’ve mentioned, for each wavelength we need to be able to represent a distribution. Heatmap (which works well for visualising 3 dimensional data but is somewhat difficult to input), 3d mesh, curves panel with a ‘timeline’ below it, regardless of how it’s implemented it’s a lot of information to input so I’d need to look to see whether there’s any degrees of freedom which could be removed without it feeling limiting.

I’d argue, as even with the current spectral curves, ideally you’d want what amounts to a material library. Hardly anybody will even directly want to fiddle with spectra. The curve version is just about doable, but filling in an entire matrix correctly is gonna be tough, no matter what. (Though fwiw many of the tools you’d need for this, especially in heat map form, would already be in Blender: It’d be akin to painting a monochrome texture)

Really, if you want a proper way to fiddle with spectra that aren’t just measured and thus extracted from nature, you’d (eventually) want a whole mode centered around this, with plenty of tools to help you visualize the results in various scenarios.
Like, effectively I’d imagine a route where, in the shader node, you only ever use either upsampled RGB values or specific spectra from a large library. And you could add to that library in a variety of ways. Stuff like:

  • drag and drop a spectral data file into Blender
  • build such a file yourself with external tools (and then import it)
  • build it yourself using a spectrum maker that could be in Blender

I’d guess that Everything Nodes, once those happen, would pave a way to make this kind of idea more flexible. But it’d be a new area to maintain of course. Still, to meaningfully build spectra with special interesting properties that you can’t just measure something for to obtain, is gonna require actual tools with powerful previews and such to use. Else you’ll have a very hard time generating a specific shade that initially looks one way and then, after a bunch of bounces, will look another. Especially once fluorescene actually is considered, if that ever happens.

Btw, on that front, one rather easy thing that would be nice to have is what amounts to a spectral gamma correction. I.e. simply take spectra to a power. This is useful because any absorption spectrum with unclear units is equivalent to a whole family of transmission spectra defined by taking any potential transmission spectrum to a specific power.

For instance, in the data set I converted, if you try out the spectra there as reflection spectra, they’ll all look really pale. You’d have to take them to some power before the actual colors you’d likely want to go for start showing.

I specifically normalized them such that the top and bottom most extreme values are roughly equally far away from the extremes that are 0 and 1, in order to hopefully minimize floating point issues, and improve ease of putting them in as a curve. But as reflection spectra, that just doesn’t suffice. Like, the ones for blood will end up looking a pale orangeish yellow (though interestingly and nicely, the shadows, where multiple bounces happen, end up becoming red)