Implementing gradient index refraction, atmospheric refraction

A common thing for me is the need to render images as they would appear under influence of atmospheric refraction. This effect makes distant objects like mountains appear higher than they really are, and is caused by vertical variation of refractive index of air making light rays bend down towards the earth. Being able to simulating such an effect with blender is a useful technical tool, although I concede it won’t be that useful for artists.

I’m writing this post to give an idea of what I want to implement for myself, while opening up for influence to make things align better with the goals of Cycles, so that the functionality may be merged and benefit others than myself, if thats relevant at all.

I have already made myself a basic raytracer/shader for this, written in Python for Blender, which works okay, albeit very slow. To improve the speed, I’m interested in implementing it in a faster engine, like Cycles which uses Optix/CUDA.

To trace the ray in the atmosphere, one needs to solve two coupled ordinary differential equations.
The integration method I’ve implemented is a fourth-order Runge Kutta integrator. Instead of shooting a single ray from the camera, it makes several finite rays in sequency that together trace out a curved path, until the path intersects with scene objects, where the ordinary shading operations of the raytracer may occur.

The variables affecting the curving is the refractive index (scalar), and its gradient (vector). These will again vary with factors like temperature and pressure, which both are determined by altitude. Since this effect is only substantial in the atmosphere, it makes sense to me to implement this in the renderer itself instead of relating it to any data connected to blender objects. To do this, the renderer must be configured with some idea of which direction is “up”, and at what level the gradient of the atmosphere starts.

Here is the control panel for my Python prototype renderer. (My renderer assumes atmosphere starts at z=0 and that z represents altitude.) The user can choose the integration step size and maximum ray distance. A sub-panel contains settings for adjusting the atmosphere. These settings are essentially a calculator which takes in surface temperature, pressure, humidity, wavelength, temperature gradient (i.e. lapse rate) and some other factors, to then calculate refractive index and its gradient which is used for rendering.

I have gotten blender to build with CUDA support. I will try to get Optix to build as well. I’ll start digging in the codebase to find the code path for raytracing and where it makes sense to put this stuff.

Good tips and ideas are appreciated :slight_smile:

6 Likes

I’d implement this (or any cycles feature really) on the CPU backend first, most of the code is shared between the CPU/OSL/CUDA/OpenCL backends, but CUDA takes forever to build, you’d be more productive by ignoring the GPU kernels in the beginning.

4 Likes

This is pretty specialized and not super likely to end up in Cycles master.

Maybe if it can be implemented as a volume rendering closure so that it integrates well with other features. I guess that’s what it is fundamentally, but I’m not sure if there is an efficient formulation as part of typical volume rendering algorithm.

2 Likes

@ParticularDynamics You probably can get away with an array of horizontal planes with a glass shader and cranked up light paths bounces. Here’s Mt. Monkey (not stretched!) and straight (!) beams in such a setup. Of course, there are artifacts and this setup in not correct in all cases.

I’d say that ray bending is a thing, it can be made even at home - a cool home experiment demonstrating ray bending due to sugar concentration gradient in a fish tank.
LaserBend

Thanks. Volumes seem like the most idiomatic way to do it. I’ll have to do some study of how rendering volumes works, and see if it is plausible to implement it this way. Is there a writeup somewhere about the architecture of the renderer?

I found this https://blender.community/c/rightclickselect/Zbdbbc/ feature request which is related to what I’m thinking about. A challenge of solving it in a generalized way, is choosing the integration step size to properly track the disturbances in the refractive index field, unless this is a property the volume can keep track of itself before rendering. The step size must be appropriate for the scale of the disturbances. For instance over a burning fire, the scale is a few cm, while in the atmosphere as a whole, the scale is thousands of meters.

That aquarium example is exactly what happens in the atmosphere. A limitation of using several layers/planes is that if the ray goes within a single plane there would be no refraction. But in the same case with a continuous gradient, this direction is where the light bends the most.

The Cycles code documentation is here, but there is not much about volume rendering in particular.
https://wiki.blender.org/wiki/Source/Render/Cycles

The concepts are pretty standard and similar to what is explained here:
https://pbr-book.org/3ed-2018/Volume_Scattering/Volume_Scattering_Processes

Cycles automatically sets volume step size, and this could be influenced by shader nodes and their parameters if needed.

I could not figure out a clean way to do the job in a volume shader, partly because I don’t understand enough about how they work. But I managed to add basic functionality to the non-branched kernel_path_integrate in kernel_path.h. For my purposes this will be fine.

Here is an example showing a tower and distant hills appearing substantially higher and loomed up than their scene position would suggest (orange outlines):

I’ve also been able to mock up the GUI by editing the Python code.

What I now need to do is to pass relevant parameters from my GUI panel into my kernels. It seems I need to put the info in KernelGlobals struct for the kernel to access it, but I have not been able to find the proper place where KernelGlobals struct is initialized with whatever parameters the user has set. Got any pointers on this?

5 Likes

Please any update on this? really keen on this feature :smiley:

Was there any progress on this topic in he last months?
I would be very interested in how you got your implementation working. Generally I would like to achieve something similar:
I implemented a simple Raytracer for volumes with arbitrary gradients (to render gradient lenses/eyes/fluids of different density, gravitational effects, etc), which can run on the cpu but also supports cuda. It would be very nice to get that running inside blender, but I have no experience with the cycles-code. So I would appreciate any help or getting insight into your code.