I searched through the blender roadmap but i dont see gpu tesselation and displacement anywhere

So noone has even thought about it i guess.

I think the way its implemented in modern Games( and Maya) is pretty good.

And in combination with Eevee it could do wonders for previs and rendering but also important workflow tasks like matching the feet of a character to a displaced ground.

Just imagine what it could do for visual design with animated displacement maps besides its realistic graphics qualities for gamedesign and animation.

Similar to …

OpenSubdiv GPU acceleration

… it could sit at the end of the mesh pipeline and pose as a “rendering effect” with NO blender data attached to it (just the input from the displacement in the shader node tree).

Thats also its native implementation on the gpu itself its basically a specialised postprocessing hardware shader on the gpu.

Nevertheless , becasue the GPU implementation supports both bend and standard displacement with fullscene tesselation at reasonable performance penalties and there would be no data to read back into blender it would not really pose any performance threat but make blender 100x times more powerful for several tasks.

so implementing it would encompass :

  1. the material output node

  2. eevee master shaders implementation

  3. (maybe a displacement modifier which must sit at the end of the modifier stack like the mentioned obensubdiv implementation above - but i think shader is better and can be tuned to get parity between cycles and viewport/eevee)

i guess implementing a test case into eevee isnt that hard since its not hard to write the shader code and there are several glsl examples on the internet.

Where are the master shaders for eevee viewport located in the codetree?

What do you guys think?

you think this is important and a good addition ?
is there something i´m overlooking (like selection which runs through opengl i guess)?
any other problems ?
has anyone ever tried it already?

Thanx for listening


The enduser could work with ultra low poly assets with tesselation/displacement at all times throughout the creation pipeline …
Big scenes have the potential to become very light since most geometry is moved out of blender and into the underlying opengl code.
Previs would be complete once and for all.


I’m not sure if I understand your request right but I think that is already in blender. You need to enable experimental features first . It has been part of the experimental features for 3 years now I think…

Or do you mean an Eevee implementation?

He means Eevee, it’s been in Cycles for a long time, along with adaptive subdivision.

But pretty much every single realtime engine out there tesellates using the GPU, but Eevee.

1 Like