Artist Members for Rendering Module

As mentioned in this blog post, we are looking for artists module members to join the Rendering module.

In practice this means you would do one or more of these things:

  • Join the weekly rendering meetings if possible, or read the notes. We’ll start putting tasks that artists can help with in the meeting notes.
  • Follow this forum and get involved in topics, helping communicate what’s going on in development and answering questions.
  • Test new features when they are submitted for code review on, and get involved in design tasks. You’d join the Render & Cycles and/or Eevee & Viewport project on as a member and receive email notifications whenever tasks or code reviews tagged with the project are created or updated. And then you can give feedback, suggest improvements to the functionality or UI/UX, etc.
  • Create or gather demo .blend files for new features, to help test the feature and to show in release notes, demo videos, Blender Today, etc.
  • Help write release notes on when features are committed. Developers will write a basic text explaining the changes, artists can add images and expand the text, to present the feature well.
  • Improve the Render section of the manual, and help document new features there.

Metin Sevin (@MetinSeven) and Jonatan Mercado already volunteered, and I think both would be great to have on board.

As for tasks that we can use help with right now:

  • New Eevee Depth of Field was committed but is not in the release notes yet. Especially helpful would be a demo file and render demonstrating how the new algorithm improves results.
  • Cycles has improved SSS random walk sampling, demo images comparing old and new would be helpful.
  • Cycles point cloud rendering is being worked on, this could use a good demo file and render, created with geometry nodes.

If you accept me in the team, count with me, I would be glad to help with Cycles in everything I can :slight_smile:


:+1: I’d like to help with these two tasks. Just to make sure: both of these are 2.93 features, aren’t they, so there’s some time to create the demo scenes?

You’re welcome to join the team.


Yes, they are 2.93 features and there is time.

1 Like

Regarding this:

  • Cycles point cloud rendering is being worked on, this could use a good demo file and render, created with geometry nodes.

Can we use an alembic file of a sand simulation to show this? It will be generated in a different software, no need to talk about that software, but that’s the reason why I ask this.
We already use Geometry Nodes to be able to render sand with Cycles, and I think that case could be perfect for this.

Another possible option, depending on how point cloud behave, could be foam with mantaflow, right now is very problematic and I imagine point cloud will help with this, possibly exporting foam to Alembic for the time being, since I’m not sure particles are baked to OpenVDB and OpenVDB still don’t have motion vectors I think, but I can check this with Sebbas to see the best option.


Would be pleased to work on some Eevee DOF comparison.


Any of this is fine. Using an Alembic file generated in other software is fine, the purpose is to demonstrate the rendering.

1 Like

:+1: Then I’ll go for the Cycles Random Walk SSS comparison, and leave the Eevee DOF comparison up to you.


that sounds awesome Metin! I can’t wait to see your results.

@brecht I’m sorry if I don’t know the proper steps to present an issue, but as soon as I try to bake the indirect lighting with Eevee in the 2.93 alpha version of Blender, it crashes. That will be a big issue to create a good scene. What should I do?


1 Like

You can report a bug to the tracker:

1 Like

Done! thanks!

I would not mind joining for showcasing new VR stuff once it’s solidified / merged.

I have been able to do quite a bit using XR_Action + upbge

the depsgraph seems to fight me in the viewport though last I tried

XR actions and UPBGE are not part of the Rendering module, and VR only to the extent that it is about rendering. So we’re not really looking for this type of demo for this module.

I found a limitation currently to work with GN and Point Clouds, it seems I cannot generate points from vertices from another geometry.

I can distribute points in a geometric object, however if I try to use vertices as points or sources of points that´s not possible right now, I cannot create a point-per-vertex neither convert a vertex cloud into a point cloud, that limits a lot what I can dynamically do with Point Cloud for the time being because I cannot load an alembic and use it as source.

However what I can do is to instance an object in every vertex of the alembic file, but that alone would kill the real benefit of the point rendering capabilities of the Point Cloud, am I right?

Today I’ll test if the point cloud receives materials or not and I’ll post a bug if it does not.

Meeting notes regarding artists module members:

New task to keep track of open tasks (I plan to create a new one for each release):


I think you can convert a Mesh object with Object > Convert > Point Cloud.

Instancing would indeed lose any benefit of point cloud rendering.

Yes, the problem with that approach is that it’s not useful for Alembic I think, since it’s static it won’t work as a modifier turning every frame into a point cloud.

Sorry, I missed a word in my last sentence, if you use instancing then point cloud rendering will not be efficient.