Open Mesh Effect branch (prev. Houdini Engine modifier for Blender)

I read your blog post. I think you should approach this in a way that Renderman’s RSL was adopted by many renderers and just copy the Houdini Engine API format for defining what ops are used to create and modify the mesh. That way, people who have been using Houdini Engine will already be familiar, and people could with little or no modification move their effects from one platform to the other.

Again, I really like this idea and hope it takes off.


I am a bit concerned about just copying the Houdini API because of the aforementioned never-ending Google v. Orcale argument. IIRC at the moment, as crazy as it sounds, an API can be copyrighted, according to the judges of this affaire.

That being said, you raise a good point. We have to think about attracting developpers, and people who ended up using the Houdini API are potentially those targeted by Open Mesh Effect. So, even if we remain on the OpenFX base, we should definitely provide some open source utility code to adapt code meant for houdini API to Open Mesh Effect. Thanks for the suggestion!

Some heads-up about the Open Mesh Effect integration to Blender:


Looks very promising. You need some examples and people would be so hyped.

An example of link with vcglib, the core of MeshLab:

This will be an example of actual Open Mesh Effect plugin that is GPL licenced.
If you have ideas of modifiers that you miss in Blender, feel free to suggest. :wink:

1 Like

About the examples: it would be nice to have a video of something getting done (or started) in Houdini or Meshlab then showing the process of how it arrives in Blender. Tbh. I don’t get it yet. But I’d love to see and understand.

As for the new modifier: what I’ve seen asked the most is remove double and circular option for the array - but the devs are considering it already:

1 Like

I’m not sure myself if I get it either . From what I got some things don’t make much sense.

for example.

if you mix and match mesh effect and built-in modifiers from blender in the same stake what happens ?

from my current understanding this will mean you can no longer move your mesh between application, so you have to rewrite all of blender modifiers and future ones and other DCC’ s as will to mesh effect plugins . This very prohibitive workflow.

I don’t see Autodesk transfering there modifiers or node systems to work like that.

The only possible workflow is if the whole thing becomes a wrapper around Houdini only .

that means you have to have a Houdini license.
at this point it’s no longer an open standard because the only way to use it fully is depends on proprietary software? If you mix it with built-in procedural effects from the DCC itself the whole thing falls apart ?

As I see it this basically means it’s is either a wrapper for Houdini or a sort of round about way of giving unofficial and shard way of writing plugins using c to blender
It’s not going to allow sharing of procedural content without Houdini or very big limitations

Correct, I should work on that. I’ve only made quick previews for Twitter so far, I will work on a longer format with more didactic talking :wink:

It does work. Like any modifier. The later example of convex hull stacks the openmesheffect modifier after an Armature modifier for instance. I could have added a subsurf afterwards with no problem.

Nope. OpenMeshEffect and regular modifier play well together. It is already not possible to move a mesh from one DCC to another without either deleting or applying the modifiers (i.e. burning them into the geometry, cannot be undone). This does not change at first, but ultimately could if all modifiers are OpenMeshEffect ones. So worst case is status quo, best case is better interoperability.

The very point of my last example with VCG was to show that OpenMeshEffect enables new workflows completely independent from Houdini. VCG is a GPL library, using it in Blender through my branch requires nothing about Houdini.

I fear you’re lending me bad intentions I don’t have. First, as seen earlier, this can remain totally independent from Houdini if one wants/needs. Actually this is a strong requirement because I want it to be open the possibility to propose alternatives to Houdini. About seeing this as a workaround from writing C plugins to Blender, well, it’s not wrong. But how is this a bad news for Blender? It will remain unofficial until it becomes official (if it does), as for any features that goes through a proper submission and code review pipeline.

You’re both right about a point: I should make this explanation video. :wink:


So as far as i understand, OpenMeshEffect would be open API for modifiers. And it would allow to introducing houdini engine, meshlab operations and custom modifiers (including creating paid).

People I’d love to hear feedback from

Have You got any feedback already?
I dont see any reply about OME here only previous, have You contacted them via irc/lists?
I have concern that Your whole work might be rejected :s.

Remove Doubles, Mesh Join, UV Unwrap, UV Translate, UV Pack, Mesh Translate, Bisect, bevelafterbevel addon alike, tissue addon alike, OpenVDB Remesh, Fracture, Mark Sharp, Asymmetric Mirror and so on.
But most of them are on the way or will be doable as modifier nodes (i guess).
Anyway how would this connect to “modifier nodes”?

btw would be possible to embed mesheffect into blend file? So eg. i create effect and just send file to my co-worker?
Whenever i use some custom build that allow me to do something i just cannot send it to someone to continue working on that cause i would have to send them my special build and that’s annoying.

1 Like

I don’t see any of this as bad . so no no bad intentions LOL

thats exactly what I meant , for interoperability you have to have everything inside OpenMeshEffect. so that mean you can’t use any native tools without breaking th whole thing… besides all dcc apps will need to become almost gui’s for OpenMeshEffects blugins . its a huge under taking to build every function of every dcc app as OpenMeshEffect . basicly all this functionalty will need to be part of the standard , otherwise you can’t call it a standard I think?

I didn’t say it is a bad thing . Sorry i f you thought i was doig so…
also maybe once blender become blendini at can become the core for sharing procedurals

I just dont think that this will be feasble as an open standard for this becouse it will have big limitations , that dos not mean it’s a bad thing :slight_smile:

How is your interoperability now?

I mean… can you use several modifiers in Blender and send the result to, let’s say… C4D?

The idea is that IF something happens to interoperability it’s an improvement, in any other case, it’s like it has been until now.

I don’t think this is meant to improve interoperability unless other software includes OME support. Even still, that wouldn’t be the main goal. For instance, OpenFX, which has been widely adapted, doesn’t grant better interoperability between compositors that support it. All it means is that someone can develop an OpenFX addon and it will work in Nuke and Fusion.

What it will mean for blender is that compiled addons can be written for Blender that won’t require going through the approval process. Right now compiled modifiers have to be part of the Blender codebase. That wouldn’t be the case with OME. I think there are gray areas of the licensing that will probably need to be addressed, though.


@myway880 @JuanGea I agree with your prudence about interoperability, but @DanPool summarized perfectly why I am still motivated and confident that this will get useful. The point about compositing softwares is relevant: Natron is not able to open a Nuke scene, nor is Fusion able to read a Flame file, even though they might consists only in OpenFX plug-ins.

The confusion might come from the fact that there are two different notions of interoperability we are talking about here. The interoperability of scenes is a long term goal, that will have to deal with USD. The first kind of interoperability is between plug-ins and hosts, it is the interoperability that makes it possible to write a plug-in that will automatically work in all DCCs supporting Open Mesh Effect.

Oops, I admit over-interpreted a bit, my bad ^^

1 Like

Oh, I was not questioning your idea, in the contrary, I was supporting it, making it a point regarding that current interop between apps is as bad is it can get IF this don’t bring anything new to the table on that regard :slight_smile:

1 Like

@Elie I’m interested in contributing. I use Blender and Houdini together, I was wanting to make a Godot module for Houdini Engine, and I like coding procedural mesh effects a lot.

A plug-in standard for mesh effects is a great solution because I want to make an effect and be able to use it in Blender, Godot, After Effects or any DCC that have 3D features. A similar technology that works like this besides OpenFX is the VST. You create a single virtual instrument plugin and it works in every DAW nowadays. It can have a generic parameter interface or even it’s all UI.

Btw, Godot is open source and has support for C and C++ plug-ins with GDNative. It would be a good option for the second host you mentioned on your blog.

1 Like

This is excellent, not because we get the ability to use Houdini modifiers but because if provides an API for modifier creation. That really opens things up IMO. Of course performance is an important consideration but this looks really nice so far.

1 Like

Hello @hammer! If you want to contribute, I think the easiest and best documented thing to start with is to follow the guide for writing Open Mesh Effect plug-ins. Tell me if anything is lacking! You can test your OME plugin using the preview builds of the OpenMeshEffect branch

Then, you feel like, working in an integration with Godot would be amazing. You can start by copying the intern/openmesheffect folder except for the blender subdirectory, it is independent from Blender. This will be a way to populate the guide for writing hosts.

P.-S.: Good catch about the VSTs :wink:


This looks intriguing. Would this potentially allow for something like an aaOcean implementation via OFX, rather than having to mess around with the internals of blender?

How do I set parameters? I’m trying to use" parameterSuite->paramSetValue(widthHandle, 15)" in the createInstance to set a default value for the parameter but it doesn’t work.

The process regarding parameters is the following:

  1. The Describe action is called, listing the parameters
  2. The RNA/UI is built with a widget for each parameter
  3. The Instance is created
  4. Parameters are copied from UI to instance
  5. Effect is cooked

So it is important that the default values of parameters is known at step 2. OpenFX provides a mechanism for this, with the kOfxParamPropDefault property.

OfxPropertySetHandle paramProps;
parameterSuite->paramDefine(parameters, kOfxParamTypeInteger, "Width", &paramProps);
propertySuite->propSetInt(paramProps, kOfxParamPropDefault, 0, 15);

That being said, I havn’t implemented it yet. So you can either open an issue so that I remember (I can do it quickly) or give it a shot, I think it is an easy fix. It has to do around openmesheffect/blender/intern/mfxModifier.c#L452.