I created a LOD system that also act as a Proxy system, and can also optionally back-up your mesh-data. It’s working as a mesh-data exchanging system available as a new panel in Properties>Mesh-data>Level of Detail, i will do a little presentation video tomorrow.
You can choose which lods you want to display in either the active viewport, the
rendered view or the final render individually.
The LOD system use API properties stored within your object and object-data
properties, that mean that once you created a LOD system for your object,
the LOD(s) data will stick with it once you copy or append your obj to
another file for example. Even to users who don’t have Lodify installed yet.
You can automatically search for lods according to their names, if you use a
“Suzanne LOD 0”,“Suzanne LOD 1”,“Suzanne LOD 2” naming system or something similar.
You can use this LOD system as a data backup management system if you need to store
your mesh data while working on destructive hardsurface modeling for example.
You will find two ‘backup’ operators in the menu next to the LOD list.
Note that each LODs can have their own materials, as the material data is stored per
meshes and not per objects.
Keep in mind that pointers are considered as data-users. You might want to clean
leftovers pointers from deleted objects with the ‘cleanse data-block’ operator.
If you are animating via Modifiers (Bones for ex) the LOD system will work perfectly with
your animation, assuming that the Vgroups are all assigned correctly for each level of detail
(that’s why it’s more easy to create your lod from the final model, as the vgroup will
automatically be assigned when you simplify the topology or decimate it).Shape Keys animation
are not compatible with LOD.
Regarding the Rendered view automatic mesh-data switching:
Note that Lodify will try to update the mesh-data on each “blender internal update signal”
(called depsgraph update), so if you use a custom shortcut or pie menu that don’t send those
“depsgraph updates” you might just want to click anywhere on the viewport to send a new one.
(the default header shading viewport buttons will work 100% of the time for sure).
The addon was not made with linking from external blends in mind.
render.interface_lock will be enabled with this addon, this can be disable in the options (but expect some crashes if removed)
About the addon python code:
This addon basically act as a big mesh-data exchanging system where i exchange mesh data
according to booleans stored in ui-lists.
As i’m drawing inside object-meshdata and i’m constantly switching the active mesh,
‘lod_original’ pointer is used as a constance, and is stored in object properties of
all mesh-data owners. if you need to work with lodify api, always use this constance
if not None. object.data is simply not reliable (-> ‘true_mesh_data’)
All the Lod-switch is done via a fct on each depsg udpate, (->‘analyse_and_exchange_data’)
the code just analyse the ui-lists of all objects, if list exist and if boolean filled,
the mesh-data is exchanged or restored accordingly.
Due to a severe blender crash, while in rendered view, if data is changed from a fct in a
depsgraph, it will crash blender back to desktop instantaneously. To counter that, if user
in rendered view and changing rendered boolan, the view will be toggled back and forth.
You can experience this bug for yourself if you delete ‘toggle_shading’ in lines 825 and 827.
(i did my best so that the addon respect blender native UI)
Hello, today I tried this add on and it is great. This is one of the reasons why I love blender (because of people like you) it has an awesome community and keeps developing the program and making it better. I have a suggestion on how to make this even better. I’m not sure how difficult it would be. To create simple presets of the LOD like box, plane. But it is great as it is really handy for display performance.
This is really awesome. It’s very easy to set up and works great as a proxy system for using low-resolution meshes in the viewport and high resolution meshes in the render.
Is it possible to automate the LOD based on a user-configurable distance to the camera? This would allow us to do large vistas where objects close to the camera get rendered with LOD0 models and those further away get LOD3 or LOD4.
Thanks for posting! Hoping to see this develop into a full LOD system!
** Also, just realized that you’re the author of the Scatter tool. We purchased that on the Blender Market. Fantastic tool. Now makes sense why this proxy tool looks so familiar.
Yeah, it’s auto-generating the mesh by using a driver on the decimate modifier, so I wouldn’t recommend using it on e.g. character, where you want to preserve a certain geometry.
Awesome add-on! The view-based LOD selection is very easy to implement using handlers looping over through all the objects with LOD enabled and selecting the appropriate LOD level based on distance to viewport camera. Could be a great improvement for that!
yeah but how do you handle instances from particle systems ?
distance LOD switching could be extremely useful for creating forests for example, the only problem is that the current hair instancing solution don’t allow for much flexibility
You can’t really do that inside one particle system through Python, it has to be homogenous in terms of the objects it displays. However, you can still set correct lods if the particle emitter itself is long away from the camera and its global “bounding box” (you need to calculate that) defined by how far the particles distribute from the emitter. So, in case of having different particle systems on the screen, that’d help to choose lods for them. But proper distance based lods are not possible in terms of smooth transition of particle instances from camera to the far clipping plane, sadly.
Added new batch operations functionality, batch enable/disable assets LOD system or change status by LOD name ending
Added a new mandatory mode, to lock the interface while cycles is rendering. it seem that it’s causing unexpected behavior, cycles really don’t like mesh-data being exchanged while he is working …
Couldn’t a gradient with object group solve that? So the darker a gradient step it will take a different model. But thinking about it. We can’t control them like that. We could setup different particles setup based by distance. But that would mean lots of work if you have a big variation in models. I gies everything nodes could do this job as it can work with real objecr