I am and environement and archviz artist and the author of the « Scatter » addon and I have a lot to share for everything that concerns vegetation scattering in blender.
So Can we talk about the future particle « hair » here ?@jacqueslucke
or well anything related to the future of scattering within blender.
I don’t know if hairs are going to be separated from emitter after all, if there’s going to be a new particles hair node ? if everything will be mixed in one node editor? not clear.
We should have a dedicated period of development for Scattering.
Scattering is how people use the particle system the most within blender and there’s a clear lack of features in this area. for many years. users keep asking for improvements and nothing is done. That’s why i made Scatter, it’s an attempt at re-thinking the current workflow and bringing new improvised features in python, that should really be done with C at the first place.
I have no clue about hairs and furs, not my domain. the same goes for particle node and all things motion designs. not my domain either, and way too complex for just scattering things around.
Therefore All my post below will be Scattering things concerned.
In the post below i’ll share thoughs about flaws, ideas, improvements
i’ll try to restructure everything when i found most of the important ideas.
Well, I’m not sure about hair yet, but as for scattering I know that the team have remarked that the array modifier, and indeed any other modifier that instances geometry is inefficient and duplifaces are preferred. I’m pretty sure in the future, with everything nodes, we will get more options. In specific I’m hoping for something similar too Houdini’s approach with the “Scatter Node” and the “Copy to Points” node.
If not enough display option, visualizing complex biomes is almost impossible. Or at least not confortable at all. You cannot expect blender users to play with psys parameters blindly with bounding boxes.
so here’s contents that need to be implemented:
“display as proxy” in > properties editor > object > viewport display options
“display as point cloud” in > properties editor > object > viewport display options
“display as decimated” in > properties editor > object > viewport display options
for sure we could use a modifiers do to so (like i did in the addon) but this feature is so important we should have an native support for camera clipping and distance culling within each particle system. a simple toggle camera clip + threshold options, and distance culling toggle + distance options would do the trick easily.
there’s no way to use cycles powerful node for particles distribution. It’s a huge flaw because cycles dispose of very powerful tools to make such distribution procedural maps that the texure editor simply don’t have. Also, in most cases, particle distribution is directly linked to the terrain material. For example certain colors in the terrain material will never have particles (like rocks, snow, water, ect)
maybe a new modifier “material to weightmap” could do the job -> from material data to bitmap (i think there already a shader for that) -> bitmap to vweight.
there’s no way to make the particles not “intersect” with each others. either within the same particle system, or across multiples part sys. It’s really problematic. in most cases, you don’t want a scattered assets being inside another one.
the current problem with this “Weight proximity” modifier is that we can only do operation with one vgroup at the time.
so if we want X objects to substact weight from let’s say 5 particles system, 5 modifiers will be needed (if we want to keey having different vgroup influence per psys of course, and it is really important to do so)
having a modifier that can substract multiples vgroup at the time could be much much more useful and maybe performance friendly
of couse, another more easy solution could be implemented for users than this modifier. ( a node form ?)
this seems more logical and give more possibilities over its randomness. right now it’s kind of a bit confusing “phase” being z, “rotation” being xy.
also another things that is confusing is that there too option for controlling the scale of the particle.
either with the scale ratio or the hair length. only one is necessery, and it should be a 1:1 ratio by default.
there’s a texture mixing calculation type avaible in the code but not avaible for users ?
Because it’s something completely different.
I guess @Alberto is talking of frustum culling. That way geometry that lie outside of the cameras viewing frustum get culled before rendering. You have to think of what you are saving when. Anything that gets done before rendering itself can cause a lot of workload and won’t get better by frustum culling. If you are processing hundrets of thousands of particle positions you can’t cull anything here before the position and/or dimension is final and the rendering doesn’t really have to be the bottleneck, but the computations before. Another thing here is if you put less elements into a scene with that camera check you implemented and then blenders culling skips anything outside the camera, you are saving multiple times, because you calculated less elements, you generated less and by changing the camera view even less remains on screen.
maybe a bugs ? -> representation of particles inacurate in the viewport
the rotation is set to random and count display on 1%
but somehow at 1% all the particles rotation are not random. so using the display % is actually a bad way of previewing what will happend with particles distribution.
second case: having two textures influencing density/length will make the particles representation in the viewport totally inacurate when chaging the values. an edit mode refresh is needed to make them precise.
PERFORMANCE ISSUE of course
lot of particle’ instances slow down blender. other sofwate can handle instances much much better at a much much higher count !
it seems that the outline overlay is drastically slowing down blender performance on high particle count. something wrong with this design, maybe ignore particles with the outline.
No float number support for particle display % ?
when working on really dense grass field, 1% isn’t enough. the ability to go below 1% is necessary for a lot of cases.
huge problem for animation, any modifiers that influence particles density vgroup will change the seed of the particles. i tried this with every >Emission>source> distributions options. nothing works. particles seed change every time.
particle seed in viewport is unpredictible.
Sometimes, when changing values in the particles parameters, textures or modifiers, the displayed particles in the viewport are not in the same place they will be in the final render.
this is a big deal for important object on your scenes like trees and big foliage.
i couldn’t identify the cause of that. the “viewport seed” just behave strangely
the only counter to this annoying ‘bug’ (?) is the user constantly toggling edit mode on/off or toggle viewport visibility on/off
this program is just magic, it can show off millions of object, billions of particles live in the rendered viewport
i don’t understand how they can achieve this
having blender 2.8x viewport being a lot more optimized for instances and particles is a must for any environement artists who’d like or who are using blender.