Automatically fake-user everything that does not reside in the 3D View. Everything except:
Objects, Meshes, Armatures, Cameras, Curves, Collections, Lights, Lattices, Light Probes, Meshes, Metaballs, Paint Curves, Point Clouds, Simulations, Speakers, Volumes.
Users usually expect deleting objects from 3D viewport to be synonymous with deleting them from the Blender file.
Completely remove the fake-user button from everywhere. The button would not exist anywhere in the UI next to any datablock selector.
In Blender View Outliner mode, add icon indicator for any datablocks that have 0 users, so users can overview/manage/clean them up quickly.
Really, just take a while to think about it, how it would be in practice. If you imagine that materials, geometry nodes, textures, etc… would simply have no fake user button so you would have that safe cozy feeling that they will always be around until you decide to clean them up manually. And then all the 3D objects (stuff that can be placed into a 3D scene) will also always be there unless you just delete them from the viewport.
If you take a moment to think about it, then you realize how redundant the fake-user button is. The only case for it to exist is a rare case where you know well in advance that the material/node group or something you are creating is for sure just for some temporary/testing stuff. But even if you are just slightly unsure, you’d already be reluctant to disable fake user on it.
And once it truly becomes unused, then you will not be unchecking the fake user button, because there’s nowhere to uncheck it in the UI, as it’s not used anywhere.
It takes a while of contemplation, but that contemplation almost always leads to a realization the fake user button just doesn’t make sense.
Hmm, interesting. And do you think it would would pay off to use workaround of for example putting object using that mesh into a disabled collection if what you got in return was just not having to ever care about fake user at all?
My point here is that Blender is only 3D DCC which has a concept of being able to store a mesh without it existing in the 3D scene. And users of all those other DCCs don’t really seem to be limited by it in any way.
Yes. Absolutely. I’ve been saying this earlier as well - all the data users can manage as first class citicens should have a better way of management. I am not sure if the hidden collection would be the way to go. The scene view should be as focused as possible. I would rather see the “Blend File” or “Oprphaned Data” browser in the outliner to get more mass management skills, instead…
But there have been many more discussions about this topic in the other threads, already. I think those are all valid.
And for the record: In the case of the Mesh file I would totally expect having to manually take care of it. This was really just saying that ‘unexpected cases’ may not actually be as unexpected for everybody. As long as there remains a way to manage and replace datablocks. They may not be hidden so much that we get the opposite of what we have right now. I think they need to be manageable, accessible and switchable ob objects. They need to be less hostile for users than they are right now and they need default behaviours for creation. Everything beyod that are individual cases which probably need scripting anyways.
Well if you are for example a 3D concept artist and do some heavy density 3D concept art sculpt sketching, you may create/copy/delete a heavy 1M+ poly mesh maybe even several dozen times in just a single hour. If the unused mesh datablocks weren’t progressively purged you’d constantly keep ending up with up to dozens of gigabytes large .blend files. And if you are a 3D concept artist, you want to just focus on creating your art. You don’t want to have to stop every 30 minutes and switch your attention to doing some chores just to avoid running out of HDD space or waiting for the blender file to save for a while since it got huge.
Sure. Absolutely agree. As I said: Meshes are expected to be automatically marked as deletable and gone after exit, the latest. I just thik the ability to manage these blocks if needed has to remain somehow.
Fake user does make sense in the way that it is a way of marking datablocks for keeping that the user would otherwise need to keep track of manually or that would usually not be saveable. Again: That blender actually does have a way of exposing this is a good thing. It allows for a few tricks that are simply not possible in other software. It’s the management aspect that is a problem. Not the existence.
Most other software for example only allows to keep everything or delete everything unassigned. That sucks. A way to actually mark things for keeping is good here. Havin the option to assign mesh data to other objects - occasionally super nice. I sometimes use it to sneakily switch meshes in existing hierarchies and not destroy or rebuild everything in the process. There are very, very useful cases which simply aren’t possible like this in other software. We really just need to make sure that we shift the importance and management aspects.
But since I am repeating myself now I just shut up.
The forum already reminded me that we two are talking mostly exclusive to each other already
I’ve been so used to blender’s way of doing things it didn’t even register with me it will delete unused datablocks on save. Now I realize I do a bunch of actions and rituals inside the program to avoid this kind of situation.
Yeah I could not agree more. I’m already managing datablocks my self with purging and deleting them from the outliner, I definitely don’t want blender to surprise delete my materials on save.
This being the actual problem that needs solving. And accepting dataloss because the managing is clumsy is imo not the right solution to that problem.
Meh. If the cleaning process would be easier this wouldn’t be much of a problem. IMO much preferable over accidentally losing material nodetrees you’ve spent hours on. But I could see the value of having implicit GC for lightweight stuff like lights and empties and the like. But Keeping things consistent would be better I think.
So, there should be better cleanup tools. Having a central list of all your unused datablocks with some statistics would be quite handy anyway.
Maybe it could be as simple as just never implicitly running the GC and only run it when requested.
You may want to take a step back a bit. Most of us are just discussing, not arguing. At least that’s how I perceive it. Chill debates are usually debates about things people don’t care about. As soon as people care about something, it’s obvious the discussion will be vigorous, but I don’t see anything even remotely wrong about the tone of the discussion thus far.
Data management is one of the most crucial Blender topics in the past decade. I see no reason to chill out a bit, because chilling out is not the approach that gets anything ever moving.
If you don’t have anything to contribute to the topic, then just please let us talk.
Some of these issues are inter-related, but that does not mean all of them require the same solution. It doesn’t mean that a universal solution that solves all potential issues is required.
The core issue is this: Blender deletes materials (for example) without any direct action by the user. This is not acceptable.
I believe the origin of this flaw originates with the fact that a material/action/whatever is is treated by Blender as a property. Properties that are not used, are discarded. This was a design flaw as it was applied to things in ways that it should not have been, and the following approach of garbage collection and “purge on save” is a punitive result. We cannot undo that old decision, but nonetheless the result of it is not ok.
When the user creates 4 cubes, lights, or bones - and deletes them - yes, blender silently throws it in a trash can without telling the user “I’ll throw this away later.” That’s acceptable, because the user deleted the item.
Materials are not managed this way. If the user creates 10 materials and doesn’t assign them to objects at a given moment (or REMOVES a material from the only object using it) - which is a normal workflow in other software, Blender will throw it away unless you tell it not to. I didn’t hit DELETE - it just got burned. The comparison with “well, what about a deleted mesh or light” isn’t valid.
Blender is the only 3D software I’ve ever which treats materials in this manner. It’s not logical, it’s not rational, and it’s a poor design to force a user to tell the program - “I made this, and YES I WANT IT. Do not burn it!”
There are various issues with fake user beyond this, but those issues do not have to ALSO be solved in order to correct this flaw. Previous debates have fallen heavily to the Nirvana fallacy, and it’s time to stop embracing it as a brick wall.
Because it’s unique and wonky design. No new user would know what this is about and it would become just another one of off putting weird blender things that scare new users away and annoy existing ones.
To put it simply, 3D DCC file data management has absolutely no business having a learning curve. As soon as you need to spend any substantial amount of time learning how the file data management works and/or have to spend any substantial amount of mental energy using the given data management, it means it’s just straight up wrong solution.
The separate garbage file also contradicts collaboration, since you would not be able to delete none of those files, since possible pinch of useful data will be mixed with all the garbage there, so they would probably just stay forever, growing pretty fast with file incremental versioning at multiple projects going in parallel.
I am not sure about the proposed solutions.
They or assume opressive working environment around a pile of garbage (like save all IDs), or mixing garbage with useful data, making it unmanageable (like autofake), or making incredibly messy to handle datamanagement system (like external gc or custimization), or something that would have even worse consequences, like everything together, and any of them assumes violating collaboration conditions on a massive projects with thousands of a materials and hundreds of a UDIMs per asset, inherent in modern industry requirements.
None of those solutions doesnt take into account conditions the initial system was designed for - which starts working properly at heavy and full of garbage datamanagement coditions (collaboration + imported data like during projects making), but is questionable at more sterile datamanagement conditions with almost no trash around (personal use + vanilla data like during assets making)
I guess solving this problem assumes a single dedicated person who is familiar with all the production datamanagement complexily levels and existing datamanagement systems in practice, who will collect all the contexts together and spend, say, a year to find a proper solution with proper production testing before implementation, so we dont have to guess if we even would be able to survive through the proposed solution.
The task is very difficult, and topic is too divisive. No developer would want to spend a year redesigning data system to then be rejected by half of the community.
It is not going to happen.
The only way is if devs make a decision ignoring portion of the community. But for that to happen they must be interested in the issue, and at the moment I do not see them care.
Autofake non-viewport data is the least divisive (since it is optional) and also is trivial to implement.
We have basically three immediate options:
1.Add a system settings option to auto-fake user all datablocks upon creation and add an import toggle. This should be two fold: Auto-Save on creation in the current file auto-save on import from external non-blender files. Blender files should probably just copy the imported setting if applicable
2. Ignore it again for the next 2 - 5 years and occasionally rediscuss with the same result.
3. Redesign the whole damn thing. Probably would have to be done by the core devs because this is a very core design of Blender and would take a long time.
I would love (3) but can currently only see (1) happening if changes do happen. Personally, I would also be happy with it because at least it’s something and it retains the possibility to set the autosettings individually for users/companies who need it a certain way for their workflow.