Add option "Auto Add Fake User" to user preferences

Indeed… I think @1D_Inc 's input on this thread, and many others, can be dismissed wholesale, as they have very specific requirements and visions for Blender that don’t match up with anyone else. The volume of their comments leads to a false impression of shared vision, when in this case, they are in a massive (and possibly single-person) minority

5 Likes

I just share our experience of a higher production level, when you start facing such kind of a problems.
Every company face that, otherwise they would not evolve a nasty and expensive supervizing departments which goal is to force users to adjust their data very similar to the way it is done in Blender and punish them for not making so.
After all, it was not for nothing that we preferred to switch to Blender after all these games with Max and Maya system design. It worked in practice.

(Not sure which way we are supposed to be against Agx, taking into account all those efforts we put to initialize its development by describing Filmic issues)

You know, we can just as easily talk about autosaving.
Ultimately, you just need a storage file from which nothing will ever disappear and it will remain in your disk.
And of course, in the end it will be a landfill, like any of our archives, into which we always put something aside and think about doing it later.

Your company’s difficulties in managing your staff’s work practices were not due to a failure of Max not deleting unassigned materials. It was a failure of your staff to keep their files clean. Your staff’s past inability to maintain a clean workflow isn’t something that the rest of the userbase should have to work around.

If a company were to have a design mandate of “no mesh over 100,000 faces”, then it is up to that company to hold production to that standard… not demand that the software delete extra geometry when the file is saved.

If you don’t want your files to contain 4,000 unassigned shaders - then tell your staff to clean up their files. Or leave the auto-fake preference off, and Blender will still do this for you.

If your staff ever upgrades beyond 2.79, tell them that use of Auto-Fake is forbidden. If it’s still a major problem, dock a paycheck or put an employee on unpaid leave. If outside collaborators don’t follow your guidelines, then have a discussion; they can meet your requirements, or end the collaboration. People generally follow policy when they have a solid financial incentive to do so.

1 Like

Sounds extremely oppressive. You probably have a nice supervizor talent…
All of this is something we actually tried to avoid.

4k materials is not made by our people, it is usually obtained during working with imported data (stocks, BIM, file format converters, etc). This is why there is a certain difference between vanilla and imported data management. The industry is a messy place.

you just need a storage file from which nothing will ever disappear

@modmoderVAAAA That could be incredibly hard to manage I guess. I would probably prefer to pack everything, rather than dig through such kind of sewers.

Okay … Stopy it. You’re getting personal. Both of you.

On the topic I also do agree that Blende rshould …
a) … not call the datablocks ‘Fake User’ but more descriptive to the outside.
b) … turn on autosave for most users depending on data. Set Presets in the options (‘strict’, ‘gefault’, ‘save all (not recommended)’)
c) … implements an auto-purge manager at some point

If we get a more solid rework: Cool. Long overdue and appreciated. Until then or maybe even never not deleting attributional data is just necessary.
I actually can see that the datamanagement aspect of it is important and that autopurging does make a difference. And despite vehemently disagreeing with @1D_Inc I will actually make a point for his usual reasoning.
When unassigning data that is not needed any more it needs to get marked as “not used” because there is no easy way to access it afterwards. So it would stay in the file unnoticed to most because they never learned about the fake user system. And from there on out it will bloat the file. That is just true and we all know it.

It still leads to a false solution. Important data is perceived as unnecessary because it is not treated that way. The data is inaccessible and difficult to manage because Blender evolved that way and this is one of the areas that got dragged through over the last 15+ years without many improvements.
Neither does Blender have a material library/editor that works independent of models in the scene nor does it have a real and dedicated central place for all the other Datablocks in the file to be managed and sifted through. And I mean - a workspace that is meant for management. Not just observation with minor managing functionality like the Blender file browser currently is.

Some Data simply is still important data even though it may be invisible at the moment. There is a distinction. Why are meshes preceived as gone after deleting from the scene? Because the file managment browser is the prominant and direct way to work with them. BEcause we assume them to be gone from the root of the scene. They are the star of the show and when they are gone we want them gone. We see them in the outliner and even if they are hidden or temporaily disabled as a collection we still know they are there because they are logically treated as such.
Consequently as soon as we delete an object from the outliner we also percieve it gone. We intuitively accept that data as gone. And it should be. Despite Blender still keeping the datablock in memory for now. It’s a dead mesh walking unless the user seriously wants it otherwise.

With other data there is a dissonance. My two favorite examples are animations and Materials. Neither of these has a true dedicated editor yet to most people working with 3D software of any kind both of these are logically percieved as “something I create as an attribute or a library for later to apply and mix and match” because they are. Animations can be used as clips for characters even if not currently needed. They can be exported to game engines - despite not being used on any mesh in the scene. They are f’in important!! (yes - I lost data like this). Materials are attributes. They are libraries that can be applied to objects later on. Even if there are no objects currently in the scene. They are fuctional blocks.

Deleting this kind of Data is simply not right. No matter how you turn it. The most important aspect of this whole discussion is the definition of which data should be treated as important. And I think the most important thread in this regard is this one:

because this is the discussion of what data is actually currently percieved less important than it should be.

As a sencondary step if the devs or anyone fro the community wants to there should be a discussion about better Library management for this data as well. But maybe the asset browser will evolve to get this functionality or become a deidacted separate workspace for the current file. This is a different discussion. Important - but not right here or now.

3 Likes

Not sure about the need for grade of shades.
Datamanagement policy has to be simple to work properly.
Its personal use or collaboration, so you save all or nothing.
Grade of shades, especially customizable, could bring just more confusion.

I disagree for all above mentioned reasons but I am sure at this point I cannot change your mind about the subject, anyways. :man_shrugging:

1 Like

The problem with unassigned data is slightly different.
The problem is not that it bloats the file by itself.
When user stores some important data unassigned (in case if the system designed to allow that), it mixes with the other (useless) unassigned data, so you can’t purge such files during collaboration - because you will shred someones important but unassigned data.
This is why fake user is supposed to be set manually - it allow to separate useful unassigned data from messy pile of a useless unassigned data.

Also not sure how centralized data editor will help to retrieve useful data among 4k of a useless data.

But data monitor would be nice to have for sure - for example we wrote such kind a monitor for textures in order to have the ability to check their status, weight, or adress (which material/node it belongs to).
Very useful, taking into account blend file packing ability.

2 Likes

Thing is - in principle most people here absolutely agree with you there. Unneeded data should be shredded religiously and managment of this kind of data is important. IT’s just thefinition of useless where most user will differ.

Quick naive question … why are scripts set to auto-save?
They surely aren’t used after they are executed, are they?

Yes its this kind of explicit and direct management that should be the way to go. ( What shall not mean that I’d be against a trashbin )

Its not about keeping everything stored in blendfile, sure is there is temporary or no longer needed data, but how its currently is furthers accidents that happen because someone didnt take action, and eg thought of marking some data as persistent after a change.

Its one thing if intentional actions lead to dataloss, but another if the opposite leads to that.

Imo the core system is ok, it’s just that:

  • For anything the user creates the fake user should be added by default.
  • The name ‘fake user’ should be changed to ‘guard’, or ‘protect’ or something like that, with a tooltip ‘save even when unused’.
  • The options to list unused datablocks should be expanded. Maybe some setting/filter in the outline ‘blender-file’ view to only show unused data.

Not sure how outliner deletion could help clarity.
Empty scene with three cubes with materials AB, BC, AC.
Deleting two of them will make at least one material became no user, regardless from where they will be deleted - outliner or viewport. If it will be handled differently it will be a mess.

Well, at this point I think that the original idea of developers to write unused IDs but with manual fake user useful/useless unused data delimeter is far away to be perfect and has perictable issues, but is better than any Autofake concept in general, because of no ability of setting useful/useless unused data delimeter.

If there’s a list of all data that is unused by anything in the scene it would be a one-click action to delete everything. Collaboration cleanup problem solved.

And that person who kept there a couple of a materials he spent several hours to make but decided to assign them tomorrow (lazydodo case) will appreciate that?

The materials state shouldnt depend on the existence of objects using it. And after one deletes all objects that use it,eg a dialog could ask for confirmation if the material should be deleted or kept.

There was a proposal to show warning splash before closing file.

You have a point there, but still I think the situation is preferable to the ‘auto deletion’ we have now, which bites people in the ass all the time.

There should be a way to turn off GC for any user generated datablock, while still keep it for autogenerated ones.

Yeah cool, but there’s still a difference to immediate and direct feedback or verification.

Not sure if it could be a smooth solution - stock data is made mostly manually by users so its data will stuck in projects. Also modifying imported data will be a corner case. (You import a model with generic materials, set them up properly, but they are not protected since was imported)