Right now our studio is in massive collaboration with a dozen of other architectural studios (together we are going to build up 70km2 area).
All the other studios send us their data, some of them sends those zips with half of their hard drives imprinted there, some of them send sketchup files with materials separated by polygons.
Technically, we are the only studio which is able to handle all this mess, just because we switched to Blender in time.
People are extremely bad in datamanagement, especially in the world where deadlines and automatic data generation exists.
This is why Blender is a CG industry survival tool, according to our observations, the industy desparately need it.
I understand this is a challenge, and it sounds like you guys have done some impressive work to deal with it.
But some of your collaborators just sound like a slob. Iâm not saying they arenât talented nor productive. Iâve worked with artists who have a new level of OCD about their desk cleanliness, and Iâve worked with talented artists whose office looks like a tornado went through it.
Just because one group of people canât work in a clean manner, doesnât mean everyone else should have to wear a hazmat suit.
I empty the trash can on my computer on a regular basis. Iâve also seen other people whose trash can was so full that they were running out of hard drive space, because they couldnât decide whether or not to throw something away. Sometimes they would just buy another hard drive.
I would be furious if in the next version of my operating system, apple or Microsoft decided that they will just arbitrarily throw away older untouched files on my computer, to make sure that I donât run out of disk space.
âSmartClean identified 127 fonts and 987 images that you havenât used in 2 years. Weâve burned them, to make sure your system stays fast and efficient. Happy computing!â
I just want to emphasize that the ability to pack external data in project files provided in Blender is a big deal in comparison to zip packing solution from its competitors, which better fits archive storage rather than project making process.
For example, unability to pack extermal data usually locks texture folders during collaboration so you cant manage it (remove or rename its members) since something from it could be used somewhere, and anything which goes there just stay there forever - it is a datamanagement problem similar to discussed , but in external realization.
Datamanagement is rather mathematical problem.
That specific diff, like the others attempts made in this area, looks like an attempt to break math problem with wrench.
It ulilize a system that was designed for the other purpose.
And all those tools will still be there if we implement the âautosave active for (these) datablocksâ and a list like proposed in the first post. The only thing that would change for now is that people who want the current workflow unaltered go into options and deactivate any autosave checkbox there.
I stand by it:
Put a list of curated individual checkboxes for autosave into System Settings
add a preset box with presets for âstrict datablock managementâ and "cautious datablock management"with strict being the current settings, cautious being default active shield for datablocks that are commonly assumed autosaved - especially materials and animations (IMO)
Maybe add it to the first time startup diaglog, since it seems important. Together with âRCS or LCSâ, What does Space do, how should Blender manage Data. (strict/cautious)
Putting it into the startup screen raises a bit of awareness of how important it is for people who donât know. It also shows that it is something worth checking out and if you donât set it itâs your own fault Blender autodeletes stuff.
The only thing I am not so sure about is the import options. Those should probably remain on Fake user inactive, since otherwise every importer would need an update to in- or exclude metrial autosave by default. And it would need to be coupled with the system settings which ⌠seems too complex.
And then some time in the future UI devs should think about expanding the data management from the outliner. But until then this sounds like a good solution to me. It changes nothing about the actual functionality of codeblocks in Blender. It shows that itâs an important setting in the startup. It can be changed after the fact. Import would not be touched by this. Only user created datablocks. If Materials are changed on imported meshes these (by default) remain deletable.
If you are a new user, and seeing this screen for the first time, then âData Management: Strictâ is not going to help you. In fact itâs going to discourage new users. Most of them wonât understand what it means, and the minority who will bother googling what it means will be discouraged from using Blender, because their very first impression will be âWTF? How am I supposed to trust a piece of software which canât get even saving of the data right.â
Itâs just yet another example of the ridiculous situation we are in. Saving of the data user creates in a digital content creation software should never be something user controls or has to configure. It should just happen. It should not be a setting, and it definitely should not be a choice on some sort of startup screen.
We as a Blender users have become numb to the ridiculousness of whatâs happening. But imagine you would be trying for example a new 2D image editor to replace Photoshop for you. Imagine you ran it for the first time, and it popped up some startup screen with a checkbox saying âDo not delete layers on file closeâ and that checkbox would even be off by default. Imagine how frustrated and reluctant would you feel if you realized the software even dares to ask you this question, let alone makes a wrong choice for you by default.
The problem is different.
It is using a single system for different purposes logical flaw.
Lets say, some software has uv seams but no sharp edges (3dsmax example).
The absence of a sharp edges is technically possible to solve by utilizing seams sometimes as uv seams and sometimes as sharp edges, but such kind of a system design will create similar logical flaw - cases that require sharp edges will not have proper uv seams and vice versa - cases that require uv seams will not have proper sharp edges. Such kind of a system will generate predictable heavy production conflicts.
Fake user system is supposed to be used as a bookmark to delimit important data, and bookmarking each page will destroy the initial bookmarking ability, since those purposes are mutually exclusive.
This is why the proposed solution better fits something like addon level for local purposes rather than official software core level datamanagement policy, which is supposed to be free of direct and predictable logical flaws.
This issue has been known for a couple of decades, it is not expected that after all those years the best we has got is the solution of an addon I designed, tested and abaddoned more than 15 years ago because of all the same predictable datamanagement issues.
There was also a couple of similar realizations with quite low level of interest around them:
The functionality this PR proposes is already available for all those years.
The discussion about how data is managed is a valid one. I assume you have read the posts.
I assume you have read this thread. This is not about implementing an antirely new system. this is about giving an option until a new system can happen.
Simply switching everything to ânow everything is savedâ will at the very least wreck existing workflows and speaking of trust - will also cause trust issues in the millions existing users who suddenly might have to rebuild entire workflows without an option to go back.
And I also disagree that people will not trust blender just because they have to read about the architecture of the program they are about to use. If anything the explanation should just be very prominently accessible, concise and easy to understand.
And if you have a better ideea - please propose it.
(edit) or just to give a practical example of how this works if you simply eliminate the option alltogether:
Any material that is on a mesh imported, created, whatever is now being saved.
So you use the very practical (but sometimes buggy) Material management addon to mass replace some materials and create a few new ones for, say, 50 - 60 objets in your scene.
What now? If you want to get rid of them your options now are the following:
create a dummy object and select every material from the dropdown list to check if itâs saved or not.
use the purge function, that now does nothing any more because everything is saved by default
write a script that simply sets the âfake userâ of any unassigned material to off so you can use purge again - effectively eliminating the entire idea and going back to the result of the current workflow. Just that now you have to write a script - something that will surely alienate just as many people.
Exaggerated? True. Still, this will be a substential part of the users affected by a permanant change.
And then they have to learn about the datablock structure anyways again.
A script on Blenderartists or stackexchange is not a âgiven solutionâ.
âShips with blender and has at least been tested a bit before releaseâ is a given solution.
Of course it is given.
There is just never been any interest around it.
A couple of people probably use it, but I never met them.
Here is this addon lifecycle.
User switches to Blender, making local projects - lose data, install addon, became happy - realize that making local projects is a salary limitation, switches to massive projects - face massive trash problems, realize that uncle Ton is wiser than he thought - removes the addon, became happy.
This is the way I also has got this experience as well.
Well, it is concentrated experience. It may be unpleasant but at least it is truthfull.
Anyway, if you want, you can protect your data in that way just today. The ability for that is already provided in Blender.
Just have a proper patch that saves all the datablocks users expects to be saved, without even asking or making it option. Then simply have better tool to clean them up manually, which already exists and waits for approval.