Subscribe addon to event OnClose and OnOpen Addon

Hello i am tryng to create a code that keep track of the time spend on every file.
(before continue is this addon allready made???)
now i have as a documentation this page.
https://docs.blender.org/api/latest/bpy.app.handlers.html?highlight=handle#module-bpy.app.handlers
i can see that i do not have a OnClose event but i can use bpy.app.handlers. save_pre. do i have to load the script as persinstent or for every file?
ps is there a way to save in the file some python class? is there a way to serialize them into the file at the event of Save? now i have to rely on an exernal sqlitedb.
thanks

Itā€™s definitely possible, I built essentially the same thing for my studio- unfortunately I canā€™t share the scripts but I can give you a high level idea of how youā€™d go about doing it:

  • you need to write the data as a custom property on the scene. ie) context.scene.time_spent
  • you should use a persistent depsgraph update to track time. store the last calculated time, and use math to get the delta since the last update- and you have an elapsed time you can add to the running total.
  • depending on what your needs are, you may want to have a ā€˜timeoutā€™ where the timer doesnā€™t just keep ticking while blender is idle or the user is staring at their monitor thinking about what to do next. Our script is set up so that the work timer goes idle after 60 seconds of inactivity- we detect this by looking at the delta time since the last depsgraph update, and if itā€™s greater than 120 seconds we zero out the elapsed time for that update tick.

Anyone reading this might wonder why we have such a ā€œdraconican time tracking scriptā€ for our studio but the reason is actually pretty innocuous and maybe even benevolent- we have a custom autosave system that ticks autosave based on actual time worked on a file, not elapsed clock time- and it caches off historic snapshots of the file when the autosave happens. Itā€™s pretty cool, because when an artist finishes a scene they can step backward in 10 minute intervals all the way back to when they started, itā€™s like ā€œtime machineā€ for blender :slight_smile:

2 Likes

How do you deal with files that are many gigabytes? Saving of huge files while the work is being done can be really frustrating.

well, the short answer is that we just dont. For the type of work we do, Blender is nearly unusable beyond a certain size sceneā€¦ on our largest files (which are ~500mb) the object mode undo takes a few seconds, and we notice that Access Violation crashes happen far more frequently. we try to split up our work in a way where it makes sense to work in smaller scenes wherever possible (using linked data, instances, etc). Our average .blend is probably less than 200mb and we find that performance/stability tradeoff is pretty good there. Also worth noting that our studio machines are all threadrippers with NVME hard drives, so Iā€™m sure that helps too.

1 Like