Undo Performance Must Be Addressed

I talk about some test files to compare perfomance between computers, about the main problem of the undo, not an random scene that have a poor perfomance because OpenSubD (that of course must be a problem)

But it was a sugerence, if people preffer still talking about random times or different scenes, no problem.

I think it’s a good idea. I’d suggest something simple -

I’m testing it on a cube applied subdivision set to 6, duplicated 16166 times. That gives me a quick scene with 1536 objects, 75M tris, no modifiers.

Takes 7 seconds to undo a simple translate on one of the objects on a 6core xeon E5-1650 v3 @3.5 ghz, 64 gigs of ram. (I’d be happy use any scene or to test it on a couple different workstations/OSs, would be good to see what makes a difference)

Same scene in maya and undo is instant, not even a hint of delay.

Could you share the blend file?

sure, give me a few - I’m on a closed network, need to re-create it on my laptop.

scratch that, the file I’m getting is huge 5 gigs uncompressed, 1.6 gigs compressed. It’s faster to recreate.

just create a cube, subdivide, 6 divisions, apply

duplicate 1536 times (16 in x, 16 in y, 6 in z for example), done

Just tried on an [email protected], 64gb ram, undo took 13 seconds.

It makes absolutely no sense to have any goals in seconds here. Making current undo solution faster is not a way to go. Current undo simply saves the entire scene at each step, and then reloads the entire scene on each undo. Even if reloading of the entire scene was as optimized as possible, it would still be a bad solution. The proper undo solution is storing data deltas.

So for example if you have giant office building, and move one chair on one floor 1 meter to the right, undo should not store the entire scene, and then on undoing reload the entire office building with thousands of pieces of furniture inside, but it should just take that one chair, and set the location transform of that chair back to the previous value. So by moving the chair one meter, from for example XYZ 20,10,0 to 20,11,0, you’d store the 20,10,0 location, and on undo, revert to that move value.

You can easily test that in current Blender. Open very large scene where undo takes very long, select some objects, write its XYZ position, then move it, and instead of undoing it, manually write the XYZ location coordinates you’ve written down. Now you have made a manual undo that did not take ages to complete.

So no, having any goals in seconds does not make sense here, because it’s not about optimizing current undo system to be faster, but about completely redoing it from scratch.

2 Likes

it was not a defensive phrase of the status quo …
I pointed out that there are those who are worse …
let’s say there was a lot of sarcasm in that comment
^^

Looks like this is the case. Not matter if it’s lot’s of objects or one very high poly one, bigger scene size = longer undo.

Seems like a very brute force solution. In this case from a non coders perspective it seems like a complete rewrite. It’d be interesting to hear about how complex such an endeavor would be, or if there are some suggested workarounds, linking scenes, whatever.

For production environments currently this’d be a no go, I literally find myself not undoing things, rather recreating the previous state manually. I’d be interested to hear from the Spring team on regards to best practices.

A database that marks all the modified parameter steps, would suffice … and when we get undo, simply rewrite that previous parameter in the database …
no need for large resources, no need for large memory …
on paper it would seem very simple …

Even large memory wouldn’t be an issue, stock standard workstations at ILM came with 128 gb ram 5-6 years ago, beefier rigs with 768 gb or more. There’s only so much you can do with storage speed, but that wouldn’t be an issue either.

I’d be more than happy to invest what I’ve saved on Maya subscription on hardware and the occasional Blender donation - if that’d solve these problems.

Anything would be better than what we have, let’s hope it gets addressed.

a database are just id addresses and numbers to be recorded … practically nothing

The best thing you can do for something like this is to write a script that generates the scene. Now there can be an error-free, consistent standard for people to benchmark. Here’s a video series that shows how easy it is to populate a scene with simple objects.

IIRC, the line of code to do with the cursor placement has to be updated slightly to work with 2.8 but aside from that most of the code for the entire series is platform agnostic.

I think it comes down to scene size - no matter what you do, if your scene is getting big - due to blender reading every datablock when undoing - it’s gonna be slow. Doesn’t matter to much if it’s many small or one big object, same happens with a single 50 mill poly cube.

Since there’s a task on it in the long term plans I guess the best thing we can do is lean back and wait.

The ticket mentions linked libraries as being exempt from the global undo problem, not sure what that means exactly but didn’t make a difference for me when linking a large file into a small one. Would be really interested to hear from companies doing production work about best practices in this regard.

1 Like

That’s what I was trying to say, referencing the ticket regarding Optimized per-datablock global undo.

Seeing that it has been marked as a long term project I’d imagine it has to be a change to the core, similar to the edit performance issues.

Honestly, I con work around it in production using packages that’d suit me best, but seeing more studios adapting or expressing interest in using Blender it’d be very helpful to see how they’re addressing such issues (which seem like a show stopper at first glance)

For now the only solution is to follow the advice given by Dr. Sybren A.Stüvel and to make sure people know about that page.

1 Like

What the developers have said previously was that a new undo system will affect big chunks of Blender and will take some time to get coded because of it, but to my knowledge they have never given any indication that the entirety of Blender needs to be rebuilt from scratch. Where have you got this information from?

Good to know, thank you for the link.

Actually, many times I’m working with Blender it’s with single heavy geo-s or bakes, no modifiers, no nothing. Not much I can do about that.

It seems like a very cheap way (admittedly, looking at the blender file’s data structure and fast save/load times the first thing I’d have resorted to as well) of handling undo, I hope they come up with something more production friendly.

Just the other day, I was working with a 20.2 Gb maya file. Saving/loading time was a pain, nor editing or undo was an issue though.

This is false. I have tried it. The undo is currently so broken that even if you link the complex geometry from different blend file, the undo performance will be as slow as if the geometry was present in the currently open blend file itself.

Undo performance really should be the main and only priority now, as there are no workarounds or fixes for it, and it limits usability of blender to only trivial scenes, which in turns constantly keeps damaging the reputation of blender among professional users.

2 Likes

Listening to the last Blender Today it seems it is high priority.

I’m sure some developers are more motivated than others to dig into this (what seems to be a) mammoth project - but with more studios backing or looking into using Blender I really hope this and edit mode is getting some much needed attention.

I’m happy to use Blender on freelance projects, but I wouldn’t be confident to introduce it at the company I’m working at, mostly because of undo and some other performance issues.

That being said, I’m as happy as the next guy seeing new features and improvements every week - honestly, having a hard time to decide if I want a fun new toy to play or a tool to use at work.

2 Likes