Hi folks -
I’m really interested in UI automation for testing and was wondering if there are any automation frameworks out there that has successfully automated the Blender user-interface. I don’t see many resources out there on this and would like to attempt this. I think UI automation would be beneficial as it would test out user/artist workflows and actions.
1 Like
There’s some internal tooling around this subject that you may be interested in and can see in this merged PR and the related commits. The main entry point is here.
As for other non-custom solutions, I’m unaware of any efforts in this area. AFAIK, things like Selenium or Playwright wouldn’t really be applicable due to their focus on web browsers.
I’ve recently been investigating event simulation from python for potential use with Grease Pencil tools. It’s not a full test yet but the principle works.
Running Blender with --enable-event-simulate
makes it possible to fire input events from python. The easy_keys
module is a utility to set up various things to help with that. It’s used inside a larger ui_simulate framework, but i wasn’t happy with the structure it forces scripts to use (everything goes through run.py
), i’m using a standalone script.