I’m new to posting on this forum, so please accept my apologies if I’ve put this in the wrong place and let me know where it should go.
I have been working with blender for the past two years. Its an integral part of our pipeline and we love it. We’re using it as a major part of a VFX pipeline for feature films. We work with multiple VFX facilities who all have different proprietary pipelines and data gets shared back and forth on a daily basis in various different formats.
We use blender as our primary renderer in our pipeline, and therefore all the data that gets shared back and forth, at some point will be brought into blender and rendered.
On our last project, we were mostly passing alembics into blender, and I came across a few IO issues that I believe are major issues for anyone working in a VFX pipeline:
I’ve tested both of these issues with Blender 3.2 and they are still issue
When importing a camera with an animated focal length, blender ignores the focal length animation - The only format that does import animated focal length is FBX, however I have yet to successfully import an fbx camera from maya that has the correct orientation in blender - No matter what I set it to in maya, one axis or another comes in inverted. I solved this issue in our pipeline, by importing (via python) both an alembic camera and an fbx camera and using the focal length from the fbx camera to drive the alembic camera’s focal length.
Custom Attributes are ignored - Adding a custom attribute is an essential part of any vfx workflow, especially when dealing with data from multiple sources, so that you can pass on non standard information to blender. Currently, when I add a custom attribute to an object in maya and export as an alembic, as far as I can tell, that attribute is nowhere to be found once its imported into blender.
I’m currently exploring using USD, and I’ve already discovered that importing a camera via usd has the same issue as importing it via alembic. I had hoped that USD would solve the issue, given how important cameras are in a vfx pipeline.
Is there any work being put it into solving these two fundamental issues any time soon? things as they are are very much limiting what we can do with blender in a vfx pipeline.