Unified Data Import/Export API

Blender Data Exchange Management

The idea would be to have a universal way of getting the data from Blender and giving it to Blender via the Python API.

This is a stub and ideas and comments are welcome, as well as describing current issues that it could fix.

Emphasizing on the stub nature of it here because this is how I currently see the idea and understood it (I have not a lot of experience with the API, so any input is welcome), I could be totally wrong too :grin:

Collaborative document on the topic. (empty)

General idea

We could have one data structure to avoid having to browse the data tree to get an object. This data structure would be defined, and specify a format for the data structure for a specific use. This could then extend to having (step 2) format templates, simplifying data manipulation for recurrent cases.

The Data

This would be an object describing all the data from the current data opened in Blender. As if the data was saved and you would take and read the file, but here, it’s all Python objects, easy to access and modify quickly from scripts and Addons.

Use Cases

This is a starting point to improve upon

  • Get Material.001’s textures :

      textures = getData().getMaterial("Material.001").textures
  • Get Material.001’s backed texture :

      Material001_texture = getData().getMaterial("Material.001").texture
  • Get Material.001 :

      Material001 = getData().getMaterial("Material.001")

    This would be an Material object with methods to manage formats and so on.

  • Get the animation action ‘jump’ :

      Jump = getData().getAction("jump")

From the description so far this seems to be the same thing as the bpy.data API that we already have. Which problem are you trying to solve?

The idea would be to have a unified way to have access to all the data (bpy.data does not always give access to everything), and to be able to be one layer removed from the actual structure of the content (I.E.: to have access to the backed shader to export it in an other format implicitly).

The solution is not to create a whole new thing. I’m just not referencing anything existing to have no constraints on the resulting potential solution, but at the end, it might be some modifications to do on bpy.data to fill the gaps, or so. Extending bpy_extras.io_utils could be a solution too, depending on what’s needed. And with this thread, I wanted to gather some needs to know what to extend :slight_smile: .

1 Like

And there would be something else too : Easing out the conversion of data for import / export. That should be more in bpy_extras.io_utils I guess.

Is merging this module into bpy an idea for the future?

This thread talks about the different ways of accessing data through the API :

About the API’s design. It looks like the idea is to have it be only, without data manipulation.

It all depends on the design, but I see both being conceivable, and the latter (with data manipulation) would make it easier to write scripts, but the first would be cleaner (strait access and C operator manipulation).

So why not create a second level (a higher level ‘framework’, wrapping the API), with those algorithms? (Restructure all the utilities around the bpy)

That would keep everything clean, maintain the compatibility with the scripts already written (since the API remains unaltered), avoid a big change and make scripting more straight forward (avoiding re-implementation of features each time a script is created)!

So, from tool ‘scripts’, to a structured module wrapping the API. Why not?

The API does provide read and write access to data, and is quite comprehensive for importers and exporters. There are hundreds of Python scripts using it for that purpose.

It’s fine to brainstorm about API improvements of course, and certainly there’s room for improvement. But any breaking changes or new additions should be done to solve concrete problems, and be evaluated by how well they solve those problems, and so far this doesn’t seem to identify any.

Could you give some examples of code in the current API which is currently difficult/awkward, then an example how a different API would help here?

Note that I can think of some too, I just like to see what you had in mind.

Otherwise API discussions get too abstract, it could be argued we already have a “Unified Data Import/Export API” so I’m not sure what you’re proposing.

With all the getXxx() functions, it looks more like a Java API than Python…


(I currently encounter network problems (hence the delay, sorry) so I can’t follow the discussion right now) but I’ll try to give a quick explanation of the context in which this topic was created.
Several discussions (on IRC) where about issues on manipulating the data with the API. For example manipulating the shaders to export them to a gltf format (extracting and backing the textures). In the meantime, there was a discussion about issues manipulating animation data. We came to think about gathering different point of views and experiences to see whether there is a recurring point that could be fixed (by code or documentation. It seems that the methods aren’t always easy to find to users).
I took on the thread creation, to get input from other people. I’m not currently encountering any issues myself (I was more here to help working on the solution) but Kupoman, Gaia and bzzploink (on #blendercoders), with whom I was discussing this point, may be able to give more background on this.
The aim is to know if there is something to improve and what we could do to improve it, the best way possible.

But we wheren’t starting to change anything right now, just gathering point of views to know if there is something or not.

That’s it! :slight_smile:

@Quetzal2 shaders are one area this definitely makes sense.

For the FBX-importer I wrote a utility module shader_cycles_compat that handles creating a node tree from diffuse, specular, bump… etc. values & textures.

A module to do the reverse would be imprecise, but possible. If someones done this already, I’m not aware of it.

Not sure how something like this is needed for animation data (would need some examples).

I wouldn’t call this a unified API, there are just some utility modules that would be nice to do higher level operations on Blender data, for the times scripts are not concerned with details.

The bpy_extras module is intended for this purpose. Note that I didn’t include cycles_shader_compat module there because it’s only used by one add-on and not well tested for general purpose tasks.

Indeed, Am not convinced that we need a new API specifically for IO, but rather adding some helpers for some common issues (maybe even in C for performance-critical cases), and generic wrapper.

For animations, an easy way to get baked data would be useful. A fair part of FBX exporter is currently dealing with this, most of the code being about getting valid baked values for every frame, and then simplifying them.

Difficulty here is to find the right point up to which we can use generic code, given all the specificities of each format, you’ll always have to code some own code to adapt the data anyway.


Thank you all for your feedback.

I’ll be thus more selective, and work now on the shader case. I am starting to write some tests and see how that helper (in C) can be structured to be as generic as possible, and easily adaptable to a specific use case.


A quick overview of the concept:

And here is the example file:

Does this seem right to you?

I’m not sure what to use to calculate those textures.

When the input node is of type BSDF, I could bake the different textures (color roughness, …) with the baking system, right? I was studying how the baking system works (with the code and some documentation), but I’m not sure how to use it. If you have any link to a documentation about how to use it or to a line code using it, It’d be awesome :slight_smile: (As Internal is going away, and I’m interested in baking Cycles materials anyways, I am looking at the baking feature of cycles, and trying to make sense of what is Internal only and what is shared, because it seems like part of the baking pipeline is shared). Because right now, I’m not seing how I could use it to bake the wanted textures.

And when the input is only color, do I really need to use the bake feature? There might be a quicker way of getting them (and is there anything in Blender that I should make use of to do it or can I implement it?).

The steps for now are as (roughly) follows:

  • create the node sub-trees for each input, in order to get a node tree per input of the selected node (the root node, not included in any sub-tree)
  • for each sub-tree:
    • check the type of the related input
    • compute the texture (bake it with cycles or something else)
    • add the texture to the list
  • return the list of textures

Thank you for your attention! :slight_smile:

During the summer break, I have continued to learn the structure of the code and the tools available to achieve this goal. I realized as well that there are already “bake to …” options that match exactly the textures I was looking to create. I thus simplified the process.

However, being quite new to the code, I wasn’t able to directly implement it, and there are still too many code sections and tools which I don’t know.
Since September, I couldn’t allocate as much time as I wanted on it and it stalled.

I have seen what has been merged into 2.8 on this subject, and, as I become more familiar with the codebase, might be able to contribute to this effort specifically.

This thread is now way too broad and not up to date with the current status of the GLTF exporter. I think creating specific threads on specific issues or themes would be better. I will thus come back to smaller tasks, to get to know the codebase better, and if I can help on this topic, I’ll get back in contact with the developer in charge of this topic to see what I can do.

Thank you for your help, time and attention.