The idea would be to have a universal way of getting the data from Blender and giving it to Blender via the Python API.
This is a stub and ideas and comments are welcome, as well as describing current issues that it could fix.
Emphasizing on the stub nature of it here because this is how I currently see the idea and understood it (I have not a lot of experience with the API, so any input is welcome), I could be totally wrong too
We could have one data structure to avoid having to browse the data tree to get an object. This data structure would be defined, and specify a format for the data structure for a specific use. This could then extend to having (step 2) format templates, simplifying data manipulation for recurrent cases.
The Data
This would be an object describing all the data from the current data opened in Blender. As if the data was saved and you would take and read the file, but here, itās all Python objects, easy to access and modify quickly from scripts and Addons.
The idea would be to have a unified way to have access to all the data (bpy.data does not always give access to everything), and to be able to be one layer removed from the actual structure of the content (I.E.: to have access to the backed shader to export it in an other format implicitly).
The solution is not to create a whole new thing. Iām just not referencing anything existing to have no constraints on the resulting potential solution, but at the end, it might be some modifications to do on bpy.data to fill the gaps, or so. Extending bpy_extras.io_utils could be a solution too, depending on whatās needed. And with this thread, I wanted to gather some needs to know what to extend .
About the APIās design. It looks like the idea is to have it be only, without data manipulation.
It all depends on the design, but I see both being conceivable, and the latter (with data manipulation) would make it easier to write scripts, but the first would be cleaner (strait access and C operator manipulation).
So why not create a second level (a higher level āframeworkā, wrapping the API), with those algorithms? (Restructure all the utilities around the bpy)
That would keep everything clean, maintain the compatibility with the scripts already written (since the API remains unaltered), avoid a big change and make scripting more straight forward (avoiding re-implementation of features each time a script is created)!
So, from tool āscriptsā, to a structured module wrapping the API. Why not?
The API does provide read and write access to data, and is quite comprehensive for importers and exporters. There are hundreds of Python scripts using it for that purpose.
Itās fine to brainstorm about API improvements of course, and certainly thereās room for improvement. But any breaking changes or new additions should be done to solve concrete problems, and be evaluated by how well they solve those problems, and so far this doesnāt seem to identify any.
Could you give some examples of code in the current API which is currently difficult/awkward, then an example how a different API would help here?
Note that I can think of some too, I just like to see what you had in mind.
Otherwise API discussions get too abstract, it could be argued we already have a āUnified Data Import/Export APIā so Iām not sure what youāre proposing.
(I currently encounter network problems (hence the delay, sorry) so I canāt follow the discussion right now) but Iāll try to give a quick explanation of the context in which this topic was created.
Several discussions (on IRC) where about issues on manipulating the data with the API. For example manipulating the shaders to export them to a gltf format (extracting and backing the textures). In the meantime, there was a discussion about issues manipulating animation data. We came to think about gathering different point of views and experiences to see whether there is a recurring point that could be fixed (by code or documentation. It seems that the methods arenāt always easy to find to users).
I took on the thread creation, to get input from other people. Iām not currently encountering any issues myself (I was more here to help working on the solution) but Kupoman, Gaia and bzzploink (on #blendercoders), with whom I was discussing this point, may be able to give more background on this.
The aim is to know if there is something to improve and what we could do to improve it, the best way possible.
But we wherenāt starting to change anything right now, just gathering point of views to know if there is something or not.
@Quetzal2 shaders are one area this definitely makes sense.
For the FBX-importer I wrote a utility module shader_cycles_compat that handles creating a node tree from diffuse, specular, bump⦠etc. values & textures.
A module to do the reverse would be imprecise, but possible. If someones done this already, Iām not aware of it.
Not sure how something like this is needed for animation data (would need some examples).
I wouldnāt call this a unified API, there are just some utility modules that would be nice to do higher level operations on Blender data, for the times scripts are not concerned with details.
The bpy_extras module is intended for this purpose. Note that I didnāt include cycles_shader_compat module there because itās only used by one add-on and not well tested for general purpose tasks.
Indeed, Am not convinced that we need a new API specifically for IO, but rather adding some helpers for some common issues (maybe even in C for performance-critical cases), and generic wrapper.
For animations, an easy way to get baked data would be useful. A fair part of FBX exporter is currently dealing with this, most of the code being about getting valid baked values for every frame, and then simplifying them.
Difficulty here is to find the right point up to which we can use generic code, given all the specificities of each format, youāll always have to code some own code to adapt the data anyway.
Iāll be thus more selective, and work now on the shader case. I am starting to write some tests and see how that helper (in C) can be structured to be as generic as possible, and easily adaptable to a specific use case.
Iām not sure what to use to calculate those textures.
When the input node is of type BSDF, I could bake the different textures (color roughness, ā¦) with the baking system, right? I was studying how the baking system works (with the code and some documentation), but Iām not sure how to use it. If you have any link to a documentation about how to use it or to a line code using it, Itād be awesome (As Internal is going away, and Iām interested in baking Cycles materials anyways, I am looking at the baking feature of cycles, and trying to make sense of what is Internal only and what is shared, because it seems like part of the baking pipeline is shared). Because right now, Iām not seing how I could use it to bake the wanted textures.
And when the input is only color, do I really need to use the bake feature? There might be a quicker way of getting them (and is there anything in Blender that I should make use of to do it or can I implement it?).
The steps for now are as (roughly) follows:
create the node sub-trees for each input, in order to get a node tree per input of the selected node (the root node, not included in any sub-tree)
for each sub-tree:
check the type of the related input
compute the texture (bake it with cycles or something else)
During the summer break, I have continued to learn the structure of the code and the tools available to achieve this goal. I realized as well that there are already ābake to ā¦ā options that match exactly the textures I was looking to create. I thus simplified the process.
However, being quite new to the code, I wasnāt able to directly implement it, and there are still too many code sections and tools which I donāt know.
Since September, I couldnāt allocate as much time as I wanted on it and it stalled.
I have seen what has been merged into 2.8 on this subject, and, as I become more familiar with the codebase, might be able to contribute to this effort specifically.
This thread is now way too broad and not up to date with the current status of the GLTF exporter. I think creating specific threads on specific issues or themes would be better. I will thus come back to smaller tasks, to get to know the codebase better, and if I can help on this topic, Iāll get back in contact with the developer in charge of this topic to see what I can do.