[SOLVED] Materials implementation for game modding plugin

A little background
I am creating an addon that enables the user to import and export models from a format called “OpenFormats” for the video game Grand Theft Auto V. A program called OpenIV allows you to browse the files of the video game and export them into the decrypted “OpenFormats” file, and OpenIV can also convert them back into the correct format so that the game can interpret them. OpenIV has a massive XML file that contains a list of organized shader parameters. These parameters are simply either a collection of float values or an image path. What I need to do in Blender is have some sort of material editor, where, for each material in an object, the user can select a shader defined in the XML file (I would do some sort of parsing of the XML file but that doesn’t matter for this question). These shaders would do nothing to the object in the scene, but would just be used for the exporter. The material editor’s data would also have to be saved in the file somehow.

TL;DR
I need a way for the user to set certain options for each material of an object and have those options saved in the save file. These options (other than a diffuse map) would have no effect on how Blender renders the object, but they would just be used for my custom exporter. A diffuse map (which is in many of the shaders from the XML file) is the only thing that needs to affect the rendering of the object, as I want the users to be able to see the textures they apply.

Here is an example of what it might look like from a similar plugin for 3ds max: https://gyazo.com/e44f91a46f8efcbbfa50d8c601960861
Here is the XML file containing the shaders (had to cut off a good portion but should be enough to get the gist of it): https://pastebin.com/1nnT5Yyg

What I’ve thought of doing so far

  • Could I have some custom property of each object that is a dictionary storing all of the options? I have considered that approach, but have found it difficult to figure out a way to make sure the dictionary stays in sync. For instance, how could I update the dictionary when I change the name of the material?

  • Is there some sort of way I could dynamically create my own custom shaders from the XML file? For instance, my addon would, at runtime, create a large amount of shaders from the XML file. The only “real value” these shaders would ever have would be a diffuse map. Other than that the values would just be floats or vectors that do nothing in Blender, but are used in the exporter.

Also, please bear in mind that I have no Blender modeling experience, but I do have a lot of experience programming. My purpose in creating this plugin is so I (and many others in the community) could use Blender for modding GTA V, and it seems I am the only one who is willing to do it. Thanks!

1 Like

Custom properties lends itself well to being used by other exporters like gltf, that being said you are limited to mostly the Principled BSDF and the properties themself won’t really do anything for previewing.

If you wanted to imitate the interface and look of the game’s shader: You could also create custom node group(s) (that wraps around the Principled BSDF or whatever setup you want with custom features and dummy sockets as necessary) and exposes whatever shader parameters the game uses as sliders and colors. When you export the material scan if the the node group for (one of) your game’s shader node group is plugged into the surface and either read the slider/color or check if an image texture is plugged in when exporting a specific field. Default blender shaders can also be checked for a best-match which is especially helpful when working with imported assets. Supporting the Principled BSDF regardless of target application in one way or another is generally a best-practice when writing a blender exporter.

If your game uses a uber-shader/handful of shaders you can likely just implement the node group(s) for them manually in another blend file you can link into the working file manually/via python or using it as an application template. If your game uses some costly features like parallax occlusion mapping in it’s ubershader then consider splitting that into it’s own node group. If your game has many permutations which mix costly effects then breaking it up into smaller node groups of features which can be connected together in another node group via python to form the whole shader you need on-demand might be a good option.

If your game doesn’t use ubershaders but has a bunch of manually defined shaders variants where most of which share the same features (eg: TexturedUnlit, Unlit, TexturedBumpSpec, TextureBlendBumpSpec, etc.) then consider solving this problem down to a single ubershader in blender with feature bits which are used to select the appropriate target shader.

Worst case scenario if your game uses some shader graph setup or variants cannot be easily inferred by “switches” (implemented as sliders) in the node group and/or a handful of variants then you might have a significantly bigger problem on your hands.

Hopefully this helps you get on the right track.

1 Like

Hi, thanks for the post. I am not sure which category the shaders fall in (I have never worked with shaders before). It sounds to me like your second method would make the most sense, as it has to be done programmatically. Have you checked out the Pastebin of the XML file I sent? I also have just taken all of the “ShaderParameters” and dumped them into this text file with no duplicates: https://pastebin.com/x08QZqij In the XML they are re-used a lot. So, looking at the file I just sent, and the XML, would it make sense for each Item under ShaderParameter to be a node socket, and for each ShaderPreSet to be a group of those sockets? Am I understanding this right?

1 Like

I can gleam a little bit of context from the paste link but it seems like that is a much larger blob of any parameter that might get passed to a shader, even those that don’t make sense at a material level (like view/player position). Simply dumping or generating a list editor from XML contents would not be a good material editor. The only place it would make sense to do a generic solution like that would be for either a constantly changing product (which I assume it’s not) or something with too many parameters and variants to replicate.

If your game doesn’t use ubershaders but has a bunch of manually defined shaders variants where most of which share the same features (eg: TexturedUnlit, Unlit, TexturedBumpSpec, TextureBlendBumpSpec, etc.) then consider solving this problem down to a single ubershader in blender with feature bits which are used to select the appropriate target shader.

Based on the docs you linked it seems that GTAV uses this kind of approach to shaders where there are shaders for different surface types with some shader types just being stripped back or swapped out features. So for example the variants could be consolidated into one or more node group and when exporting the exporter could choose what shader to use based on what features are used in the setup.

Blender’s way of handling materials is centered around a unified node graph which is far different from the approach taken by Max,Maya and even Houdini where they use the “select a shader/set parameter approaches” blender just gives the user a sandbox of node to play around with and expects the renderer to respect their intention to the best of their ability. This is honestly a bit of a painful spot where blender might not play as well with others.

Before you go further it is a good idea to seperate a concept of “Materials” from “Shaders” where materials are the artist friendly description of a surface that exists in blender and shaders/parameters are used to describe to the engine how they want to render the surface. The artist doesn’t need to know, nor care about every ShaderParameters that can be passed to the shader, they just want to choose what texture to use, set speculatiry, bump mapping, or make it look like cloth. When the exporter comes to parse the material it would be it’s responsibility to infer it’s surface properties in the blender and translate them into the optimal ShaderPreSet with the ShaderParameters set to reflect the material.

Again though, blender’s material system is just a sandbox of nodes that can be configured in any way the artist wants so deeply interrogating the node graph to understand these properties would be painful and fruitless. So most exporters will respect the PrincipledBSDF shader if it is plugged straight into the surface (since this is the most basic starting material), and define a few custom shaders/extensions with node groups, and leave the rest as undefined behavior that will be ignored and left to a fallback material. This is the recommended approach as the “PrincipledBSDF” shader itself is a PBR ubershader which supports many common features which might be a good starting point. If you still need things like toggles for technical (like buffer access, skinning, etc) stuff you can put this in a custom panel in the material properties tab similarly to how settings for shadow/displacement/blending are defined for eevee and cycles.

So ShaderPreSet’s would likely need to be broken down and plotted out for different features. Once those features are broken down they can be composed into one or more “uber-shader” or “extension” node groups where features can be selected with “use” sliders and their inputs can be based around the ShaderParameters which would be relevant to the artists.

This probably seems like a lot to take in especially if you aren’t familiar with shaders or blender but I would reccomend taking a step back from the problem and maybe look at how other exporters, including the official Khronos GLTF exporter maps blender’s nodes to their specification or problem domain.

It’s not going to be a quick and painless automated process but in the end the exporter would be much higher quality and more enjoyable for users to work with when done properly.

1 Like

You explained this perfectly, I understand what you’re saying now. I definitely agree this is a better approach and would offer a far better user experience, improving on previous modding tools that have been made.

This probably seems like a lot to take in especially if you aren’t familiar with shaders or blender but I would reccomend taking a step back from the problem and maybe look at how other exporters, including the official Khronos GLTF exporter maps blender’s nodes to their specification or problem domain.

Definitely a good idea. I have enough info to start now, but before I do I am going to learn how to actually use the shader editor myself as well as view some other exporters to solidify my understanding. This is going to take much longer than I had originally anticipated, but will be well worth it.

Thanks for the insight, this wouldn’t be possible without your help!

1 Like

Some of the built-in exporters are a little neglected when it comes to material export. I tried to remaster the obj exporter a while back with PBR support but due to some stuff shifting around and code getting updated breaking compatibility it fell through the cracks. https://developer.blender.org/D8868 Hopefully linking it might give some insight. Also: “node_shader_utils.py” is a bit hit-and-miss and might be a good starting point for inferring some properties but I would recommend accessing the nodes graph with your own methods. Despite the tech debt in this area I hope this helps.

Thank you for mentioning your patch. I am currently working on getting Ankit’s GSoC 2020 OBJ I/O in C++ project ready for master. The hope is that we can deprecate the python importer and exporter, perhaps in Blender 3.0. I am worried about the material import and export, especially how to test it. Do you have any recommendations (even better, test files and way of telling whether the results are as expected or not)? Also, any recommendation of a best “this is the current way materials should be specified in OBJ files” written material?

When making the patch I quickly found out that the MTL ecosystem is a mess where nobody really followed standards for over 25 years honestly and broken materials became the standard. So in crafting my patch to modernize MTL with the proposed PBR extension to support most of the Principled Shader I tried to best follow the intention of the 1995 spec as the most cannon version of the format. Where there was the need to expand the format (ex introducing subsurface scattering) I carefully documented and proposed a solution which other implementations could follow. It felt like the best way to go forward was to put your foot down and hit the reset button. Everything that was found out in the wild outside the spec would require the user to enable compatibility flags on the importer. The goal was to generate valid MTL files that encoded a nice fallback for apps that don’t support PBR and never produce a malformed output just to accommodate bad behavior from other apps.

For example: MTL has no normal map support without the PBR extension formally proposed “norm”, but everyone would have a different solution but the most popular one crammed it into the “bump” property. This is wrong, the bump is reserved for grayscale bump maps. This is just one of the many inconsistencies from countless exporters and hand written mtl files that have piled up over the years and many of them aren’t very well documented. As a result I set out to download every file I could and try to figure out what their specific issue was. Very old versions of Maya swapped the ambient and diffuse and for a while this became standard practice… that got fixed at some point. Houdini just threw in the towel and didn’t support MTL at all. Blender as of 2.80x is a serious offender in this department as it shoved metalness into the slot intended for reflection cubemaps, mangled ambient values, etc. in order to quickly transition blender to PBR without jumping to the PBR extension.

When working on my patch I compared the generated files against a test-bench I made using the tinyobj importer as this is the most commonly used up-to date C++ obj importer with the widest feature set. For testing importing obj/mtl files I just downloaded a bunch of obj files off the internet (here is a nice grab bag), and messed around with other software to inspect their output for weirdness, not pleasant but I gained a decent amount of insight from it.

I’ve since been working more in geometry nodes with volume point distributes but if you would like any further feedback on working with MTL I would be glad to offer whatever help I can on blender.chat.

Thanks for the detailed response. I may take you up on the offer of a blender chat some time in the future.

1 Like