I was talking with the people on the xr-channel in blender.chat about controller support in the mixed reality branch. I am interested in developing for Blender but have little experience with the blender source, so there are probably a lot of newbie questions. I figured that they are better placed here.
I got a custom controller representation working with gizmos in blender, but they do not show up in VR. This is as @muxed-reality explained to me due to the missing context in the offscreen draw loop. I tried adding the draw function of my gizmo with draw_handler_add with:
args = (self, context)
self._handle = bpy.types.SpaceView3D.draw_handler_add(self.gizmo.draw, (self.gizmo, context), ‘XR’,‘POST_VIEW’)
in the setup function of my gizmo group but this gives me an error: “TypeError: draw() takes 2 positional arguments but 3 were given” This was probably a little naive.
Another solution would be to just add controller objects and update their position in python. Would one write a modal operator for this? If so, what would be the event that starts the operator?
The last solution would be to create custom models in wm_xr_draw_controllers() but this seems a little hacky.
As I understand it, the preferred option would be to use the XR_MSFT_controller_model extension, but this only works for windows mixed reality at the moment. Did I miss anything?
I apologize if these are really basic questions, but I’m also willing to learn.