GSoC 2020: OpenXR Actions (VR interaction) Weekly Reports

  • I added a default action set (and some other things that go along with this). There is admittedly very little documentation on all things OpenXR, so this was harder to set up than I liked, but I think I’ve done so properly.

  • Experimented with python support and/or doing some basic things with the action set. None of this is in the latest commit, I need to figure out how to do this properly and not just for testing.

  • Dealt with an unfortunate bug. For some reason, I started getting serious stuttering in VR every few seconds. As it turns out, this was not something I introduced (I had suspected a memory issue of some kind) and also happens in another application. It seems to be an Nvidia driver issue where the GPU thinks it’s not being utilised and repeatedly enters and leaves a power saving state. I still don’t know why it happens but I’ve found a workaround.

If anyone else has experienced this, please let me know, as Googling doesn’t give me any help.

  • I added some debug code that isn’t quite up to par to try
    and track this bug down (should this be commited in some form)?

I think next week will be more productive. I have a much greater understanding of the Ghost helper library and the wm/xr and how the parts fit together, though I need to talk with my mentors / other experienced blender contributors about the right way to do things.


Week 2

  • Created a C++ API / abstraction for interacting with the OpenXR action system and a system for managing the various aspects of OpenXR (e.g. handles).

At present, actions and action sets are stored in maps with unique string IDs. For now, there is always a default action set with the ID “default_action_set”. One important consideration is that at present, a “GHOST_XrAction” does not map to “XrAction” as there is a “GHOST_XrAction” for each subaction as well. I am considering simply not using subactions as a potential alternative.

Various things can be done from the API (creating actions, getting action state, action spaces, creating haptic feedback, …).

Some other things:

  • Created a system for managing and applying bindings

  • Finished creating the default action set

  • Worked on the C API

Next Steps

  1. Finishing the C API
  2. Making some important design decisions I’m still unsure about (with input from more experienced developers)
  3. Using the C API inside blender, with the goal of drawing controllers

Short update, since I haven’t gotten the graphics stuff working yet. That should be done very soon, though.

Week 3

  • Did some rewriting
  • Fixed a lot of issues with the C++ GHOST API.
  • Got the C API working (the current commit only has one function enabled, though).
  • Worked on system for managing bindings for various device types (needs testing, since I only have access to oculus).
  • Some other stuff
  • Worked on the graphics (should have a build up soon that uses immediate mode drawing).

Next Steps

  • Getting controllers to draw (should be done very soon, and with a build that people can test).
  • Implementing some basic controls.

Meant to post this earlier, but twice it seems to have stayed a draft.

Week 4

  • Misc changes to C++ and C GHOST API.
  • With some effort, got basic rendering of a placeholder shape in the position of the controllers to work.
  • Close to getting a more advanced rendering system to work.

I’m having a weird OpenXR bug that I can’t yet fix, which delayed the promised build, but ideally there will be a testable build today.

Next Steps
Finish off controller rendering (part of this is to figure out a way to render the controller with geometry).
Start work on either the python API or actual functionality for the controllers.


Week 5

Quick update.

I’m still working on the more advanced graphical stuff, and on fixing another OpenXR issue, but I’ve finally committed the build with the basic controller rendering demo if anyone wants to test it out.

I’ve received feedback from my mentors that I haven’t been committing enough and my weekly reports have technically been late, so the next weekly report will be in around four days (Sunday Australia time). By then I expect to have completed the rest of the controller rendering task, and can do a better weekly report. As such I’m leaving this weekly report a bit short.

– Peter Klimenko


Hi, I’m very interested in your project! I regularly use the BlenderXR version from Marui, but the ideal solution would be the direct integration in Blender. Which has not yet been useable without controllers.

I’d like to test. Where do I find your builds?

Hi. I was depressed and sick with what I thought was coronavirus (“just” bronchitis), which lead to a ~2 week lapse of communication. I was commiting and not pushing, which lead to people losing faith in my project.This was my fault, but I was still working hard on my GSOC project.

Progress Report #1 - Milestone 1 (Controller Drawing)
I’ve gotten controllers to draw, with a line extending along the orientation. This took 2+ weeks of work, with countless bugs and annoyances I had to deal with. Getting something to actually draw took a while, and then I had to deal with issues with my transformations. I finally got something to draw, though, and from then on it was fairly easy.

Progress Report #2 - Milestone 2 (Python API)
Lately, I’ve been working on the Python API. I’ve been looking at RNA (creating RNA structs for the various GHOST_OpenXr objects), creating functions, and making a custom app handler specifically for openxr session creation (post-creation event).

Learning how the RNA aspect of blender works has been a particular challenge, but over the past 2 weeks I’ve been very productive, I feel. I’ve understood how all the parts fit together. I think I’m about halfway through milestone 2.


Glad to see you are back!

Hi Peter, thanks for the development you are doing, and of course to the rest of the developers.
Finally I was able to get my “HTC VIVE PRO EYE” working with the blender VR inspector.
I wanted to tell you that I am at your disposal if you need to do tests on this device, just tell me what to try or do.
It is interesting that the HTC VIVE PRO EYE, have the technology of “eye tracking”, which are implementing the rest of VR devices.
It use SRanipal as Software, maybe it will be useful for your development in the future. So I am at your entire disposal to do the tests you need.
A big hug!!!


What is the status on this?