XR Actions branch question

As far as I know the XR Actions branch is being further developed, I tested it and it seems to be a great start, however to setup basic actions right now it’s kind of messy, bot because the system is bad, but because it takes a lot of time to set basic things.

Is there some config file with basic actions already setup, like teletransport / movement, move/rotate/scale, or access menus (if this last one is even possible)?

I’m not sure who to mention here, so allow me to mention you @dfelinto to drive this to whoever is responsible or if you know the answer :slight_smile:

Thanks.

I do not think we should require this kind of set up. We could allow custom action sets via the preferences or as developer UI. But the default experience should “just work” and already have the supported actions (selection, transformations, navigation, haptic feedback, …) preconfigured for the different OpenXR compatible controllers.

@julianeisel I think I have expressed the question in a wrong way.

Precisely my question is if the current branch has some kind of defaults, because I tried it and it has no defaults at all, that’s why I asked, becuase there are no preconfigured actions in the branch :slight_smile:

Should I see default actions already there?

I didn’t look at that part of the patch recently, but AFAIK there are no such defaults. But what I’m saying is, there should be, at least if this goes into a release.

@muxed-reality will know more :slight_smile:

1 Like

I agree, those defaults should be present or it will be discouraging for the users :slight_smile:

I’ll wait until such defaults are present because right now it would be super time consuming to configure anything by hand :slight_smile:

Thanks for your feedback. I agree that default actions should be present and will be necessary for a release.

I didn’t get around to adding any defaults yet but will do it in the future. It should just involve extending the add-on (no changes necessary to Blender itself).

2 Likes

I’ve been enjoying playing around with this build, and was able to make some basic actions. I am having an issue with something I’m sure is a simple mistake on my part. I’m trying to make an action that makes the thumbstick left/right control frame backward/forward respectively.

With the setup in the attached image, both left and right directions advance one frame forward. What do I need to change so thumbstick pressed left goes a frame backward and thumbstick pressed right goes forward?

1 Like

Thanks for your feedback, and glad to hear that you are enjoying the branch so far.

Actually it is not a mistake on your part but an oversight on mine. In the future I will add the option to treat the positive/negative axes of a thumbstick/trackpad as separate buttons, but currently I am busy making changes for the patch review.

In the meantime, you can either use separate inputs or create a custom operator and assign it to the thumbstick axis. Then in the custom operator’s invoke() function you can listen for an event of type XR_ACTION that will be received when the action is triggered. This event will have an “xr_state” property, which gives the float value (-1.0 to 1.0) of the input and you can use this to call the screen.frame_offset operator with different arguments on execute().

1 Like

The xr-actions branch now has default actions, since they were requested in the patch review.

However, these are not present in the old 2.92 experimental build so for now you will need to build the branch from source to use them.

Hi! Muxed-reallity Thanks a lot for the effort! I´m super excited about this addon. I have tried with the compiled version, setting up some simple actions and I’m super hyped with the possibilities :astonished:. For this reason I have compiled the xr-actions-D9124 branch and the procedure finishes ok and blender runs perfectly, but i can’t see the default actions :confused: I’m pretty sure that I’m making something wrong, but I don’t know what could be.

.- For building the version, the correct procedure is making a checkout to xr-actions-D9124 branch and “make full” isn’t it?
.- Is it necessary to make something more inside blender?

Thanks a lot for any help

Hi @Santox, very sorry for the late reply. Didn’t notice your message until now.
If you are building from source, in addition to checking out the xr-actions-D9124 branch of blender, you also need to check out the xr-actions-D9124 branch of the addons submodule.

Alternatively, there is now an updated build for 2.93 here: https://builder.blender.org/download/branches/

Also, here are a few steps to get started:

1 Like

Hi Muxed Reality! Thnxx a lot for the answer. Yes, some weeks ago I found the xr-actions new compilated 2.93 version and I downloaded it. I have played only a little with it, but I’m absolutelly amazed with the posibilities. Currently I have not to much time, but I thinlk in a couple of weeks I will have more time and my intention is to check that version deeply. Thank you very much for your work. I will be back for here to tell things…

Greets!

1 Like

I just tested the latest build of the XR-ACtions branch.

I’m impressed on how well it goes!

@muxed-reality

I used it with an Oculus Quest 2 with Oculus Link (with cable) at 120Hz and it’s a pleasure!

I miss to have at least the tools at hand, and at the very very least to be able to enter into modes and use the selected tool, to do some basic sculpting, that seems to not be possible yet, am I right?

Anyways, WELL DONE! :slight_smile:

1 Like

Thanks! Glad that you enjoy it :slightly_smiling_face:

Yes you’re correct that using the tools and modes other than object mode is not currently supported. The focus for the time being is scene navigation, with some basic object selection/transform for scene arrangement applications.

However, once all this is (hopefully) incorporated into master, the next milestone for the Virtual Reality project will be supporting the paint/sculpt/draw tools, at least at a basic level.

1 Like

That’s awesome :smiley:

Thanks!

Hi
@muxed-reality
and other VR Blender enthusiasts.

I am using the actual XR Dev branch build to prototyp a VR Collab session.
It s all about the simplest possible navigation for our users.

Therefore i placed

  • 6 landmarks around the car (standing positions)
  • 4 landmarks inside the car (seat positions)

The landmarks are populated with avatars who are sitting inside and standing around the car to have an vsiual represention and an raycast target…
With the Vive controllers the user can now mark the avatar and gets teleported to the standard positions. Around the car and inside the car.
This works great and is the most easy navigation for first time users.
It also delivers standard positions to teleport for discussing the car desig from predefined locations.

Question 1:

All the Avatars are inside one collection and i need to make them only visible by pulling the vive trigger.
pull trigger: collection with avatars is set visible
release trigger: teleport to selected Landmark Avatar, set collection with avatars invisible

Do you have probably an minimal script example how i could set this visibility to an collection via an vive pull trigger with index finger on controller action.
Tried hard.

Question 2:

I populated with the motion tracking simple suzannes to both hands and the head.
So that the user is visible in the scene.

Then i use Ubisoft mixer to connect to Blender sessions.
Starting VR on both scenes.
Now the Avatars from the connected user via Mixer (suzannes on hands and head position) should be visible and move, so that you see where your collab partner is placed in scene and to discuss the interior and exterior design. Only the hand visualisation is here very helpful to see what the other user is pointing at.

This is the most easy VR Collab mode. See where your connected partner is and where he is pointing with his hands. Speech is delivered via MS Teams or Skype seesion.

It´s something like this planed from your side to support by standard?
Ubisoft Mixer is currently not maintained anymore.
I just searching a maintained connection implementation with this minimal sync of hand and head position.
Any tips how to get this would be great.

Thanks.

Hi, thanks for trying out the XR Dev branch.

Here’s a minimal operator for setting collection visibility:

import bpy

class ShowCollectionOperator(bpy.types.Operator):
    bl_idname = "scene.collection_show"
    bl_label = "Show (or Hide) Collection"

    name: bpy.props.StringProperty(
        name="Collection Name",
        default=""
    )
    show: bpy.props.BoolProperty(
        name="Show",
        default=True
    )

    def execute(self, context):
        col = bpy.data.collections[self.name]
        if not col:
            return {'CANCELLED'}

        col.hide_viewport = not self.show
        return {'FINISHED'}

bpy.utils.register_class(ShowCollectionOperator) 

After running this in the script editor, load the default VR action maps, then copy the teleport action. For the new action, replace the operator with scene.collection_show and change a few more properties as shown below:

Copy this new action (teleport_show_collection) and repeat for the hide collection case:

Note: The names of the new actions are intentional, since collection hiding needs to happen after the teleport raycast. To ensure this, just prefix the new names with the action they should follow (in this case teleport).

In response to your second question, there currently aren’t any plans to support networked VR collaboration in Blender natively. However, if any developer would like to contribute, it would be more than welcome.

Thanks a lot for this detailed help and the code snippet.
I will upload the project with the Avatar based teleport to GitHub next days.

I will also try to sync the controllers and head pos per Blender session via
Multiuser from Swann Martinez

I use the EEVEE Racer scene from katarn66 and integrated some controllers and a headset as Blender Assets with CC Attribution.

Here a little simulated preview how the VR collab should look when finished.

The green user sees the yellow user (head position and booth hand positions) in Blender 1.
The yellow user sees the green user (head position and booth hand positions) in Blender 2.

1 Like

Hi,
@slumber

with the new motion tracking feature i am able to move the head position and hands in a vr session now.

I also did some inital tests to sync two or more users.

Could you give me some tips what would be the easiest and most stabil way to sync
the Head and both Hand positions with your Multisuser.

Here the simulation to illustrate what would be needed…

Blender 1 (yellow): head/ hand left / hand right
master
Blender 2 (green): head/ hand left / hand right
joined the session

Only the pos of head/ hand left / hand right should be synced.
All other Multiuser features could be deativated.
In the actual MultiUser implementation there is only the camera wireframe synced.
I set delay to 0. The synced movement speed would be fast enough.

Is there a easy way to add my pos of head/ hand left / hand right to the sync?

1 Like

Hi !
Thank you very much for exploring collaborative VR in Blender and for your feedback on the initial tests !

The multi-user python API provide a function update_user_metadata(repository, the_custom_data_dictionnary) to sync any user-related metadata (code example).

Once published by a client, these data can then be read on each client to from the field session.online_users[usernames].medatata[your data fields](code example)

Since the multi-user hasn’t a dev doc yet, I can help you to implement it, can you give me precisely the data fields to synchronize ? (Does the viewer_pose_location, viewer_pose_rotation, controller_location and controller_location attributes should be enough ? )

This could be part of an experimental branch of the multi-user :slight_smile: