Ubisoft Animation Studio - VR development discussion

Introduction

Ubisoft Animation Studio (UAS) is an animation studio that focuses on production, previz and artistic workflow research. In order to boost the creativity of our artists, we are developing ground-breaking tools such as VR applications.

Context

Today we have a fully functional VR app in Unity. Users can scoot the scenes, place and animate cameras. A scene can also be built from scratch using primitives and blender meshes. Finally, we have developped a 3D paint module. Everything is connected to Blender via a network bridge. This is a python addon developed to synchronize Blender with other dccs (Unity in our case).

We chose this solution because back in early summer previous year, the VR development in Blender was just beginning and we had no certainties whether or not it would be ready for our March deadline.

Now things have evolved and VR in Blender is officially supported. We are re-evaluating what it means for us to port our Unity VR software to Blender. We need to know if we have all the tools in Blender to make custom UIs, collision with objects, etc.

We have a small R&D team (@Plop @grosglob @nfauvet) currently dedicated to VR previz tools, but with a little time budget for the blender exploration. We are not a fulltime on this (yet).

Prototype

What we have prototyped:

  • Oculus Touch controllers support: we can grab the pose of the two controllers, as well as the joysticks and button states. We have exposed this in Python.

  • World transformation: we inject an intermediate “world” matrix in the view matrix to be able to move, rotate and scale the world relative to the view.

  • We have implemented a “bi-manual” navigation mode. Grabbing the world with the two controllers, we are able to move/rotate/scale it naturally. This is done in another python addon we developed (to avoid modifying the current one for the moment).

  • We have fixed issues with the Oculus (already merged in 2.83)

  • We have fixed an issue with the culling algorithm when using asymetric frustums (for VR).

Features needed

  • Controller support: today our company is using Oculus Quest and Rift S, we need to support the Oculus touch devices. We need to design a generic system to extend to other devices.

  • World manipulation: we need to navigate (translate/rotate/scale) inside our 3D scenes, without adding any explicit parent object in the scene.

  • 3D manipulators: we need to draw custom gizmos and objects without affecting scenes (draw controllers, UIs etc).

  • Intersection/Collision testing: in VR, we found that the best way for grabbing objects is to touch them (no “ray”), therefore we have to find a way to test intersections with controllers and scene objects. We also need ray collision/intersection for the selection of unreachable objects.

  • Real-time physics: it would be great for manipulation if we could throw stuff to easily populate a scene for example. In Real-Time. We also need physics for a few navigation modes (fps, teleport)

  • Real-time particle fx: We need to be able to see smoke, fire, rain in real-time for previsualization purposes.

  • Render-to-texture: We need to have a realistic camera feedback in place in the scene. For multiple cameras.

  • Tools: grease-pencil in VR, having the ability to paint in 3d.

  • Performance: We need to be able to navigate medium-sized, un-optimized scenes, in realtime, at VR-speed. (70-90Hz).

If all these conditions are met, we will probably dive full time in blender.

Questions

We have started prototyping in blender the following items:

  • We implemented the low level openxr controller manager (Oculus only) and wrapped it in python. We are not sure we did the right things. Python wrapping seems very verbose and annoying. Is there a faster way to expose internal properties to the python level?

  • To implement world manipulation, we created a “fake” world transform, that corresponds to a world matrix directly combined with the camera view. We stored its components in wmXrSessionState which may not be the best place. We also needed to initialize it somewhere.

  • For 3D manipulators, we have seen examples where drawing was made in python addons using opengl. Is this a good practice? Can we use the blender workbench/evee engine to draw blender objects/meshes without adding them to the scene? Or do we have to implement our own little render engine in OpenGL to draw our own custom meshes?

  • What can we use to browse the scene and test intersections? We saw that there is a raycast function. Is it real-time enough to intersect with the whole scene every frame? We saw that the Bullet Physics 2 library was present in the external dependencies. Is there a physics representation of the whole scene somewhere, or do we have to do it all by ourselves?

  • Do blender have facilities to render a camera to a texture, in real-time?

  • BUG: We noticed that the grease-pencil was not behaving correctly when we scale the modelview matrix. The strokes seem to keep their screen-space dimensions, becoming ultra big when we scale down the world.

@julianeisel This is it. What do you think about it? Can you bring light on some of our questions?

38 Likes

Update:

We created a new branch called “xr-world-navigation”, in the master repository and also in the addons repository (we modified the viewport_vr_preview.py addon).

It contains what I called prototype: bi-manual world navigation using oculus touch controllers.

Happy to contribute.

9 Likes

Daily Win64 build available on Graphicall.

5 Likes

Thank you very much for bringing us this daily build, this will be very helpful to gather feedbacks.

To access new features, we used the current “VR scene inspection” addon.

In this first build, we implemented:

  • A standard world navigation system in VR. It is accessible with both oculus touch’s grip buttons

  • Oculus Touch Controllers. Controllers data are accessible in python in bpy.context.window_manager.xr_session_state structure. Note that the VR session has to be started

example : trigger = bpy.context.window_manager.xr_session_state.left_trigger_value

  • World manipulation in bpy.context.window_manager.xr_session_state.world_location bpy.context.window_manager.xr_session_state.world_rotation and bpy.context.window_manager.xr_session_state.world_scale
4 Likes

Hey nick !

check out the work UPBGE team has done !

most of your wishlist is already done :smiley:

2 Likes

Somebody on blenderartists indicated that you guy’s might be interested in multiuser as well so I figured I’d point out Verse 2.0:

https://verse.github.io/

back in the day, before blender 2.5x there were functional builds of verse-blender that users could use to over the internet collaboratively edit the same scene in blender. So long as separate objects were edited.

Real-time physics: it would be great for manipulation if we could throw stuff to easily populate a scene for example. In Real-Time. We also need physics for a few navigation modes (fps, teleport)

Blender’s physics is tied to its animation clock. You’d need to de-couple them to make this happen. There’s a task for this already, but it’s pretty far-off (since everything nodes is being developed, with Particle Systems as the first target).

You probably already know about this, but if you haven’t seen them, you should look at the documentation for the upcoming physics system: https://wiki.blender.org/wiki/Source/Nodes Here’s the task.
I think you’ll want to contact both Jaques Lucke and Dr. Sybren Stüvel for this, since it touches on both of their modules. I believe it is Sebastián Barschkis that owns Physics.

I’m sure you already knew about this, but I hope this is helpful either way.

1 Like

Thanks Jacob.
I checked it out very quickly. It seems to be a fork of an old version of blender, coming with a new version of BGE. So, if I guess correctly, it does not work with the current version of blender, and comes with its own blender version.
We are working with artist that build scenes for CGI, we are not trying to build a game (well, the VR part feels like a game).
Waht we would like is to have those features in the latest version of blender. And be able to upgrade to the next version of blender as soon as the arts department decides that they need it.
From my point of view, I don’t think upbge to be the right tool for us.

Thanks, we’ll check it out :slight_smile:
It could really help our sync efforts.

Thanks Joseph for all the info.
We know we are asking for a lot on this topic. Realtime physics is a game engine feature, and Blender is not a game engine. It simulates physics, quite fast, but often has to cache results. And like you said, it is tied to the timeline.
We’ll keep an eye on the future physics system. In the meantime, I think we’ll have to just use raycasts for UI, from python, if anything like this is available.

There might be other ways around that in the meantime. A few years ago, this gem was presented at the Blender conference-- fast forward to ~12:16. The artist in this video created an interactive cloth simulator entirely in Python!

Unfortunately, he doesn’t share any details about the code, except that he ised Numpy to accelerate it.

Python does have raycasting features. Raycast is a method of the Scene object: https://docs.blender.org/api/current/bpy.types.Scene.html#bpy.types.Scene
Without doing some research, I can’t tell you how to find the matrix of the current viewport. In a pinch, you can lock camera to view and use the camera’s matrix.

1 Like

Thanks for the pointer to the Raycast, we’ll probably use it.
When I talk about physics, there are two underlying concept (and I should not use physics for both):

  • collision detection / raycasts, used for UI and Object/controllers collisions
  • physics simulation to populate scenes by dropping objects, and navigation modes requiring real-time collision detection of rigid bodies.

I think we’ll have difficulties with the simulation part.

But the collision problem could already be solved with all the tool in place. If there is a raycast, there may be some acceleration structure on the whole scene to speed up the process.
The same structure could be used to test intersections between the controllers meshes and scene meshes.

Anyway, I’ll take a look at the raycast, thanks again.

2 Likes

@nfauvet you can check how Animation Nodes work, it has it’s own execution loop, it’s not tied to the animation timer, or at least it seems so.

Hi,
I think a lot of the features that you are requesting will be part of future Blender interactive engine. At this moment they are not availables but you can mitigate with:

  • Animation nodes as pointed out before it has its own execution loop
  • UPBGE. The current version under development (0.3 alpha) works with last version of Blender, uses Eevee as renderer, has realtime physics, has render-to-texture, render grease-pencil, can do intersections/collision, has SDL controller support and it is possible to make a navigation template easily. You can grab from here: https://mega.nz/folder/t9EEFSaS#JPiOPSInCZyU-SW_-rhEOQ

Having said that, I apologize for stating the above since I believe that a site for the development of Blender is not a site to promote another program.

2 Likes

in case you haven’t discovered this help about rendering to texture:

https://docs.blender.org/api/blender2.8/gpu.html#rendering-the-3d-view-into-a-texture

I hope your evaluation will lead you to blender.

1 Like

Thank you!
This is typically the kind of things we are looking for.

Thanks, in my quick overview I had not looked at the alpha, so I only saw that it was based on blender 2.79.
There are certainly things to pick up if we have to handle everything ourselves in C.

1 Like