Is there enough features in the API to support 3d haptics devices?

There are a few of these devices. They provide not only 3d input but also force feedback and haptics.

omega6v

video example:

While the systems are a bit different mechanically, the output data provided and input expected via USB is the same.

Sculpting programs with VR controllers for example shoot a virtual laser beam from the controller on the 3d model surface so VR controllers act like a laser pointer and uses raycasting (example: https://www.youtube.com/watch?v=jnqFdSa5p7w ), but devices like the one in the above image actually allow you to “feel” the current shape and sculpt on it rather than working like a laser pointer because they provide real force feedback and haptics. So using these devices like a VR NDOF controller won’t make sense.

With mouse/stylus the cursor in the 3d view is assumed to always be “on” the surface of the 3d model and with NDOF controllers it is still assumed to be on the surface as long as the controller is pointing at the surface. This behavior won’t work well at all with the devices above. So I’m wondering if the current API allows to change, bypass this behavior or design around it somehow for there to make sense in using a true haptic 3d stylus.

1 Like