Hands in XR

Hands are used for gestures and to provide a more immersive experience by delivering a more natural feel when interacting with a 3D scene, whether in VR or in a real-world setting through MR.

Hand Gestures

Only supported for the Varjo XR-3.

This section provides the supported VRED hand gestures for interacting with your digital designs. Your hand movement is tracked as you tap parts of a model with your index finger to initiate touch events, select tools from the xR Home Menu, or interact with HTML 5 HMI screens. Use these gestures to navigate scenes, without the need for controllers.

Note:

The virtual hands will automatically adjust to match your real-world ones.

Touch Interactions

Use your index finger’s tip (on either hand) to touch and interact with scene content, such as the xR Home Menu, objects with touch sensors, or HTML 5 HMI screens.

xR Home Menu

To open and close the xR Home Menu, use the same gestures.

xR Home Menu

Move your index finger to the palm of your other hand. This opens the menu. Now you can use your index finger to select an option from the menu. Repeat the initial gesture to close the xR Home Menu.

xR Home Menu

For information on the xR Home Menu and its tools, see xR Home Menu.

Tip:

If your hands are in front of a virtual object, but are occluded by it, enable Hand Depth Estimation.

Teleport

To teleport, you will need to initiate the teleporter, orient the arc and arrow, then execute a jump to the new location. When finished, terminate the teleporter.

Initiating and Terminating Teleport

To initiate and terminate teleporting, use the same gestures.

Using one hand, tap the back of the other hand. This initiates the teleport from the tapped hand. Now orient the arc and teleport. Repeat this gesture to terminate the teleporter.

Initiate Teleport

Teleport Orientation

Rotate your wrist, while the teleport arc is displayed, to orient the teleporter. Now teleport.

Teleport orientation

Teleporting

With the hand not currently displaying the teleport arc, pinch your index finger to thumb. This accepts the teleport arc location and orientation, executing the teleport. When finished, terminate the teleport to exit the tool.

Teleporting

For information on teleporting, see Teleporting.

Laser Pointer

To use the Laser Pointer, you will need to initiate, execute, then terminate it.

Initiating and Terminating the Laser Pointer

To initiate and terminate the Laser Pointer, use the same gestures.

Point your thump, index, and middle fingers out with palm facing towards camera. Now, use the laser pointer to point to things or trigger interactions with scene content. Repeat this gesture to terminate the Laser Pointer.

Initiating and Terminating the Laser Pointer

Using the Laser Pointer

Use your index finger to point at scene content.

Pointing with the Laser Pointer

Pinch your index finger to thumb together to trigger interaction with scene content, such as selecting tools from the xR Home Menu, activating a touch sensor, or interacting with HTML 5 HMI screens. When finished, terminate the Laser Pointer to exit the tool.

Using the Laser Pointer

Hands in VR

VR VRED comes with one set of standard hands, which support joints/bones with pre-skinned vertices for hand representation. This makes for a smoother transition between hand poses. Gestures are driven by an HMD-specific controller setup.

An X-Ray material is used for the hands to improve the experience.

VR hands

To enter VR, select VR (the VR feature) in the xR Home Menu.

Using Hands in VR

When activating the Oculus Rift or OpenVR mode, one hand appears per tracked controller.

Note:

VRED uses the OpenVR 1.12.5 SDK.

You can use it to interact with WebEngines and touch sensors in the scene. Only the index finger can be used for interaction. When the finger touches interactable geometry, a left mouse press/move/release is emulated for WebEngines and touch sensors on that geometry.

When using a script with vrOpenVRController, the hands will disappear automatically. You can enable them again with controller.setVisualizationMode(Visualization_Hand).

For example scenes with touch sensors and WebEngines to interact with:

How to Make Nodes Interactable

Nodes in the scene are not interactable, by default. All nodes you want to interact with need to be made "interactable" with the Python command setNodeInteractableInVR(node, True). Please view the Python documentation (Help > Python Documentation) for more details. Enter the command in the Script Editor, press Run. The setNodeInteractableInVR command is not persistent and needs to be executed every time you load a scene. We recommend you save the scene with the script, so the nodes are interactable the next time you open the scene.

VR Hand Poses

Besides using your fingers and hands for pressing and touching buttons on the HTC VIVE or Oculus Touch Controllers, use them to communicate with others.

Use the following poses:

Tip:

If you have a Varjo XR-3, there are gestures for interacting with your digital designs. See Hand Gestures.

Customization for VR Hands

Use Python scripting and HTML5 to create custom setups. You can find script files in the VRED Examples folder (C:\\ProgramData\\Autodesk\\VREDPro-11.0\\Examples).

Here are some of the added scripts:

Hands in MR

Currently, only Varjo XR HMDs are supported for MR.

MR In MR (mixed reality), VRED replaces the VR X-Ray material hands with your own.

Hands in MR

To enter MR, select MR (the MR feature) in the xR Home Menu.

Be aware, there are differences in the supported functionality for the XR-1 and XR-3 in MR.

Connecting Hand Trackers with Python

Only available for Varjo XR-3 users.

To see how to connect hand tracking using Python, select File > Open Examples and navigate to the vr folder, where you'll find externalHandTracking.py. This provides an example for how to set hand tracking.

The returned object is of type vrdTrackedHand.

Use Tracked Hands in VR

Only available for Varjo XR-3 users.

Use the Virtual Reality preferences to set the default behavior for hand tracking in VR. Choose from Varjo Integrated Ultraleap for hand tracking in MR or Custom to use other hand tracking devices. To adjust any offset between the tracked hands and the hands rendered in VR, use Translation Offset and Rotational Offset.

Setting up Hand Tracking for Other Devices

For the Custom Tracker option, you must provide all the tracking data to VRED's Python interface. How to do this can vary from device to device; however, if you can access the tracking data via Python script, the data needs to be set into the vrdTrackedHand objects returned by the methods used. This requires the transformation data of the tracked hand and/or different finger joints (see the externalHandTracking.py example file for how this works).

For testing, set the corresponding preferences, load the script, and enter VR. You may have to modify the script by changing values for hand and/or joint transforms to understand how everything works.

Hand Depth Estimation

Currently, only Varjo XR HMDs are supported for MR.

Hand Depth Estimation tool Detects your real-world hands in the mixed reality video and shows them in front of a virtual object, if they are closer than the object. When disabled, the rendering from VRED will always occlude the real hands, even if the hands are closer than the rendered object.

Find this option in the xR Home Menu. To set a default behavior for this option, visit the Virtual Reality preferences > HMD tab > Varjo section.

Hand Depth Estimation OFF Hand Depth Estimation ON
Hand Depth Estimation OFF Hand Depth Estimation ON

Displaying Your Hands in Mixed Reality

Currently, only Varjo XR HMDs are supported for MR.

Use the Hand Depth Estimation tool to display your hands, while in mixed reality.

  1. Wearing a connected Varjo HMD that supports MR, in VRED, select View > Display > Varjo HMD.

  2. Press the Menu button on your controller to access the xR Home Menu.

    Hand Depth Estimation OFF

  3. Press Hand Depth Estimation icon to see your real-world hands in MR.