Hands in VR and MR
Hands in VRED are fully rigged and posable with preset gestures for a more realistic and natural experience. They provide better visual and haptic feedback. Use them to interacting with a 3D scene, as well as HTML5 content within that scene. Explore HMI designs directly in VR using these hands.
If you are having problems with hand tracking, see Troubleshooting Hand Tracking in the Hands for MR section.
Hand Gestures
This section provides information about the supported VRED hand gestures used to interact with your digital designs. Your hand movement is tracked as you tap parts of a model with the tip of your index finger to initiate touch events, select tools from the VR Menu, or interact with HTML 5 HMI screens. Use these gestures to navigate scenes without the need of controllers.
The virtual hands will automatically adjust to match your real-world ones.
Hands for VR
To enter VR, select (the VR feature) in the VR Menu.
VRED comes with one set of standard hands, which support joints/bones with pre-skinned vertices for hand representation. This makes for a smoother transition between hand poses. Gestures are driven by an HMD-specific controller setup.
An X-Ray material is used for the hands to improve the experience. As of 2025.2, VR devices and hands are rendered in raytracing modes with color corrections, such as tonemapping and gamma, applied.
Using Hands in VR
When activating the Oculus Rift or OpenVR mode, one hand appears per tracked controller.
If using a Varjo HMD, for hand tracking in VR, check out Use Tracked Hands in VR.
VRED uses the OpenVR 1.12.5 SDK.
You can use it to interact with WebEngines and touch sensors in the scene. Only the index finger can be used for interaction. When the finger touches interactable geometry, a left mouse press/move/release is emulated for WebEngines and touch sensors on that geometry.
When using a script with vrOpenVRController
, the hands will disappear automatically. You can enable them again with controller.setVisualizationMode(Visualization_Hand)
.
When working in OpenVR and in Oculus Rift mode:
For an example scenes with touch sensors, see File > Open Examples >
deprecated_VR_examples
>OculusTouchExample.vpb
. To see what this file is doing, check out (Deprecated) Oculus touch example. However, for better touch examples, see the following:- Implementation of a custom device interaction
- Combine a custom and a default device interaction
- Scale geometry in VR by using controllers
- Print the current finger position on the touchpad
When working in OpenVR:
For example scenes with touch sensors and WebEngines to interact with, see File > Open Examples >
deprecated_VR_examples
>VR-hands-webengine-buttonVibration-openvr.vpb
. Touch the calculator buttons with the index finger. The calculator material is connected to a web engine. Touching the web engine geometry with the index finger triggers a mouse events on the website and a controller vibration. The calculator web engine has javascript code that sends arequestVibration()
Python command to the VRED WebInterface, when a button on the web site is hit. This triggers haptic feedback (vibrations) on the connected controller device.For example scenes with trackers, see File > Open Examples >
deprecated_VR_examples
>OpenVRTracker.vpb
. To see what this file is doing, check out (Deprecated) OpenVR Tracker. However, for an updated example, see Add a child node to the node of a VR Tracker. In this script, a box is added as child to the node of a VR tracker.
Ultraleap Hand Tracking in OpenXR
Please note that an Ultraleap driver above 5.17 is required.
Download and install the Ultraleap software https:/leap2.ultraleap.com/gemini-downloads/.
Run the Ultraleap software and use their control panel to ensure hand tracking works.
In VRED, select Edit > Preferences > Extended Reality > Interaction and do the following:
- In the General section, set Default Visualization to Hands.
- In the Hand Tracking section, enable Use Tracked Hands in VR and set Tracker to Custom.
At the bottom of the dialog, click Apply and Save.
In the View menu, select Display > OpenXR HMD to see tracked hands in VR.
Known Issues
In OpenXR, since VR controllers/wands appear as hands; therefore, four hands will be displayed in the scene (two for the wands and two for tracked hands).
- Workaround: Power off the VR controllers.
There might be runtime conflicts preventing hand tracking from working in VRED.
- Workaround: Try minimizing the OpenXR runtimes on the PC and keeping only the one you need.
How to Make Nodes Interactable
Nodes in the scene are not interactable, by default. All nodes you want to interact with need to be made "interactable" with the Python command setNodeInteractableInVR(node, True)
. Please view the Python documentation (Help > Python Documentation) for more details. Enter the command in the Script Editor, press Run. The setNodeInteractableInVR
command is not persistent and needs to be executed every time you load a scene. We recommend you save the scene with the script, so the nodes are interactable the next time you open the scene.
VR Hand Poses
Besides using your fingers and hands for pressing and touching buttons on the HTC VIVE or Oculus Touch Controllers, use them to communicate with others.
Use the following poses:
- Forefinger to point and touch objects
- Thumb's up/down to communicate if something is good or bad
- Open hand for waving
Customization for VR Hands
Use Python scripting and HTML5 to create custom setups. You can find script files in the VRED Examples folder (C:\ProgramData\Autodesk\VREDPro-<internalVersion>\examples\deprecated_VR_examples)
.
Here are some of the added scripts:
To enable haptic feedback (controller vibration) when a hand touches an interactable node.
VR-hands-vibrate-openvr.py
- For OpenVR modeVR-hands-vibrate-oculus.py
- For Oculus Rift mode
To change hand color / hit point color
VR-hands-color-openvr.py
- For OpenVR modeVR-hands-color-oculus.py
- For Oculus Rift mode
To interact from a distance (instead of directly touching something) by just pointing with the index finger at nodes in the scene. A sphere appears at the hit point. When activating the "Pointing" pose (Grip button pressed, Index trigger button not pressed), touch sensors and web engines can be executed.
VR-hands-pointing-openvr.vpb
- For OpenVR mode
Hands for MR
To enter XR, select (the MR feature) in the VR Menu.
In MR, VRED replaces the VR X-Ray material hands with your own.
If you want to change this behavior and re-enable VR Hands in MR, click here for an article that explains how to do that with a custom Python Script.
Be aware, there are differences in the supported functionality for the XR-1 and XR-3 in MR.
The XR-1 supports hand depth estimations and marker tracking.
The XR-3 supports hand depth estimations, marker tracking, and hand tracking (gestures). This means hand gestures can be used for scene interaction in place of controllers. Since VRED is replacing the standard hands which support joints/bones with your real-world hands, you can use Python to connect hand trackers for scene interaction.
Tip:Use the Extended Reality > Interaction preferences to set default behaviors for hand tracking, teleporting, marker tracking, and Extended Reality > HMD preferences for your HMD.
Connecting Hand Trackers with Python
To see how to connect hand tracking using Python, select File > Open Examples and navigate to the vr
folder, where you'll find externalHandTracking.py
. This provides an example for how to set hand tracking.
- For the left hand use
vrDeviceService.getLeftTrackedHand()
- For the right hand use
vrDeviceService.getRightTrackedHand()
The returned object is of type vrdTrackedHand
.
Disabling Tracked Hand Gestures
We added two Python commands, one for enabling or disabling the detection of any of gestures in vrHandTypes.HandTrackingGesture
when using tracked hands in XR and another for checking if a gesture is enabled. To block users from accidentally activating any interactions through hand gestures, disable these commands. To allow users to activate VRED's standard interactions, like showing the VR Menu or activating the pointer or teleport, enable them.
vrImmersiveInteractionService.setHandTrackingGestureEnabled(gesture, isLeftHand, enable)
vrImmersiveInteractionService.getHandTrackingGestureEnabled(gesture, isLeftHand)
->bool
Select Edit > Script Editor.
Copy and paste the following code into the Script Editor:
interactions = vrDeviceService.getInteractions() for i in interactions: print("Deactivating: " + str(i.getName())) vrDeviceService.deactivateInteraction(i.getName())
Click Run to execute. You will see the following terminal output:
Deactivating: GestureDetection Deactivating: GroundCalibration Deactivating: Place Deactivating: Pointer Deactivating: Teleport Deactivating: Tools Menu Deactivating: Tooltips
Note:This above deactivates all gestures. If only Teleport needs to be deactivated, use
vrDeviceService.deactivateInteraction(Teleport)
.
Use Tracked Hands in XR
Use the Extended Reality > Interaction preferences to set the default behavior for hand tracking in VR. Choose from Varjo Integrated Ultraleap for hand tracking in VR, XR, or Custom to use other hand tracking devices. To adjust any offset between the tracked hands and the hands rendered in VR, use Translation Offset and Rotation Offset.
Setting up Hand Tracking for Other Devices
For the Custom Tracker option, you must provide all the tracking data to VRED's Python interface. How to do this can vary from device to device; however, if you can access the tracking data via Python script, the data needs to be set into the vrdTrackedHand objects returned by the methods used. This requires the transformation data of the tracked hand and/or different finger joints (see the externalHandTracking.py
example file for how this works).
For testing, set the corresponding preferences, load the script, and enter VR. You may have to modify the script by changing values for hand and/or joint transforms to understand how everything works.
Troubleshooting Hand Tracking
Some issues are due to preference settings (Edit > Preferences > Extended Reality).
- For hand tracking, in the Interaction tab > General section, ensure Default Visualization is set to Hands. In the Hand Tracking section, ensure Use Tracked Hands in VR is enabled and Tracker is set to Varjo Integrated Ultraleap.
- For activating the creation of collision objects, in the Interaction tab > General section, ensure Create Collision Objects is enabled.
- For setting hand offset, use Translation Offset and Rotational Offset.
Here are some common hand tracking issues and how to resolve them.
If you cannot see your hands in VR, check the following:
Does hand tracking works in Varjo Base software? If not, restart Varjo Base.
Are you preferences set up correctly?
Are Ultraleap runtimes installed and running? Hand tracking will only work in VRED if only the Varjo runtime provides hand tracking data.
Is Varjo HMD selected in View > Display? If OpenVR HMD is selected, hand tracking is not supported and it won't work.
Does the hand tracking use the network ports. If these are used by another application, it could cause a conflict. We recommend using a tool like TCPView to check port availability and usage. The default port is 12345, but can be changed by adding a
client_config.json
file toProgramFiles\Varjo\varjo-handtracking\Ultraleap
.
If touch doesn't work, check that the creation of collision objects is active in the preferences.
If hands don't work in MR, but do in VR, check the following:
- Does the open VR Menu move with the hand? Try to carefully touch some menu items. This might require some practice.
- Does the position of the VR Menu look offset on your real hands, in comparision to your virtual hands in VR? If so, correct the offset in the preferences. See Translation Offset and Rotational Offset in the Hand Tracking section.
Hand Depth Estimation
Detects your real-world hands in the mixed reality video and shows them in front of a virtual object, if they are closer than the object. When disabled, the rendering from VRED will always occlude the real hands, even if the hands are closer than the rendered object.
Find this option in the VR Menu. To set a default behavior for this option, visit the Extended Reality > HMD preferences > Varjo section.
Hand Depth Estimation OFF | Hand Depth Estimation ON |
---|---|
![]() |
![]() |
Depth Testing for Varjo MR
We've added support for depth testing with Varjo mixed reality headsets. Now, depth occlusion scenarios are possible, using Depth Estimation when in mixed reality.
To use depth testing, do the following:
- In the Extend Reality > HMD preferences > Eye Tracking section, enable Eye Tracking.
- In the Varjo section, enable Depth Estimation.
- Click Apply and Save.
Displaying Your Hands in Mixed Reality
Use the Hand Depth Estimation tool to display your hands, while in mixed reality.
You can set VRED's default behavior to open in MR and have Depth Estimation enabled using the Extended Reality > HMD > Varjo options.
Wearing a connected Varjo HMD that supports XR, in VRED, select View > Display > Varjo HMD.
Press the Menu button on your controller to access the VR Menu.
Press
to switch to mixed reality.
Press
to see your real-world hands, activating depth occlusion in mixed reality.
Teleport
To teleport, you will need to initiate the teleporter, orient the arc and arrow, then execute a jump to the new location. When finished, terminate the teleporter.
Initiating and Terminating Teleport
To initiate and terminate teleporting, use the same gestures.
Using one hand, tap the back of the other hand. This initiates the teleport from the tapped hand. Now, orient the arc and teleport. Repeat this gesture to terminate the teleporter.
Teleport Orientation
Rotate your wrist, while the teleport arc is displayed, to orient the teleporter. Now teleport.
Teleporting
With the hand not currently displaying the teleport arc, pinch your index finger to thumb. This accepts the teleport arc location and orientation, executing the teleport. When finished, terminate the teleport to exit the tool.
For information on teleporting, see Teleporting.
Laser Pointer
To use the Laser Pointer, you will need to initiate, execute, then terminate it.
Initiating and Terminating the Laser Pointer
To initiate and terminate the Laser Pointer, use the same gestures.
Point your thump, index, and middle fingers out with palm facing towards camera. Now, use the laser pointer to point to things or trigger interactions with scene content. Repeat this gesture to terminate the Laser Pointer.
Using the Laser Pointer
Use your index finger to point at scene content.
Pinch your index finger to thumb together to trigger interaction with scene content, such as selecting tools from the VR Menu, activating a touch sensor, or interacting with HTML 5 HMI screens. When finished, terminate the Laser Pointer to exit the tool.