XR Enhancements and Improvements

VRED 2022 added Varjo MR marker placement and preferences, hand gestures, a hand depth estimation, along with out-of-the-box object placement in XR and VR via the VR Menu. We also added functions for a Python interface for implementing a custom version of marker tracking. They can be found in the vrImmersiveInteractionService documentation.

Varjo Markers

Only available for Varjo XR HMD users.

Marker icon Use a visible marker, known as the origin marker, for easy placement of objects in a mixed reality scene. The coordinate system is automatically synced; however, your nodes need to be prepared before entering MR.

Marker with coordinate system

Use Marker icon from the VR Menu to enable supported HMDs to detect markers. Once the HMD recognizes the marker, if a node is tagged with VarjoMarker, the node content is moved to the marker's location. The marker's ID number needs to be set as a separate tag.

Marker displaying the ID number

Note:

Tag syntax is case-sensitive.

Multiple markers can be combined into a single marker. This improves tracking stability.

Combined markers into a single marker

For example, multiple nodes are tagged with VarjoMarker, but each has a different tag number. The marker tracking system detects the markers and assigned a confidence value to each, ranging from 0.0 to 1.0. If deems correct, the node content is moved to the marker's location. For more about confident values, see the Min Marker Confidence Virtual Reality preference.

Preparing Nodes for Markers

Only available for Varjo XR HMD users.

Follow these steps to prepare your nodes before entering MR. Note that objects that won't fit in your room will need to be scaled down before entering MR.

Example scene tag node structure for markers

  1. Add a tag and rename it VarjoMarker.

  2. Drag and drop an object node onto this tag to assign it.

  3. Create another tag and rename it to the number printed on the Varjo marker.

    Marker in a scene with close up of number

  4. Drag and drop the same object node onto this tag.

Using Markers

Only available for Varjo XR HMD users.

Nodes from your scene must be tagged before continuing. If this has not been done, see Preparing Nodes for Markers before continuing.

  1. Wearing a connected Varjo HMD that supports MR, in VRED, select View > Display > Varjo HMD. The objects that were tagged with the VarjoMarker and number tags appear in MR.

  2. Open the VR Menu with your controller and select Enable Markers.

    Enable Markers

    The geometry jumps directly onto the marker.

    Tip:

    Disable Enable Marker in the VR Menu to stop any jiggling of the object that may occur.

Hands in MR

Currently, only Varjo XR HMDs are supported for MR.

MR In XR, VRED replaces the VR X-Ray material hands with your own.

Hands in MR

To enter MR, select MR (the MR feature) in the VR Menu.

Be aware, there are differences in the supported functionality for the XR-1 and XR-3 in MR.

Connecting Hand Trackers with Python

To see how to connect hand tracking using Python, select File > Open Examples and navigate to the vr folder, where you'll find externalHandTracking.py. This provides an example for how to set hand tracking.

The returned object is of type vrdTrackedHand.

Hand Depth Estimation

Currently, only Varjo XR HMDs are supported for MR.

Hand Depth Estimation icon Detects your real-world hands in the mixed reality video and shows them in front of a virtual object, if they are closer than the object. When disabled, the rendering from VRED will always occlude the real hands, even if the hands are closer than the rendered object.

Find this option in the VR Menu. To set a default behavior for this option, visit the Virtual Reality preferences > HMD tab > Varjo section.

Hand Depth Estimation OFF Hand Depth Estimation ON
Hand Depth Estimation OFF Hand Depth Estimation ON

Displaying Your Hands in Mixed Reality

Currently, only Varjo XR HMDs are supported for MR.

Use the Hand Depth Estimation tool to display your hands, while in mixed reality.

  1. Wearing a connected Varjo HMD that supports MR, in VRED, select View > Display > Varjo HMD.

  2. Press the Menu button on your controller to access the VR Menu.

    Hand Depth Estimation OFF

  3. Press Hand Depth Estimation icon to see your real-world hands in MR.

Hand Gestures

This section provides the supported VRED hand gestures for interacting with your digital designs. Your hand movement is tracked as you tap parts of a model with your index finger to initiate touch events, select tools from the VR Menu, or interact with HTML 5 HMI screens. Use these gestures to navigate scenes, without the need for controllers.

Note:

The virtual hands will automatically adjust to match your real-world ones.

Touch Interactions

Use your index finger’s tip (on either hand) to touch and interact with scene content, such as the xR Home Menu, objects with touch sensors, or HTML 5 HMI screens.

xR VR Menu

To open and close the xR Home Menu, use the same gestures.

VR Menu

Move your index finger to the palm of your other hand. This opens the menu. Now you can use your index finger to select an option from the menu. Repeat the initial gesture to close the VR Menu.

VR Menu gesture

For information on the VR Menu and its tools, see VR Menu.

Tip:

If your hands are in front of a virtual object, but are occluded by it, enable Hand Depth Estimation.

Teleport

To teleport, you will need to initiate the teleporter, orient the arc and arrow, then execute a jump to the new location. When finished, terminate the teleporter.

Initiating and Terminating Teleport

To initiate and terminate teleporting, use the same gestures.

Using one hand, tap the back of the other hand. This initiates the teleport from the tapped hand. Now orient the arc and teleport. Repeat this gesture to terminate the teleporter.

Initiate Teleport

Teleport Orientation

Rotate your wrist, while the teleport arc is displayed, to orient the teleporter. Now teleport.

Teleport orientation

Teleporting

With the hand not currently displaying the teleport arc, pinch your index finger to thumb. This accepts the teleport arc location and orientation, executing the teleport. When finished, terminate the teleport to exit the tool.

Teleporting

For information on teleporting, see Teleporting.

Laser Pointer

To use the Laser Pointer, you will need to initiate, execute, then terminate it.

Initiating and Terminating the Laser Pointer

To initiate and terminate the Laser Pointer, use the same gestures.

Point your thump, index, and middle fingers out with palm facing towards camera. Now, use the laser pointer to point to things or trigger interactions with scene content. Repeat this gesture to terminate the Laser Pointer.

Initiating and Terminating the Laser Pointer

Using the Laser Pointer

Use your index finger to point at scene content.

Pointing with the Laser Pointer

Pinch your index finger to thumb together to trigger interaction with scene content, such as selecting tools from the VR Menu, activating a touch sensor, or interacting with HTML 5 HMI screens. When finished, terminate the Laser Pointer to exit the tool.

Using the Laser Pointer

Place Tool

Only available in MR mode.

Note:

When in a collaborative XR session, the position of objects are syncronized for users. Place tool Use for placing objects on and moving them along the ground of the scene, using the laser pointer. These objects must have an assigned scene tag, called Place.

The Place tool uses a similar approach to Teleport for rotating objects, that is rotating your wrist. You can snap the rotation to a 10° angle by pressing the controller's Grip button. Use this when trying to align one object with another. It is also possible to place objects atop one another.

Preparing Nodes for Placement

Only available in MR mode.

To pick and place objects in MR, you must first prepare the nodes. If objects are too large or small for your room, they will need to be scaled to an appropriate size before entering MR.

image of an object with Place scene tag

  1. Create a tag and rename it Place.
  2. In the Scenegraph, drag and drop an object node onto this tag.

Using Place

Only available in X mode.

Nodes from your scene must be tagged before continuing. If this has not been done, follow the instructions in Preparing Nodes for Placement.

Placing an object in MR

  1. Wearing a connected Varjo HMD that supports MR, in VRED, select View > Display > Varjo HMD. The objects that were tagged with the Place appear in MR.

  2. Open the VR Menu by pressing the menu button on your controller, then select Place tool (Place).

    Place icon in the VR Menu

    When highlighted orange, the tool is enabled. When gray, the tool is disabled. An indicator appears above your controller stating what is active and picked.

  3. Point to an object and continually squeeze the trigger to pick and move it along the virtual ground.

  4. Rotate your wrist to rotate the object around the up axis. This action is similar to that of the Teleport tool.

    Tip:

    Squeeze the Grip button to snap to 10 degree angle increments. Use this to help with the alignment of your object to another.

Hand Tracking Preferences

Only available for Varjo XR-3 users.

We added interaction preferences to the Virtual Reality preferences for enabling hand tracking and setting translational and rotational hand offset for tracked hands.

Tip:

Enabling hand tracking in the preferences automatically activates it for VR or MR.

Interaction Preferences

Use Tracked Hands in VR

Only available for Varjo XR-3 users.

Sets the default to always track hands in VR when enabled. If Tracker is set to Varjo Integrated Ultraleap, hand tracking is also enabled in XR for supported HMDs.

Tracker

Only available for Varjo XR-3 users.

Sets the default system used for hand tracking.

Translation Offset

Set the default translational tracking offset from the hands. Use this to adjust any offset between the tracked hands and the hands rendered in VR.

Rotational Offset

Set the default rotational tracking offset from the hands. Use this to adjust any offset between the tracked hands and the hands rendered in VR.

Marker Tracking

The marker tracking system detects each marker and assigned a confidence value to it, ranging from 0.0 to 1.0.

For example, a marker that gets a value of 0.9 means there is a 90% confidence in the correctness of the marker's position and IDs.

Min Marker Confidence

Sets the default minimum value used by the marker tracking system to determining if the detected marker position is correct.

Varjo Preferences

We added Varjo HMD-specific preferences to the HMD tab for settings the default native foveated rendering state, the mode you enter, and hand depth.

Varjo Preferences

Native Foveated Rendering

Requires Eye Tracking to be enabled.

In the HMD tab > Varjo section, use to set the default state for how things in the periphery are rendered. When enabled, peripheral resolution (image quality) is reduced; however, areas tracked by your eye are still rendered at high resolution. This improves performance in scenes with compute intensive materials, and when using real-time antialiasing. For more information on foveated rendering and the different settings, see Custom Quality.

Default Mode

In the HMD tab > Varjo section, use to set the default viewing mode for a Varjo HMD. If you always work in mixed reality, set this to MR.

Hand Depth Estimation Preference

In the HMD tab > Varjo section, use to set the default state for real-world hands in MR. When enabled, it detects your real-world hands in the mixed reality video and shows them in front of a virtual object, if they are closer than the object. When disabled, the rendering from VRED will always occlude the real hands, even if the hands are closer than the rendered object.

Hand Depth Estimation OFF Hand Depth Estimation ON
Hand Depth Estimation OFF Hand Depth Estimation ON

Python for Markers

These are the functions for the Python interface for implementing a custom version of marker tracking. They can be found in the vrImmersiveInteractionService documentation.

createMultiMarker

vrImmersiveInteractionService.createMultiMarker(multiMarkerName, markerNames, markerType) Creates a multi marker by averaging the pose of multiple regular markers.

getDetectedMarkers

vrImmersiveInteractionService.getDetectedMarkers(markerType) Gets all detected markers of a given type.

getMarker

vrImmersiveInteractionService.getMarker(name, markerType) Gets a marker that has already been detected.

getMinMarkerConfidence

vrImmersiveInteractionService.getMinMarkerConfidence() See also: setMinMarkerConfidence.

setMinMarkerConfidence

vrImmersiveInteractionService.setMinMarkerConfidence(confidence) Sets the minimum marker confidence. When markers are detected with a lower confidence they will be ignored. Markers that are already known to the system will not be updated, if the updated data has a lower confidence.

markersDetected

vrImmersiveInteractionService.markersDetected(markers) This signal is triggered when new markers are detected that have a confidence that is equal or higher than minimum marker confidence.

markersUpdated

vrImmersiveInteractionService.markersUpdated(markers) This signal is triggered when new markers are detected that have a confidence that is equal or higher than minimum marker confidence.