Part 4: VR

For this last, you will learn about how to use the scene for virtual reality.

Video captions: Hello and welcome to this tutorial for VRED. My name is Christopher, and, in this video, I will give you an introduction to interactions in VR, how to use the various tools that are provided out-of-the box and how to navigate your scenes.

Starting VR

VRED supports a wide range of headsets out of-the-box. A popular headset in this professional area is for example the HTC Vive Pro Series. But there is also support for StarVR headsets, that integrate eye tracking to accomplish foveated rendering. And not to forget Varjo headsets, that besides eye tracking also integrate native hand tracking. Which means you can navigate and interact with your scene only using hand gestures.

But really any headset that is compatible to OpenVR or SteamVR should work for VRED. Just make sure you have all necessary drivers installed and you should be good to go. For this tutorial I’m using the Valve Index headset that operates with SteamVR.

The VR mode can be started by selecting View in the menu bar, Display and OpenVR HMD or another option that fits your headset. The VR mode will be started from the current camera position in your scene. You can exit VR mode by selecting Standard Display again.

Controller Overview

Because VRED supports different kinds of Headsets there are also different kinds of controllers. In this case I’m using the Valve Index controllers that have a different button layout than other manufactures. In production settings classical controllers are widely adopted but it’s also possible to control your scene with hand gestures using the new Varjo headsets. Regardless of the form factor of the controller however, three main interactions are always supported: Pointer, Teleport and access to the VR menu.

The pointer is the equivalent of a mouse click in VR. You can point on the object you want and hit the trigger to active the attached interaction. The pointer is also used to navigate the VR Menu. On any controllers the pointer should be activated by the trigger button. That is the button near your index finger. Since version 2023, you can adjust the pointer settings in the preferences, set the pointer beam width and also the hit point size, which can be useful in large environments.

When in hand visualization mode, you can also interact with objects, just like with the pointer, by just reaching out and tapping them with your index finger. With Varjo headsets that support handtracking, you can activate hand gesture interaction in the preferences. For a complete reference on all hand gestures that are available, you can have a look at the documentation.

The teleporter is mapped to the touch pad or control sticks of your controllers. When just touching the pad, you should initiate the teleport tool that shows you where to teleport. A press on the touchpad should actually set off the teleport. But more on that later. The VR menu can be opened by pressing one of the buttons of your controllers. As every controller is different you have to test which one works for your setup. The Valve Index controllers for example use the B-Button for this. This VR menu is attached to your hand, or controller and offers access to tools, variants sets and so on. We will cover all the available tools in moment.

When in VR, there are basically three ways to get around in your scene. First the most obvious one, walking. Depending on your physical space you have at your setup, this is the most natural way of exploring your models. Even when you only have a few meters to move it really helps to immerse in the virtual world. In general, your virtual worlds most of the time are way larger than your physical space. But most VR headsets offer visual indicators when you leave your save space to walk around.

The second way of moving around is by changing the viewpoint with the VR Menu. This will instantly teleport you to the location of the viewpoint. In the VR Menu you can select which attributes of a viewpoint should be applied. For example, you can choose if the orientation of the viewpoint should be used.

Lastly, the teleportation tool helps you navigate scenes in VR when you either have not enough physical space to walk around, or when the scene or distance to travel is much larger than your physical area.

The teleport tool is active by default and how to use it depends on your VR Headset and controllers. In this case I’m using the Valve Index controllers which allow me to trigger the teleporter with the touch pad. By touching the pad with my thumb, a teleport indicator is shown, which highlights the area and direction where I will be teleported to. I can turn my wrist to adjust the direction I’m facing. When I press the touch pad the teleport is executed. Most controllers use this two-step approach to teleporting. When I try to teleport out of range, the teleporter will show this by changing to a red color.

I can adjust the behavior of the teleporter either in the VR menu or in the VR preferences. In the VR menu I have the option to teleport either on the ground plane or onto geometry. When you select teleportation on the ground plane you can teleport for example into a car and explore the interior. The ground plane can also be calibrated with the “Calibrate Ground” option that you can find in the VR menu.

In the preferences it is also possible to adjust the maximum teleportation distance.

The VR Tools

The VR Menu offers some tools that can be useful to explore the scene. There is a flashlight that allows you to shine into dark parts of your models. There is also a measure tool that shows you the distance between two points. You select the first point by holding and releasing the pointer. After pointing on a second location in your scene, the measurement will be shown.

This option allows you to switch between hand and controller visuals. For some people the hand representation is more comfortable, but for other controllers might feel more natural. Here, you have the option to choose.

This menu allows you to select viewpoints that are defined on your camera. As mentioned earlier, you can define which property of a viewpoint is changed. For some users it might be more comfortable to keep the viewing direction to prevent disorientation. For each viewpoint you can define whether it should be shown in the VR menu.

Very similar to the viewpoint menu is the variant set menu. This menu shows all variant sets and variant set groups you have defined in your scene. You can change geometry switches, materials, environments, and actually everything else you can do with variant sets. By default, all variant sets are shown in the VR menu. But like viewpoints, you can specify if a variant set group or variant set is included.

Of course, it’s also possible to attach scripts to your variant sets. This makes it possible to include scripted tools or change render settings while in VR. In this case I can reduce the real time antialiasing quality when the performance is not comfortable anymore and increase my framerate.

Touchpoints

As seen in a previous video on interactions by Christian, touch sensor can be used to transform objects in your scene into buttons. These buttons can trigger variant sets and therefore pretty much anything you want. In VR touchsensors work exactly the same. You can select interactive objects with the pointer of your controller and trigger the linked action.

Touch sensors can be created by opening “Interaction” in the menu bar, where you find the Touchsensors dialog. You can drag any node from the scenegraph there to create a touchsensor. When you then drag a variant set onto that new entry, you created a button that activates this variant set. This makes it very easy to create buttons to change materials on a material switch, or shuffle through geometry variants.

In VR, it can be problematic to use a node with a lot of geometry in its child components as a touch sensor. This is because at startup, VRED will build a collision model for every touchsensor which can take a long time for nodes with lots of geometry.

To get a better performance, you can instead use a proxy geometry that acts as a touchsensor and is built of a single sphere or box shape. If you use a lot of touch sensor in your scene, this can significantly improve startup time for your scenes.

You attach the proxy shape on the node that includes your actual hit target. Then apply a material that is 100% transparent to this proxy, so that it is invisible. You also have to change the preferences so that it is possible to select objects that are transparent. Then define your touchsensor as always and link it with a variant set.

VRED offers many VR tools out-of-the box and supports a wide variety of headset, both professional and consumer hardware. There are different tools included, that make it possible to quickly start presentations in VR.

If you are interested in developing your own VR tools, the script examples might help you. They are available in the menu bar under File > Open Examples. They show different ways of adding custom tools to the VR menu.

I hope you enjoyed this overview of interactions in VR. Thank you for watching and see you next time!