Reality XR Game Glove Setup

Reality XR Game Glove Setup

Objective

This knowledgebase article will show users how to setup the Reality XR Game Glove for use in SteamVR and OpenVR or OpenXR runtimes. You will learn how to calibrate your gloves for gesture recognition and hand animation to give the most accurate interactions in VR apps and games.

The Reality XR Train and Reality XR Game gloves are interchangeable hardware, however the software for the XR Train and XR Game app will be different. The XR Train app is designed for companies developing custom applications for enterprise training use-cases.
This guide assumes you have already connected your headset to your PC and have SteamVR installed if using OpenVR, or are using Meta Quest Link or Virtual Desktop to wirelessly stream PCVR games and apps to your Meta Quest headset.
It also assumes you have an XR app or game already setup and configured with either the OpenXR or SteamVR OpenVR runtimes.

Required Materials

Software

The below software is required for this guide, except those marked as optional:

Description

Version

Description

Version

StretchSense XR Game App
0.1.1

Hardware

This guide supports the following hardware options:

StretchSense Gloves

Image

StretchSense Gloves

Image

StretchSense Reality XR Game Gloves
image-20250326-012615.png
StretchSense Studio and Pro Fidelity gloves are not compatible with the StretchSense OpenXR driver.

 

First Time Setup

XR Game App Setup

  1. (optional) Ensure that SteamVR is already installed on your PC if you are planning to run OpenVR apps and games in SteamVR. Ensure that Steam is installed at C:\Program Files (x86)\Steam. You can skip this step if you are using the Meta Quest Link Oculus or Virtual Desktop VDXL OpenXR runtimes.

  2. Install the XR Game app.

  3. Run the XR Game app.

Connecting Reality XR Game Gloves

  1. Turn on your Reality Gloves.

  2. In the XR Game app, open the Settings menu.

    Screenshot 2025-04-11 124339.png
  3. It will show the BLE glove connection dropdown menus where you can select and connect your gloves:

    Screenshot 2025-05-05 120405.png
  4. Assign the detected Reality Gloves to their respective hands:

    1. Left BLE Glove: Reality Glove (L)

    2. Right BLE Glove: Reality Glove (R)

In some environments with a lot of Bluetooth interference, it may take some time for the gloves to show up in the list. If they are not showing up, leave the gloves turned on and try closing the Settings menu and reopening it to force the list to refresh.
  1. Close the Settings menu.

  2. You will be prompted to calibrate the gloves for the first time.

In an environment where there are multiple Reality Gloves detected, we recommend only turning on the desired pair during initial connection. The XR Game app will remember the BLE ID of the glove pair for future sessions. If you are switching between glove sizes, you may want to physically label the gloves with a tag containing the last 4 digits of the BLE ID, as this will require you to manually switch to them in the list.

Using Your Reality XR Game Gloves

Reality XR Game Glove Calibration Explained

If mounting trackers directly to the glove, use the included Reality Glove mounting clip, 1/4” screw and spacer and follow the instructions included in the printed Quick Start Guide you received with your gloves. You may need to adjust the rotation of the trackers if your virtual hands appear backwards. It is important to attach the tracker prior to calibration as this will affect the sensor values during normalization and possibly button gesture training and operation.
For HTC Wrist Trackers or when using the Meta Quest Controller strap, please ignore this step.

Before Reality Gloves can be used in XR games, applications and VMC streaming, they must be calibrated. The calibration process is divided into two parts; the articulation calibration, which calibrates your hand animation data, and the gesture control calibration, which calibrates the button and joystick emulation.

Glove Articulation Calibration Process

Always calibrate and train with the trackers mounted to your gloves. This ensures the calibration accurately reflects the additional weight and pull from the mount and trackers.
  1. Once you have connected your gloves you will be presented with the screen below. Click the Preview Animation button to preview the calibration routine.

    Screenshot 2025-05-05 121316.png
  2. Click Start Capture and follow along the example animation with your hand. Please note that keeping in time with the example animation is important to getting a good calibration.

    hand4.gif
We do recommend that you re-calibrate at the start of each session to compensate for changes in how the gloves are sitting on your hands.

Glove Gesture Control Calibration Process

Training a Button Gesture

When binding each button gesture, rotate your wrist to train the model to expect movement. This will make the gesture easier to detect while your hand is moving.

  1. Screenshot 2025-06-05 200205.png First, we will train an Idle hand gesture to act as a baseline gesture. If this is your first calibration, the Idle gesture will be automatically selected. Form a resting hand pose, gently rotate your wrist and click Bind Pose to capture the idle gesture. Remember to keep rotating your wrist until the training process is complete.

    Idle.png

     

    Once captured, the button icon will turn from dark grey to white, indicating data has been captured. The Bind Pose text on the button will also change to Tune. Tuning is the process of adding more data to the machine learning model. The more data you give it, the stronger the calibration.

    Screenshot 2025-05-05 120734.png
  2. Do this for all the buttons you need on both hands. For best results we recommend you leave buttons untrained if your XR app or game does not use them.

    Screenshot 2025-05-05 120712.png

Once you bound a pose, we recommend tuning the pose to add additional data into the model. The gesture control system is capable of detecting incredibly subtle differences between poses, but this also means that if you don’t make the exact pose, the system may not trigger. By tuning a pose you add additional data into the model and can broaden the range of the pose.

The way you form a gesture or pose will change with context. In tense situations your hands will be more tense, but if you’ve been gaming for hours your hands may take a more relaxed or tired pose. To ensure a reliable experience, make sure you account for this when you Tune your gloves. Tune data for a tight or tense pose, and also for a loose or relaxed pose.
You will find links to lists of recommended gestures at the bottom of this page. This is particularly important for non-intuitive gesture outputs, like Menu, where it’s important to find a pose that you won’t accidentally trigger.

Refining your Gesture using Tune or Idle

Gestures are additive, so once you’ve bound a gesture you can click Tune to capture additional data. This will add more training data to the original capture to improve the button’s gesture detection. It is important that any data added is close to the original gesture trained. Adding dissimilar data will lead to poor results.

Screenshot 2025-05-05 122826.png

You can also decrease the range of detection by training the Idle gesture while making a hand pose that is similar, but not intended to activate a button output. You are in effect training the system when not to trigger an event.

Screenshot 2025-04-11 121346.png If you accidentally train tune a button gesture incorrectly, you can press and hold the delete button at the top of the button dashboard to erase the previously trained data from the model, just for that button.

The fastest way to train a robust gesture model that needs minimal tuning between sessions is as follows:
1 - Train the model.
2 - Take the gloves off, then put them back on.
3 - Test the gesture.
4 - If the gesture doesn’t trigger the correct button output, add more training data by tuning the gesture.
5 - Repeat this process until the gesture works consistently without needing more data.
Once your model reaches this point, your gloves will require very little setup time in future sessions. However, we still recommend testing the model at the start of each session to check if any tuning is needed.

Adjusting Grip and Trigger Sensitivity

While most buttons are binary, the Grip and Trigger buttons output both a scalar and a binary output with an adjustable activation threshold. They are trained the same way, with the scalar button outputting a continuous value indicating the level of activation as well as a binary button state signifying whether the activation level has exceeded the user-defined activation threshold.

  1. For the grip and trigger buttons, click the Settings icon while the button is selected and use the sensitivity slider to adjust how quickly the button reaches the activation threshold (indicated by the white line in the bar).

Screenshot 2025-05-05 132205.png
Screenshot 2025-05-05 132220.png
  1. Find a sensitivity value that gives a good smooth range of movement for the grip and trigger sliders, while consistently allowing the yellow bar to pass beyond the activation threshold.

Screenshot 2025-05-05 132257.png
At any time during gesture training, you can delete a gesture you’re not happy with. This will remove the previously trained data for that gesture and allow you to start fresh without affecting other trained gestures.

Training Multi-Button Gestures

After training the base set of gestures you can train button combinations to allow buttons to be pressed at the same time (e.g. Grip + Trigger). In the case of Grip + Trigger, this allows held virtual objects to be activated when held in hand (e.g. a trigger to drive a power tool).

  1. Start training multiple buttons by clicking the multi-select checkbox in the top left corner of a hand’s button gesture pane.

    1. While in this mode, you can click multiple buttons to add them to the selection. Or you can uncheck multi-select to go back to single button training mode:

      Screenshot 2025-04-10 162949.png
  2. Each selected button will highlight white, then you will see the selected buttons appear in the top pane (as below):

    Screenshot 2025-05-05 131808.png

     

  3. Hit the Bind Pose button to train that multi-button gesture. A list of trained multi-button gestures will appear at the bottom of the pane.

    Screenshot 2025-05-05 131935.png

     

  4. You can click the combo in the list to go back later on and further Tune it or delete the combo:

    Screenshot 2025-05-05 132007.png
For best performance in the Grip + Trigger button combinations, once the combo is trained, exit multi-button mode and tune Grip + Trigger individually to ensure maximum performance. Also adjust the sensitivity for the Grip and Trigger buttons separately if they are not quite reaching their thresholds when both are activated.

Once you have completed both calibration processes, be sure to test inside your application or a SteamVR game.

Disabling Gesture Control Output

For some use-cases where gesture detection is only needed periodically and where the hand animation will drive interaction (e.g. direct touches with the index finger), you may wish to disable the button gestures and leave the hand articulation visible.

Use the disable button gesture output feature while streaming or live performing, especially when handling props or typing on a keyboard for maximum stability.
  1. Click the Disable Controls button and click Bind Pose.

    Screenshot 2025-05-05 120846.png
  2. Whenever you make the Disable Controls gesture and hold it for 3 seconds it will disable button emulation output to SteamVR/OpenVR or OpenXR and will only send hand articulation data.

  3. An audible tone will play in the XR Train app when the toggle is made to indicate the toggle was successful.

  4. The process is the same to re-enable controls, just hold the gesture for 3 seconds.

Pinky Touch Helper Gesture

Screenshot 2025-05-23 164120.png

VRChat users can train a pinky touch helper gesture. When this gesture is performed, it will snap the animation output into a pinky touch pose, which is compatible with VRChat’s built-in gesture system. The following actions are triggered by this gesture in VRChat:

  • Left Hand Pinky Touch - Toggle’s VRChat gesture system on and off. We recommend setting this to off, as VRChat’s system can interfere with the Reality XR Game controller emulation.

  • Right Hand Pinky Touch - Toggle’s VRChat’s floating radial action menu. This particular menu is positioned in world space and can be navigated using laser pointer controls, making it ideal if you have trained and set the XR Game joystick to D-Pad mode, but still want to toggle avatar features or summon and control the VRChat camera for stills and video streaming.

We recommend training this by pressing your thumb to your distal knuckle, as pictured. This makes the gesture easily repeatable.

Pinky Touch

Pinky Touch

pinky-touch.jpg

Calibrating the Digital Joystick

The D-Pad allows discrete calibration of the joystick Up / Right / Down / Left buttons with individual gestures and allows for the calibration of a single direction if only one required (e.g. UP for activating the teleport raycaster).

For the fastest setup and ease of use, we recommend controlling direction of travel by the user looking where they want to travel and using the StretchSense D-Pad joystick to control moving forwards. This minimizes the controls a user needs to learn and provides a reliable and intuitive experience.

An analog/scalar joystick is available however it takes longer to setup and tune, so we only recommend using the analog/scalar joystick where necessary and for advanced users. For information on the analog/scalar joystick, see the left-hand sub-menu.

Any gesture can be used to control any D-Pad direction. We recommend using a “horns” gesture for all joystick directions as it is not a pose you will accidentally trigger mid-session. The position of the thumb can then be used to control direction of travel.

To calibrate the D-Pad joystick:

  1. Click the Joystick center circle (neutral joystick) to select it.

  2. To bind the neutral joystick position, make a “horns“ gesture and ensure your thumb is sitting over the knuckle, as in the ‘Neutral / Center’ image below, and click the Bind button to train the gesture.

  3. Repeat the above steps to bind for the Up/Forward joystick direction.

Neutral / Center

Up / Forward

Neutral / Center

Up / Forward

d-pad-center.jpg
d-pad-up.jpg

Training Additional Directions

joystick-training2.gif

If needed, train these additional directions for controlling backwards and strafing/turning movement for smooth locomotion.

Using the recommended poses below allows you to treat the up/down and center directions as if the top of your middle finger was a touchpad. For the left/right directions, pivot your thumb left and right to avoid accidental activations. Remember to swap the left/right thumb direction when training on the opposite hand so the interaction feels intuitive:

Down

Left

Right

Down

Left

Right

d-pad-down.jpg
d-pad-left.jpg
d-pad-right.jpg
Train the up / down D-Pad direction in your left hand for movement and the left / right D-Pad direction for rotation left and right. In many apps it is customary to bind teleport ray activation to the up direction on one hand.
You can also use the D-Pad to train additional button gestures that can be mapped to SteamVR controller binding actions for more complex XR apps.
It is currently not possible to re-bind the D-Pad to other game functions in OpenXR apps not running in the SteamVR runtime. We hope to add more flexibility in the future. In the meantime, as an OpenXR app developer you can set your own custom bindings within Unity’s Input System or the equivalent in Unreal Engine.

We recommend configuring your in-game controls to perform smooth locomotion for forwards/backwards movement using the left hand’s up/down D-Pad directions. You will have greater precision by using your entire body to turn in game while using StretchSense gloves but you can optionally train the left or right joystick direction on your right hand if you don’t want to physically turn or prefer to play seated.

Where possible it is always best to minimize the number of controls as this reduces the cognitive load on new users.

Working with Trained Models

Saving a Trained Model

Once you have trained a model for a hand:

  1. Click Save.

    Screenshot 2025-04-11 121203.png
  2. You will be prompted to save the button gesture model and hand articulation animation training data as a single DAT file.

  3. Give the model a descriptive name (e.g v0.1.0 Grab Trigger) to help remember what XR Game app version and buttons the model is trained on when loading it next time.

Save numbered or descriptive versions of these if you are testing various gestures need to easily roll back to a previous model.
You can save and load models optimized for different SteamVR games and SteamVR controller binding profiles. Keep these organized in different folders under C:\Users\USERNAME\AppData\LocalLow\StretchSense\XR Train

It’s important to remember to save your model after Training or Tuning your gestures.

Loading a Previously Trained Model

When starting the XR Game App your previously loaded model for gestures and articulation (hand animation) will automatically load on startup.
  1. Perform a calibration for both hands as described in Reality Glove Calibration.

  2. Click the Load button in the top menu and select the previously saved model on disk OR if you have previously saved a model, it will have been automatically loaded on startup.

    Screenshot 2025-04-11 121246.png

Recommended Gestures

You can set any gesture as any button, but it’s important to use gestures you won’t trigger by accident. For first time users, it can be useful to follow our recommended gesture-button configurations to avoid having to figure out everything for yourself.

  • Recommended Button Gestures: If your game or application already supports one or more of the A, B, Trigger, Grip, and Menu/System buttons, see this configuration guide.

Changelog

Version

Publish Date (YYYY/MM/DD)

Changelog

Version

Publish Date (YYYY/MM/DD)

Updated to reflect the combining of the normalisation and articulation calibration processes

v1.0.5

2025/07/04

Updated tuning section wording and removed recommended gestures for standalone applications.

v1.0.4

2025/06/25

Minor fix for down D-Pad gesture. Changed from “index finger” to “middle finger” in the text description.

v1.0.3

2025/06/17

Added D-Pad joystick instructions, did a pass to remove duplication and simplify language, moved Tuning section further up the page to sit below the gesture calibration process instructions

v1.0.2

2025/06/05

Add new normalization animation

v1.0.1

2025/05/27

Initial Version

v1.0

2025/05/23