Week 5: Gesture, Locomotion & Spatial UI

Virtual, Augmented and Spatial Computing

Week 5 Overview

  • Gesture design and recognition
  • Locomotion techniques in VR
  • Spatial UI design principles
  • Diegetic vs non-diegetic interfaces
  • Comfort and accessibility

Gesture Design

What makes a good XR gesture?

  • Distinct — not confused with natural movement
  • Comfortable — low fatigue, repeatable
  • Discoverable — findable without instruction
  • Reversible — easy to cancel

Gesture Vocabulary

Gesture Common Use
Pinch Select, confirm
Open palm Cancel, stop
Point Navigate, indicate
Two-hand spread Scale
Two-hand rotate Rotate object

Gesture Recognition in Unity

// Example: Detect pinch gesture
using UnityEngine.XR.Hands;

void Update() {
    var hand = XRHandSubsystem.GetHand(Handedness.Right);
    float pinchStrength = hand.GetFingerPinchStrength(
        XRHandFingerID.Index
    );
    if (pinchStrength > 0.9f) {
        OnPinch();
    }
}

Locomotion in VR

Locomotion is the #1 cause of simulator sickness

The Core Problem

  • Physical body stays still
  • Virtual body moves
  • Vestibular system detects mismatch → nausea

Locomotion Techniques

Technique Comfort Immersion Use Case
Teleportation ★★★★★ ★★★ General VR
Smooth locomotion ★★★ ★★★★★ Games, training
Room-scale ★★★★★ ★★★★★ Small spaces
Arm-swinging ★★★★ ★★★★ Active experiences
Redirected walking ★★★★★ ★★★★★ Large spaces (research)

Teleportation Design

  • Arc indicator — shows landing zone
  • Valid/invalid zones — colour-coded feedback
  • Fade transition — reduces disorientation
  • Orientation control — let user choose facing direction

Unity Implementation

  • TeleportationProvider + TeleportationArea
  • XRRayInteractor with teleport mode enabled

Smooth Locomotion Comfort

If using smooth locomotion:

  • Vignette — narrow FOV during movement reduces sickness
  • Speed limit — keep below 3 m/s for most users
  • Snap turning — 30–45° increments instead of smooth rotation
  • Comfort mode toggle — always offer teleport as alternative

Spatial UI Design

The Problem with Flat UI in XR

  • 2D panels feel out of place in 3D space
  • Fixed HUD elements cause eye strain (vergence conflict)
  • Menus that follow the user are annoying

Three UI Attachment Strategies

Type Attached To Example
World-locked Scene Sign, button on wall
Body-locked User (with lag) Wrist menu
Head-locked Camera (fixed) Crosshair, HUD

Diegetic vs Non-Diegetic UI

Type Definition Example
Diegetic Exists in the world Health bar on character’s chest
Non-diegetic Overlaid on world Traditional HUD
Spatial In world, not on character Floating damage numbers
Meta Outside the world Pause menu

Best practice for VR: Prefer diegetic UI — it maintains immersion and avoids vergence issues.

Wrist Menu Pattern

A common and effective body-locked UI pattern:

  1. User raises wrist (palm up)
  2. Menu appears on inner wrist
  3. User selects with other hand
  4. Wrist lowered = menu dismissed

Advantages: Discoverable, low fatigue, natural gesture

Accessibility in XR Interaction

  • Seated mode — all interactions reachable from chair
  • One-handed mode — full functionality with one controller
  • Comfort presets — locomotion, FOV, speed options
  • Text size — minimum 24pt at 1m viewing distance
  • Colour contrast — don’t rely on colour alone

Lab Preview

This week’s lab: - Implement teleportation in Unity - Build a wrist menu - Compare diegetic vs non-diegetic UI approaches - Test on Quest 2 and Vive Pro

Key Takeaways

  • Gesture design requires careful attention to comfort and discoverability
  • Locomotion is the biggest comfort challenge in VR — always offer teleport
  • Spatial UI should prefer diegetic approaches
  • Accessibility is not optional — design for seated, one-handed, and comfort-sensitive users

References