Week 5 Lecture Notes: Gesture, Locomotion & Spatial UI

Virtual, Augmented and Spatial Computing

1 Overview

Week 5 builds on the input fundamentals from Week 4 to address three interconnected design challenges: gesture vocabularies, locomotion comfort, and spatial UI design. These are the areas where XR design diverges most sharply from conventional UX practice.


2 1. Gesture Design

2.1 1.1 Principles of Good Gesture Design

Gesture design in XR is constrained by both the capabilities of tracking systems and the ergonomics of the human hand. A gesture vocabulary should be:

Distinct: Each gesture should be clearly different from others in the vocabulary, and from natural resting hand positions. Ambiguous gestures cause false positives.

Comfortable: Gestures that require sustained tension (e.g., holding a spread-finger pose) cause rapid fatigue. Prefer gestures that can be performed and released quickly.

Discoverable: Users should be able to find gestures through exploration or minimal instruction. Gestures that require memorisation of a specific hand shape are problematic for casual users.

Reversible: Every gesture should have a clear cancel or undo path. Users need to feel safe experimenting.

2.2 1.2 Standard Gesture Vocabulary

Most XR platforms have converged on a small set of standard gestures:

Gesture Trigger Common Use
Index pinch Index + thumb contact Primary select
Middle pinch Middle + thumb contact Secondary action
Open palm All fingers extended Cancel / stop
Point Index extended, others curled Navigate / indicate
Two-hand pinch-spread Both hands pinch then separate Scale
Two-hand rotate Both hands pinch and rotate Rotate object

2.3 1.3 Custom Gesture Recognition

For custom gestures beyond the standard vocabulary, Unity’s XR Hands package provides a pose detection system. Key considerations:

  • Define poses using joint angle thresholds, not absolute positions
  • Test across multiple hand sizes (children, large hands, small hands)
  • Provide visual coaching for non-standard gestures
  • Log false positive rates during testing

3 2. Locomotion

3.1 2.1 The Vestibular Conflict

Simulator sickness (cybersickness) occurs when visual motion signals conflict with vestibular (inner ear) signals. The vestibular system detects: - Linear acceleration - Rotational acceleration - Gravity direction

When the visual system shows movement that the vestibular system does not detect, the brain interprets this as potential poisoning (an evolutionary response) and triggers nausea.

Key insight: It is not speed that causes sickness — it is acceleration. Constant velocity movement is less sickening than acceleration/deceleration.

3.2 2.2 Locomotion Techniques in Detail

3.2.1 Teleportation

The most comfortable locomotion technique. The user points to a destination and is instantly transported. The visual discontinuity is handled by a brief fade or blink.

Design details: - Arc trajectory indicator (parabolic, not straight) feels more natural - Colour-code valid (green) and invalid (red) landing zones - Allow user to set facing direction before confirming - Fade duration: 100–200ms (shorter = less disorienting)

3.2.2 Smooth Locomotion

Continuous movement via thumbstick. High immersion, higher sickness risk.

Comfort mitigations: - Vignette: Dynamically narrow the FOV during movement. Reduces peripheral motion that triggers sickness. - Speed cap: 2–3 m/s maximum for general audiences - Snap turning: 30° or 45° increments instead of smooth rotation. Eliminates rotational vestibular conflict. - Acceleration curve: Ease in/out rather than instant full speed

3.2.3 Room-Scale

Physical walking within the tracked play area. Most comfortable and immersive, but limited to the physical space available. Quest 2/3 Guardian system defines the safe zone.

3.2.4 Arm-Swinging

User swings arms as if walking; system translates this into forward movement. More comfortable than thumbstick because it involves physical movement that partially matches the visual signal.

3.3 2.3 Comfort Settings Best Practice

Always provide a comfort settings menu with at minimum: - Locomotion type toggle (teleport / smooth) - Vignette on/off - Snap turn angle (15°, 30°, 45°, 60°) - Movement speed slider


4 3. Spatial UI Design

4.1 3.1 Why Flat UI Fails in XR

Traditional 2D UI (panels, windows, HUDs) creates several problems in XR:

Vergence-accommodation conflict: If a UI panel is rendered at a fixed virtual distance (e.g., 2m) but the user’s eyes converge on a real object at 0.5m, the mismatch causes eye strain.

Depth inconsistency: A flat panel floating in 3D space looks artificial and breaks immersion.

Head-locked discomfort: UI that moves exactly with the head (like a traditional HUD) is uncomfortable — the eye cannot rest because the target never stops moving.

4.2 3.2 UI Attachment Strategies

World-locked UI: Attached to a fixed position in the scene. The user must move or look to interact with it. Best for environmental information (signs, labels, control panels).

Body-locked UI (with lag): Follows the user but with a slight delay and damping. Feels more natural than head-locked. The wrist menu is the canonical example.

Head-locked UI: Moves exactly with the head. Use sparingly — only for critical persistent information (e.g., a small battery indicator in the corner).

4.3 3.3 Diegetic UI Design

Diegetic UI exists within the world of the experience. It is the gold standard for immersive VR because it: - Eliminates vergence-accommodation conflict (rendered at world depth) - Maintains immersion - Provides spatial context

Examples: - Health displayed as a glowing bar on the character’s chest armour - Inventory shown as a physical backpack the user opens - Settings accessed via a physical control panel in the environment - Notifications displayed as floating text attached to world objects

4.4 3.4 The Wrist Menu Pattern

The wrist menu has become a standard pattern across VR platforms (used in Meta’s system UI, Hololens, and many applications):

  1. User raises wrist with palm facing up
  2. Trigger: wrist rotation beyond ~60° from neutral
  3. Menu appears on inner wrist surface
  4. User selects items with opposite hand (ray or direct)
  5. Wrist lowered → menu dismissed

Design considerations: - Keep menu items large (minimum 2cm × 2cm) - Limit to 6–8 items maximum - Use icons + text labels - Provide haptic feedback on open/close


5 4. Accessibility

XR accessibility is an emerging but critical area. Key considerations:

5.1 4.1 Physical Accessibility

  • Seated mode: All interactions must be reachable from a seated position. Test with a chair.
  • One-handed mode: Full functionality with one controller/hand. Map all critical actions to single-hand input.
  • Reduced motion mode: Disable smooth locomotion, limit visual effects.

5.2 4.2 Perceptual Accessibility

  • Text size: Minimum 24pt at 1m viewing distance (approximately 2.4° visual angle)
  • Colour contrast: WCAG AA minimum (4.5:1 ratio). Never use colour as the only differentiator.
  • Audio captions: Provide text alternatives for all audio cues.

5.3 4.3 Cognitive Accessibility

  • Consistent interaction patterns: Don’t change how things work between scenes
  • Undo support: Allow reversal of actions
  • Time limits: Avoid timed interactions; if unavoidable, allow extension

6 Self-Check Questions

  1. What is the vestibular conflict and why does it cause simulator sickness?
  2. Why is teleportation more comfortable than smooth locomotion?
  3. What is the difference between diegetic and non-diegetic UI?
  4. Describe the wrist menu pattern and explain why it works well in VR.
  5. Name three accessibility considerations for XR interaction design.

7 References

  • Jerald, J. (2015) The VR Book: Human-Centered Design for Virtual Reality. ACM Books.
  • Bowman, D.A. et al. (2004) 3D User Interfaces: Theory and Practice. Pearson.
  • Oculus Design Guidelines: developer.oculus.com/design
  • Microsoft Mixed Reality Design: learn.microsoft.com/windows/mixed-reality/design
  • XR Access Initiative: xraccess.org
  • Kemeny, A. et al. (2020) “Getting over motion sickness.” Communications of the ACM, 63(5), 91–97.