Week 4: Input Systems & Interaction Fundamentals

Virtual, Augmented and Spatial Computing

Week 4 Overview

  • Input modalities in XR
  • Controller-based interaction
  • Hand tracking fundamentals
  • Gaze and eye tracking
  • Designing for input constraints

The Input Problem in XR

“In XR, the interface is the world — input must feel natural, not learned.”

  • Desktop: keyboard + mouse (2D, precise)
  • Mobile: touch (2D, direct)
  • XR: 3D, embodied, spatial — no single dominant paradigm

Input Modalities

Modality Examples Strengths Weaknesses
Controller Quest 2, Vive Pro Precise, haptic Breaks immersion
Hand tracking Quest 3, Hololens 2 Natural, no device Fatigue, occlusion
Gaze Pico Neo Eye Hands-free, fast Privacy, Midas touch
Voice All platforms Accessible Noisy environments
Body/pose Full-body tracking Expressive Setup complexity

Controller Interaction

  • Ray casting — pointing and selecting at distance
  • Direct grab — near-field object manipulation
  • Haptic feedback — vibration as confirmation signal
  • Button mapping — trigger, grip, thumbstick, face buttons

Unity XR Interaction Toolkit

  • XRRayInteractor — ray-based selection
  • XRDirectInteractor — near-field grab
  • XRGrabInteractable — objects that can be picked up

Hand Tracking

  • Skeletal model — 26 joints tracked per hand
  • Pinch gestures — index + thumb = primary select
  • Pose recognition — custom gesture detection
  • Limitations: occlusion, lighting, fatigue

Platforms

  • Quest 2/3: built-in hand tracking (no controller required)
  • Hololens 2: primary input modality
  • Snap Spectacles: limited gesture support

Gaze & Eye Tracking

  • Gaze dwell — look at target for N milliseconds to activate
  • Gaze + gesture — look to target, pinch to confirm
  • Foveated rendering — render high detail only where user looks
  • Midas Touch Problem — everything you look at activates

Pico Neo Eye

  • Eye tracking for both interaction and analytics
  • Useful for attention heatmaps in research/evaluation

Designing for Input Constraints

Gorilla Arm Effect

  • Holding arms up for extended periods causes fatigue
  • Design for low-effort, short-duration interactions

Fitts’ Law in 3D

  • Target size and distance still matter
  • Minimum interactive target: ~2cm at arm’s length

Feedback Triad

  1. Visual — highlight, glow, cursor
  2. Audio — click, chime, spatial sound
  3. Haptic — vibration pulse (controllers only)

Interaction States

Every interactive object should have clear states:

Default → Hover → Selected → Activated → Released
  • Default: object at rest
  • Hover: user is near/pointing at object
  • Selected: user has grabbed/chosen object
  • Activated: action is executing
  • Released: object returns to default or new state

Lab Preview: Controller & Hand Tracking

This week’s lab: - Implement ray-cast selection in Unity - Add hand tracking support (Quest 3) - Compare interaction feel between modalities - Document observations in lab journal

Assessment 2 Reminder

Case Study due: Week 6

  • Choose an XR experience to analyse
  • Apply interaction design frameworks
  • Build a low-fi prototype of an improved interaction
  • Evaluate with at least 2 users

Key Takeaways

  • XR has no single dominant input paradigm — design for the device
  • Controllers offer precision; hands offer naturalness
  • Gaze is powerful but requires careful design to avoid Midas Touch
  • Every interaction needs visual, audio, and (where possible) haptic feedback
  • Design for fatigue — arms down, short interactions

References

  • LaViola et al. (2017) 3D User Interfaces: Theory and Practice
  • Bowman et al. (2004) 3D User Interfaces
  • Fitts, P.M. (1954) — original Fitts’ Law paper
  • Unity XR Interaction Toolkit Docs: docs.unity3d.com