Week 4: Input Systems & Interaction Fundamentals
Virtual, Augmented and Spatial Computing
Week 4 Overview
- Input modalities in XR
- Controller-based interaction
- Hand tracking fundamentals
- Gaze and eye tracking
- Designing for input constraints
Controller Interaction
- Ray casting — pointing and selecting at distance
- Direct grab — near-field object manipulation
- Haptic feedback — vibration as confirmation signal
- Button mapping — trigger, grip, thumbstick, face buttons
XRRayInteractor — ray-based selection
XRDirectInteractor — near-field grab
XRGrabInteractable — objects that can be picked up
Hand Tracking
- Skeletal model — 26 joints tracked per hand
- Pinch gestures — index + thumb = primary select
- Pose recognition — custom gesture detection
- Limitations: occlusion, lighting, fatigue
- Quest 2/3: built-in hand tracking (no controller required)
- Hololens 2: primary input modality
- Snap Spectacles: limited gesture support
Gaze & Eye Tracking
- Gaze dwell — look at target for N milliseconds to activate
- Gaze + gesture — look to target, pinch to confirm
- Foveated rendering — render high detail only where user looks
- Midas Touch Problem — everything you look at activates
Pico Neo Eye
- Eye tracking for both interaction and analytics
- Useful for attention heatmaps in research/evaluation
Interaction States
Every interactive object should have clear states:
Default → Hover → Selected → Activated → Released
- Default: object at rest
- Hover: user is near/pointing at object
- Selected: user has grabbed/chosen object
- Activated: action is executing
- Released: object returns to default or new state
Lab Preview: Controller & Hand Tracking
This week’s lab: - Implement ray-cast selection in Unity - Add hand tracking support (Quest 3) - Compare interaction feel between modalities - Document observations in lab journal
Assessment 2 Reminder
Case Study due: Week 6
- Choose an XR experience to analyse
- Apply interaction design frameworks
- Build a low-fi prototype of an improved interaction
- Evaluate with at least 2 users
Key Takeaways
- XR has no single dominant input paradigm — design for the device
- Controllers offer precision; hands offer naturalness
- Gaze is powerful but requires careful design to avoid Midas Touch
- Every interaction needs visual, audio, and (where possible) haptic feedback
- Design for fatigue — arms down, short interactions
References
- LaViola et al. (2017) 3D User Interfaces: Theory and Practice
- Bowman et al. (2004) 3D User Interfaces
- Fitts, P.M. (1954) — original Fitts’ Law paper
- Unity XR Interaction Toolkit Docs: docs.unity3d.com