Week 4 Lab: Controller & Hand Tracking Interaction

Virtual, Augmented and Spatial Computing

1 Lab Overview

Duration: 2 hours
Hardware: Quest 2, Quest 3, Hololens 2
Software: Unity 2022 LTS, XR Interaction Toolkit 2.x
Deliverable: Lab journal entry + Unity scene (uploaded to VLE)


2 Learning Objectives

By the end of this lab you will be able to:

  • Implement ray-cast and direct interaction in Unity using XR Interaction Toolkit
  • Enable hand tracking on Quest 3
  • Compare the interaction feel of controller vs hand tracking
  • Document interaction design observations systematically

3 Part 1: Ray-Cast Interaction (45 min)

3.1 Setup

  1. Open your Unity project from Week 1 (or create a new URP project)
  2. Ensure XR Interaction Toolkit is installed via Package Manager
  3. Add the XR Origin (Action-based) prefab to your scene

3.2 Task 1.1 — Add Interactable Objects

  1. Create 3 primitive objects (cube, sphere, cylinder) in the scene
  2. Add XRGrabInteractable component to each
  3. Add Rigidbody component to each (enable gravity)
  4. Position objects at different distances: 0.5m, 1.5m, 3m from origin

3.3 Task 1.2 — Configure Ray Interactor

  1. Select the Right Hand controller object in the XR Origin hierarchy
  2. Add XRRayInteractor component
  3. Add XRInteractorLineVisual component (shows the ray)
  4. Set Max Raycast Distance to 5m

3.4 Task 1.3 — Test & Observe

Build and deploy to Quest 2. Test the following:

Test Observation
Select object at 0.5m
Select object at 1.5m
Select object at 3m
Move object while selected
Release object

Record in your lab journal: Which distance felt most natural? What visual feedback helped most?


4 Part 2: Direct (Near-Field) Interaction (30 min)

4.1 Task 2.1 — Add Direct Interactor

  1. Add XRDirectInteractor component to the Right Hand controller
  2. Add a small sphere collider to the controller (set as trigger)
  3. Disable the Ray Interactor temporarily

4.2 Task 2.2 — Test Direct Grab

Deploy and test: - Walk up to each object and grab it directly - Try picking up and placing objects precisely - Try stacking objects

Record in your lab journal: How does direct grab compare to ray cast? When would you prefer each?


5 Part 3: Hand Tracking on Quest 3 (30 min)

Note: This part requires a Quest 3 headset. If unavailable, use the Quest Hand Tracking Simulator in Unity.

5.1 Task 3.1 — Enable Hand Tracking

  1. In Unity, go to Edit → Project Settings → XR Plug-in Management
  2. Ensure OpenXR is selected for Android
  3. Add Hand Tracking feature under OpenXR features
  4. In the XR Origin, enable the Hand Tracking prefab

5.2 Task 3.2 — Pinch Interaction

  1. Replace controller interactors with XRHandInteractorsSetup prefab
  2. Configure pinch threshold (default: 0.7)
  3. Deploy to Quest 3

Test the following:

Gesture Works? Notes
Pinch to select (near)
Pinch to select (far ray)
Two-hand grab
Sustained pinch hold

6 Part 4: Comparative Reflection (15 min)

Complete the following comparison table in your lab journal:

Criterion Controller (Ray) Controller (Direct) Hand Tracking
Ease of learning
Precision
Fatigue after 5 min
Immersion feel
Best use case

6.1 Reflection Questions

  1. Which input method would you choose for a museum exhibit aimed at first-time XR users? Why?
  2. Which would you choose for a surgical training simulation? Why?
  3. What feedback (visual/audio/haptic) made the biggest difference to interaction confidence?

7 Submission

Upload to VLE by end of week:

    • Completed observation tables
    • Comparative reflection table
    • Answers to reflection questions
    • At least 2 screenshots from your Unity scene

8 Troubleshooting

Problem Solution
Ray not visible Check XRInteractorLineVisual is on the same object as XRRayInteractor
Objects fall through floor Add a plane with a collider to the scene
Hand tracking not detected Ensure good lighting; remove gloves; check OpenXR feature is enabled
Build fails for Android Check Android Build Support is installed in Unity Hub