Week 4 Lab: Controller & Hand Tracking Interaction
Virtual, Augmented and Spatial Computing
1 Lab Overview
Duration: 2 hours
Hardware: Quest 2, Quest 3, Hololens 2
Software: Unity 2022 LTS, XR Interaction Toolkit 2.x
Deliverable: Lab journal entry + Unity scene (uploaded to VLE)
2 Learning Objectives
By the end of this lab you will be able to:
- Implement ray-cast and direct interaction in Unity using XR Interaction Toolkit
- Enable hand tracking on Quest 3
- Compare the interaction feel of controller vs hand tracking
- Document interaction design observations systematically
3 Part 1: Ray-Cast Interaction (45 min)
3.1 Setup
- Open your Unity project from Week 1 (or create a new URP project)
- Ensure XR Interaction Toolkit is installed via Package Manager
- Add the XR Origin (Action-based) prefab to your scene
3.2 Task 1.1 — Add Interactable Objects
- Create 3 primitive objects (cube, sphere, cylinder) in the scene
- Add
XRGrabInteractablecomponent to each - Add
Rigidbodycomponent to each (enable gravity) - Position objects at different distances: 0.5m, 1.5m, 3m from origin
3.3 Task 1.2 — Configure Ray Interactor
- Select the Right Hand controller object in the XR Origin hierarchy
- Add
XRRayInteractorcomponent - Add
XRInteractorLineVisualcomponent (shows the ray) - Set Max Raycast Distance to 5m
3.4 Task 1.3 — Test & Observe
Build and deploy to Quest 2. Test the following:
| Test | Observation |
|---|---|
| Select object at 0.5m | |
| Select object at 1.5m | |
| Select object at 3m | |
| Move object while selected | |
| Release object |
Record in your lab journal: Which distance felt most natural? What visual feedback helped most?
4 Part 2: Direct (Near-Field) Interaction (30 min)
4.1 Task 2.1 — Add Direct Interactor
- Add
XRDirectInteractorcomponent to the Right Hand controller - Add a small sphere collider to the controller (set as trigger)
- Disable the Ray Interactor temporarily
4.2 Task 2.2 — Test Direct Grab
Deploy and test: - Walk up to each object and grab it directly - Try picking up and placing objects precisely - Try stacking objects
Record in your lab journal: How does direct grab compare to ray cast? When would you prefer each?
5 Part 3: Hand Tracking on Quest 3 (30 min)
Note: This part requires a Quest 3 headset. If unavailable, use the Quest Hand Tracking Simulator in Unity.
5.1 Task 3.1 — Enable Hand Tracking
- In Unity, go to Edit → Project Settings → XR Plug-in Management
- Ensure OpenXR is selected for Android
- Add Hand Tracking feature under OpenXR features
- In the XR Origin, enable the Hand Tracking prefab
5.2 Task 3.2 — Pinch Interaction
- Replace controller interactors with
XRHandInteractorsSetupprefab - Configure pinch threshold (default: 0.7)
- Deploy to Quest 3
Test the following:
| Gesture | Works? | Notes |
|---|---|---|
| Pinch to select (near) | ||
| Pinch to select (far ray) | ||
| Two-hand grab | ||
| Sustained pinch hold |
6 Part 4: Comparative Reflection (15 min)
Complete the following comparison table in your lab journal:
| Criterion | Controller (Ray) | Controller (Direct) | Hand Tracking |
|---|---|---|---|
| Ease of learning | |||
| Precision | |||
| Fatigue after 5 min | |||
| Immersion feel | |||
| Best use case |
6.1 Reflection Questions
- Which input method would you choose for a museum exhibit aimed at first-time XR users? Why?
- Which would you choose for a surgical training simulation? Why?
- What feedback (visual/audio/haptic) made the biggest difference to interaction confidence?
7 Submission
Upload to VLE by end of week:
-
- Completed observation tables
- Comparative reflection table
- Answers to reflection questions
- At least 2 screenshots from your Unity scene
8 Troubleshooting
| Problem | Solution |
|---|---|
| Ray not visible | Check XRInteractorLineVisual is on the same object as XRRayInteractor |
| Objects fall through floor | Add a plane with a collider to the scene |
| Hand tracking not detected | Ensure good lighting; remove gloves; check OpenXR feature is enabled |
| Build fails for Android | Check Android Build Support is installed in Unity Hub |