Week 7 Lab: AR Foundations
AR Foundation Setup and Tap-to-Place Prototype
Lab Overview
In this lab, you will build a simple augmented reality prototype using Unity and AR Foundation. The goal is to understand how a device detects surfaces in the real world and how virtual content can be placed meaningfully into that space.
Learning Objectives
By the end of this lab, you should be able to:
- Set up a Unity project for AR development.
- Install and configure the required AR packages.
- Detect real-world planes using AR Foundation.
- Implement a basic tap-to-place interaction.
- Reflect on how spatial anchoring affects usability and presence.
Required Software and Hardware
Software
- Unity 2022 LTS
- AR Foundation package
- XR Plug-in Management
- OpenXR Plugin
Hardware
- Compatible mobile AR device, or
- Quest 3 with pass-through workflow where applicable, or
- Lab-supported AR hardware as directed
Part 1: Project Setup
Create a New Project
- Open Unity Hub.
- Create a new 3D (URP) project.
- Name the project
Week7_AR_Foundations.
Install Packages
Open Window > Package Manager and install: - XR Plug-in Management - AR Foundation - Platform-specific AR support packages as required in the lab
Configure XR
In Project Settings: - Enable the appropriate XR plug-in. - Confirm build support for the target device. - Check camera permissions if deploying to a mobile device.
Part 2: Plane Detection
AR systems use computer vision and motion tracking to detect flat surfaces such as floors, desks, and walls.
Task
- Add an AR Session to the scene.
- Add an AR Session Origin or equivalent XR rig setup.
- Add the following components where appropriate:
ARPlaneManagerARRaycastManager
- Enable plane visualisation so that detected surfaces are visible during testing.
Checkpoint
By this stage, you should be able to move the device around a room and see detected surfaces appear.
Part 3: Tap-to-Place Interaction
Now you will allow the user to place a virtual object onto a detected plane.
Task
Implement a script that: 1. Detects a screen tap or selection event. 2. Uses ARRaycastManager to raycast against detected planes. 3. Instantiates a prefab at the hit position. 4. Aligns the object naturally with the surface.
Suggested Prefab
Use a simple object such as: - a cube, - a chair, - a plant, or - a branded 3D object relevant to your project idea.
Part 4: Test and Evaluate
Test your prototype in at least two different physical environments.
Questions to Consider
- How quickly does the device detect planes?
- Does object placement feel stable?
- What happens in poor lighting?
- Does the object appear convincingly anchored in space?
Deliverable
Submit the following at the end of the session: - A screenshot or short capture of your AR prototype in use. - Your Unity project files as instructed by the lecturer. - A short reflection (150–200 words) on what worked well and what limitations you noticed.
Extension Task
If you finish early, try one of the following: - Allow the user to reposition the object after placement. - Add a second prefab choice. - Add simple scaling or rotation controls. - Experiment with light estimation or occlusion if supported.
Reflection Prompt
Write briefly about the relationship between technical tracking quality and user experience. For example, how does unstable placement affect presence, trust, or task success in AR?