Week 1: Introduction to Spatial Computing

Virtual, Augmented and Spatial Computing

Welcome

Virtual, Augmented and Spatial Computing

BSc in Immersive Digital Media — Year 3

We are at the beginning of a new era of computing. The interface is no longer a screen. It is the world.

Who is this module for?

You already know how to:

  • build interactive digital experiences
  • work with 3D tools and game engines
  • think about users and design

This module asks you to apply all of that in three-dimensional space.

What we will cover

  • XR hardware and systems
  • Perception and presence
  • Interaction design
  • Environment and content design
  • AR systems
  • Spatial computing
  • Evaluation and critique

How the module runs

Each week:

  • 1 hour lecture
  • 2 hour lab
  • 2 hour practical

Assessment

Assessment Type Weight Due
1 Continuous Lab Work 30% Ongoing
2 XR Case Study 30% Week 6
3 Individual Project 40% Week 12

Technologies we use

  • Unity and XR Interaction Toolkit
  • Lens Studio
  • Meta Quest 2 and Quest 3
  • HTC Vive Pro
  • Pico Neo Eye
  • HoloLens 2
  • Snap Spectacles 2022
  • Display Glasses

What is XR?

XR stands for Extended Reality.

It is an umbrella term covering:

  • Virtual Reality (VR)
  • Augmented Reality (AR)
  • Mixed Reality (MR)
  • Spatial Computing

Virtual Reality

The user is placed inside a fully synthetic environment.

  • no view of the real world
  • complete sensory substitution (visual, audio, haptic)
  • examples: Meta Quest, HTC Vive Pro

Augmented Reality

Digital content is overlaid on the real world.

  • the real world remains visible
  • digital objects appear to exist in physical space
  • examples: HoloLens 2, Snap Spectacles, Pokémon GO

Mixed Reality

A spectrum between fully real and fully virtual.

  • digital and physical objects coexist and interact
  • examples: HoloLens 2, Meta Quest 3 with passthrough

Spatial Computing

A broader paradigm shift in how we interact with computers.

  • computing moves into the environment
  • interfaces are no longer confined to screens
  • examples: Apple Vision Pro, HoloLens 2

The Reality-Virtuality Continuum

Milgram and Kishino (1994) described a spectrum:

Real Environment Augmented Reality Augmented Virtuality Virtual Environment
Fully real Mostly real Mostly virtual Fully virtual

Most modern XR devices sit somewhere in the middle.

Why does this matter now?

  • hardware has become affordable and accessible
  • processing power has reached mobile XR
  • spatial computing is entering mainstream use
  • industry demand for XR skills is growing rapidly

Key industry moments

  • 2012: Oculus Rift Kickstarter
  • 2016: Consumer VR launches (Rift, Vive, PSVR)
  • 2019: Quest 1 — standalone VR
  • 2021: Meta rebrands around the metaverse
  • 2023: Apple Vision Pro announced
  • 2024: Spatial computing enters enterprise and consumer markets

Hardware in this module

Device Type Primary Use
Quest 2 VR Primary student HMD
Quest 3 VR/MR Mixed reality and passthrough
Vive Pro VR High-fidelity comparison
Pico Neo Eye VR Eye tracking
HoloLens 2 MR Spatial computing
Snap Spectacles AR Lightweight AR
Display Glasses Passive display Comparison

6DOF vs 3DOF

3DOF: tracks rotation only (pitch, yaw, roll)

6DOF: tracks rotation AND position (x, y, z)

6DOF allows the user to physically move through space.

All primary devices in this module are 6DOF.

Inside-out vs outside-in tracking

Inside-out: cameras on the headset track the environment

Outside-in: external sensors track the headset

Quest 2 and Quest 3 use inside-out tracking. Vive Pro uses outside-in (base stations).

This week in the lab

  • Set up Unity XR project
  • Install XR Interaction Toolkit
  • Deploy a basic scene to Quest 2
  • Test locomotion and object interaction

This week in the practical

Build a minimal XR scene:

  • a simple environment
  • basic locomotion
  • one interactive object

Key questions to carry through the module

  • What makes an XR experience feel real?
  • What makes an interaction feel natural?
  • How do we design for a body in space?
  • What are the ethical responsibilities of XR designers?

Further reading

  • Jerald, J. (2015) The VR Book. ACM Books.
  • LaViola et al. (2017) 3D User Interfaces (2nd ed.)
  • Milgram, P. and Kishino, F. (1994) A taxonomy of mixed reality visual displays

Summary

  • XR covers VR, AR, MR, and spatial computing
  • The reality-virtuality continuum describes the spectrum
  • Hardware has matured rapidly
  • This module combines theory, critique, and practice
  • Assessment is continuous, case study, and project

Next week: XR Hardware and Systems