Week 1 Lecture Notes: Introduction to Spatial Computing

1 Lecture

2 Introduction

This module sits at the intersection of technology, design, and human experience. By the time you finish it, you should be able to think critically about immersive systems, build functional XR prototypes, and evaluate them against established principles of human-computer interaction.

This first week establishes the conceptual landscape. Before we can design or build anything, we need a shared vocabulary and a clear understanding of what XR actually is, where it came from, and where it is going.


3 Defining the XR Spectrum

The term XR (Extended Reality) is an umbrella that covers several related but distinct technologies. Understanding the differences between them is essential because each imposes different design constraints and affords different possibilities.

3.1 Virtual Reality (VR)

In a VR system, the user’s visual field is entirely replaced by a synthetic environment. The headset blocks out the physical world and presents a stereoscopic 3D image that responds to the user’s head movement. High-quality VR systems also include spatial audio, hand controllers with haptic feedback, and room-scale tracking.

The defining characteristic of VR is sensory substitution — the system replaces real sensory input with synthetic input. The quality of this substitution determines how convincing the experience feels.

3.2 Augmented Reality (AR)

AR systems overlay digital content onto the user’s view of the real world. This can be achieved through:

  • optical see-through displays (the user looks through a transparent lens with digital content projected onto it — HoloLens 2, Snap Spectacles)
  • video see-through displays (cameras capture the real world and the combined image is displayed — Quest 3 passthrough, mobile AR)
  • projection-based AR (digital content is projected directly onto real surfaces)

The defining characteristic of AR is that the real world remains visible and the digital content must be registered accurately to physical space.

3.3 Mixed Reality (MR)

Mixed reality is a term used in different ways by different organisations. In its most precise sense (following Milgram and Kishino’s taxonomy), MR refers to any point on the reality-virtuality continuum between fully real and fully virtual. In practice, it is often used to describe systems where digital and physical objects interact — for example, a virtual ball that bounces off a real table.

The HoloLens 2 and Meta Quest 3 are both marketed as mixed reality devices.

3.4 Spatial Computing

Spatial computing is a broader paradigm that describes computing systems that understand and interact with three-dimensional space. Rather than confining computation to a screen, spatial computing embeds it in the environment. This includes XR devices but also extends to robotics, autonomous vehicles, and smart environments.

Apple’s Vision Pro is positioned explicitly as a spatial computing device rather than a VR or AR headset, reflecting a deliberate shift in how the industry frames these technologies.


4 The Reality-Virtuality Continuum

Milgram and Kishino (1994) proposed a taxonomy of mixed reality visual displays that remains influential today. They described a continuum from fully real environments to fully virtual environments, with augmented reality and augmented virtuality occupying the middle ground.

Real Environment Augmented Reality Augmented Virtuality Virtual Environment
Fully real Mostly real, some virtual Mostly virtual, some real Fully virtual

This framework is useful because it reminds us that the distinction between VR and AR is not binary. Many modern devices can operate at different points on this continuum depending on their mode of use.


5 A Brief History of XR

Understanding the history of XR helps contextualise the current moment and avoid repeating past mistakes.

5.1 Early Foundations (1960s–1990s)

  • 1965: Ivan Sutherland describes the “Ultimate Display” — a room in which the computer can control the existence of matter
  • 1968: Sutherland builds the first HMD, the Sword of Damocles
  • 1987: Jaron Lanier coins the term “virtual reality”
  • 1992: The CAVE (Cave Automatic Virtual Environment) is developed at the University of Illinois
  • 1994: Milgram and Kishino publish their taxonomy of mixed reality

5.2 The First Consumer Wave (2012–2016)

  • 2012: Oculus Rift Kickstarter campaign raises $2.4 million
  • 2014: Facebook acquires Oculus for $2 billion
  • 2016: Consumer launches of Oculus Rift, HTC Vive, and PlayStation VR
  • 2016: Pokémon GO demonstrates mass-market AR

5.3 Standalone and Spatial Computing (2019–present)

  • 2019: Meta Quest 1 launches — standalone VR without a PC
  • 2021: Meta rebrands from Facebook, announces metaverse strategy
  • 2022: Meta Quest Pro introduces mixed reality passthrough
  • 2023: Apple announces Vision Pro at WWDC
  • 2024: Spatial computing enters enterprise and consumer markets at scale

6 Degrees of Freedom

A fundamental concept in XR hardware is degrees of freedom (DOF), which describes how many independent axes of movement a tracking system can detect.

6.1 3DOF

Three degrees of freedom tracks rotation only: - pitch (nodding) - yaw (shaking head) - roll (tilting head)

3DOF systems cannot detect the user moving through space. They are suitable for 360-degree video viewing but not for interactive VR.

6.2 6DOF

Six degrees of freedom tracks rotation and position: - pitch, yaw, roll (rotation) - x, y, z (position in space)

6DOF allows the user to physically move through the virtual environment. All primary devices in this module support 6DOF tracking.


7 Tracking Technologies

7.1 Inside-Out Tracking

Cameras mounted on the headset observe the environment and use computer vision to determine the headset’s position and orientation. This approach requires no external hardware and is used by the Meta Quest 2, Quest 3, and HoloLens 2.

Advantages: - no external setup required - portable and flexible

Limitations: - can struggle in low-light or featureless environments - tracking volume limited by camera field of view

7.2 Outside-In Tracking

External sensors (base stations or cameras) track the headset and controllers. The HTC Vive Pro uses this approach with SteamVR base stations.

Advantages: - highly accurate and reliable - large tracking volume possible

Limitations: - requires setup and calibration - not portable


8 The Hardware in This Module

You will have access to a range of devices throughout this module. Each offers different capabilities and design constraints.

8.1 Meta Quest 2

The primary student HMD. Standalone, 6DOF, inside-out tracking, touch controllers with haptics. Widely used in industry and education. The baseline platform for most lab and practical work.

8.2 Meta Quest 3

Adds full-colour passthrough for mixed reality, improved processing, and slimmer form factor. Useful for exploring the boundary between VR and MR.

8.3 HTC Vive Pro

A PC-tethered headset with outside-in tracking via SteamVR base stations. Higher visual fidelity than the Quest 2. Useful for understanding the trade-offs between standalone and tethered systems.

8.4 Pico Neo Eye

Includes integrated eye tracking hardware. Enables gaze-based interaction and foveated rendering. Used in Week 5 for eye tracking exploration.

8.5 HoloLens 2

Microsoft’s mixed reality headset. Optical see-through display, hand tracking, gaze tracking, voice input. The primary platform for spatial computing work in Week 10.

8.6 Snap Spectacles 2022

Lightweight AR glasses with a small display. Used in Week 9 alongside Lens Studio for AR lens development.

8.7 Display Glasses

Passive display glasses for comparison and demonstration purposes.


9 Why XR Design is Different

Designing for XR requires a fundamentally different mindset from designing for screens.

9.1 The body is the interface

In XR, the user’s physical body — their hands, head, gaze, voice, and movement — becomes the primary input device. Design must account for physical ergonomics, fatigue, and the limits of human movement.

9.2 Space is the canvas

Rather than arranging elements on a flat surface, XR designers work in three-dimensional space. Objects have position, scale, and orientation. The user can move around them, look at them from different angles, and interact with them physically.

9.3 Presence changes behaviour

Research has shown that people behave differently in immersive environments than they do in front of screens. They respond emotionally to virtual stimuli, feel genuine discomfort in threatening virtual situations, and can develop real skills through virtual practice. This makes XR a powerful tool — and a significant ethical responsibility.

9.4 Comfort is a design constraint

Unlike screen-based interfaces, XR can cause physical discomfort — motion sickness, eye strain, neck fatigue, and disorientation. Good XR design treats comfort as a first-class design requirement, not an afterthought.


10 The Module in Context

This is a third-year module. You are expected to bring critical thinking, not just technical skill. Throughout the module, you will be asked to:

  • analyse existing XR experiences, not just build new ones
  • justify your design decisions with reference to theory and evidence
  • evaluate your work honestly, including its limitations
  • consider the ethical implications of immersive design

11 Self-Check Questions

  1. What is the difference between VR, AR, and MR?
  2. Where does spatial computing fit in relation to XR?
  3. What does the reality-virtuality continuum describe?
  4. What is the difference between 3DOF and 6DOF tracking?
  5. What are the advantages and limitations of inside-out tracking?
  6. Name three devices available in this module and describe one unique capability of each.
  7. Why is comfort a design constraint in XR?
  8. How does the body function differently as an interface in XR compared to traditional computing?

12 Further Reading

  • Jerald, J. (2015) The VR Book: Human-Centered Design for Virtual Reality. ACM Books. Chapters 1 and 2.
  • LaViola, J. et al. (2017) 3D User Interfaces: Theory and Practice (2nd ed.). Addison-Wesley. Chapter 1.
  • Milgram, P. and Kishino, F. (1994) A taxonomy of mixed reality visual displays. IEICE Transactions on Information and Systems, E77-D(12).
  • Sutherland, I. (1965) The ultimate display. Proceedings of IFIP Congress.
  • Meta Quest developer documentation: https://developer.oculus.com