Week 3 Lecture Notes: Perception and Presence

1 Introduction

This week we move from hardware to the human. Understanding how the brain constructs the experience of being somewhere is fundamental to XR design. Every design decision — from frame rate to spatial audio to locomotion technique — has its roots in human perception.

The central concept this week is presence: the subjective feeling of being in a place. Presence is what makes XR powerful. It is also what makes it potentially harmful if designed carelessly.


2 Presence and Immersion

2.1 Defining Presence

Presence is one of the most studied and debated concepts in XR research. At its simplest, it is the feeling of “being there” — the sense that you are actually in the virtual environment rather than observing it from outside.

Mel Slater, one of the leading researchers in this area, distinguishes two components of presence:

Place Illusion (PI) is the strong illusion of being in a place even though you know you are not. It is supported primarily by the quality of the sensorimotor contingencies — the relationship between your movements and the changes in the sensory display. If you turn your head and the visual scene updates correctly, your brain interprets this as evidence that you are in a real place.

Plausibility Illusion (Psi) is the illusion that what is happening is really happening. It is supported by the responsiveness of the environment — if you reach out and touch a virtual object and it responds, your brain interprets this as evidence that the event is real.

Slater argues that both PI and Psi are necessary for full presence. A highly immersive display that shows a completely unresponsive environment will produce PI but not Psi. A responsive environment on a low-quality display may produce Psi but not PI.

2.2 Immersion as an Objective Property

Immersion is often used interchangeably with presence, but they are distinct. Immersion is an objective, measurable property of the technology — it describes how completely the system replaces the user’s sensory input with synthetic input.

A system with a wide FOV, high resolution, low latency, and spatial audio is more immersive than one with a narrow FOV and no audio. But a user may feel more present in the less immersive system if the content is more compelling and responsive.

This distinction has important design implications. You cannot guarantee presence by maximising immersion. You must also design content that supports plausibility.


3 Perceptual Modalities

The brain integrates information from multiple sensory systems simultaneously. In XR, understanding each modality helps you design experiences that support presence and avoid discomfort.

3.1 Visual Perception

Vision is the dominant sense in most XR systems. The visual system provides information about:

Stereopsis: The brain uses the slight difference in the images received by each eye (binocular disparity) to compute depth. HMDs present slightly different images to each eye to simulate this. The effectiveness of stereopsis depends on the accuracy of the interpupillary distance (IPD) setting.

Motion parallax: As you move your head, objects at different distances move at different rates across your visual field. This is a powerful depth cue that is well-supported by 6DOF tracking.

Accommodation and vergence: As discussed in Week 2, the vergence-accommodation conflict is a significant limitation of current HMDs.

Peripheral vision: The peripheral visual field is important for detecting motion and maintaining spatial awareness. A narrow FOV that cuts off peripheral vision reduces immersion and can cause disorientation.

3.2 Auditory Perception

The auditory system contributes significantly to presence. Research has shown that high-quality spatial audio can increase presence scores substantially, and that audio that does not match the visual scene can break presence rapidly.

The brain localises sounds using:

Interaural Time Difference (ITD): The difference in the time it takes for a sound to reach each ear. Sounds from the left arrive at the left ear slightly before the right ear.

Interaural Level Difference (ILD): The difference in the loudness of a sound at each ear. The head creates an acoustic shadow that attenuates sounds on the far side.

Spectral cues: The shape of the outer ear (pinna) modifies the frequency content of sounds in ways that depend on the direction of the source. These spectral cues are captured in the HRTF and are particularly important for distinguishing sounds above and below the listener.

3.3 Proprioception

Proprioception is the sense of the position and movement of your own body, provided by receptors in muscles, tendons, and joints. It tells you where your limbs are without looking at them.

In VR, proprioception can conflict with visual information. If you see your virtual hand in a position that does not match where your real hand is, this creates a conflict that can be disorienting. This is why accurate hand tracking and controller tracking are important for natural interaction.

3.4 The Vestibular System

The vestibular system, located in the inner ear, detects:

  • linear acceleration (moving forward, backward, up, down)
  • rotational acceleration (turning)
  • the direction of gravity

The vestibular system is the primary cause of simulator sickness in VR. When the visual system detects motion (because the virtual environment is moving) but the vestibular system detects no motion (because the user is stationary), the brain interprets this conflict as a sign of poisoning — a response that evolved to protect against neurotoxins that cause hallucinations. The result is nausea.

This is called vection — the perception of self-motion induced by visual stimulation alone.

3.5 Haptic Perception

The haptic system encompasses touch (cutaneous sensation) and force (kinaesthetic sensation). It provides information about:

  • surface texture
  • temperature
  • compliance (how much an object deforms under pressure)
  • weight and inertia

Current XR systems provide very limited haptic feedback — primarily vibration from controller motors. The absence of rich haptic feedback is a significant limitation for experiences that involve object manipulation, surface interaction, or social presence (the feeling of being with another person).


4 Proximal and Distal Stimuli

This distinction, drawn from perceptual psychology, is useful for understanding how XR works.

The distal stimulus is the actual object or event in the world — a tree, a voice, a surface.

The proximal stimulus is the pattern of energy that reaches the sensory receptor — the pattern of light on the retina, the pressure wave at the eardrum, the pressure on the skin.

The brain never has direct access to the distal stimulus. It only has access to the proximal stimulus, and it must infer the distal stimulus from that. This inference is based on prior experience, context, and the integration of multiple sensory signals.

XR works by manipulating the proximal stimulus. We cannot put a real tree in front of the user, but we can create a pattern of light on the retina that is consistent with the presence of a tree. If the proximal stimulus is convincing enough, the brain will infer the presence of the tree — and the user will feel present in a forest.

This framing helps explain why presence is not simply a function of display quality. The brain is not just checking whether the image is photorealistic. It is checking whether the pattern of sensory input is consistent with being in a real place. Consistency across modalities, responsiveness to movement, and plausible cause-and-effect relationships are all important.


5 Simulator Sickness

5.1 Definition and Symptoms

Simulator sickness is a form of motion sickness that occurs in simulated environments. Symptoms include:

  • nausea and stomach discomfort
  • disorientation and dizziness
  • eye strain and headache
  • fatigue
  • pallor and sweating in severe cases

Symptoms typically develop during or after use and can persist for some time after the headset is removed.

5.2 Causes

Latency is the delay between the user’s head movement and the corresponding update of the display. Even small amounts of latency (above approximately 20 milliseconds) can cause discomfort. Modern HMDs use asynchronous timewarp and reprojection techniques to minimise perceived latency.

Low frame rate causes judder — a stuttering of the image during movement. This is both visually unpleasant and a significant cause of sickness. Maintaining a consistent frame rate above 72Hz is essential.

Vection — visual motion without vestibular motion — is the primary cause of sickness during smooth locomotion in VR. The visual system detects movement, but the vestibular system does not, creating a conflict.

FOV mismatch occurs when the visual motion in the display does not match the expected motion for the user’s real-world movement. This can occur with incorrect camera settings or locomotion implementations.

5.3 Mitigation Strategies

Teleportation eliminates vection by moving the user instantaneously rather than continuously. It is the most effective comfort option for locomotion.

Vignetting reduces the peripheral visual field during movement, which reduces the vection signal. Many VR games offer this as a comfort option.

Snap turning rotates the user’s view in discrete steps rather than continuously, eliminating rotational vection.

Frame rate maintenance is critical. Dropping below the target frame rate is more disorienting than a consistently lower frame rate.

User control — allowing users to choose their preferred locomotion method and comfort settings — is important because susceptibility to simulator sickness varies significantly between individuals.


6 Health Effects and Ethics

6.1 Physical Health Effects

Beyond simulator sickness, extended XR use can cause:

  • Eye strain and fatigue from the vergence-accommodation conflict and sustained focus at a fixed distance
  • Neck and shoulder strain from the weight of the headset and sustained head positions
  • Post-VR disorientation — a brief period of disorientation after removing the headset, particularly after extended use
  • Social isolation — extended use of fully immersive VR removes the user from their physical social environment

6.2 Ethical Responsibilities

The power of presence creates ethical responsibilities for XR designers.

Disclosure: Users should be informed of potential health risks before using XR systems, particularly for extended sessions.

Comfort options: Experiences should provide comfort options (teleportation, vignetting, session time limits) rather than forcing users into uncomfortable configurations.

Accessibility: XR experiences should be designed to be accessible to users with different physical abilities, sensory sensitivities, and susceptibilities to simulator sickness.

Manipulation: The sense of presence can be exploited to create powerful emotional responses. This raises questions about the ethics of using XR for persuasion, advertising, or behaviour change. Designers should consider whether their use of presence is in the user’s interest.


7 Measuring Presence

Several standardised questionnaires have been developed to measure presence:

The Presence Questionnaire (PQ) (Witmer and Singer, 1998) measures presence across several dimensions including involvement, sensory fidelity, and interface quality.

The Igroup Presence Questionnaire (IPQ) (Schubert et al., 2001) measures spatial presence, involvement, and realness.

These tools are useful for comparing different design decisions and for evaluating the effectiveness of XR experiences. You will use them in Week 11 when we cover evaluation methods.


8 Self-Check Questions

  1. What is the difference between presence and immersion?
  2. What are Slater’s two components of presence?
  3. What is vection and why does it cause sickness?
  4. What is the vestibular system and how does it contribute to simulator sickness?
  5. What is the difference between proximal and distal stimuli?
  6. Name three design strategies that can reduce simulator sickness.
  7. What ethical responsibilities do XR designers have regarding health effects?
  8. How does spatial audio contribute to presence?

9 Further Reading

  • Slater, M. (2009) Place illusion and plausibility can lead to realistic behaviour in immersive virtual environments. Philosophical Transactions of the Royal Society B, 364(1535), pp. 3549–3557.
  • Witmer, B. and Singer, M. (1998) Measuring presence in virtual environments: A presence questionnaire. Presence, 7(3), pp. 225–240.
  • Jerald, J. (2015) The VR Book, Chapters 4 and 5.
  • Kolasinski, E. (1995) Simulator sickness in virtual environments. US Army Research Laboratory Technical Report ARI-TR-1027.
  • LaViola, J. (2000) A discussion of cybersickness in virtual environments. ACM SIGCHI Bulletin, 32(1).