Reimagining Industrial Training with XR

Reimagining Industrial Training with XR

The project aimed to understand the real needs of training creators, define user requirements, and test a prototype using a no-code XR platform. The goal was to evaluate the practicality, usability, and effectiveness of XR in safety-critical environments.

University of Nottingham - JLR

User Researcher

July '24 - Sep '24

Problem Statement

Considering the practical challenges training creators face in usability, scalability, and content authoring, this research addresses the question -

How can XR be effectively designed to support industrial training workflows?

Why this research matters

Many industrial training processes can't be practiced repeatedly due to high risk, cost, or complexity.

According to the 70-20-10 learning model, 70% of learning comes from hands-on experience, yet current training methods often rely on passive formats like manuals or lectures.

XR provides a safe way to simulate real-world scenarios, but adoption is hindered by tool complexity, high development cost, and hardware limitations.

Many training teams lack the technical resources or experience to build XR content from scratch.

This research focuses on designing a low-barrier XR training tool that meets the needs of training creators and is feasible to implement in real environments.

Research Goals

Identify User Requirements of Industrial Training Creators

Develop and Test a Prototype

Evaluate Usability and Effectiveness

Recommend Improvements and Future Research

Research Process

View Summary

Step 1

Literature Review

Reviewed 33 academic and industry sources related to XR training and usability.

Identified 11 key design challenges -

Integration with existing workflows

Modular content structure

Instructional clarity and feedback

Ease of use and onboarding

Scalability and cost-efficiency

Inclusivity and accessibility

Device comfort and usability

No-code/low-code authoring

Performance tracking support

Resistance to XR adoption

Realistic user pacing and flow

Step 1

Literature Review

Reviewed 33 academic and industry sources related to XR training and usability.

Identified 11 key design challenges -

Integration with existing workflows

Modular content structure

Instructional clarity and feedback

Ease of use and onboarding

Scalability and cost-efficiency

Inclusivity and accessibility

Device comfort and usability

No-code/low-code authoring

Performance tracking support

Resistance to XR adoption

Realistic user pacing and flow

Step 2

Expert Interviews

Interviewed 7 experts from Academics & the Industry

Semi-structured interviews based on the background of the participant.

Although the interview was structured to encourage an open yet focused discussion on our research interests, the primary objective was to address the following questions –

How do academic collaborations and industry insights inform user experiences with XR technologies?

How effective are the existing training methods & what are the challenges?

How can AR/VR enhance training effectiveness?

Themes discovered from the interviews -

Step 2

Expert Interviews

Interviewed 7 experts from Academics & the Industry

Semi-structured interviews based on the background of the participant.

Although the interview was structured to encourage an open yet focused discussion on our research interests, the primary objective was to address the following questions –

How do academic collaborations and industry insights inform user experiences with XR technologies?

How effective are the existing training methods & what are the challenges?

How can AR/VR enhance training effectiveness?

Themes discovered from the interviews -

Step 3

Consolidating User Requirements

22 URs identified

Prioritised based on relevance, feasibility, and impact -

8 Critical (Must-Have)

Show all URs

11 High Priority

Show all URs

4 Medium Priority

Show all URs

Step 3

Consolidating User Requirements

22 URs identified

Prioritised based on relevance, feasibility, and impact -

8 Critical (Must-Have)

Show all URs

11 High Priority

Show all URs

4 Medium Priority

Show all URs

Step 4

Prototyping

Tool Selection

Compared Unity, ZapWorks, Torch AR, and Reality Composer.
Chose Reality Composer due to -

Rapid prototyping capabilities

Real-time AR scene editing

Compatibility with widely available iOS devices

Low learning curve for non-technical users

Prototype Key Features

Drag-and-drop UI for object interaction.

Task segmentation with guided AR scenes

Visual overlays and instructional audio

Completion indicators for objectives

Step 4

Prototyping

Tool Selection

Compared Unity, ZapWorks, Torch AR, and Reality Composer.
Chose Reality Composer due to -

Rapid prototyping capabilities

Real-time AR scene editing

Compatibility with widely available iOS devices

Low learning curve for non-technical users

Prototype Key Features

Drag-and-drop UI for object interaction.

Task segmentation with guided AR scenes

Visual overlays and instructional audio

Completion indicators for objectives

Step 5

Testing the Prototype

Usability Testing Results

I tested the prototype with 6 participants - 5 Academic Researchers & 1 Industry Expert
Scenario - Completing an equipment shutdown using the XR training prototype.

100% task completion

5/6 completed the task without help

Average task time: 4 minutes 28 seconds

Average error rate: 1.8 per user

SUS Score: 66.25 (Marginal Usability) (From 3 users)

NASA-TLX Score: 45.8 (Moderate Mental Load)

Feedback Highlights

Visual instructions were helpful (4/6)

Onboarding flow was confusing (3/6)

Navigation became easier after initial familiarisation (5/6)

Step 5

Testing the Prototype

Usability Testing Results

I tested the prototype with 6 participants - 5 Academic Researchers & 1 Industry Expert
Scenario - Completing an equipment shutdown using the XR training prototype.

100% task completion

5/6 completed the task without help

Average task time: 4 minutes 28 seconds

Average error rate: 1.8 per user

SUS Score: 66.25 (Marginal Usability) (From 3 users)

NASA-TLX Score: 45.8 (Moderate Mental Load)

Feedback Highlights

Visual instructions were helpful (4/6)

Onboarding flow was confusing (3/6)

Navigation became easier after initial familiarisation (5/6)

Outcomes

What Worked

Tasks were broken down clearly and were easy to follow.

Visual guidance improved confidence and accuracy.

The modular design allowed for reuse across different training flows.

What Didn’t

Users wanted more guidance when making mistakes.

Onboarding was not intuitive for first-time users.

Lacked real-time analytics for instructors or evaluators.

If you'd like to discuss the research process and outcomes, feel free to reach out.

© 2025 Gaurav Sinha

© 2025 Gaurav Sinha

© 2025 Gaurav Sinha