+ All Categories
Home > Documents > ARTUR Augmented Reality Training Utility for Radiography

ARTUR Augmented Reality Training Utility for Radiography

Date post: 12-Apr-2022
Category:
Upload: others
View: 16 times
Download: 0 times
Share this document with a friend
110
ARTUR Augmented Reality Training Utility for Radiography Design, development and usability testing of an educational tool for radiographic patient positioning Master Thesis For attainment of the academic degree of Master of Science in Engineering (MSc) in the Master Programme Digital Healthcare at St. Pölten University of Applied Sciences by Christoph Kamp, BSc 1810756802 First advisor: Dr. Markus Wagner, BSc MSc St. Pölten, 03.06.2020
Transcript
Page 1: ARTUR Augmented Reality Training Utility for Radiography

ARTUR – Augmented Reality

Training Utility for Radiography

Design, development and usability testing of an

educational tool for radiographic patient positioning

Master Thesis

For attainment of the academic degree of

Master of Science in Engineering (MSc)

in the Master Programme Digital Healthcare

at St. Pölten University of Applied Sciences

by

Christoph Kamp, BSc 1810756802

First advisor: Dr. Markus Wagner, BSc MSc

St. Pölten, 03.06.2020

Page 2: ARTUR Augmented Reality Training Utility for Radiography

II

Declaration

I declare that I have developed and written the enclosed Master Thesis completely

by myself and have not used sources or means without declaration in the text. Any

thoughts from others or literal quotations are clearly marked. This work was not

used in the same or in a similar version to achieve an academic grading or is being

published elsewhere.

.................................................. ................................................

Place, Date Signature

St. Pölten, 03.06.2020

Page 3: ARTUR Augmented Reality Training Utility for Radiography

III

Preface

My interest in the possibilities of virtual training in the education of radiographers,

especially in the training of patient positioning for plain radiography, was sparked

by my involvement as a lecturer for plain radiographic imaging in a radiographer

degree programme. From personal experience, I saw demand for time and space

independent training of radiographic positioning. With augmented reality becoming

more available and accessible, I wanted to explore possibilities to enable

augmented reality supported training of radiographic positioning.

I want to thank my advisor Dr. Markus Wagner, BSc MSc for his valuable input and

feedback during the writing of this thesis.

I would also like to thank my parents, Sonja & Günther for encouraging and

supporting me throughout my whole life and enabling me to pursue my passions.

Special thanks go to my dear girlfriend Martina, who supported and motivated me

during the whole master programme. Thank you for your patience and

encouragement!

Page 4: ARTUR Augmented Reality Training Utility for Radiography

IV

Abstract

Today, most of the teaching and training of radiographic positioning is done either

via simulation-based roleplay training performed in laboratory exercises, or in a

clinical setting at internships. Supported by academic studies at the university,

simulation-based training and clinical training are critical parts in the education of

radiographer students. Virtual training environments have shown to provide similar

effects to practical training, while being more cost effective and allowing students

to study according to their own schedule and to focus on their personal academic

requirements.

To evaluate the feasibility and usability of an augmented reality supported tool for

the training of radiographic positioning, ARTUR was developed and tested as part

of this thesis. ARTUR is an augmented reality application using the Microsoft

HoloLens mixed reality headset to project holograms into the user’s space. These

holograms of body parts positioned for plain radiographic procedures, act as

references for students to practice radiographic patient positioning.

ARTUR was developed utilizing a user centred design approach. Functional

requirements were set by a focus group. These requirements aided in the

development and usability testing of a working prototype. Following user tests,

semi-structured expert interviews were held to gather qualitative data for the

evaluation of the prototype.

Overall, users gave mostly positive feedback. All users were able to complete the

testing scenario without additional assistance. No major usability issues were

identified, leading to the assumption that augmented reality supported training of

plain radiographic procedures is a viable concept.

Page 5: ARTUR Augmented Reality Training Utility for Radiography

V

Kurzfassung

Den Grundstock der Ausbildung von Radiologietechnolog*innen bilden sowohl die

akademische Lehre als auch klinische Praktika und praktische Übungen in

Laborumgebungen. Insbesondere im Bereich der konventionellen

Projektionsradiographie spielt dabei die praktische Ausbildung der Studierenden

eine zentrale Rolle.

Studien konnten bereits zeigen, dass virtuelle Lernumgebungen ähnliche Effekte

wie praktische Trainings erzielen können. Darüber hinaus sind sie kostengünstiger

und ermöglichen es den Studierenden in Eigenregie sowie unabhängig von

Studenplan und Ort zu trainieren.

Um zukünftige Radiologietechnolog*innen bei der Ausbildung zu unterstützen

wurde in dieser Arbeit ein Prototyp einer Augmented-Reality Anwendung zum

Training von Patient*innen-Einstelltechnik in der konventionellen

Projektionsradiographie (ARTUR) entwickelt. Damit können mittels eines Microsoft

HoloLens Mixed-Reality-Headsets Hologramme von Körperteilen in korrekter

Patient*innenpositionierung in den Raum der Nutzer*innen projiziert werden.

Diese Hologramme sollen den Lernenden als Referenz zur Nachstellung dienen.

Für die Entwicklung von ARTUR wurde ein User Centered Design angestrebt.

Wobei die funktionellen Anforderungen an die Applikation mittels einer

Fokusgruppe festgelegt wurden. Zur Evaluation der Usability wurden Nutzertests

sowie leitfadengestützte Expert*inneninterviews durchgeführt.

ARTUR wurde von den Testnutzern überwiegend positiv beurteilt. Insbesondere

wurde hervorgehoben, dass die Durchführung der Nutzertests ohne zusätzliche

Unterstützung möglich war. Die Testnutzer haben keine großen Probleme in der

Nutzbarkeit von ARTUR identifiziert. Diese Beurteilungen bestätigen die

Annahme, dass Augmented Reality das Training von konventionellen

Röntgeneinstellungen positiv beeinflusst und unterstützt.

Page 6: ARTUR Augmented Reality Training Utility for Radiography

VI

Table of Content

Declaration II

Preface III

Abstract IV

Kurzfassung V

Table of Content V

1 Introduction 1

1.1 Research Questions 2

1.2 Structure & Methods 2

2 Theoretical Background & State of the Art 4

2.1 Radiographic Positioning 4

2.1.1 Importance of Radiographic Positioning in Plain Radiography 5

2.1.2 Radiographic Positioning - Terminology 7

2.2 Augmented Reality 11

2.2.1 Microsoft HoloLens 11

2.3 3D Scanning 13

2.3.1 Structured Light Scanners 14

2.4 State of the Art in Simulation-Based Training in Radiography Education 15

3 Design and Development of ARTUR 25

3.1 User Needs & Context of Use 27

3.1.1 Focus Group Setup 27

3.1.2 Focus Group Interview 28

3.1.3 Focus Group Findings 31

3.2 Task Analysis 33

3.3 Functional Requirements 35

3.4 Development of ARTUR 36

4 Evaluation of ARTUR 47

4.1 Expert Interview Setup 48

4.2 Expert Interview Findings 50

4.3 Evaluation Results 53

5 Discussion 57

5.1 Contributions 60

6 Conclusion 61

Page 7: ARTUR Augmented Reality Training Utility for Radiography

VII

References 63

List of Figures 69

List of Tables 71

Appendix 72

A. Declaration of Consent 72

B. Focus Group Interview Guideline 73

C. Abridged Transcript - Focus Group Interview 74

D. Testing Guideline for the Test Users 78

E. Expert-Interview Guideline 79

F. German Transcripts of the Semi-Structured Expert Interviews 80

a. German Interview with Participant One 80

b. German Interview with Participant Two 84

c. German Interview with Participant Three 88

G. English Transcripts of the Semi-Structured Expert Interviews 92

a. English Interview with Participant One 92

b. English Interview with Participant Two 96

c. English Interview with Participant Three 99

Page 8: ARTUR Augmented Reality Training Utility for Radiography
Page 9: ARTUR Augmented Reality Training Utility for Radiography

1 Introduction

1

1 Introduction

Practical training of theoretically acquired knowledge helps students in their

understanding of said knowledge and can increase their motivation to study [1].

Today, most of the teaching and training of radiographic positioning is done either

via simulation-based roleplay training performed in laboratory exercises or at

internships in a clinical setting. Supported by academic studies at the university,

simulation-based training and clinical training are critical parts in the education of

radiographer students [2].

Simulation-based training is well-known and integrated in many health care

professions. With surgery, emergency medicine or radiological procedures just

being examples [3]. Like real life simulation-based training, virtual simulation is

already in use in a variety of health care applications. Especially virtual reality (VR)

platforms, to simulate numerous surgical procedures or interventional radiography

situations are already commercially available. Virtual training environments can

supplement existing resources such as lectures, books or e-learning resources. In

addition, three-dimensional (3D) education environments can improve the

contextualization of learning and increase the student’s engagement [4]. Virtual

training environments enable learners to study without the presence of a teacher,

trainer or supervisor, which are needed while training in a laboratory or clinical

setting. This enables students to study according to their own schedule and

concentrate on their academic requirements [5].

Despite these advantages of virtual training environments, augmented reality (AR)

enabled training simulations are far from being mainstream in health care

education. Currently there are only a few AR enabled options of educational

content available. Since the supply of readily available software is low and the

initial cost associated with the purchase of AR hard- and software, compared to

traditional learning media, is rather high. The added value of AR supported training

is often overlooked [6].

Page 10: ARTUR Augmented Reality Training Utility for Radiography

1 Introduction

2

1.1 Research Questions

To explore the possibilities of AR supported practical training in healthcare

education, the main goal of this thesis is to design and evaluate a prototype for an

AR enabled simulation-based training method for patient positioning in plain

radiography. Through the development and evaluation of this tool, the following

research question and its two sub questions will be answered:

Main Research Question:

How can virtual learning environments support radiographers in learning

radiographic positioning?

Sub Question One:

How can a visual aid, to assist users in training the positioning process, be

realized?

Sub Question Two:

How do users evaluate the design and usability of the visual positioning aid?

1.2 Structure & Methods

For all empirical research in this thesis, an ethics approval was granted by the state

of Lower Austria.

To get an introduction into the topics relevant to this thesis, radiographic

positioning, the principles of AR and three-dimensional (3D) scanning, are covered

in chapter 2. The literature researched for this chapter is comprised of textbooks

and scientific publications. If given, specific technical data of the devices used to

realise this thesis is derived from the manufacturer’s websites or technical

documentation.

To give an insight into current scientific developments around virtual and AR

training simulation, chapter 2.4 evaluates scientific writings dealing with those

topics. To ensure the currency of the information, only articles and studies

published in reputable journals after 2015 are used to analyse the current state of

the art in the training of radiographic positioning.

To help answer the main research question, as well as the first sub question, a

focus group of four teaching radiographers will meet and discuss the possibilities

of AR assistance for radiographic positioning training. In addition, the findings of

the focus group interview will assist in the user centred design of the prototype of

the AR simulation environment ARTUR, which will be developed as part of this

Page 11: ARTUR Augmented Reality Training Utility for Radiography

1 Introduction

3

thesis. The qualitative data, gathered through the focus group interview, will be

analysed to help set task requirements and (non-) functional requirements for the

prototype of ARTUR [7, pp. 204–206].

The design and development of ARTUR will be documented in chapter 3. The

application will be developed for a Microsoft HoloLens 1 mixed-reality headset

utilizing the Unity 3D Engine.

The evaluation of the application’s usability will be accomplished through user tests

and semi-structured expert interviews with two radiographers who have at least

five years of professional experience and one expert on AR/VR applications with

at least three years of experience with AR. Each test user will do an individual test

run of a training session with ARTUR, guided by a testing guideline. Following the

test, individual semi-structured interviews will be held, recorded and transcribed.

To allow for a structured analysis of the gathered qualitative data, the content of

the transcripts will be thematically categorised for statements relevant to the

feasibility and usability of ARTUR [7, pp. 198–200]. Any usability issues identified

by the test users will be rated on the scale of severity for usability problems by

Nielsen [8].

The findings of the literature research, the focus group interview, the user centred

design of ARTUR and the results of the usability tests are then discussed and

concluded. In addition, limitations of this thesis and an outlook on future

possibilities of AR in radiographer education as well as possible future additions to

ARTUR are discussed.

Page 12: ARTUR Augmented Reality Training Utility for Radiography

2 Theoretical Background & State of the Art

4

2 Theoretical Background & State of the Art

This chapter gives an introduction into the topics relevant for this master thesis.

Since ARTUR will be an AR training application for radiographic positioning, basic

principles of plain radiography and the importance of adequate positioning are

discussed in the following subchapter 2.1. To get an insight into the underlying

technology of ARTUR, AR and the Microsoft HoloLens are also explained in

subchapter 2.2. The 3D models of various body parts used for the AR holograms

were acquired with a structured light 3D scanner, the principles of which will also

be explained in subchapter 2.3. The current role of simulation-based training in

healthcare and especially radiographic training will be explored in subchapter 2.4.

2.1 Radiographic Positioning

Soon after the discovery of x-rays and their presentation to the physical medical

society in Würzburg in 1896 by Wilhelm Conrad Röntgen († 1923), the potential of

ionizing radiation in medicine was explored [9]. As one of the pioneers of radiology,

Guido Holzknecht († 1931), a doctor who established the radiology department at

the Vienna General hospital in the early 20th century, was one of the first people to

urge for the standardization of radiographic imaging. The least amount of

normalized projections, which still allow proper evaluation of the radiographic

images, had to be found and described [10].

The WHO released a revision of its Manual of Radiographic Technique in 2003,

which illustrated radiographic positioning techniques and stated that “The range of

basic radiographic techniques described is sufficient to enable more that 90% of

the problems diagnosable through radiography to be routinely examined” [11, p.

7]. Each position was described in written form and illustrated via several

photographs of a patient in a certain position, relative to the x-ray equipment. In

addition to the patient’s position, technical properties, such as recommended size

of film cassettes and exposure parameters were listed for each position. A format

which is still in use by many textbooks for radiographic positioning today [12], [13].

Page 13: ARTUR Augmented Reality Training Utility for Radiography

2 Theoretical Background & State of the Art

5

2.1.1 Importance of Radiographic Positioning in Plain Radiography

Radiographers are required to display high levels of accuracy when positioning

patients, even slight over rotations or misalignments can result in poor image

quality and complicate the diagnosis of a radiographic image. In addition,

radiographers are obligated to protect patients from unnecessary radiation

exposure, which requires radiographic images to, ideally be acquired on the first

try [14].

Despite the advancements in diagnostic imaging, like computed tomography (CT)

or magnetic resonance imaging (MRI), which allow to generate three-dimensional

images and therefore enable accurate diagnosis of pathologies. Plain radiography

remains an essential diagnostic tool, mainly because it is a simple, cheap and

accessible technique, which exposes patients to a relatively low doses of radiation.

While still providing valuable clinical information [15].

Due to its nature, plain radiography generates two-dimensional summary images

of three-dimensional bodies. The patient’s body part represents an obstacle for the

x-rays. Tissues with higher thickness, density and ordinary number absorb more

of the passing x-rays. The remaining x-rays, which are leaving the patient’s body,

are then detected by a digital detector. The detected intensity distribution of the x-

rays on the detector is resulting in a plain radiograph. Where larger amounts of x-

rays on the detector result in darker pixels on the image – hence why bones are

displayed as white and air as black [16].

Page 14: ARTUR Augmented Reality Training Utility for Radiography

2 Theoretical Background & State of the Art

6

Figure 1: Frontal and lateral wrist x-ray in two typical projections for a musculoskeletal

radiographic image. When acquiring images of the wrist, two projections, angled 90° to

one another are necessary [17].

This principle entails the danger that pathologies with lower density could be

obscured by an overlaying anatomical structure with higher density. Thus, multiple

views of the same body part are needed. These different projections of the same

anatomical structure ensure that no pathologies or fractures are obscured. To

minimize the radiation exposure of the patients, most plain radiographic studies of

the musculoskeletal system are comprised of two projections, angled 90° to one

another [16]. An example of a plain radiographic image pair is shown in Figure 1.

In Addition, standard radiographic positioning allows radiologists to describe

findings on radiographs in relation to anatomic structures. Standard positions

ensure comparability between multiple instances of radiographic images. Berlin

[18] has argued, that a radiologists accuracy in reporting radiographs can’t be

greater than the quality of the radiographs presented for interpretation. Proper

radiographic positioning is essential for high quality radiographs.

Page 15: ARTUR Augmented Reality Training Utility for Radiography

2 Theoretical Background & State of the Art

7

2.1.2 Radiographic Positioning - Terminology

To learn and compare radiographic positioning techniques, specific terminology is

used to describe patient positions, x-ray beam positioning or direction and size of

the effective radiation field.

Since commonly used location designations, such as above or below, change

depending on the position of the human body, anatomical directional terms are

used. These terms are based on the anatomical position, which is a person

standing upright, eyes straight ahead, the hands are hanging beside the body with

the palms to the front and the feet are parallel. The anatomical directional terms

are absolute to this position and independent of the persons actual position [19, p.

5].

The most used anatomical directional terms are:

• Anterior: in front of, front

• Posterior: behind, towards the rear

• Superior: above, over

• Inferior: below, under

• Distal: away from the body center

• Proximal: closer to the body center

• Dorsal: towards the back

• Ventral: towards the stomach

• Lateral: towards the side, away from the mid-line

• Medial: towards the mid-line, away from the side

• Cranial: towards the head

• Caudal: towards the bottom

In addition, directions in the human body can be defined by planes:

• Sagittal plane: a plane which runs through the body from front to back, it

divides the body into left and right regions.

• Coronal plane: a vertical plane which runs through the body from side to

side, it divides the body into front and back regions.

• Transverse plane: a horizontal plane which runs through the body, it

divides the body into upper and lower regions [19, pp. 6–7].

Page 16: ARTUR Augmented Reality Training Utility for Radiography

2 Theoretical Background & State of the Art

8

Figure 2: Sagittal, coronal and transversal plane. Illustrated on a human standing in the

anatomical position [20].

Figure 2 illustrates a human standing in the anatomical position and the three main

planes, the transversal plane, the coronal plane and the sagittal plane.

To define the suitable x-ray beam direction for a certain radiograph, the path of the

beam through the patients’ body is described. For a chest x-ray where the patient

is positioned with their back towards the x-ray tube, the terminology would be chest

x-ray posterior-anterior, since the beam enters the body at the back (posterior) and

leaves it at the front (anterior). This tells the radiographer how to position the x-ray

equipment relative to the patient, without the need of further instructions. In

addition, the designation of the patient’s position and the path of the x-ray beam

allow radiologists to distinguish left and right on the summary image.

Since plain radiography projects three-dimensional objects onto a two-dimensional

plane and x-rays leave the x-ray tube in a cone like shape, objects positioned

further from the vertical central beam appear more distorted. To project the object

of interest true to original, the central beam must be positioned on top of this object.

Page 17: ARTUR Augmented Reality Training Utility for Radiography

2 Theoretical Background & State of the Art

9

To aid position the cone beam accordingly, a light sight with a cross hair indicating

the central beam is used. This light sight also indicates the size of the cone beam

and can be collimated to the size of the object. The desired position of the central

beam on the patient’s body is determined by either visible or palpable anatomical

structures. The central beam is positioned on or near these structures [13, pp. 21–

22].

To perform a radiographic study, additional technical parameters need to be

known. The suitable format of the detector is given in cm width and height.

Common detector sizes are 18/24 cm, 24/30 cm, 35/43 cm which are still based

on analogue film formats.

Most radiographic equipment allows users to select either a small or a large focal

point. The size of the focal point determines the spatial resolution of the resulting

image and the resilience of the x-ray tube. A smaller focal point results in higher

resolution, a larger focal point allows the use of higher energies and therefore the

acquisition of thicker or denser objects [13, p. 28].

If high energy x-rays are used, or if the object is dense, a higher percentage of

scattered x-rays emerge from the patient’s body. Since these scattered rays lower

the contrast of radiographic images, anti-scatter grids can be used to filter the

scattered rays before they hit the detector. These anti scatter grids are used for

objects which are thicker than 11 cm [21, p. 36].

If the detector is placed inside the x-ray table, the systems automatic exposure

control (AEC) system can be used, to optimize the exposure of the acquired image.

Utilizing an ionizing chamber, the AEC measures the amount of radiation leaving

the patient and stops the radiation as soon as enough radiation for adequate

exposure is detected [13, p. 556].

The Focus Detector Distance (FDD) is a crucial value which influences the

projection properties and requires adaption of exposure values if it is changed from

the norm. It is defined as the distance in a straight line between the x-ray tube’s

focal point and the detector and is given in centimeters. For most radiographic

positions, the FDD is between 105 and 110 cm [13, pp. 22–24].

The voltage in kilovolts (kV) is the first exposure value, it determines the energy or

quality of the produced x-rays. The higher the voltage, the higher energy x-rays

are produced. Typical voltage values in radiographic imaging are between 40 and

120 kV. Thicker and denser Objects require higher voltage.

The heat current time product, given in milliampere times time in seconds (mAs) is

the second exposure value. The heat current is responsible for the quantity of x-

Page 18: ARTUR Augmented Reality Training Utility for Radiography

2 Theoretical Background & State of the Art

10

rays produced. For thicker and denser objects higher heat current is required. If an

AEC is used, the current is regulated automatically [22, p. 152].

The following Table 1 shows how the patient position; the central beams position

and the technical properties are typically conveyed [13, p. 84].

Table 1: Clinical and technical examination parameters for patient positioning and

equipment adjustment. Example parameters for a radio-ulnar os scaphoideum

radiography [13, p. 84].

Patient Position

Place the patient’s wrist flat in the middle

of the detector and raise the ulnar side of

the patient’s arm and hand for 45°.

Central Beam

Place the central beam in the middle

between the thumbs base joint and the

radius and shift it 1 finger width to median.

Detector Size 18/24 or 23/30 cm

AEC Disabled

FDD 105 cm

Focus Size Small

Voltage 50 – 60 kV

Tube Current 1,3 – 2 mAs

Any learning material for radiographic positioning, conventional media like

textbooks or e-learning alike, need to communicate instructions in this terminology

to students. Possibilities on how to communicate these parameters with the help

of AR will be explored in chapters 2.4 and 3.

Page 19: ARTUR Augmented Reality Training Utility for Radiography

2 Theoretical Background & State of the Art

11

2.2 Augmented Reality

Augmented reality describes different display technologies which can overlay or

combine alphanumeric, symbolic or graphical information with the real-world view

of the user. Mostly AR is associated with visual augmentation, which means that

computer graphics are combined with real world images. From heads up displays

in cars, to mobile games, the use of AR already arrived in the general public.

Although the spread of smartphones and tablets has increased the interest of

consumers and industry in AR, it has not yet reached its full potential, which is

likely to grow with increasing mobile computing power [23].

Lots of attention, especially for professional applications, is geared towards

wearable eyewear AR with optical see through tracking. These often head

mounted devices use depth sensors to scan and model the user’s environment

and register computer graphics into the real-world space. Such devices are

available from a variety of manufacturers, the most prominent ones being the

HoloLens line by Microsoft. Other examples are the Magic Leap 1 by Magic Leap

or the discontinued Meta headsets by Metavision. Both the HoloLens 1 and 2, as

well as the Magic Leap are spatial computing units, which perform all computing

operations on the device and therefore do not need to be tethered to a personal

computer (PC) or another external computing device [24, pp. 10–11].

Since ARTUR was developed for the Microsoft HoloLens system, the next

subchapter deals with the functional principle of the Microsoft HoloLens device.

2.2.1 Microsoft HoloLens

The Microsoft HoloLens is a fully self-contained holographic computer, running

Microsoft Windows 10. It is designed combining machine vision with pico projection

displays. Although, Microsoft did not release specifics on how the projection is

accomplished in detail, it is known that the HoloLens uses additive projection to

display holograms. The HoloLens 1 features a 32-bit 1Ghz Intel Atom processor

with 4 logical CPU Cores, 1 GB of DDR3 memory and 64 GB of storage. A custom

processor, called the holographic processing unit (HPU) is responsible to calculate

spatial- and location-based processing Which allows reliable interaction between

holograms and the real world [25].

Spatial mapping provides detailed information about real-world surfaces in the

vicinity around the user wearing the HoloLens. In order to perform consistent

spatial mapping of the real-world environment, the HoloLens is equipped with an

array of sensors. An inertial measurement unit, six cameras, four microphones and

an ambient light sensor constantly scan the environment to keep holograms

Page 20: ARTUR Augmented Reality Training Utility for Radiography

2 Theoretical Background & State of the Art

12

steady. Spatial mapping also allows holograms to physically interact with real-

world objects, users can place holograms on the surface of a table for example

[26].

Users can interact with the HoloLens 1 in a few different ways. The device reacts

to certain hand gestures, the user’s gaze or voice input:

• Gaze

Gaze is the first form of user input which works like a cursor on a PC.

Instead of manipulating a mouse, the cursor of the HoloLens is manipulated

through the user’s head movements. If the user wants to select an item,

they must look at that item.

• Bloom

The Bloom hand gesture opens the start menu of the HoloLens. It is

performed by bringing all fingers of one hand together and opening the

hand, while holding the hand in front of the user. An illustration on how the

Bloom gesture is performed is shown in Figure 3.

Figure 3: Bloom gesture to open the main menu.

Page 21: ARTUR Augmented Reality Training Utility for Radiography

2 Theoretical Background & State of the Art

13

• Air Tap

To select an item, the user must gaze at that item and perform the Air Tap

gesture. To Air Tap the index finger of one hand must point up, needs to

be lowered and raised again. The Air Tap gesture is illustrated in Figure 4.

Figure 4: Air Tap gesture to select items.

• Voice

Alternatively, users can gaze at an object and say “select” to select an item

without the need to use Air Tap [27].

In November 2019, Microsoft released the HoloLens 2 as a direct successor to the

HoloLens 1 device. While working on similar principles as the HoloLens 1, the new

device has received a substantial boost in processing power, a greater field of view

(FOV), additional interaction possibilities which should allow for a more immersive

AR experience [28]. Unfortunately, at the time of writing the HoloLens 2 was not

yet available in Austria.

2.3 3D Scanning

3D Scanners are devices which analyse real-world objects, in order to create 3D

digital models of these objects. Depending on the area of application, a variety of

technologies can be used. Fundamentally, 3D scanning technologies can be

divided into two categories: contact and non-contact scanners. Contact scanners

touch the objects to measure its dimensions. Contact scanners have limited range

and are suited for smaller objects but can take precise measurements of

complicated objects. Non-contact scanners use either lasers or light-emitting

diodes (LED) to capture the object. Non-contact scanners allow larger objects to

be scanned but can be less precise. Especially if the object moves or the light

conditions are not ideal [29, pp. 20–21].

Page 22: ARTUR Augmented Reality Training Utility for Radiography

2 Theoretical Background & State of the Art

14

The 3D scans used in ARTUR were acquired using a non-contact Artec Eva

structured light 3D scanner [30]. Therefore, the principle of structured light

scanners will be described in the following subchapter.

2.3.1 Structured Light Scanners

Structured light scanners capture sequences of images with various patterns of

visible light projected on the scanned object’s surface. This creates a full 3D point

cloud of the object’s geometry. For accurate results, it is crucial that the scanned

object stays stationary and still during the scan process.

Structured light scanners can either be stationary or for handheld use, they

typically use a combination of scanning hardware and scanning software.

• Hardware

Structured light scanners use a projection light source, usually blue or white

LED light sources. The light source projects multiple parallel light beams

on the scanned object. A camera then captures images of the different light

patterns reflected by the object.

• Software

The software operates the scanner, processes the collected data and

displays the digital model of the scanned objects. Typically, the software

contains post processing possibilities to edit the captured models.

The structured light 3D scanning technique has both its advantages and

disadvantages. The advantages include that the scanners can generate high

resolution data with a great level of detail. So even small details of objects can be

captured. The range of raw data quality of currently available handheld scanners

is illustrated in Table 2. Another advantage is that many structured light 3D

scanners can capture the object’s appearance and map the texture accurately to

the model.

The major disadvantages are the scanners range and speed. Often multiple scans

from multiple angles need to be done, which takes up time. Depending on the

objects surface, structured light scanners can have issues capturing the object.

Transparent or very reflective objects cannot be captured by the device’s cameras

[31].

Page 23: ARTUR Augmented Reality Training Utility for Radiography

2 Theoretical Background & State of the Art

15

Table 2: Range of raw data quality of handheld scanners. Displaying the range of 3D

surface sampling density, 3D point accuracy, the working distance and typical data

acquisition speeds for current handheld 3D scanners [31].

3D Surface Sampling Density 0,1 – 0,5 mm

3D Point Accuracy 0,05 – 0,1 mm

Working Distance 0,2 – 1,2 m

Data Acquisition Speed 1 – 4 Million Points / Second

With a 3D point accuracy of 0,1 mm, a 3D surface sampling density of 0,5 mm, a

working distance of 0,4 – 1 m and a data acquisition speed of 2 million points per

second the Artec Eva is well suited to capture 3D models of human body parts in

radiographic positions in sufficient quality for an AR application [30].

Due to their flexibility and ability to capture complex three-dimensional scenes,

handheld structured light 3D scanners are widely used in a multitude of

applications, such as industrial design, arts and healthcare [31, pp. 51–53].

2.4 State of the Art in Simulation-Based Training in Radiography Education

Training of radiographic positioning is a crucial part of the clinical education for

radiographer students. Even though contact to real-life patients provides students

with optimal conditions to develop their skills, clinical environments are not able to

offer students safe learning surroundings. In addition, patients are entitled to

receive the best possible and safest treatment available, which makes the use of

simulated learning environments in student education essential [32].

With clinical education, where students perform radiographic examinations on

patients under the supervision of registered radiographers in a clinical setting,

being recognized as an important core of students’ education in radiography [2],

[33], [34]. Clinical placements are not universally applicable, students often find the

transition from academic studies to clinical practice challenging, which hinders the

formation of a clear link between theory and practice. [35], [36], [37].

As a possible factor for difficulties in clinical education, Botwe et al. [38] researched

the impact of theory-practice gaps on radiography students undergoing clinical

placements. A theory-practice gap is described as the discrepancy between

Page 24: ARTUR Augmented Reality Training Utility for Radiography

2 Theoretical Background & State of the Art

16

content theoretically taught in a classroom and the actual clinical practice. The

quality of clinical training depends on a multitude of factors, amongst others on the

supervisor’s level of capability and experience, the workload at the teaching

hospital and the relationship between the supervisor and the trainee.

Inexperienced or incompetent supervisors, high workloads and poor relationships

between students and supervisors are causes for increased theory-practice gaps.

The study investigated issues for students caused by theory-practice gaps in the

most common radiographic procedure, chest radiography on 26 third- and fourth-

year radiography students. After completing academic studies and clinical practice

for chest radiography, the students partook in a quantitative study utilizing a

descriptive survey. The answers to a questionnaire containing close- and open-

ended questions were categorized to enable quantitative analysis of the results.

The findings were described using descriptive statistics.

Figure 5: The impact of theory-practice gaps on students [38].

The study’s results on the negative influences of theory-practice gaps on students,

as shown in Figure 5, are consistent with those in other literature [39], [40], [41].

They identify confusion, poor performance in clinical and practical exams, the

inability to practice the theory and a loss of interest in the subject as major negative

impacts of theory-practice gaps. To close the theory-practice gap, the study

suggests suitable preparation of the students on clinical placements, periodic

training on the students’ needs for supervising radiographers and effective

communication between the educational and imaging departments [38].

Page 25: ARTUR Augmented Reality Training Utility for Radiography

2 Theoretical Background & State of the Art

17

To adequately prepare students for their clinical education, as well as to ensure

patient safety, simulation-based trainings in a laboratory setting are commonly

used in radiographic degree programmes worldwide [34], [35], [42]. Simulation-

based trainings are commonly used to help students develop clinical skills in a safe

and stress-free environment where they can develop their skills with fewer time

constraints and less perceived pressure from observation [14]. Simulation-based

training tries to emulate real patient interaction, anatomic regions or clinical tasks.

The term simulation can be applied to a variety of methods, like role-play, virtual

systems, simulated patients or simulated environments. Based on the realism of

an environment a simulation can recreate, they are divided into low-, intermediate-

and high-fidelity simulations. Low-fidelity simulations offer limited realism and

interactivity, they commonly include role-play trainings of simple clinical scenarios

or replications of patients in the form of anatomic phantoms. Intermediate-fidelity

simulations are more realistic and include the use of manikins with simulated

heartbeat or breath sounds or mock radiographic equipment to simulate more

complex situations. High-fidelity simulations feature a realistic experience, for

example life-size manikins which mimic human physiological responses and react

to the trainee’s actions. A widely adopted approach in radiographic education is

the utilization of low-fidelity role-play scenarios, where patient positioning is trained

with test patients (actors or students) on real x-ray equipment. And the use of

manikins and anatomical body parts to practice the fundamentals of radiographic

projection [34], [35].

Kong et al. [34] researched the impact of low-fidelity simulation-based training on

the development of clinical knowledge and confidence on first-year radiography

students. To measure the students’ knowledge gain after completing role-play

trainings and x-ray phantom imaging on a life size manikin. Pre- and post-

simulation tests focused on radiographic examinations of the abdomen, pelvis and

the hip joint were conducted. Both the pre- and the post-simulation tests contained

eight questions regarding the execution of the radiographic examinations. In the

role-play training scenario, students acted as radiographers or patients to simulate

a whole radiographic procedure. The radiographer received a request form

containing mock patient data and clinical notes and had to carry out the procedure

accordingly. The radiographer was required to greet the patient and give him or

her instructions on the forthcoming examination. After that, the patient had to be

positioned appropriately, the body had to be palpated for superficial landmarks to

position the central beam and correct exposure values had to be chosen. For

radiation protection reasons no actual radiographic images of the test patients

were acquired. To simulate the resulting radiographic images, a life-size phantom

was positioned in the same way as the test patient and radiographic images of the

Page 26: ARTUR Augmented Reality Training Utility for Radiography

2 Theoretical Background & State of the Art

18

phantom were taken. This allowed students to analyse the radiographic projections

on resulting images without exposing their peers to harmful radiation.

Figure 6: Percentage of correct responses pre- and post-test [34].

All 55 participants of the study completed the pre- and post-simulation tests. As

illustrated in Figure 6, all questions on the post-test showed an increase of correct

responses. Correct answers for questions four, five, six and eight increased

significantly. With a result of mean total points of 16,01 ± 0,31 on the post-test, a

significant increase over the 13,52 ± 0,316 mean total points on the pre-test was

observed (p < 0,001). In accordance with previous relevant studies, the authors

concluded that low-fidelity simulations can significantly increase students’ clinical

knowledge of radiographic examinations and radiographic projection [34].

While low-fidelity role-play exercises and inanimate radiographic phantoms, wo are

lacking physiological characteristics, are still able to support understanding the

principles of radiographic projections, a certain degree of realism in a simulation is

needed to keep students from focusing on learning skills without considering the

clinical context in which the skills are applied [34].

The immersion and realism provided by high-fidelity simulations are essential to

situated learning and help transferring knowledge and skills into real world

scenarios [43]. A situated learning approach is based on the theory that learning

is embedded into activity and context [44]. To measure the impact of realism in

simulations in the education of radiographers, Shiner et al. [36] recreated second-

and third degree burns by applying moulage to a test patient. Moulage is the

0

20

40

60

80

100

1. Anatomy 2. X-raytube

alignment

3. Radiationprotection

4. Cassetteplacement

5. Clinicalindications

6. Patientpositioning

7. Exposurefactors - kV

8. Expsurefactors -

mAs

Pre-test Post-test

Page 27: ARTUR Augmented Reality Training Utility for Radiography

2 Theoretical Background & State of the Art

19

application of make up to mock injuries in simulation scenarios, which increases

the participants’ immersion into the experience. To further increase the realism of

the simulation, the test patient was situated in a hospital bed at the teaching

hospital and medical equipment was placed around the patient. 28 third-year

radiographer students were invited to partake in the study and observe the burns

patient, who was not communicative and visibly in pain. The students did not know

that the patient was fake and in fact one of the researchers in moulage. The second

researcher initiated a discussion with the students. They explored the complex

care needs of a burns patient, imaging requirements, infection control procedures,

communication necessities and further systemic pathologies experienced in the

simulation scenario. After the exercise was finished, the simulation was declared

to the students. The researchers collected data from all 28 participants with a

mixed methods questionnaire, combining quantitative and qualitative approaches.

Video footage analysis of the simulation and focus groups were used to evaluate

the exercise and to gain an insight into the students’ impressions. To get an insight

into the students‘ knowledge gain on the challenges of radiographic imaging of

burn patients through the simulation, pre- and post-simulation questionnaires

containing four questions were completed by all 28 participants.

Figure 7: Pre-simulation test results [36].

The four questions in the pre- and post-simulation questionnaires could be

answered on a Likert scale questionnaire. As illustrated in Figure 7 and Figure 8,

a positive shift in the answers to the questionnaire can be observed. Especially

questions regarding communication with burn victims and the understanding of the

radiographer’s role in the burns unit were increasingly positive after the simulation.

Page 28: ARTUR Augmented Reality Training Utility for Radiography

2 Theoretical Background & State of the Art

20

This correlates with the qualitative data collected by the researchers in the post

simulation focus group interviews.

Figure 8: Post-simulation test results [36].

In addition, the qualitative data of the study agrees with other literature, stating that

the theory practice transfer works best when the exercise is closely linked to real-

world scenarios. Which supports the theory that realistic and immersive training

environments promote the transfer of knowledge into real situations [36].

While being an effective tool to transfer theoretically acquired knowledge into

practice, delivering hands-on simulation training requires coordination of

resources, such as trainers and laboratories. Those laboratories need to be

equipped with radiographic equipment, all of which adds to the expenses

necessary to realize hands-on simulation training, like role-play scenarios or

phantom studies. Virtual simulation enables students to review the content at any

time convenient to them and without being bound to the laboratory. Therefore

enabling cost and time efficient exercise. Like hands-on simulation, virtual training

environments increase student engagement and help transferring theoretical skills

into a clinical setting [4].

Shanahan [33] researched the perspective of radiography students on using a

virtual radiography simulation. The pilot study introduced a virtual radiography

simulation, in the form of the computer-based software Projection-VR, now known

as Shaderware. The simulation acted as support for the development of pre-clinical

technical skills in the second year of an undergraduate radiography programme.

To evaluate the students’ perceived confidence and skill development and possible

Page 29: ARTUR Augmented Reality Training Utility for Radiography

2 Theoretical Background & State of the Art

21

usability issues the students might encountered, the study utilized a mixed

questionnaire containing Likert-scale and open-ended questions. The survey was

completed by 82 of 86 invited students.

Table 3: Student perception on the usability of the Shaderware software (n = 82) [33].

Question Strongly

agree Agree

Neither agree

nor disagree Disagree

Strongly

disagree

I could access Shaderware

outside my scheduled lab 28 40 12 1 1

I could control the equipment

as I needed when using

Shaderware

20 53 8 1 0

I liked using Shaderware 26 39 11 6 0

Shaderware is easy to use 13 55 9 5 0

Technical problems made

using Shaderware difficult 3 14 24 36 5

The weekly Shaderware

laboratory worksheets were

easy to follow

44 36 1 0 1

As illustrated in Table 3, overall perception on the accessibility of the simulation

and the students’ ability to use the software as needed was positive. With

Shaderware being computer-based software, controlled via keyboard and mouse,

users with little experience in using computer software could encounter more

difficulties than the students whose self-evaluated confidence in the usage of

computers was rather high. With 92% of students describing themselves as

confident or moderately confident in using computer-based simulations. In

addition, the evaluation of the questionnaire suggests that the use of virtual

simulation offers students the opportunity to develop clinical skills and build

confidence through the possibility of repeating exercises as necessary. Overall,

student feedback in the study suggests that virtual simulation environments add

value to their learning experience. Despite virtual simulation having beneficial

effects on the development of students’ skills, the introduction of virtual simulation

technology into an educational programme can have its issues. Hardware must be

compatible with the planned system or must be purchased especially for use with

the simulation environment. Educators might also underestimate the level of

Page 30: ARTUR Augmented Reality Training Utility for Radiography

2 Theoretical Background & State of the Art

22

guidance and training needed by the users before the virtual simulation can be

utilized efficiently [33].

Being computer-based and controlled through keyboard and mouse input,

Shaderware offers a relatively low-fidelity training environment. While, like in

traditional real-life simulation, patients can be positioned and radiographic images

can be acquired, the preparation of an examination or patient interaction cannot

be simulated. To research the impact of a more immersive, higher-fidelity

simulation and compare it to computer based software, Sapkaroski et al. [45]

developed a virtual reality based software called CETSOL VR Clinic and integrated

it into the curriculum of a radiography programme. Student perception on the virtual

reality application was acquired using a Likert-scale questionnaire [33], [45].

CETSOL VR Clinic is an immersive virtual reality radiography simulation. With the

software the 79 students participating in the study, were able to complete a variety

of radiographic examinations in a virtual reality environment. It featured dynamic

patient interaction with voice recognition in simulated clinical scenarios. The user’s

progress was logged online, which enabled users and trainers to remotely track

the training progress. In their study, Sapkaroski et al. [45] had two cohorts of

students complete virtual radiographic training. The first group (n=43) used

Shaderware, a traditional computer-based simulation. The second cohort (n=49)

utilized CETSOL VR Clinic in their virtual training sessions. Both groups answered

the same Likert-scale questionnaire with questions regarding their perception on

the impact of virtual simulation on their radiographic education. Coinciding with

other studies, the results of the questionnaires indicate that the use of virtual

simulation aids students in their training of radiographic procedures [46], [47].

Comparing the results of the two cohorts, the higher immersion through dynamic

patient interaction and communication of CETSOL VR Clinic improved the

students’ perceived knowledge in equipment and patient positioning at a higher

rate than the computer based simulation [45].

Virtual simulation is used in other radiological disciplines than plain radiography.

Radiotherapy training with the Virtual Environment for Radiotherapy Training

(VERT) has shown to reduce complications in clinical placements and potentially

reduce the burden on supervisors and students in clinical training [48]. Utilizing a

service provided by NETRAD and the software LabShare, real CT scanners can

be operated by students remotely. This enables a time and space independent,

high-fidelity simulation of real equipment. A study by Lee et al. [49] has shown that

such a simulation has a similar impact on the learning of core knowledge as hands-

on simulation on a local machine [49].

Page 31: ARTUR Augmented Reality Training Utility for Radiography

2 Theoretical Background & State of the Art

23

At the time of writing, AR simulation for professional training of radiographers was

not yet established. However, in a study Balian et al. [50] reviewed the feasibility

of an AR simulation in cardio-pulmonary resuscitation (CPR) training. 51

healthcare professionals, who had to recertify their CPR training were asked to

participate the AR study. While performing chest compressions on a manikin, AR

holographic images of a cardiovascular system were projected in front of the user.

The hologram provided visual feedback according to the user’s performance. With

blood flow to vital organs either increasing or decreasing depending on the quality

and rate of chest compressions on the manikin. If the user diverted from the

recommended compression rate, additional audible feedback was given. Users

performance was displayed in the form of a CPR quality score after the session. In

addition to performance metrics, the participants satisfaction of the training

experience was surveyed, using a four-point Likert scale questionnaire. The

sensory data showed that most participants used adequate chest compression

technique. Which indicates that AR training environments support effective

execution of CPR. These findings are consistent with other research concerning

the value of audio-visual feedback in training situations [51], [52]. In addition, the

participants evaluation of the AR experience was positive. As illustrated in Figure

9, the users found the audio-visual feedback to be useful in assisting effective

performance of chest compressions during a CRP training.

Figure 9: user evaluation of the AR CPR system [50].

Over 90% of participants would like to use the AR CPR simulation in future

trainings. With most users feeling comfortable using the AR system, the study

proposes that AR training environments are effective educational tools [50].

Page 32: ARTUR Augmented Reality Training Utility for Radiography

2 Theoretical Background & State of the Art

24

Further areas of application of AR in healthcare are applications to teach anatomy,

for surgical training, patient education or implant design. An example is a software

for radiology with holographic augmentation, RadHa by Sira Medical. Which allows

the projection of radiographical datasets as three-dimensional holograms onto a

real-world background. This visualisation helps surgeons to understand

complicated anatomy before a surgery or to create roadmaps of anatomical

structures during surgeries. Allowing for quicker, safer and more efficient

procedures. The technology can also be used to educate patients about their

surgeries or other planned measures [53].

Summary: While clinical practice offers students a great overall learning

environment, clinical placements do not provide safe learning surroundings for

students and patients [32]. Students often find it difficult to transfer theoretically

acquired knowledge into practice. Which leads to confusion, poor performance in

clinical tasks, inability to practice the theory and ultimately loss of interest in the

subject [38]. To prepare students for their clinical practice, simulation-based

laboratory trainings are commonly used [35]. Simulations enable students to

develop clinical skills in a safe and stress-free environment [14]. Various studies

have shown that role-play simulation in a laboratory setting can significantly

increase students’ clinical knowledge of radiographic examinations and

radiographic projection [34]. Despite the knowledge gain provided by low-fidelity

role-play trainings, a certain degree of immersion can only be accomplished by

simulations with higher fidelity [36]. While being effective learning tools, practical

hands-on trainings, provided in a laboratory setting by a trainer are cost and time

intensive [4]. To deliver time and space independent high-fidelity training, virtual

simulation environments provide similar benefits to hands-on simulation in a

laboratory [49]. Virtual simulation is already in use in various disciplines of

radiographic education [48]. At the time of writing, no AR enabled simulation

environments for the training of radiographers were available. However, the

feasibility of AR simulation in healthcare education has been proven by a study

testing an AR CPR training system [50].

Page 33: ARTUR Augmented Reality Training Utility for Radiography

3 Design and Development of ARTUR

25

3 Design and Development of ARTUR

To enable laboratory and trainer independent AR assisted training of patient

positioning for plain radiography. The design of ARTUR aimed to provide users

with life-size reference holograms of radiographic positions. In addition to

positioning guidance, users should also be instructed over further examination

parameters, like central x-ray beam position or collimation size. To realize those

ambitions, ARTUR was designed and developed in a user centred design

approach.

Typically, user centred design follows a design and development cycle. With a

focus on understanding who the product’s users are and what they want in a

product. User centred design can be applied in many forms as the term does not

specify exact methods for each phase of the design cycle. User centred design is

driven by user centred evaluation of the whole user experience, which involves

users through the entire design and development process.

Figure 10: Illustration of a typical user centred design cycle. After identifying the user

needs and the context of use, the user requirements are specified. Following

development and design of the product, users evaluate the finished product. The circle

can be applied iterative. Image courtesy of [54].

Page 34: ARTUR Augmented Reality Training Utility for Radiography

3 Design and Development of ARTUR

26

As demonstrated in Figure 10, user centred design involves the following phases

in a design process:

• Identify the need of the users

What problem should be solved?

• Specify the context of use

Who are the users of the product, what will they use it for and what are the

conditions the product will be used in?

• Specify user requirements

What are the goals of the users while using the product?

• Create design solutions

The actual development phase of the product, which can be done in phases

from a concept to a complete design.

• Evaluate the design

Testing of the product, ideally by actual users.

Depending on the approach and the developer’s needs, a user centred design

process is composed of several methods and tasks. User research to gather

requirements can be accomplished through focus group interviews, individual

interviews, personas, online surveys, task analysis and many other approaches.

Like the specification of user requirements, the usability evaluation can also

employ various methods, including System Usability Scales (SUS), expert reviews,

focus groups, remote testing and more. The choice of methods depends on

multiple factors, like the requirements of the developed product, the size of the

development team or the timeline for development [54].

For the user centred design of ARTUR, the users were specified as radiographers

who want to practice radiographic positioning. To specify the user requirements, a

focus group interview was conducted. With the insight of the focus group interview,

a task analysis helped to further specify the user requirements and set guidelines,

in the form of functional requirements for the development and evaluation process.

A working prototype of ARTUR was developed for the Microsoft HoloLens with the

Unity 3D development platform. The design of the prototype was then evaluated

by three test users. Two of whom were radiographers and one user who was an

expert on AR and VR applications. The expert’s impressions on the prototype’s

functionality and usability were gathered by individual semi-structured interviews.

The qualitative interviews were transcribed and analysed for statements relevant

to the previously set functional requirements of the application. A detailed

description of the interview setup and the analysis of the interviews is given in

Chapter 4.

Page 35: ARTUR Augmented Reality Training Utility for Radiography

3 Design and Development of ARTUR

27

3.1 User Needs & Context of Use

Practical training in clinical or laboratory settings represents a crucial part in the

education of radiographers. Since hands-on training on real radiographic

equipment in the presence of a trainer or clinical placements in a hospital demand

a lot of resources, alternatives are sought after. Virtual training applications have

shown to provide similar effects to hands-on training, while conserving monetary

and human resources. Thus, the intended target audience for ARTUR are first year

radiography students who are preparing for their first clinical practice. A second

group of users are teaching radiographers, lecturers or trainers, who need to be

able to support students in using the virtual simulation. While not being targeted

users per se, seasoned radiographers who want to relearn certain patient

positions, could also benefit from an AR enabled tool. As a main task, the

application should guide inexperienced radiographer students through the patient

positioning process for plain radiographic procedures. Ideally, without the need for

a trainer being present.

To assist with the design process and to identify the user requirements, qualitative

data was gathered through a focus group interview. Focus group interviews are

typically held in small groups, where all participants have certain characteristics in

common. The data, gathered in a focused discussion helps with understanding the

subject of interest [55, p. 6].

The primary goal of the focus group interview was to gather user-task requirements

for the AR training environment. In addition, the focus group interview provided

proficient expert opinions on the following core topics:

• How can AR assist in the training of radiographic positioning?

• How can an AR training environment be realized?

3.1.1 Focus Group Setup

Participants

A panel of four radiographers who were lecturing in a radiographer bachelor

degree programme was chosen for the focus group. As experts on radiographic

imaging and with an insight into educational requirements, the focus group’s

statements helped to establish a mutual understanding of the task requirements

for the AR simulation environment. A detailed list of all participants and their

properties can be seen in Table 4.

Page 36: ARTUR Augmented Reality Training Utility for Radiography

3 Design and Development of ARTUR

28

Design & Procedure

Before the interview started, all participants signed a form of consent, which can

be found in Appendix A. As an introduction into the topic and the aim of this master

thesis, the principles of AR, the Microsoft HoloLens and their capabilities were

explained to the participants of the focus group. To give the group members a

better understanding of the Microsoft HoloLens, a short video about the HoloLens

was shown to the participants. Following the introduction, an open discussion was

held. To keep the discussed topics relevant, a guideline containing ten open ended

questions was used. The questioning guideline can be viewed in Appendix B.

Including the introduction, the meeting had a duration of approximately one hour.

The interview was conducted in February of 2020.

Apparatus & Materials

For further analysis and to document the interview, audio of the interview was

recorded and notes were taken by the moderator. Following the interview, an

abridged English transcript of the audio recording was made, which can be found

in Appendix C. Supported by the moderator’s notes, the transcript was analysed

to set user-task requirements for the AR simulation.

Table 4: Participants of the focus group interview.

Participant P1 P2 P3 P4

Gender Male Male Male Male

Age Group 30-35 50-55 30-35 30-35

Years as a Radiographer 5 8 5 5

Years in Education 3 13 3 4

Highest Degree MA PhD MSc MSc

Field of Expertise Computer

Tomography

Rapid

Prototyping Angiography Radiotherapy

3.1.2 Focus Group Interview

To introduce all participants and to get an insight into their professional

background, the first question aimed at their clinical and educational experience.

Participant one (P1) had five years of clinical and three years of educational

experience. Participant two (P2) worked eight years as a clinical radiographer and

13 years in education. Participant three (P3) was a clinical radiographer for five

years and three years in teaching. Participant four (P4) had five years of clinical

Page 37: ARTUR Augmented Reality Training Utility for Radiography

3 Design and Development of ARTUR

29

experience and four years of educational experience. The focus group had a mean

clinical experience of 5,75 ±1,5 years and a mean experience of 5,75 ±4,86 years

in teaching.

After the introduction, the first question asked the participants to think back on how

they trained radiographic positioning and on their impressions of that teaching

method. Both P1 and P3 had theoretical input via lectures supported by literature

and PowerPoint presentations. As practical training role-play exercises in groups

of 10 to 15 students and a trainer were held in a laboratory setting. P2 did also

receive lectures supported by literature and completed laboratory role-play

trainings and phantom studies in groups of four to five students, supported by a

supervisor. P4 also had lectures supported by literature and practical training in an

instructed laboratory setting in groups of four to five students. Only P4 had access

to a laboratory for self-study outside of set training sessions. According to all

participants, open access to laboratories for small groups of students, which

enables them to practice and repeat clinical tasks according to their own schedule,

is important for optimal prerequisites in the practical training of radiographic

positioning.

Participants were then asked on their opinion how AR could improve the training

of radiographic positioning. P1 and P4 stated that new and innovative tools

increase students’ interest in a topic and promote their motivation to learn. P1, P2

and P4 agreed that AR learning environments enable students to study without

supervision or restrictions of a schedule. Which allows them to study under

reproducible exercise conditions without perceived pressure and at their own

comfortable speed. In addition, P2 and P4 emphasized the pedagogic possibilities

of tracking the student’s learning progress, providing personalized content or

testing the student’s knowledge with a virtual simulation environment.

Further, the participants were asked how they would like to receive assistance in

radiographic positioning from an AR device. Aside from visual guidance and

information conveyed via text, the focus group wished for a speech interface. With

speech input to enable users to use their voice for input. The participants also

found the possibility of speech output with explanations as an addition to visual

aids significant. P4 stated that a speech interface could enable people with

disabilities to use the application, but also specified that it should be possible to

disable the speech output if a user does not wish for audio guidance.

Since visualising radiographic positions via AR holograms is the core concept of

this thesis, the participants were asked which additional visual support, other than

the position of the body part would be useful. P1 stated that the position of the x-

ray tube and feedback on correct test patient positioning would be useful. P2

Page 38: ARTUR Augmented Reality Training Utility for Radiography

3 Design and Development of ARTUR

30

specified the added value of holograms featuring injuries or pathologies, since

those could not be trained in a laboratory environment. To avoid students to only

focus on the task of positioning, P3 mentioned that information for patient

preparation, like the removal of jewellery and clothing or necessary radiation

protection measures, should be conveyed by the application. In addition to the

patient’s position, P4 commented on the importance of additional parameters

necessary for the execution of a radiographic examination. Central beam position,

collimation size of the x-ray beam, detector size, exposition parameters and the

resulting radiographic image should be communicated to the users. P4 also

declared that, to maintain usability, the application should not be overloaded with

features.

The focus group recognised immersion as an important factor for the value of a

training experience. Which was also mentioned in numerous publications [43], [36],

[45]. The next topic focused on the importance of visual quality or realism of the

visual positioning aid provided by the AR headset. To position the central beam

and to adapt the collimation, anatomical structures must be visible. Radiographers

use distinctive anatomic structures to orientate themselves. Therefore, P3 and P4

highlighted the importance of holograms with high visual quality. According to P3,

simplistic models could also break the immersion and lower student engagement.

In contrast, P2 noted that if holograms are to complex and have to many details,

users could get distracted from the actual goals of the exercise. But to further

increase immersion and learning effects, holograms featuring injuries or patients

of different ethnicities could be implemented.

To get an idea on how users imagine themselves interacting with AR holograms,

the discussion was then directed into the possibilities regarding the user input. The

participants were asked how they would like to interact with the holograms. P1, P2

and P4 agreed on hand gestures being the main form of user interaction with the

application. P1, P2 and P4 highlighted the importance of similarities to smartphone

interaction, which would lower the entry barrier for users who are inexperienced

with AR. With P2 mentioning that speech input for certain actions would be

convenient. Regarding speech input, P3 stated that complex interaction with

holograms could be difficult to realise and use. Moving or rotating holograms for

small increments could me more complicated than hand gestures. Further, P3

added that adding physical movement to a learning experience could increase the

learning effect. To which P2 agreed.

To show users if their inputs are detected, interpreted and handled, the application

needs to give some form of feedback. The participants were asked on how they

would like to receive feedback for their inputs. P4 wished for haptic feedback, for

Page 39: ARTUR Augmented Reality Training Utility for Radiography

3 Design and Development of ARTUR

31

example trough a vibrating AR headset. Recognizing constant audio feedback as

irritating, P3 asked for visual feedback for user interactions. To address the issue

of too much audible feedback, P4 mentioned the possibility to add specific audio

feedback for certain interactions. P1 and P3 agreed that users should be able to

toggle audible feedback on or off.

Earlier in the interview, the focus group recognised the potential of virtual

simulation as an additional tool to laboratory exercises in radiographic education.

To explore the potential of AR in a laboratory setting, the participants were asked

how real radiographic equipment could be included into the AR training sessions.

P1 had the opinion that access to a laboratory with radiographic equipment already

enables hands on role-play training. An AR application would not add a lot of value

and should be used in preparation for role play exercises. P4 had concerns that

AR holograms, a real radiography system and role play partners would clutter the

user’s FOV and overwhelm them. The light beam collimator could also interfere

with the holograms and hinder visibility. Aside from the application in training

scenarios, P2 and P3 both recognized that AR could support radiographers in

positioning real patients in a clinical setting. Especially in difficult or exotic

examinations.

In addition to the topics gathered according to the interview guideline, the

participants deviated from the topics and made remarks regarding the costs and

resources needed to implement an AR simulation into a bachelor degree

programme. Initial costs or expenses maintaining the AR setup and resources

needed to train all students on the hardware were mentioned.

While not in the scope of this thesis, an interesting outlook, mentioned by P4, was

the possibility to offer a two-dimensional browser-based option of the application.

Which would enable independent learning without the need for an AR headset [56].

3.1.3 Focus Group Findings

In accordance with current literature regarding the topic of simulation-based

training in healthcare education, all participants recognized the importance of

practical training in learning radiographic positioning [32], [34], [35]. They also

stated that new and innovative tools promote student engagement and motivation.

The capability of AR to provide reproducible, time and space independent training

environments was also mentioned by several participants. As further advantages

of virtual simulation, potential cost and time savings were mentioned.

Statements relevant to the design of ARTUR were grouped and categorized to

assist in the user task analysis. Topics identified as applicable were audio-visual

Page 40: ARTUR Augmented Reality Training Utility for Radiography

3 Design and Development of ARTUR

32

assistance, the importance of visual quality, methods to interact with holograms

and feedback on user input.

When asked about possible visual assistance by an AR application, the

participants mentioned elements which would also be present in a laboratory or

clinical setting. Including a hologram of the body part in question, visualisation of

the central beam position, the collimation and detector size. In addition,

visualisation of the x-ray tube position and its angle were mentioned. To utilize the

possibilities of AR over real-life simulation, the possibility to display technical

parameters and the resulting radiographic image were discussed.

In terms of usability and accessibility a speech interface was suggested. Which

could accompany different types of learners and people with different needs. As

an example, text displayed could be read out by a narrator or a text to speech

engine.

Regarding visual quality, the importance of realism was emphasized. Realistic

visuals improve immersion and learner engagement. Furthermore, the process of

radiographic positioning makes use of anatomical landmarks, which need to be

visible in the simulation. Preferably high-fidelity 3D models, true to original body

parts should be used in the application.

Hand gestures as an input method to interact with holograms, were universally

requested by the focus group participants. Again, a speech interface enabling

users to interact with voice commands, was also mentioned.

To give users feedback on their interaction is crucial for AR applications. The focus

group had several suggestions with a variation in feasibility, including haptic

feedback through a vibrating headset and audio-visual feedback.

To get an overview of the user needs, the analysed and categorised feature

requests are listed below. Depending on viability, the features were integrated into

the prototype.

Visual feedback:

• Hologram of the body part

• Central beam position

• Collimation size

• Detector size

• X-ray tube position and angle

• Exposition parameters

• The resulting radiographic image

Page 41: ARTUR Augmented Reality Training Utility for Radiography

3 Design and Development of ARTUR

33

Audio feedback:

• Text to speech output

Visual Quality:

• Realistic representation of body parts and equipment

User Input:

• Hand gestures

• Speech input

Feedback on inputs

• Haptic feedback

• Visual feedback

• Audio feedback

3.2 Task Analysis

Task analysis is a process which helps to understand how users perform tasks and

achieve their goals. What actions do users perform to achieve their goals? The

personal, social and cultural background of the users. How users are influenced

by their environment and how users’ knowledge and previous experience influence

their workflow. This supports the identification of the tasks the application must

support and helps with the development and revision of new features. Task

analysis should be performed early in the development process, especially before

doing any design work [57].

For the design of ARTUR, a hierarchical task analysis was conducted. Hierarchical

task analysis is focused on dividing a high-level task into smaller subtasks. The

goal of this high-level task decomposition is to display the overall structure of the

main user tasks. To visualise the process of task decomposition, hierarchical

charts are helpful tools. To carry out a task analysis, several steps must be taken.

The main task must be identified and broken down into four to eight subtasks. A

layered diagram gives an overview of the subtasks, which is often easier to

interpret. The desired level of decomposition must be decided on, which

guarantees the decomposition into subtasks is consistent [58].

To simulate patient positioning with AR in accordance to user needs, ARTUR had

to meet multiple prerequisites. Users must be able to position realistic and life-size

holograms of body parts in their surroundings and orientate them accordingly. The

Page 42: ARTUR Augmented Reality Training Utility for Radiography

3 Design and Development of ARTUR

34

position of the central beam and the collimation size of the beam needed to be

displayed. Since a radiographic procedure requires more knowledge than just

patient position, beam position and size, which could all be derived from the

hologram, additional examination parameters had to be provided to the users in

some form.

For the task analysis, the main task of the users was set as the whole virtual

simulation experience.

Main Task

The main task of the users was to position an AR hologram

onto a surface to act as a reference for the intended body

position of the patient. To position a test patient according to

the hologram and to check for central beam position,

collimation size and other examination parameters.

The main Task was then decomposed into seven subtasks,

which each represent a single user interaction. The subtasks

are illustrated in a hierarchical task diagram, as can be seen

in Figure 11.

Decomposed Tasks

• Choose a body part.

• Position the body part on the training surface.

• Rotate the body part to comfortably position the test

patient into the hologram.

• If necessary, resize the hologram to match the size

of the test patient.

• Position the test patient as shown by the hologram.

• Display the central beam.

• Display additional examination parameters.

The decomposition of the main user task into smaller

subtasks, allowed for a better understanding of the users’

goals. Based on the user and task analysis, functional

requirements for the Application were set, which is covered

in the next chapter, 3.3.

Choose Body Part

Resize Body Part

Position Body Part

Examination Parameters

Display Central Beam

Rotate Body Part

Position Test Patient

Figure 11: Hierarchical

task diagram.

Page 43: ARTUR Augmented Reality Training Utility for Radiography

3 Design and Development of ARTUR

35

3.3 Functional Requirements

The data collected through the focus group interview and the task analysis allowed

for the specification of seven functional key requirements for an AR simulation for

radiographic positioning. In contrast to non-functional requirements (NFR),

functional requirements (FR) describe what a system should to do, while NFR

describe how the system should behave. NFR have direct architectural impact,

while FR can have indirect impacts into architectural decisions [59].

With the hard- and software limitations set by the HoloLens, the choice in NFR for

ARTUR was limited. NFR were set to provide sufficient performance, usability and

accessibility to allow users with no previous AR experience to complete a training

session with ARTUR.

The main FR to complete the user tasks, was the ability to display life size

holograms of body parts positioned for radiographic procedures. Naturally, users

must be able to choose between different body parts, which required a menu

system. To enable users to position their training partners according to the

hologram, the hologram needed to be interactable and allow users to choose a

hologram position relative to the real-world environment. Since patients and

training partners differ in size, the ability to resize the hologram was another

requirement for the simulation. The focus group has revealed that other

examination parameters, such as the central beam position and collimation size

and other technical parameters should also be conveyed through the application.

Which set two more requirements: to display the central beam with the collimation

size and technical examination parameters in some way or form.

Listed below are seven key requirements, including two sub requirements for user

interaction with the holograms.

• Menu System

• Display Holograms

• Hologram interaction

o Hologram positioning

o Hologram resizing

• Display central beam and collimation size

• Display technical parameters

These functional requirements were used to guide the development process. In

addition, they also played a viable role during the evaluation of the application.

Page 44: ARTUR Augmented Reality Training Utility for Radiography

3 Design and Development of ARTUR

36

3.4 Development of ARTUR

To provide high-fidelity holograms, which promote immersion and are important for

users to navigate around anatomical structures. Real body parts positioned for

radiographic procedures were scanned with an Artec Eva structured light 3D

scanner. For the prototype of ARTUR three body parts were scanned and

processed into three 3D models to be used as AR holograms.

A structured light 3D scanner does not only capture the object of interest, but also

its surroundings. In case of the scanned body parts, the x-ray detector and the

body part were positioned on a table. Primary post processing steps included the

removal of the table surface below the detector. Leaving a model consisting only

of the body part and the detector. In addition, minor touch ups such as filling holes

and gaps in the scan or smoothing edges and surfaces were performed. The

scanning procedure and all post processing steps of the raw model were done in

the Artec Studio 13 3D scanning software. The model and the texture were then

exported as a single .fbx file. The tidied-up model displayed in the editor of Artec

Studio 13 and ready for export is illustrated in Figure 12. If movement artefacts can

be avoided, multiple scans from different angles provide higher quality models and

are to be preferred.

Figure 12: Cleaned 3D model of an arm in position for an x-ray of the os scaphoideum.

The Artec Eva 3D scanner captures objects with a spatial resolution of up to 0,5

mm. In addition, the texture of the object is automatically tracked and mapped onto

the 3D model. The advantages of 3D scanning body parts in radiographic positions

Page 45: ARTUR Augmented Reality Training Utility for Radiography

3 Design and Development of ARTUR

37

include the high level of detail which enables users to orientate themselves on

anatomical landmarks and the accurate representation of the body part’s position.

A major disadvantage of 3D scanning is the susceptibility for movement artefacts.

Which means that for the duration of the scan, the person being scanned needs to

remain as still as possible. Depending on the skill of the operator and the size of

the body part, a scanning process can take anywhere from five to twenty minutes.

In addition, the high spatial resolution of the scanner results in complex 3D models

and large file sizes. To keep a steady framerate on current AR hardware, the

models had to be simplified while keeping anatomical structures recognizable.

The post processed and exported 3D models consisted of up to 300.000 polygons

and had a texture size of 8192x8192 pixels with a file size of 20 to 30 megabytes

(MB). Microsoft recommends HoloLens applications to run on a steady 60 frames

per second (FPS), which ensures holograms to appear stable in real-world

surroundings. To keep the framerate on the limited HoloLens hardware high and

steady, all 3D models acquired for the prototype were simplified using the software

Simplygon version 8.3, with a free license for non-commercial use. Simplygon

features automatic 3D optimization and is predominantly used in the gaming

industry. Simplygon provides a feature called remeshing, which completely

rebuilds and simplifies 3D models while keeping their appearance. In contrast to a

simple polygon reduction, which keeps significant verticals and the topography of

a model while reducing the polygon count, remeshing can result in higher

accuracy. For further performance enhancements, Simplygon offers texture and

shader optimizations. The settings used to simplify the models for the prototype of

ARTUR are displayed in Figure 13. The remeshing and material optimization

resulted in models with a file size of fourth to a fifth of the original scan.

Figure 13: Remeshing and material optimization settings of the 3D scanned models in

Simplygon.

Simplygon outputs level of detail (LOD) models. The original model of a hand,

labelled LOD0 featured 254.811 triangles and 128.267 vertices. After the

remeshing and texture optimization, the model labelled LOD1 had 5034 triangles

Page 46: ARTUR Augmented Reality Training Utility for Radiography

3 Design and Development of ARTUR

38

and 2517 vertices. Including textures, which were now reduced to a resolution of

2048x2048 pixels and saved as normal maps separate to the .fbx file, the file size

of the models was reduced to about five MB. Despite this reduction in complexity

and file size and as illustrated in Figure 14, the LOD models were nearly

indistinguishable from one another.

Figure 14: Original 3D scanned model (LOD0) and simplified 3D model (LOD1) in

Simplygon.

With all 3D models captured and optimized, the HoloLens application could be

developed. As a development platform, Unity 3D version 2018.4 was chosen. To

develop applications for the HoloLens, Microsoft offers a Mixed Reality Toolkit

(MRTK) for Unity, of which version 2.3.0 was used. MRTK is an open source cross-

platform development kit for mixed reality applications. The MRTK can be imported

as a custom package into any Unity 3D project.

Page 47: ARTUR Augmented Reality Training Utility for Radiography

3 Design and Development of ARTUR

39

Figure 15: Menu to add MTRK to the Unity scene.

Software solutions for the HoloLens are built on the Universal Windows Platform.

The build platform had to be switched in the build settings of the project. After

importing the MRTK packages and as shown in Figure 15, the toolkit was added

to the Unity scene and configured for a HoloLens project. Configuration included

enabling the Unity XR framework for AR applications and importing TextMeshPro

essential resources into the project.

By default, the MRKT adds two game objects into the scene, MixedRealityToolkit

and MixedRealityPlayspace. The default game objects are illustrated in Figure 16.

Figure 16: Default game objects added by the MRTK.

• MixedRealityToolkit represents the toolkit itself, it provides the central

configuration for the entire mixed reality framework.

• MixedRealityPlayspace is responsible for managing the headset and the

input system. The main camera of the scene is created as a child of this

game object, which allows the MixedRealityPlayspace to manage the

camera in the scene.

With this basic configuration the scene was able to recognize the user’s gaze and

to register user input via hand gestures and voice commands. For further

configuration, as illustrated in Figure 17, the MRTK options can be accessed

through the inspector of the MRTK game object. Options configured for the project

were, the camera system, input, spatial awareness and diagnostics. The other

options were not needed for a HoloLens application.

Page 48: ARTUR Augmented Reality Training Utility for Radiography

3 Design and Development of ARTUR

40

Figure 17: MRTK configuration window.

MRTK settings are saved in profiles, which theoretically allows developers to

customize software for different headsets through the change of profiles. When

available, the application could be ported to the HoloLens 2 with adaption of the

profiles for all systems in use.

The first functional requirement set by the user task analysis was a menu system.

To build a menu, standard HoloLens toggle button assets included in the MRKT

package were used. These assets already included Interactable scripts to allow for

user interaction through the Air Tap gesture or voice input and give audio-visual

feedback on user interactions. Events triggered by the buttons could be set in an

event receiver of the interactable script. The menu allows users to choose a body

part, to rotate and resize the chosen hologram and to switch the light beam

collimator on or off. In case the users were irritated by the menu, a toggle button

was added, which allows to hide or show the menu buttons, a comparison of the

expanded and reduced menu is illustrated in Figure 18. The figure also displays

the visual feedback for a toggled button, which results in changing the colour of

selected buttons to a lighter shade of blue.

Page 49: ARTUR Augmented Reality Training Utility for Radiography

3 Design and Development of ARTUR

41

Figure 18: Compact and expanded menu of ARTUR in the HoloLens emulator.

The menu was configured to follow the user’s gaze and to always face the user,

independent of their position in the room. This was realized by setting all buttons

as children of a game object and adding scripts to the object. To have the menu

face the user at any time, a Billboard script was added. Billboarding allows the

object to rotate freely in the user’s environment. If necessary, the rotation can be

constrained to one or two axis. In the case of ARTUR, the rotation was constricted

to the X and Y axis. In this configuration the menu was able to adapt to changes in

the users position and height.

To follow the user’s gaze and position in the room, a Radial View and a Solver

Handler script were applied to the game object. The Solver Handler tracks the

user’s head position, the Radial View script changes the menu position according

to the user’s movement. The Radial View keeps the menu in the user’s FOV and

1 meter in front of the user. In combination with billboarding, the menu always stays

in the user’s FOV while simultaneously facing the user. To retain the menu from

obstructing other holograms or real-life objects, users can disable the Radial View

script with a toggle button labelled Pin Menu. If the button is toggled on, the menu

stays in place and does not follow the user. To keep the menu accessible in its

fixed position, billboarding stays enabled when the menu is pinned.

Page 50: ARTUR Augmented Reality Training Utility for Radiography

3 Design and Development of ARTUR

42

Figure 19: Hologram of a body part ready to be positioned by the user.

Three buttons in a column labelled Body Parts enable users to show or hide

holograms of three different radiographic positions. When toggled, the hologram

of the selected body part is set to active and displayed below the menu. In addition,

and as seen in Figure 19, a panel with written instructions on patient position,

central beam position and technical parameters is displayed. To allow for an audio

description of the positioning process and the additional parameters, an audio clip

with a recorded description of the examination can be triggered by performing an

Air Tap gesture on the description panel. This was realized by making the

description panel a tapable button. The audio clips describing the positioning

procedure and the technical parameters were recorded with the open source

software Audacity, version 2.3.3.

To allow user manipulation of the holograms, two scripts were necessary. A Near

Interaction Grabbable script made the object grabbable by the user through gazing

at the object and performing an Air Tap gesture. To allow rotation and resizing of

the body parts, a Manipulation Handler and a Bounding Box script were used. The

Page 51: ARTUR Augmented Reality Training Utility for Radiography

3 Design and Development of ARTUR

43

Manipulation Handler is managing the allowed types of manipulation by the user,

in this case users can move rotate and scale objects with hand gestures. Since the

HoloLens 1 does only support input by one hand, a Bounding Box script is used to

display a grabbable box when the Rotate and Resize button is toggled. Both, the

Manipulation Handler and the Bounding Box script were part of the MRTK.

Figure 20: Bounding box to resize and rotate the hologram.

The Bounding Box illustrated in Figure 20, enables users to resize a hologram by

grabbing one of the eight boxes in the corners with an Air Tap gesture. Moving the

hand away from the centre of the model enlarges the hologram and vice versa. By

grabbing one of the four spheres in the middle of the vertical corners, the hologram

can be rotated along the z axis. Since the hologram will be placed on a flat surface,

rotation along the other axis was disabled.

To give the hologram physical attributes and to allow interaction with real-world

objects, a Box Collider script and a Spatial Mapping Collider script were added to

the object. The Box Collider sets the boundaries of the Bounding Box as borders

of the object and gives the object rigid attributes. The Spatial Mapping Collider,

which is linked to the size of the Box Collider, enables the hologram to collide with

real-world surfaces. Through constant updates about the spatial mapping of the

room from the HoloLens, users can place the hologram on surfaces like a tabletop.

To simulate the collimation light and the central beam crosshair, a spotlight was

set as a child of each hologram. This ensured that the spotlights position stays

fixed relative to the hologram. When users move the hologram by grabbing it with

an Air Tap gesture, the central beam stays on the correct position of the body part.

Page 52: ARTUR Augmented Reality Training Utility for Radiography

3 Design and Development of ARTUR

44

By default, the spotlight is disabled and can be toggled on or off with the Light

Beam Localizer button.

Figure 21: Light cookies for a square and a rectangular collimation.

Spotlights in Unity produce radial lights, the light beam collimator on a radiographic

system is rectangular with a crosshair in the middle. To achieve such an effect,

light cookies were used. Cookies are texture files functioning as masks for the light

source. They usually consist of black and white greyscale images where black

translates to no light and white to visible light on any surface the spotlight hits in

the scene. To simulate the light beam collimator and the central beam crosshair,

square cookies with a crosshair in the middle were used. Since cookies in Unity

must be square, a rectangular collimation was achieved by a square cookie with

thicker black bars on two of its sides. Examples for a square and a rectangular light

cookie are displayed in Figure 21.

The central beam was positioned on the holograms according to guidelines for

radiographic imaging. The collimation size was adapted to the size of the scanned

body parts. Since the spotlights were set as children of the models, the collimation

size adapts to resized models. As an example, the central beam location and the

light collimation size for an x-ray of both hands are illustrated in Figure 22. For a

radiographic image of both hands, the central beam is positioned at the height of

the middle finger’s base joint and in between both hands. The collimation size

should be large enough to accommodate both hands, all fingers and the wrist [60,

p. 53].

Page 53: ARTUR Augmented Reality Training Utility for Radiography

3 Design and Development of ARTUR

45

Figure 22: Central beam and collimation for a hands x-ray.

Reviewing the development process, all functional requirements were met. A

functional menu system allows users to choose a body part, enable rotation and

resizing of the hologram and display the central beam and the collimator. In

addition to toggling switches with a hand gesture, users can also gaze at a button

of interest and use the voice command “select”. As feedback for user inputs,

buttons change their colour when gazed upon and when toggled. To supplement

the visual feedback a clicking noise is played back whenever a button is pressed.

For AR compliance the menu follows the user’s gaze and always faces the user,

independent of their position in the room. To keep the menu from interfering with

holograms and real-life objects, the menu can be pinned and collapsed in any

position.

The holograms of three body parts in position for a radiographic procedure are

interactable. Users can grab them by gazing at the hologram and grabbing it with

an Air Tap gesture. While holding the tap, the holograms can be moved in the

user’s space through hand movements. The spatial awareness of the HoloLens

allows holograms to be placed onto real-life surfaces such as tabletops. To adapt

to different training environments, the holograms can be resized and rotated by the

users. A bounding box with interactable key points for resizing and rotation can be

displayed for each hologram through a toggle button in the menu.

The menu system further allows users to enable or disable the light beam localizer.

Like on real radiographic equipment, the position of the central beam is indicated

by a crosshair and the recommended size of the x-ray beam is displayed with

visible white light.

Page 54: ARTUR Augmented Reality Training Utility for Radiography

3 Design and Development of ARTUR

46

Technical parameters for the examination, such as recommended detector size,

voltage or current are conveyed through text on a virtual board next to the menu.

In addition, the board gives a written overview of the patient position and central

beam location. This allows users to link the visual information of the hologram with

written instructions as seen in textbooks. To play an audio version of the

description, users can Air Tap the panel.

An overview of all software and their versions used during the development of the

prototype is given in Table 5.

Table 5: Software used in the development process.

Purpose Software Version Platform

Capturing 3D Models Artec Studio 13 Windows 10

Primary Post Processing Artec Studio 13 Windows 10

Reduction of 3D Models Simplygon 8.3 Windows 10

Application Development Unity 3D 2018.4 Windows 10

HoloLens Integration Microsoft Mixed

Reality Toolkit 2.3.0 Windows 10

Audio Recording Audacity 2.3.3 Windows 10

Scripting and deploying

to the HoloLens

Visual Studio

Community 2019 Windows 10

HoloLens Emulation HoloLens Emulator 10.0.17763.134 Windows 10

Page 55: ARTUR Augmented Reality Training Utility for Radiography

4 Evaluation of ARTUR

47

4 Evaluation of ARTUR

To evaluate ARTUR, a heuristic evaluation approach, pioneered by Nielsen and

Molich [61] was chosen. A heuristic evaluation describes a review of a products by

experts, typically usability experts or experts in the products domain. For this

thesis, three user tests followed by semi-structured expert interviews were

conducted. While not optimal, three test users have shown to already produce high

levels of overlap in their findings. The data derived from the interviews was

analysed, common points and statements relevant to the functional requirements

of ARTUR and the research questions of this thesis were gathered and used to

evaluate the feasibility and usability of the prototype [62, pp. 61–64]. An insight into

the experts chosen for the testing of ARTUR and the interview setup is given in

Chapter 4.1.

Preliminary to the interview and directed by a testing guideline, which is situated in

Appendix D, each test user did an independent test run of the application. Test

users who were inexperienced with AR received an initial explanation of the

HoloLens and its input methods. As well as some time to familiarize themselves

with the gestures1 of the HoloLens. While test users who were not familiar with the

domain of radiographic imaging, received an explanation the fundamentals of

radiographic positioning. All test users were introduced into what the application

tries to achieve and its intended use. Following this briefing, all test users

completed the tasks provided by the testing guideline without further intervention

[62, pp. 63–64]. The testing guideline was designed to simulate a training session

of two radiographic positions. It prompted the users to open the Main Menu of the

HoloLens and to start the Application labelled ARTUR. They should then choose

a body part in the application’s menu and position the hologram on a surface in the

room next to their test patient, who was present for the testing but did not give any

input or feedback to the test users. After placing the hologram onto a surface, the

test users did position the test patient according to the hologram and checked the

intended location of the light beam localizer through enabling it in the applications

menu. This procedure was repeated two times for two different radiographic

positions. Following this test run, each participant was interviewed in a one on one

interview session.

1 Air Tap, Bloom, Gaze

Page 56: ARTUR Augmented Reality Training Utility for Radiography

4 Evaluation of ARTUR

48

4.1 Expert Interview Setup

Participants

Prospective interview participants were identified using an expert definition from

Bogner and Menz [63]. Who stated that, experts hold technical process orientated

and interpretative knowledge on topics relevant to their specific professional

activity. Which means that expert knowledge does not only consist of

reproduceable and accessible knowledge, but also of practical knowledge

acquired by the expert [64, p. 166].

To benefit from the practical knowledge of radiographers in conducting

radiographic procedures, experts were initially defined as persons with a degree in

radiography and with clinical experience as a radiographer of at least five years.

To additionally gain insight into the knowledge of an expert on AR and VR

applications, a second expert criteria for test users with at least three years of

experience in development, research and practical use of AR applications was set.

Based on these expert definitions, three experts were chosen and interviewed.

One radiographer with six years of clinical experience and no experience with AR

or the HoloLens in general. One radiographer with five years of clinical experience

and four years of experience in the education of radiographers, who additionally

had some experience in developing for and using the HoloLens. The third expert

was a research associate in digital technologies with four years of experience in

developing AR and VR applications, but with no medical or radiographic

background. Further details related to the participants are specified in Table 6.

Page 57: ARTUR Augmented Reality Training Utility for Radiography

4 Evaluation of ARTUR

49

Table 6: Semi-structured expert interview participants

Expert E1 E2 E3

Gender Female Male Male

Age Group 30-35 30-35 30 - 35

Occupation Radiographer Radiographer Research Associate

Years of Experience -

Radiography 6 9 0

Years of Experience - AR 0 1 4

Highest Degree MSc MSc Dipl.-Ing.

HoloLens Expertise Novice Intermediate Expert

Radiography Expertise Expert Expert Novice

Design & Procedure

Following the user tests, done by each of the experts individually. Every interview

was conducted one on one with the experts in March and May 2020. The interview

with Expert 1 (E1) was held in person, the interviews with Expert 2 (E2) and Expert

3 (E3) were conducted online via Microsoft Teams. Before conducting the

interview, all participants signed a form of consent, which can be found in Appendix

A. After recapitulating the goals of this thesis and clarification of eventual questions

by the participants, the interviews were conducted according to an interview

guideline, which is located in Appendix E. The guideline was designed to address

topics relevant to the design and functional features of ARTUR. If possible and to

allow participants to consider different aspects of their test run, open ended

questions were asked. More specific sub-questions were added whenever

participants did not consider a topic by themselves [64, p. 167]. Including the

introduction and discussion about participants’ questions, each interview lasted

approximately 45 minutes.

Apparatus & Materials

To allow for an analysis of the interviews, audio of all three interviews was recorded

and transcribed [65, p. 83]. Since all participants were native German speakers,

the interviews were conducted in German. This allowed the experts to answer

more freely and without the constraints of using a second language. The transcripts

of the German interviews can be found in Appendix F. For use in this thesis, the

Page 58: ARTUR Augmented Reality Training Utility for Radiography

4 Evaluation of ARTUR

50

transcripts were later translated into English, these translations are attached in

Appendix G. Through analysing their content, the English transcripts were

thematically categorised for statements relevant to the feasibility and usability of

the application [65, p. 86]. To allow for a systematic evaluation, the functional

requirements set during the design process of ARTUR in chapter 3.3, were used

as a framework for a comparison with the interview findings. Testimonies indicating

common ground or major disagreements between the experts were also

highlighted, categorized and included into the findings.

4.2 Expert Interview Findings

All three test users were able to complete the test tasks without any intervention

or assistance. None of the test users experienced any errors or crashes during

their testing. E2 and E3 stated that their experience in working with HoloLens

applications helped them to quickly finish the tasks on hand. E1, who was the only

test user without any AR experience, mentioned the presence of a learning curve,

but stated that the gestures needed to operate the HoloLens were learned quickly.

All experts agreed on the need of an introductory training for users who are new to

AR and the HoloLens especially. A tutorial for the gestures used to interact with

the HoloLens is needed in some way or form. Suggestions on how to realize this

training varied from hands on lessons by an experienced trainer, interactive

training applications on the headset itself to tutorial videos, showcasing the

possible modes of input. An important input, given by E3 was to make new users

aware of the limited FOV provided by the HoloLens, if users do not know about this

limitation, they could possibly miss crucial information.

The experts also identified potential issues with limitations in use time and the lack

of comfort for users. They did not recognize any problems for shorter use times,

like the simulated training session. But E2 mentioned decreased comfort for users

wearing prescription glasses and possible comfort issues for longer, continuous

use times of 30 to 45 minutes. To keep interaction with the application as

convenient as possible, E3 suggested to keep the required menu interactions to a

minimum. They recognized that interaction with menu items, especially text input

via the on-screen keyboard can be strenuous for users. Whenever possible, text

input should be avoided or replaced by a voice input system.

The first functional requirement set for the application was a menu system to

choose holograms, enable hologram manipulation and display technical

parameters. All experts were able to navigate the menu and found the options and

information they expected, to be present. E1 and E2 stated that the menu was

Page 59: ARTUR Augmented Reality Training Utility for Radiography

4 Evaluation of ARTUR

51

logical and simple enough to quickly navigate through the application. E3 agreed

but found that some non-interactable user interface (UI) elements were similar in

appearance to clickable buttons, which lead to the assumption that they were

interactable. A clearer distinction between labels and buttons, be it by colour or

shape, is needed. The possibilities to let the menu follow the user’s gaze and, if

needed, to pin the menu in the user space was recognised as an important feature

for AR applications by all test users.

When asked about the visual quality of the holograms used as positioning

reference, all experts recognized that the 3D models provided sufficient visual

quality to reliably assist users in positioning a real body part. The fidelity was

recognized as high enough to make out anatomical details, which are important to

position the patient and the central beam of the x-ray unit. The visual quality of the

holograms was dependent on the lighting situation. The brightness adjustment of

the headset was deemed useful to adapt to different light conditions. If the room is

to bright, the hologram visibility is limited, a dark room on the other hand does

hinder the spatial awareness of the headset.

The FOV, which was mentioned as a possible topic for the training of new

HoloLens users, was discussed by all three experts. The FOV of 30° x 17,5° was

considered too small by all test users. However, E3 mentioned that an application

must always be programmed around the limitations of the hardware it runs on. If

holograms, activated by a button are not immediately in the user’s FOV, they might

get the impression that the application does not work properly.

To allow for smooth operation and reliable hologram positioning relative to real-

world objects, HoloLens applications must reach a stable and high enough

framerate of preferably 60 FPS. When asked about the application’s performance

and hologram stability, none of the test users did notice any stuttering or

inconsistencies in the holograms positioning.

The test users stated that hologram interaction worked as expected. Users could

grab holograms and position them on any surface in their vicinity. The bounding

box allowed to rotate and resize the holograms. E1 mentioned, that the input

methods were not intuitive at first and that there is a learning curve for new users.

This correlates with statements made by E2 and E3 who, again emphasized the

importance of an introduction for first time users. The possibility to enable the

spotlight for the central beam position and the collimation size was thought to be a

useful feature by both participating radiographers.

The bounding box, which allows users to rotate and resize holograms was

considered to be less intuitive in operation. E1 mentioned that at first, they

Page 60: ARTUR Augmented Reality Training Utility for Radiography

4 Evaluation of ARTUR

52

expected the corners of the box allow for both, rotation and resizing. This was

confirmed by E3 who stated that the bounding box is only intuitive for experienced

users, again putting an emphasize on the importance of training for new users. In

addition, E3 found an issue with the hologram’s box collider. Since the box collider

has the same size as the bounding box, buttons which are visible through the

transparent bounding box cannot be interacted with. They suggested capsule or

mesh colliders as alternatives to box colliders. Which should minimize colliding

areas outside of the actual 3D model, but also immediately mentioned possible

performance issues caused by mesh colliders.

Feedback on user input is an important part of UI design, test users were asked

on their impression on the feedback given by the application. All test users felt that

they received enough feedback on menu interactions. The audible feedback when

a button was pressed and the visual feedback, with the button changing colour

when toggled, let users know if their input was recognized. The test users agreed

that too much audible feedback can be distracting and did not want audible

feedback for hologram movement or resizing to be implemented into the

application. This was further supported by E3, who evaluated user feedback of a

mobile AR application. Their research suggested that interaction prompts in the

form of visual highlights were received better than audible or haptic feedback.

Since all experts mentioned that the input methods for the HoloLens 1 are not

intuitive, the test users were asked if they had suggestions for more intuitive

interaction with holograms. Despite their level of expertise in using the HoloLens,

all test users mentioned a “smartphone like” interaction to be the most intuitive.

This form of interaction would render the bounding box unnecessary, since users

could grab holograms with two hands or fingers and manipulate them directly.

Additional voice commands for text input and hologram manipulation were also

mentioned by two experts.

To allow an outlook on the educational benefits of ARTUR, test users were asked

about their opinion regarding AR supported learning. All experts agreed that AR

can benefit education. E1 and E2 stated that the visual aid given by ARTUR helped

them in positioning their test patient. All test users saw the tool as a supportive

learning method to allow students self-sufficient learning without the need for a

trainer being present. E2 mentioned that AR could be used to explain various

medical procedures in detail, ranging from technical information, like the structure

of machines to teaching soft skills like patient interaction. All experts agreed, that

to enable self-study, the system should provide users with feedback on their

actions performed in the real world i.e. how close is the test patients position to the

holograms position.

Page 61: ARTUR Augmented Reality Training Utility for Radiography

4 Evaluation of ARTUR

53

4.3 Evaluation Results

The analysed qualitative data allowed for an evaluation of the application’s design

and usability. Due to the relatively small sample size, these results provide only an

indication for the applications usability. Ideally, the findings should be implemented

into the application, followed by another iteration of the user centred design cycle

[54].

To measure the impact of issues found by the test users, each identified usability

problem was rated in its severity. The severity ratings for usability problems by

Nielsen [8], indicate the need of resources to be allocated to fix usability issues.

The severity of a usability problem is derived of a combination of the frequency the

issue occurs, the impact of the issues on the usability and the persistence of the

problem. The severity of usability issues is rated on a scale from zero to four, the

rating scale is illustrated in Table 7. Zero indicates that the identified issue is no

usability problem. Cosmetic problems receive a rating of one and no priority to be

fixed. Minor usability problems are a two on the scale of severity and should be

fixed with low priority. Major usability problems are rated with three and should be

fixed with a high priority. Usability catastrophes, ranked as four on the scale of

severity should be fixed immediately [8].

Table 7: Scale of severity ratings for usability problems by Nielsen [8].

Rating Severity Priority

0 No Usability Problem -

1 Cosmetic Problem None

2 Minor Usability

Problem

Low

3 Major Usability

Problem

High

4 Usability Catastrophe Fix Immediately

Page 62: ARTUR Augmented Reality Training Utility for Radiography

4 Evaluation of ARTUR

54

Since ARTUR is limited to run on current generation Microsoft HoloLens hardware,

hardware related issues were given a severity rating of zero. While being able to

decrease usability, hardware related issues cannot be fixed but only worked

around. Any software related usability problems, directly or indirectly caused by

hardware issues, received their own severity rating.

General Usability

+ All test users were able to complete the tasks set for the usability test and

did not need any assistance.

− An introduction in the interaction model of the HoloLens for new users is

needed.

− Use time may be limited by lack of comfort provided by the headset.

− The FOV was smaller than expected.

Since the limited FOV and a lack of wearing comfort are hardware issues, they are

given a severity rating of zero. The comfort and FOV of future AR platforms should

be considered for further development of ARTUR.

Menu System

+ The menu was functional and logically structured.

+ Users were able to navigate the menus as planned.

+ Information provided by the menu was accessible.

+ Following the user’s gaze and the possibility to pin the menu in space were

expected by the users and worked as intended.

− Better distinction between interactive and non-interactive menu elements

is needed.

Lacking distinction between interactive and non-interactive menu elements is a

frequent issue, encountered by each user every time the menu is in use. Since

manipulating non-interactive elements does not trigger any actions, the impact can

be rated as low. Same goes for the persistence of the problem. Once users are

aware of the differences between buttons and labels, the usability is not restricted.

It can therefore be rated as a cosmetic problem, which should only be fixed when

resources are available. As a cosmetic issue, it is rated as a one on the severity

scale.

Page 63: ARTUR Augmented Reality Training Utility for Radiography

4 Evaluation of ARTUR

55

Holograms

+ The holograms’ fidelity was high enough to recognize anatomical features.

+ Hologram interaction worked as expected, users were able to manipulate

holograms as they intended to.

+ Central beam and collimator size indication were working and useable.

− The interaction methods are not intuitive and need some learning by the

user.

− Usability issue with transparent box colliders, which were bigger than the

holograms. This did not allow for the manipulation of buttons visible behind

the hologram.

The limited interaction methods are, again a restraint of current generation

hardware and cannot be fixed in software. They are therefore given a severity

rating of zero.

Users not being able to manipulate visible objects because they are occluded by

transparent collider boxes are an issue frequently encountered by users. With

hindering users from interacting with the applications menu or holograms, the

impact of the problem is more severe than just cosmetic. Users can overcome the

issue by moving holograms or the menu in their space, which makes the issue non

persistent. It can be classified as a minor usability issue, with a rating of two on the

scale of severity.

User Feedback

+ The application provided both audible and visual feedback.

+ Users felt that their inputs were confirmed and that they knew if their inputs

were accepted or not.

Overall, users found the design and usability of the application suitable for its

purpose. During the evaluation, no major usability issues or usability catastrophes

were identified. The usability issues and their rating on the scale of severity by

Nielsen [8], are listed in Table 8.

Page 64: ARTUR Augmented Reality Training Utility for Radiography

4 Evaluation of ARTUR

56

Table 8: Severity ratings of the identified issues.

Issue Severity Rating

Headset Comfort 0

Limited FOV 0

Limited Interaction Methods 0

Better Distinction Between

Buttons and Labels 1

Collider Occlusion 2

To enhance the user-experience, better distinction between interactable UI

elements and non-interactable labels, as well as modified hologram colliders

should be implemented in the next iteration of ARTUR. General limitations like the

limited FOV, lack of comfort and unintuitive gestures can be attributed to hardware

limitations of the HoloLens 1. When available, porting the application to the

HoloLens 2 could solve some hardware related issues and further improve the

usability of ARTUR.

Page 65: ARTUR Augmented Reality Training Utility for Radiography

5 Discussion

57

5 Discussion

The main goal of this thesis was to evaluate the value of virtual learning

environments in the education of health care professionals, especially in

radiographic patient positioning for radiographers. This included literature research

about the current state of the art in the practical and virtual training of

radiographers, followed by the user centred design of an AR application to assist

in the training of radiographer students.

Practical training plays a crucial part in the education of healthcare professionals.

Various studies recognized clinical practice, where students perform supervised

examinations on real patients as a valuable part of radiographer programmes [2],

[33], [34]. Students often find the transition from academic studies to clinical

practice challenging, which makes clinical placements not universally applicable

[35], [37], [38]. To prepare students for their clinical practice and to ensure patient

safety, simulation-based trainings are commonly in use in radiographer degree

programmes [34], [35], [42]. Simulation-based training tries to emulate real

scenarios with varying degrees of fidelity and immersion and helps students to

develop new skills in a safe and stress-free environment [14], [34].

Role-play simulation in laboratory settings, where students simulate patient

interaction on real radiographic equipment, have shown to significantly increase

the students’ clinical knowledge of radiographic examinations and radiographic

projection [34]. These types of hands-on simulation require access to a laboratory

with radiographic equipment and the presence of a trainer, which makes them cost

and time intensive [4]. Virtual simulation has shown to provide similar benefits to

hands-on simulation. Virtual simulation is in use for various disciplines [48]–[50].

To explore the possibilities of AR in the practical training of radiographic positioning

ARTUR was developed utilizing one iteration of a user centred design circle [54].

Since the application is meant to support radiography students in their training, a

focus group of four teaching radiographers was assembled. The focus group did

explore how AR could assist in the training of radiographic positioning and how

such a training application could be realized.

In accordance with a study by Holmström et al. [2], the focus group recognized the

importance of practical training in the education of radiographers. They also

acknowledged that simulation-based trainings, especially virtual simulation, can

provide reproduceable training environments while being time and cost effective

Page 66: ARTUR Augmented Reality Training Utility for Radiography

5 Discussion

58

[4]. In addition, the focus group agreed on the advantages of time and space

independent training possibilities provided by virtual simulation. These findings

also correlate with current literature, indicating that the focus group setup was

appropriately chosen [4], [32], [34].

Based on the findings of the focus group interview, functional and non-function key

requirements for ARTUR were set. These provided a guideline for the development

and user testing of the application.

To evaluate the developed prototype, three semi-structured expert interviews were

conducted. Each expert did an independent test run of the application, which was

directed by a testing guideline. To allow for systematic analysis, the interviews

were recorded and transcribed.

All test users had varying degrees of experience in using the HoloLens. All, from

first time user to developer, were able to complete the tasks provided by the testing

guideline without any assistance, which suggests an overall sufficient usability of

ARTUR. The impact of the issues identified by the test users was rated utilizing the

scale of severity ratings for usability problems by Nielsen [8]. No major or

catastrophic usability problems were found, which further confirms the assumption

that ARTUR has satisfactory usability.

The menu system was evaluated to be functional and logically structured.

Although, E3 who had the most experience in working with and developing for the

HoloLens, noted an issue regarding the UI design. Interactable buttons and UI

labels were not distinctive enough, which lead to the assumption that labels were

also interactable. This could be attributed to labels being only a darker shade of

blue than the buttons, rather than being of a different colour. The fact that only the

most experienced user noticed this as an issue, leads to the assumption that less

experienced users were preoccupied with operating the system, which left no

capacity for experimentation. In addition, E3 has better insight into UI design, which

could help spotting minor cosmetic issues. In a further iteration of ARTUR, the UI

should be tweaked to allow for better distinction between interactable buttons and

labels.

The focus group, and a study by Hagiwara et al. [43] suggested that the realism

provided by high-fidelity simulations help the transfer of theoretical knowledge into

practical scenarios. This is supported by a study by Shiner et al. [36], which tested

the impact of realistic simulation of patient injuries on the transfer of theoretical

knowledge into real situations. The 3D scans used in the prototype of ARTUR were

seen as realistic and detailed enough by all test users. Anatomical structures,

Page 67: ARTUR Augmented Reality Training Utility for Radiography

5 Discussion

59

important for the positioning of the body part and the x-ray central beam were

identifiable.

Feedback on user input is especially important in an AR environment, where haptic

feedback is impossible. All test users felt that the application gave them enough

feedback on their inputs. They received confirmation if their inputs were accepted

or not. The focus group and the experts agreed that visual feedback is the most

reliable form of feedback and that audio feedback must be used appropriately.

Overall, the findings of this thesis suggest that virtual simulation, including AR

simulation benefits radiographer students in their development of practical skills.

The possibility to combine virtual 3D models and real-world objects leaves

opportunities for many applications.

At the moment, a limitation of AR is the relatively high entry cost for hardware.

While still being cheaper than laboratories and trainers, in the near future individual

students will not have access to AR headsets in their homes. Which negates the

time and space independence other virtual simulations can offer. Other issues

pointed out by the test users were the limited FOV and interaction possibilities

provided by the HoloLens 1. Both are hardware limitations and at the moment can

only be worked around. With newer generation hardware, these issues might be

solved. Suggesting that AR is still an emerging technology which can only benefit

from more capable hardware, broader accessibility and users who are accustomed

to AR.

A major limitation of this thesis is the limited number of three expert interviews.

While three test users already produce high levels of overlap, testing by seven

radiographers, followed by semi structured interviews and a survey of radiographer

students to obtain the SUS-score were planned initially. Due to the COVID-19

situation in early 2020, it was impossible to carry out in person user tests with

healthcare professionals or students on a large scale. This led to the extension of

the expert criteria to include AR/VR experts and the reduction to three expert

interviews. Another limitation attributed to the limited time frame and scope of this

thesis is that only one iteration of a user centred design circle or ARTUR was

performed.

As future prospects for this project, further iterations of the user centred design

circle should be applied. Following an implementation of requested features and

suggested fixes, another evaluation by an extended group of test-user could be

applied. When available, porting the application to the recently released HoloLens

2, with its increased FOV and more intuitive interaction methods, could increase

general usability. Due to the use of the MRTK during the development of ARTUR,

Page 68: ARTUR Augmented Reality Training Utility for Radiography

5 Discussion

60

a port to the HoloLens 2 would be possible without the need to rebuild the

application from the ground up. In addition to usability testing, the impact on

student learning of an AR simulation training could be researched. This could be

realized by conducting a cohort study with one group using ARTUR and one group

utilizing traditional role-play training in their preparation for clinical practice.

Preliminary and after the simulation training a test could determine differences in

acquired knowledge.

5.1 Contributions

Although virtual simulation environments for the training of plain radiographic

procedures do exist and the impact of virtual simulation has been researched, at

the time of writing no research into AR supported training of plain radiographic

positioning has been done.

In accordance with current literature, the focus group recognized virtual training

environments and AR as a promising technology to enable time- and cost-effective

independent learning. The focus group agreed on virtual simulation representing

additional tools for education, complementing traditional methods like hands-on

training, lectures and clinical practice. They had high expectations in the capability

of AR and new technologies in general to boost learner engagement.

The evaluation of ARTUR showed that AR applications are useable by users of

any level of experience with AR or mixed reality. The evaluation has also shown

that many limitations are set by current generation hardware. While all test users

were able to navigate the application, none described interaction with the HoloLens

as intuitive. Suggesting the need for initial training for users new to AR. Further

limitations for the wide deployment of AR supported training are the limited

availability of current AR hardware and the high cost of purchase for said hardware.

Limiting the potential userbase to educational institutions or healthcare providers.

With a decrease in hardware cost and a wider distribution of AR headsets, true

time and space independent training of radiographic positioning could become a

reality.

Page 69: ARTUR Augmented Reality Training Utility for Radiography

6 Conclusion

61

6 Conclusion

To quantify the impact of virtual learning environments on the education of

radiographers, this thesis utilized recent literature to analyse the current state of

the art in radiographic training. Several studies indicate that simulation-based

training helps students in developing practical clinical skills. As studies suggest,

comparable results can be achieved with the assistance of virtual training

environments. To further answer the research questions of this thesis, an AR

application for the Microsoft HoloLens 1 was developed and tested. The

development was done in a user centred design approach, making use of a focus

group to set functional requirements and expert interviews to evaluate the

developed prototype.

To answer the main research question on how virtual learning environments can

support radiographers in learning radiographic positioning, the findings of the

literature research and the focus group interview were used. The findings suggest

that virtual simulation enables students to translate theoretical knowledge into

practical skills and develop those skills in a safe and stress-free environment.

Virtual simulation can improve immersion and student engagement, which in return

increases clinical knowledge of radiographic examinations and radiographic

projection. In addition, virtual simulation can be used to adequately prepare

students for their clinical practice, which reduces stress in students and risk for

patients. Another benefit of virtual simulation is that it enables students to practice

their skills independent of a schedule and without the need for dedicated

laboratories. Which allows students to study at their own pace. In addition,

universities can save costs associated with laboratory equipment and trainers for

hands-on trainings.

For the user centred design of ARTUR and to answer the first sub question, on

how a visual aid to assist users in the training of the positioning process can be

realized. A focus group interview with four teaching radiographers was conducted.

The focus group helped to set functional and non-functional key requirements for

an AR radiography training application. These requirements stated that 3D models

of body parts in position for various radiographic procedures were needed. To

acquire high-fidelity models, 3D scans of three body parts were made. The 3D

models were then implemented in a functional prototype of ARTUR. This

application allows users to choose a body part and display a semi-transparent

model of a body part positioned on a radiographic detector. Users are able to

Page 70: ARTUR Augmented Reality Training Utility for Radiography

6 Conclusion

62

manipulate holograms and position them on a surface in their surroundings. This

hologram can be used as a reference to position a role-play partner for the chosen

examination. To guide users on the other aspects of a plain radiographic

examination, the x-ray central beam and collimation size for each body part can be

displayed. Technical parameters, which cannot be displayed on the 3D model, are

displayed in written form on a panel floating in the user’s vicinity and read to the

user by an audio clip, which can be triggered by interacting with the panel.

Because of a lack of suitable sensors on the HoloLens, the application is just an

indicator on correct positioning of the body part and the x-ray equipment. Feedback

on the actual achieved position is not possible. A feedback system was mentioned

both by the focus group and all test users and could be implemented when more

capable AR hardware is available.

To answer the second sub question on how users evaluate the design and usability

of the visual positioning aid and to evaluate the implementation of ARTUR, three

experts tested the application and individual semi-structured interviews were

conducted. Two test users were experts in radiography and one was an expert on

AR/VR applications. The users with less experience in using the HoloLens

received an introduction on how to interact with the device. During the test itself,

the users did not receive further input or assistance. All users were able to

complete all tasks on the testing guideline without any issues. Which indicates no

major usability problems with ARTUR. Most issues with usability were attributed to

hardware limitations of the HoloLens 1. The FOV was deemed too limited by all

test users, leading to disorientation or the loss of information. The second major

complaint, given by all test users, was the limited form of interaction allowed by the

HoloLens 1. The combination of using the user’s gaze and their hands is confusing

for new users and takes effort, even from experienced users. All experts agreed,

that despite the application’s simple structure, new users would need some form

of training on using the HoloLens to enable effective training with the device.

This thesis leaves interesting prospects for the use of AR in the education of

radiographers, healthcare professionals or in education in general. As of now, AR

is still limited by its hardware. Headsets are expensive and meant for professional

use rather than for personal use. Practically excluding studying at home from

current use cases. With hardware becoming more accessible and capable, the

possibilities of AR in education are nearly endless. Higher fidelity holograms, better

utilization of real-world objects and more intuitive input methods would increase

user immersion and acceptance of the technology. Leading to questions on the

impact AR applications can have on learning in general and how gamification could

benefit virtual training applications.

Page 71: ARTUR Augmented Reality Training Utility for Radiography

63

References

[1] N. Katajavuori, S. Lindblom-Ylanne, and J. Hirvonen, ‘The Significance of Practical Training in Linking Theoretical Studies with Practice’, Higher Education, vol. 51, no. 3, Accessed: Oct. 15, 2019. [Online]. Available: https://link.springer.com/article/10.1007/s10734-004-6391-8.

[2] A. Holmström, ‘Radiography Students’ Learning of Plain X-Ray Examinations in Simulation Laboratory Exercises: An Ethnographic Research’, Journal of Medical Imaging and Radiation Sciences, vol. 50, no. 4, pp. 557–564, Dec. 2019, doi: 10.1016/j.jmir.2019.07.005.

[3] B. L. Niell et al., ‘Prospective Analysis of an Interprofessional Team Training Program Using High-Fidelity Simulation of Contrast Reactions’, American Journal of Roentgenology, vol. 204, no. 6, pp. W670–W676, May 2015, doi: 10.2214/AJR.14.13778.

[4] C. J. McCarthy and R. N. Uppot, ‘Advances in Virtual and Augmented Reality—Exploring the Role in Health-care Education’, Journal of Radiology Nursing, vol. 38, no. 2, pp. 104–105, Jun. 2019, doi: 10.1016/j.jradnu.2019.01.008.

[5] T. P. Chang and D. Weiner, ‘Screen-Based Simulation and Virtual Reality for Pediatric Emergency Medicine’, Clinical Pediatric Emergency Medicine, vol. 17, no. 3, pp. 224–230, Sep. 2016, doi: 10.1016/j.cpem.2016.05.002.

[6] B. M. Garrett, C. Jackson, and B. Wilson, ‘Augmented Reality M-Learning to Enhance Nursing Skills Acquisition in the Clinical Skills Laboratory’, Interactive Technology and Smart Education, vol. 12, no. 4, pp. 298–314, 2015, doi: 10.1108/ITSE-05-2015-0013.

[7] J. Lazar, J. H. Feng, and H. Hochheiser, Research Methods in Human-Computer Interaction, 2 edition. Cambridge, MA: Morgan Kaufmann, 2017.

[8] J. Nielsen, ‘Severity Ratings for Usability Problems’, Nielsen Norman Group, Nov. 01, 1994. https://www.nngroup.com/articles/how-to-rate-the-severity-of-usability-problems/ (accessed May 23, 2020).

[9] W. C. Röntgen, ‘Ueber eine neue Art von Strahlen’, Annalen der Physik, vol. 300, no. 1, pp. 12–17, 1898, doi: 10.1002/andp.18983000103.

[10] G. Holzknecht, Rötgenologie. Urban & Schwarzenberg, 1918.

Page 72: ARTUR Augmented Reality Training Utility for Radiography

64

[11] S. Sandström, Manual of Radiographic Technique. World Health Organization, 2003.

[12] H. P. Nowak, Kompendium der Röntgen Einstelltechnik und Röntgenanatomie. Rothenthurm: ixray GmbH, 2011.

[13] S. Becht, R. C. Bittner, A. Ohmstede, A. Pfeiffer, and R. Roßdeutscher, Lehrbuch der radiologischen Einstelltechnik, 7. Aufl. 2019. Berlin: Springer, 2018.

[14] D. Sapkaroski, M. Baird, M. Mundy, and M. R. Dimmock, ‘Quantification of Student Radiographic Patient Positioning Using an Immersive Virtual Reality Simulation’:, Simulation in Healthcare: The Journal of the Society for Simulation in Healthcare, vol. 14, no. 4, pp. 258–263, Aug. 2019, doi: 10.1097/SIH.0000000000000380.

[15] K. Alzyoud, P. Hogg, B. Snaith, K. Flintham, and A. England, ‘Optimum Positioning for Anteroposterior Pelvis Radiography: A Literature Review’, Journal of Medical Imaging and Radiation Sciences, vol. 49, no. 3, pp. 316-324.e3, Sep. 2018, doi: 10.1016/j.jmir.2018.04.025.

[16] R. Carlton and A. Adler, Principles of Radiographic Imaging. Cengage Learning, 2012.

[17] A. Dixon, ‘Normal wrist x-rays | Radiology Case | Radiopaedia.org’, Radiopaedia. https://radiopaedia.org/cases/normal-wrist-x-rays?lang=gb (accessed Mar. 24, 2020).

[18] L. Berlin, ‘The importance of proper radiographic positioning and technique’, American Journal of Roentgenology, vol. 166, no. 4, pp. 769–771, Apr. 1996, doi: 10.2214/ajr.166.4.8610546.

[19] K. L. Moore, A. F. Dalley, and A. M. R. Agur, Clinically Oriented Anatomy. Lippincott Williams & Wilkins, 2013.

[20] Mikael Häggström and D. Richfield, ‘Anatomical plane’, Wikipedia. Dec. 16, 2019, Accessed: May 07, 2020. [Online]. Available: https://en.wikipedia.org/w/index.php?title=Anatomical_plane&oldid=930986097.

[21] H. Aichinger, J. Dierker, S. Joite-Barfuß, and M. Säbel, Radiation Exposure and Image Quality in X-Ray Diagnostic Radiology: Physical Principles and Clinical Applications, 2nd ed. Berlin Heidelberg: Springer-Verlag, 2012.

[22] P. C. Brennan, Radiation Protection in Diagnostic X-Ray Imaging. Jones & Bartlett Publishers, 2016.

[23] S. Aukstakalnis, Practical Augmented Reality: A Guide to the Technologies, Applications, and Human Factors for AR and VR. Addison-Wesley Professional, 2016.

Page 73: ARTUR Augmented Reality Training Utility for Radiography

65

[24] J. Linowes and K. Babilinski, Augmented Reality for Developers: Build practical augmented reality applications with Unity, ARCore, ARKit, and Vuforia. Packt Publishing Ltd, 2017.

[25] Microsoft, ‘HoloLens (1st gen) hardware’, Sep. 16, 2019. https://docs.microsoft.com/en-us/hololens/hololens1-hardware (accessed Feb. 15, 2020).

[26] Microsoft, ‘Spatial mapping - Mixed Reality’, 2020. https://docs.microsoft.com/en-us/windows/mixed-reality/spatial-mapping (accessed Feb. 15, 2020).

[27] Microsoft, ‘Getting around HoloLens (1st gen)’, Sep. 16, 2019. https://docs.microsoft.com/en-us/hololens/hololens1-basic-usage (accessed Feb. 15, 2020).

[28] Microsoft, ‘HoloLens 2—Overview, Features, and Specs’, 2020. https://www.microsoft.com/en-us/hololens/hardware (accessed Feb. 15, 2020).

[29] K. Hinton, Creating with 3D Scanners. The Rosen Publishing Group, Inc, 2016.

[30] Artec Europe, ‘White Light 3D Scanner | Artec Eva Lite’, 2020. https://www.artec3d.com/portable-3d-scanners/artec-eva-lite#specifications (accessed Feb. 16, 2020).

[31] I. Brilakis and C. T. M. Haas, Infrastructure Computer Vision. Butterworth-Heinemann, 2019.

[32] F. Lateef, ‘Simulation-based learning: Just like the real thing’, J Emerg Trauma Shock, vol. 3, no. 4, pp. 348–352, 2010, doi: 10.4103/0974-2700.70743.

[33] M. Shanahan, ‘Student perspective on using a virtual radiography simulation’, Radiography, vol. 22, no. 3, pp. 217–222, Aug. 2016, doi: 10.1016/j.radi.2016.02.004.

[34] A. Kong, Y. Hodgson, and R. Druva, ‘The role of simulation in developing clinical knowledge and increasing clinical confidence in first-year radiography students’, Focus on Health Professional Education: A Multi-Professional Journal, vol. 16, no. 3, pp. 29–44, Jul. 2015, doi: 10.11157/fohpe.v16i3.83.

[35] N. Shiner, ‘Is there a role for simulation based education within conventional diagnostic radiography? A literature review’, Radiography, vol. 24, no. 3, pp. 262–271, Aug. 2018, doi: 10.1016/j.radi.2018.01.006.

[36] N. Shiner and M. L. Howard, ‘The use of simulation and moulage in undergraduate diagnostic radiography education: A burns scenario’, Radiography, vol. 25, no. 3, pp. 194–201, Aug. 2019, doi: 10.1016/j.radi.2018.12.015.

Page 74: ARTUR Augmented Reality Training Utility for Radiography

66

[37] S. M. Kengyelics, L. A. Treadgold, and A. G. Davies, ‘X-ray system simulation software tools for radiology and radiography education’, Computers in Biology and Medicine, vol. 93, pp. 175–183, Feb. 2018, doi: 10.1016/j.compbiomed.2017.12.005.

[38] B. O. Botwe, L. Arthur, M. K. K. Tenkorang, and S. Anim‐Sampong, ‘Dichotomy between theory and practice in chest radiography and its impact on students’, Journal of Medical Radiation Sciences, vol. 64, no. 2, pp. 146–151, 2017, doi: 10.1002/jmrs.179.

[39] ‘Preceptorship In Nursing Education: Is It A Viable Alternative Method For Clinical Teaching?’, ICUS and Nursing Web Journal, no. 19, Sep. 2004, Accessed: May 07, 2020. [Online]. Available: https://www.researchgate.net/publication/233959986_PRECEPTORSHIP_IN_NURSING_EDUCATION_IS_IT_A_VIABLE_ALTERNATIVE_METHOD_FOR_CLINICAL_TEACHING.

[40] D. Barrett, ‘The clinical role of nurse lecturers: Past, present, and future’, Nurse Educ Today, vol. 27, no. 5, pp. 367–374, Jul. 2007, doi: 10.1016/j.nedt.2006.05.018.

[41] B. T. Mabuda, ‘Student nurses’ experiences during clinical practice in the Limpopo Province’, Curationis, vol. 31, no. 1, pp. 19–27, Sep. 2008, doi: 10.4102/curationis.v31i1.901.

[42] S. Aura, S. Jordan, S. Saano, K. Tossavainen, and H. Turunen, ‘Transfer of learning: Radiographers’ perceptions of simulation-based educational intervention’, Radiography, vol. 22, no. 3, pp. 228–236, Aug. 2016, doi: 10.1016/j.radi.2016.01.005.

[43] M. A. Hagiwara, P. Backlund, H. M. Söderholm, L. Lundberg, M. Lebram, and H. Engström, ‘Measuring participants’ immersion in healthcare simulation: the development of an instrument’, Adv Simul, vol. 1, no. 1, p. 17, Jan. 2016, doi: 10.1186/s41077-016-0018-x.

[44] J. Lave and E. Wenger, Situated Learning: Legitimate Peripheral Participation. Cambridge University Press, 1991.

[45] D. Sapkaroski, M. Baird, J. McInerney, and M. R. Dimmock, ‘The implementation of a haptic feedback virtual reality simulation clinic with dynamic patient interaction and communication for medical imaging students’, J Med Radiat Sci, vol. 65, no. 3, pp. 218–225, Sep. 2018, doi: 10.1002/jmrs.288.

[46] E. Giles and K. Thoirs, ‘Use of Simulated Learning Environments in Radiation Science Curricula’, hwa.gov.au, Accessed: Mar. 09, 2020. [Online]. Available: https://www.academia.edu/19987127/Use_of_Simulated_Learning_Environments_in_Radiation_Science_Curricula.

Page 75: ARTUR Augmented Reality Training Utility for Radiography

67

[47] P. Bridge et al., ‘The development and evaluation of a medical imaging training immersive environment’, J Med Radiat Sci, vol. 61, no. 3, pp. 159–165, Sep. 2014, doi: 10.1002/jmrs.60.

[48] S.-J. Ketterer et al., ‘Simulated versus traditional therapeutic radiography placements: A randomised controlled trial’, Radiography, vol. 0, no. 0, Oct. 2019, doi: 10.1016/j.radi.2019.10.005.

[49] K. Lee, M. Baird, S. Lewis, J. McInerney, and M. Dimmock, ‘Computed tomography learning via high-fidelity simulation for undergraduate radiography students’, Radiography, vol. 26, no. 1, pp. 49–56, Feb. 2020, doi: 10.1016/j.radi.2019.07.001.

[50] S. Balian, S. K. McGovern, B. S. Abella, A. L. Blewer, and M. Leary, ‘Feasibility of an augmented reality cardiopulmonary resuscitation training system for health care providers’, Heliyon, vol. 5, no. 8, p. e02205, Aug. 2019, doi: 10.1016/j.heliyon.2019.e02205.

[51] S. Kirkbright, J. Finn, H. Tohira, A. Bremner, I. Jacobs, and A. Celenza, ‘Audiovisual feedback device use by health care professionals during CPR: A systematic review and meta-analysis of randomised and non-randomised trials’, Resuscitation, vol. 85, no. 4, pp. 460–471, Apr. 2014, doi: 10.1016/j.resuscitation.2013.12.012.

[52] J. Gruber, D. Stumpf, B. Zapletal, S. Neuhold, and H. Fischer, ‘Real-time feedback systems in CPR’, Trends in Anaesthesia and Critical Care, vol. 2, no. 6, pp. 287–294, Dec. 2012, doi: 10.1016/j.tacc.2012.09.004.

[53] Sira Medical, ‘Sira Medical’, 2020. https://siramedical.com/ (accessed Mar. 11, 2020).

[54] usability.gov, ‘User-Centered Design Basics’, Apr. 03, 2017. /what-and-why/user-centered-design.html (accessed Mar. 19, 2020).

[55] R. A. Krueger, Focus Groups: A Practical Guide for Applied Research. SAGE Publications, 2014.

[56] C. Kamp, ‘Focus Group Interview’, Feb. 25, 2020.

[57] J. T. Hackos and J. C. Redish, User and Task Analysis for Interface Design, 1st edition. New York: John Wiley & Sons, Inc., 1998.

[58] G. Mori, F. Paterno, and C. Santoro, ‘CTTE: support for developing and analyzing task models for interactive system design’, IIEEE Trans. Software Eng., vol. 28, no. 8, pp. 797–813, Aug. 2002, doi: 10.1109/TSE.2002.1027801.

Page 76: ARTUR Augmented Reality Training Utility for Radiography

68

[59] P. R. Anish, B. Balasubramaniam, J. Cleland-Huang, R. Wieringa, M. Daneva, and S. Ghaisas, ‘Identifying Architecturally Significant Functional Requirements’, in 2015 IEEE/ACM 5th International Workshop on the Twin Peaks of Requirements and Architecture, May 2015, pp. 3–8, doi: 10.1109/TwinPeaks.2015.9.

[60] M. Zimmer-Brossy, Lehrbuch der röntgendiagnostischen Einstelltechnik: Begründet von Marianne Zimmer-Brossy. Springer-Verlag, 2013.

[61] R. Molich and J. Nielsen, ‘Improving a human-computer dialogue’, Commun. ACM, vol. 33, no. 3, pp. 338–348, Mar. 1990, doi: 10.1145/77481.77486.

[62] C. M. Barnum, Usability Testing Essentials: Ready, Set...Test! Elsevier, 2010.

[63] A. Bogner and W. Menz, ‘Das theoriegenerierende Experteninterview’, in Das Experteninterview: Theorie, Methode, Anwendung, A. Bogner, B. Littig, and W. Menz, Eds. Wiesbaden: VS Verlag für Sozialwissenschaften, 2002, pp. 33–70.

[64] U. Flick, An Introduction to Qualitative Research, Sixth edition. Thousand Oaks, CA: SAGE Publications Ltd, 2019.

[65] M. Meuser and U. Nagel, ‘Das Experteninterview — konzeptionelle Grundlagen und methodische Anlage’, in Methoden der vergleichenden Politik- und Sozialwissenschaft: Neue Entwicklungen und Anwendungen, S. Pickel, G. Pickel, H.-J. Lauth, and D. Jahn, Eds. Wiesbaden: VS Verlag für Sozialwissenschaften, 2009.

Page 77: ARTUR Augmented Reality Training Utility for Radiography

69

List of Figures

Figure 1: Frontal and lateral wrist x-ray in two typical projections for a

musculoskeletal radiographic image. When acquiring images of the wrist, two

projections, angled 90° to one another are necessary [17]............................ 6

Figure 2: Sagittal, coronal and transversal plane. Illustrated on a human standing

in the anatomical position [20]. ..................................................................... 8

Figure 3: Bloom gesture to open the main menu................................................ 12

Figure 4: Air Tap gesture to select items. ........................................................... 13

Figure 5: The impact of theory-practice gaps on students [38]. .......................... 16

Figure 6: Percentage of correct responses pre- and post-test [34]. .................... 18

Figure 7: Pre-simulation test results [36]. ........................................................... 19

Figure 8: Post-simulation test results [36]. ......................................................... 20

Figure 9: user evaluation of the AR CPR system [50]. ....................................... 23

Figure 10: Illustration of a typical user centred design cycle. After identifying the

user needs and the context of use, the user requirements are specified.

Following development and design of the product, users evaluate the finished

product. The circle can be applied iterative. Image courtesy of [54]. ........... 25

Figure 11: Hierarchical task diagram. ................................................................. 34

Figure 12: Cleaned 3D model of an arm in position for an x-ray of the os

scaphoideum. ............................................................................................. 36

Figure 13: Remeshing and material optimization settings of the 3D scanned

models in Simplygon. .................................................................................. 37

Figure 14: Original 3D scanned model (LOD0) and simplified 3D model (LOD1) in

Simplygon. .................................................................................................. 38

Figure 15: Menu to add MTRK to the Unity scene. ............................................. 39

Figure 16: Default game objects added by the MRTK. ....................................... 39

Figure 17: MRTK configuration window. ............................................................ 40

Figure 18: Compact and expanded menu of ARTUR in the HoloLens emulator. 41

Page 78: ARTUR Augmented Reality Training Utility for Radiography

70

Figure 19: Hologram of a body part ready to be positioned by the user.............. 42

Figure 20: Bounding box to resize and rotate the hologram. .............................. 43

Figure 21: Light cookies for a square and a rectangular collimation. .................. 44

Figure 22: Central beam and collimation for a hands x-ray. ............................... 45

Page 79: ARTUR Augmented Reality Training Utility for Radiography

71

List of Tables

Table 1: Clinical and technical examination parameters for patient positioning and

equipment adjustment. Example parameters for a radio-ulnar os scaphoideum

radiography [11, p. 84]. ............................................................................... 10

Table 2: Range of raw data quality of handheld scanners. Displaying the range of

3D surface sampling density, 3D point accuracy, the working distance and

typical data acquisition speeds for current handheld 3D scanners [29]. ...... 15

Table 3: Student perception on the usability of the Shaderware software (n = 82)

[31]. ............................................................................................................ 21

Table 4: Participants of the focus group interview. ............................................. 28

Table 5: Software used in the development process. ......................................... 46

Table 6: Semi-structured expert interview participants ....................................... 49

Table 7: Scale of severity ratings for usability problems by Nielsen [64]. ........... 53

Table 8: Severity ratings of the identified issues. ............................................... 56

Page 80: ARTUR Augmented Reality Training Utility for Radiography

72

Appendix

A. Declaration of Consent

Title of the Thesis:

ARTUR – Augmented Reality Training Utility for Radiography - Design,

development and usability testing of an educational tool for radiographic patient

positioning

Author and Institution:

Christoph Kamp – University of Applied Sciences St. Pölten

The interview will be recorded and transcribed by the author. Any data which would

allow identification of the interviewed person will be changed or removed from the

transcript (pseudonymized). To give readers context, only parts of the interview will

be cited in the thesis.

Personal data will be saved independent to the interview data and deleted after

finishing the thesis.

The participation in the interviews is voluntary. You can cancel the interview at any

time, revoke your consent to recording and the transcript of the interview, without

any disadvantages arising.

With your signature you agree to participate in the interview and with the use of the

collected data for a master thesis. All data will be recorded, saved and processed.

The data will not be used for other or commercial projects.

___________________________________

First-, Last name

___________________________________

Location, Date, Signature

Page 81: ARTUR Augmented Reality Training Utility for Radiography

73

B. Focus Group Interview Guideline

1. Introduction

2. Explanation of AR, the HoloLens and their capabilities

3. Use of a questioning route

4. Tell us your names and for how long you have been working in

radiographic imaging.

5. Think back on how you trained radiographic positioning, what were your

impression on the teaching methods?

6. How do you think AR could improve the training of radiographic

positioning?

7. How would you like to receive assistance in radiographic positioning

from an AR device?

8. Which additional visual aids, other than the position of the body part

would be useful?

9. How important is the visual quality (realism) of the visual aid provided

by the AR Headset?

10. How would you want to interact with the holograms?

11. How would you like to receive feedback for your inputs?

12. How could real life radiographic equipment be included in the AR

training sessions?

13. Is there anything we didn’t talk about, which you want to add?

Page 82: ARTUR Augmented Reality Training Utility for Radiography

74

C. Abridged Transcript - Focus Group Interview

Q: Tell us your names and for how long you have been working in 1

radiographic imaging. 2

Participant 1 (P1): Radiographer since 2012 / Lecturer since 2017 (8 and 3 years) 3

Participant 2 (P2): Radiographer since 1999 / Lecturer since 2007 (21 and 13 4

years) 5

Participant 3 (P3): Radiographer since 2012 / Lecturer since 2017 (8 and 3 years) 6

Participant 4 (P4): Radiographer since 2011 / Lecturer since 2016 (9 and 4 years) 7

Q: Think back on how you trained radiographic positioning, what were your 8

impression on the teaching methods? 9

P1 and P3: Theoretical input via lectures, supported by PowerPoint presentations. 10

Practical role-play exercises in a laboratory setting in groups of 10 to 15 students 11

and a trainer on a radiography system. The laboratory was only accessible with a 12

trainer during planned training lessons. 13

P4: Theoretical input via lectures, supported by textbooks. Practical role-play and 14

phantom exercises in a laboratory setting in groups of five people and a trainer on 15

a radiography system. The laboratory was only accessible with a trainer during 16

planned training lessons. 17

P2: Theoretical input via lectures, supported by textbooks. Practical role-play and 18

phantom exercises in a laboratory setting in groups of five people and a trainer on 19

a radiography system. Additionally, open access to radiography system for self-20

study with peers. 21

Q: Which would be the optimal setting for practical training of radiographic 22

positioning? 23

P3: Role play exercises in a laboratory setting with additional open access to 24

radiography systems for groups of three to five people, to enable self-study. 25

P4: Five students are the maximum, including a trainer the space in a laboratory 26

is already limited. 27

P1: Six students could be given different exercised. Two could prepare for the 28

exercise, two could do the exercise and two could evaluate the results of the 29

exercise. 30

P4: This approach is only feasible if the laboratory is spacious enough to enable 31

undisturbed training for all groups. 32

P1: For university programmes with large amounts of students, splitting students 33

in groups of five will need additional resources, like trainers and laboratories. 34

P2: To learn all parameters of a radiographic procedure, practice and repetition 35

are important. To allow for repetition, open access to a radiography system, which 36

allows students to exercise according to their own schedule, would be optimal. 37

Page 83: ARTUR Augmented Reality Training Utility for Radiography

75

Q: How do you think AR could improve the training of radiographic 38

positioning? 39

P1: New and innovative tools promote students’ interest and engagement. 40

P4: New and innovative tools increase students' motivation to learn. Practical 41

exercises certainly have the highest learning effect, AR is the next best method to 42

practice and in a more resource-saving way. With targeted preparation using AR 43

tools, real role play exercises can be carried out more efficiently. 44

P1: AR could be used in a flipped-classroom setting. Students could practice 45

according to their own schedule and without supervision. Which would have them 46

better prepared for the supervised role-play exercises in the laboratory. 47

P3: The initial purchase cost for enough hardware must be considered. 48

P4: Possible wear and tear of mishandled rental hardware could also be an issue. 49

Additionally, training all students on using the AR hardware needs additional 50

resources. Maybe a two-dimensional browser-based variant of the tool could be 51

realized. Which would enable real time and space independent learning. 52

P2: AR tools guarantee reproducible exercise conditions. 53

P4: AR allows students to exercise without any pressure and risks for patients. 54

Which enables students to exercise without pressure and in their own comfortable 55

speed. 56

P2: The HoloLens is, compared to radiography systems or dedicated simulators 57

like the VERT system, rather cheap to purchase. 58

P4: With ongoing technical progress, AR hardware will get cheaper and more 59

accessible. 60

P2: AR applications could track the students’ learning progress and test their 61

knowledge with tests. 62

P4: The educational content could be adapted to the students’ learning progress 63

and give access to new content only after completion of certain topics. 64

P4: In addition to visual data, speech output could also be used to convey 65

information. Ideally only if activated by the user. 66

P3: Speech input could be utilized to trigger audio explanations 67

P1: All speech output should additionally be visualized in text form. 68

P4: A speech inferface with speech output could support people with disabilities in 69

using the tool. 70

P3: Communicating information in audio and text format can be benefitial for 71

different learning types. 72

P2: Localizing the application to different languages would enlarge the pool of 73

possible users. 74

Q: How would you like to receive assistance in radiographic positioning from 75

an AR device? 76

Participants felt that the question was answered in question four. 77

Page 84: ARTUR Augmented Reality Training Utility for Radiography

76

Q: Which additional visual aids, other than the position of the body part 78

would be useful? 79

P4: Central beam, collimation, detector size, exposition parameters, the resulting 80

x-ray image 81

P1: Angle and position of the x-ray tube, if the test patients’ body part is positioned 82

correctly. 83

P3: Information about the detector size, if the detector is placed inside the x-ray 84

table and therefore not visible. Necessary radiation protection measures. 85

P2: Holograms of patients with pathologies or traumas, which cannot be trained in 86

a laboratory role-play exercise. 87

P3: Information about patient preparation: If jewelry, items of clothing or dental 88

prosthesis must be removed. Maybe in text or audio format before the exercise is 89

started. 90

P4: The application should also not be overloaded with features. 91

Q: How important is the visual quality (realism) of the visual aid provided by 92

the AR Headset? 93

P4: Very important, distinctive anatomical structures must be visible. To simplistic 94

models without a texture could lessen the immersion and the engagement of the 95

students. 96

P3: Central beam and collimation are determined using anatomical structures, it is 97

important to recognize these structures on the holograms. 98

P2: If holograms are to complex and have to many details, they could distract from 99

the main goals of the exercise. Models featuring pathologies or traumas, or 100

patients of different ethnicity could increase the learning effect. 101

Q: How would you want to interact with the holograms? 102

P4: Holograms should be positioned in the room with hand gestures, like 103

smartphone interaction. 104

P2: Speech input would be convenient. Using hand gestures could add immersion 105

and increase the learning effect. 106

P1: Interaction with holograms via hand gestures like smartphones would ease the 107

handling. 108

P3: Complex interaction with holograms using speech input could be difficult to 109

realise and use. Changes of a few millimeters or degrees could be hard to realise 110

utilizing speech input. Internalizing movements while learning could also increase 111

the learning effect. 112

Q: How would you like to receive feedback for your inputs? 113

P4: Haptic feedback through a vibrating AR headset. 114

P3: Through optical feedback, constant audio feedback could be irritating. 115

P4: Specific audio feedback for certain interactions can make interaction with 116

holograms easier. 117

Page 85: ARTUR Augmented Reality Training Utility for Radiography

77

P1: To much audio feedback can get annoying, audio feedback should be an 118

option. 119

P3: Users should be able to enable or disable audio feedback. 120

Q: How could real life radiographic equipment be included in the AR training 121

sessions? 122

P1: Radiographic equipment enables practical role play training. An AR application 123

does not add a lot of value if a radiography system is available. AR can be used in 124

preparation for real life exercises or as an additional offer. 125

P4: AR in addition to a real radiography system could be too much information at 126

the same time. A field of view with real equipment, role play partners and 127

holograms visible, could overwhelm the users. The light beam collimator could also 128

affect the projection of the holograms. 129

P2: A possible application of AR in combination with a radiography system could 130

be the support in patient positioning in a clinical setting. 131

P3: AR holograms could support with the positioning of patients for positions where 132

the position of the detector is not visible for the user. 133

Q: Is there anything we didn’t talk about, which you want to add? 134

P1: To justify the cost of purchase for an AR system, the possibility of using the 135

technology for other lectures should be explore136

Page 86: ARTUR Augmented Reality Training Utility for Radiography

78

D. Testing Guideline for the Test Users

Dear test user,

thank you for participating in the usability testing of the Augmented Reality Training

Utility for Radiography (ARTUR) as a part of my master thesis at the University of

Applied Sciences St. Pölten. ARTUR is a Microsoft HoloLens augmented reality

(AR) application which supports the training of patient positioning for radiographic

imaging. By positioning life size holograms of body parts in place for radiographic

imaging in the user’s field of view, the holograms can be used as a reference for

the position of a test patient. Users can match the position of their training partner

to the hologram. In addition, the recommended position of the central beam,

indicated by a crosshair and size of the radiation beam can be displayed on the

hologram.

Following the testing an interview will be conducted, the interview will be recorded

and transcribed. The participation in this test is voluntary, all personal data will be

pseudonymised and the test can be cancelled by the participant at any time.

Testing instructions

After familiarizing yourself with the Microsoft HoloLens and its interface, please

carry out the following instructions:

a) Open the HoloLens Main Menu with the “Bloom” Gesture.

b) Chose the App “ARTUR” from the App menu.

In the Application:

1. Chose a body part of your liking in the menu.

2. Place the menu anywhere in your space, where it does not interfere with

your training session.

3. Place the hologram of the body part on a table next to your test patient.

4. Rotate the hologram, so you can position your test patient comfortably into

the hologram.

5. If needed, resize the hologram to fit your test patient’s anatomy.

6. Position your patient according to the position of the hologram.

7. Enable the holograms light beam localizer to check where it should be

positioned.

8. Repeat the same procedure with a second body part.

Page 87: ARTUR Augmented Reality Training Utility for Radiography

79

E. Expert-Interview Guideline

1. Tell me your name and for how long you have been working as a

radiographer?

2. What impression did the HoloLens make on you?

a. wearing comfort/design

b. limits on time of use?

3. How do you rate the visual quality of the HoloLens application?

a. Resolution

b. Image Quality

c. FOV

d. Hologram stability

4. Do you feel that you successfully completed all the tasks on the task

sheet?

5. What is your opinion on the applications menu?

a. was is disruptive or distracting?

b. logical structure?

c. Did you find the options you were looking for?

d. How was the organization of the menu items?

e. Could you easily undo or redo any action if you needed to?

f. Were all menu items you expected present?

6. How do you feel about the methods of interaction with the holograms?

a. ease of use/intuitive (rotate resize box)

b. what did work? what didn’t?

c. wish for additional input methods?

7. Did the system provide enough feedback on your inputs?

a. Did you get confirmation your inputs were accepted?

8. Did you have any issues with the application? Did you encounter any

errors or crashes?

9. Compared to other application on the HoloLens, how was this application

to use?

10. How useful was the hologram to help the positioning of your test patients

body part?

a. Was the size adjustment useful?

b. transparency change of use?

c. Was it easy to place the patient in the hologram?

d. Was the hologram stable enough to reliably assist in the

positioning?

11. Did the virtual light beam localizer help with the positioning of the real-life

localizer?

a. visibility?

12. Do you think AR applications can benefit the education of radiographers?

a. If yes, how?

13. Do you have any further comments or remarks?

Page 88: ARTUR Augmented Reality Training Utility for Radiography

80

F. German Transcripts of the Semi-Structured Expert Interviews

a. German Interview with Participant One

Q: Wie lange hast du Erfahrung als Radiologietechnologin? 1

A: Ich habe sechs Jahre in dem Bereich gearbeitet. 2

Q: Welchen Eindruck hat die HoloLens in Bezug auf Tragekomfort auf dich 3

gemacht? 4

A: Das war meine erste Erfahrung mit der HoloLens. Zunächst ist es ungewohnt, 5

wenn man sich daran gewöhnt ist es kein Problem damit zu interagieren. 6

Q: Wie ist es mit der Nutz Zeit aus? Fühlt es sich so an als wäre es nach 7

einiger Zeit anstrengend mit der HoloLens zu arbeiten? 8

A: Anstrengend war es nicht. Man hat nach kurzer Zeit verstanden wie es 9

funktioniert und dann geht es ganz gut. 10

Q: Wie würdest du die visuelle Qualität der Applikation beurteilen? 11

A: Ich finde sie optisch sehr übersichtlich. Der Menüaufbau ist einfach und man 12

kennt sich sofort aus. Es ist sehr userfreundlich, man braucht nicht lange damit 13

man [in die Bedienung] reinkommt. Die Qualität der 3D Scans ist hoch, man 14

erkennt Details. 15

Q: Wie war der Eindruck des Field of View der HoloLens? War der zu sehende 16

Bildausschnitt ausreichend? Wurden Objekte abgeschnitten? 17

A: Es kann sein, dass sich das mit zunehmender Zeit verbessert, aber am Anfang 18

hatte ich das Gefühl, dass ich nicht alles in mein Sichtfeld hineinbekomme. 19

Vielleicht ist das auch ein User spezifisches Problem, das geringer wird, wenn man 20

es öfter verwendet. Wenn man die Größen angepasst hat, gewusst hat wie man 21

das Menü fixieren kann und sich Hologramme nicht mehr selbst 22

übereinanderlegen funktioniert es sehr gut. 23

Q: Wie war die Stabilität der Hologramme? Waren diese stabil genug in der 24

echten Welt, um jemanden hineinzulegen? Haben diese gewackelt oder 25

gezittert? 26

A: Nein gewackelt oder gezittert hat nichts. Optimal wäre glaube ich ein höherer 27

Tisch, um sich nicht bücken zu müssen. Dabei verrutscht das Field of View 28

vielleicht wieder. Aber stabil war es auf jeden Fall. 29

Q: Hast du das Gefühl, dass du alle Tasks auf dem User Testsheet erfolgreich 30

durchgeführt hast? 31

A: Auf jeden Fall, hat super funktioniert. 32

Page 89: ARTUR Augmented Reality Training Utility for Radiography

81

Q: Was ist deine Meinung zum Menü der Applikation? War es störend oder 33

logisch aufgebaut? 34

A: Das Menü ist sehr logisch aufgebaut, ich kann es aufklappen oder zuklappen,E 35

wenn ich es gerade nicht brauche. Ich kann es festpinnen, was sehr wichtig ist, 36

weil ich es sonst dauernd im Sichtfeld habe. Es ist übersichtlich und intuitiv 37

gestaltet. 38

Q: Hast du im Menü gefunden wonach du gesucht hast? 39

A: Ja. 40

Q: War es einfach durchgeführte Aktionen zu wiederholen oder rückgängig 41

zu machen? 42

A: Wenn ich zum Beispiel die Audioanweisung wiederholen möchte? Nein hier gab 43

es keine Probleme, hat super funktioniert. 44

Q: Waren alle erwarteten Menüpunkte vorhanden oder hast du etwas 45

vermisst das so eine Applikation deiner Meinung nach benötigt? 46

A: Als Basis finde ich es sehr gut. Die Informationen, die man braucht, sind alle 47

vorhanden. Was man in Zukunft machen kann wäre einen Gamingfaktor 48

hineinzubringen. Dass man sich zum Beispiel messen kann oder ein Matching 49

zwischen der echten Person und dem Hologramm, wie weit diese voneinander 50

abweichen. 51

Q: Wie ist deine Meinung zur Interaktion mit dem Hologramm selbst? Ist das 52

Platzieren von Hologrammen in gewünschter Position und am gewünschten 53

Ort intuitiv oder schwierig? 54

A: Die Lernkurve geht hier relativ schnell, anfangs ist man eventuell etwas 55

unbeholfen. Nach ein paar Versuchen beherrscht man es aber ziemlich schnell. 56

Q: Wäre eine kurze Schulung für neue User hilfreich? 57

A: Ja eine Schulung oder kurze Einführung wäre nicht schlecht, damit man weiß 58

wie man damit interagiert. Zum Beispiel die Sache mit dem Punkt [Gaze Cursor], 59

dass man auf den achten muss und nicht auf die Position der Hand muss man 60

wissen. Das muss einem jemand sagen. 61

Q: Die Interaktion mit der HoloLens ist nicht sehr intuitiv? 62

A: Man würde es mit der Zeit schon herausfinden, aber von allein wäre ich nicht 63

gleich darauf gekommen wie die Interaktion mit dem Punkt funktioniert. 64

Q: Was hat bei der Manipulation der Hologramme funktioniert wie erwartet? 65

Was nicht? 66

A: Ich habe anfangs angenommen, dass die Eckpunkte zum Drehen des 67

Hologramms waren. Ich habe erwartet, dass ich an den Eckpunkten drehen und 68

Vergrößern kann, was glaube ich technisch nicht möglich ist. 69

Q: Gibt es Vorschläge die Interaktion intuitiver zu gestalten? 70

A: Es wäre super, wenn man Hologramme einfach angreifen und größer oder 71

kleiner machen kann. Hier weiß ich nicht ob das technisch möglich ist. 72

Page 90: ARTUR Augmented Reality Training Utility for Radiography

82

Q: Hast du wünsche für zusätzliche Inputmethoden? 73

A: Eine Sprachsteuerung zum Vergrößern von Hologrammen wäre 74

wünschenswert. Positionieren mit Sprache ist wahrscheinlich schwierig aber 75

Vergrößern, Verkleinern und Drehen wäre gut. 76

Q: Hat das System genug Feedback zu deinen Eingaben gegeben. 77

A: Ja, es gibt immer einen Klick, wenn man Buttons getroffen hat. Beim Drehen 78

von Hologrammen hat es nicht geklickt, aber hier hat man sofort ein optisches 79

Feedback ob es funktioniert. 80

Q: Waren optisches und hörbares Feedback ausreichend vorhanden? 81

A: War beides vorhanden und zufriedenstellend. 82

Q: Gab es ausreichend Bestätigung, dass deine Angaben vom System 83

akzeptiert wurden? 84

A: Ja 85

Q: Gab es Probleme mit der Applikation beim Test? Fehlermeldungen oder 86

Abstürze? 87

A: Gar nichts, keine Abstürze und keine Fehlermeldungen. 88

Q: Gab es Schwierigkeiten mit der Interaktion mit dem Menü oder den 89

Hologrammen? Ist es zu Problemen mit übereinanderliegenden 90

Hologrammen oder in der Interaktion mit der echten Welt gekommen? 91

A: Orientierung in der echten Welt war möglich, ich bin nirgendwo dagegen 92

gerannt. 93

Q: Haben sich die Hologramme so verhalten wie du es erwartet hast? Gab es 94

Probleme beim Platzieren der Hologramme auf dem Aufnahmetisch? 95

A: Nein beim Platzieren gab es keine Probleme. Mit der Helligkeit und der 96

Transparenz vielleicht. Dass es [Interaktion mit Testpatient*innen] bei künstlichem 97

Licht nicht so gut funktioniert hat wie bei Tageslicht. Aber bei der Interaktion mit 98

den Hologrammen hat es keine Probleme gegeben. 99

Q: Wie war die Bedienung der Applikation im Vergleich zu anderen HoloLens 100

Applikation. 101

A: Es tut mir leid, ich kenne keiner anderen HoloLens Applikationen. 102

Q: Wie nützlich war das Hologramm dabei deinen Testpatienten zu 103

positionieren? 104

A: Es kommt ein bisschen darauf an was für ein Lerntyp man ist. Ich tue mir leicht, 105

wenn ich etwas sehe und solche Sachen nachmachen kann. Für mich ist es eine 106

große Bereicherung. Ich kann mir einerseits vorher anschauen wie soll das 107

aussehen, was brauche ich alles dafür und kann mich darauf vorbereiten. Und 108

habe dann aber auch bei der Positionierung vom Patienten Anhaltspunkte, in 109

welche Richtung ich zum Beispiel die Hand aufdrehen soll oder auf welcher Höhe 110

ich die Keile hineinlegen soll. 111

Page 91: ARTUR Augmented Reality Training Utility for Radiography

83

Q: War die Größenanpassung nützlich oder würden Hologramme in einer 112

Größe ausreichen? 113

A: Nein, die Größenanpassung war schon nützlich. Ich habe das Gefühl, dass es 114

schon einen Unterschied macht wie weit ich vom Patienten entfernt bin und daher 115

muss ich es mir ein bisschen größer oder kleiner richten. 116

Q: War es einfach oder schwierig die Patienten in das Hologramm zu 117

platzieren? 118

A: Das ist vermutlich auch Übungssache und hängt davon ab wie geschickt man 119

ist sich einerseits darauf zu konzentrieren was man gerade sieht und auf der 120

anderen Seite mit seinem Patienten zu Interagieren. Das braucht etwas üben, 121

macht aber auf jeden Fall Spaß. 122

Q: War das Hologramm transparent genug? War die Helligkeitseinstellung 123

der HoloLens nützlich? 124

A: Das ist ein bisschen von den Lichtverhältnissen abhängig. Das wäre ein 125

geschicktes Feature einer Sprachsteuerung. Wenn ich die Hände schon am 126

Patienten habe und die Position nachkontrollieren möchte kann ich es mit per 127

Sprachsteuerung mehr oder weniger Transparent stellen. 128

Q: War das Hologramm stabil genug, um verlässlich beim Positionieren zu 129

assistieren? 130

A: Die Hologramme waren stabil, man kann sich schon selbst bewegen und es 131

bleibt an der Position liegen sozusagen. Ich kann theoretisch einmal rund herum 132

gehen und das Hologramm bewegt sich aber nicht. 133

Q: Hilft der Virtuelle Zentralstrahl bei der Positionierung des echten 134

Zentralstrahls? 135

A: Ich glaube schon, hier ist es wieder wichtig die Transparenz umstellen zu 136

können damit ich das genauer sehe. Es hilft mir schon sehr, wenn ich am 137

Hologramm sehe wo ich den echten Zentralstrahl einstellen kann und dann die 138

Position kontrollieren kann. 139

Q: Können AR Applikationen in der Ausbildung von 140

Radiologietechnolog*innen hilfreich sein? 141

A: Ich glaube auf jeden Fall, dass sie hilfreich sein können. Es gibt wenige 142

Praktikumsstellen oder Plätze wo man genug Zeit verbringen kann um seltene 143

Spezialeinstellungen Üben zu können, weil sie vielleicht auch nicht oft vorkommen. 144

Dafür ist es sicher super. Und zum selbstständigen Üben ist AR sicher auch ein 145

Vorteil, nicht immer ist jemand verfügbar der betreut und zuschaut ob man alles 146

richtig macht, ein bisschen als Selbstkontrolle. Momentan ist es eine neue 147

Technologie, alle sind gespannt darauf und wollen es ausprobieren. Diese Neugier 148

zu nutzen, um Sachen zu lernen ist sicher ein großer Vorteil. 149

Q: Gibt es noch Anmerkungen oder Kommentare zu Themen, die wir nicht 150

besprochen haben? 151

Nein 152

Page 92: ARTUR Augmented Reality Training Utility for Radiography

84

b. German Interview with Participant Two

Q: Wie lange hast du Erfahrung als Radiologietechnologe? 1

A: Ich habe 5 Jahre klinische Erfahrung und 4 Jahre Erfahrung in der Ausbildung 2

von Radiologietechnolog*innen 3

Q: Welchen Eindruck hat die HoloLens in Bezug auf Tragekomfort auf dich 4

gemacht? 5

A: Als Brillenträger fand ich die HoloLens immer etwas unangenehm, darum würde 6

mich der Tragekomfort der HoloLens 2 interessieren. Nur vom Tragekomfort und 7

aufgrund der kurzen Tragezeit finde ich sie angenehm zu tragen. Je nach 8

geplanter Nutzung sollte man sich ansehen ob es bei Tragezeiten von 30 bis 45 9

Minuten Einbußen beim Tragekomfort gibt. Bei Leihgeräten ist auch der Verschleiß 10

des Geräts und der damit abbauende Tragekomfort zu beachten. Das Größte Limit 11

in der Anwendung bei der HoloLens 1 ist aber sicher das Field of View (FOV). 12

Q: Wie würdest du die visuelle Qualität der Applikation beurteilen? 13

A: Die visuelle Qualität steht und fällt mit den aufgenommenen Daten, je höher die 14

Qualität der Scans, desto höher die Qualität der Hologramme. Die Bildqualität ist 15

sehr gut, das FOV ist dagegen ein Skandal, wenn man sich zum Beispiel mit 16

Kollegen kurz unterhalten möchte, den Blick hebt und nur noch das halbe 17

Hologramm sieht, ist das nervig. Das FOV ist auf jeden Fall zu klein. Ein größeres 18

FOV wird dringend benötigt, was die HoloLens 2 hoffentlich bieten wird. Die 19

Auflösung und Bildqualität sind jedoch auf jeden Fall ausreichend. Die stabilität der 20

Hologramme ist auch ausreichend gut, was mich jedoch stört ist, wenn ein 21

Hologramm in der Umgebung fixiert wird man den Raum verlässt. Wenn ich ein 22

Hologram in einem Raum abgelegt habe und den Raum verlassen habe, war das 23

Hologramm selbst nach einem Neustart der Applikation im Nebenraum. Das ist 24

aber schwer zu vermeiden, da das Hologram ja im Raum fixiert sein soll. 25

Q: Hast du das Gefühl, dass du alle Tasks auf dem User Testsheet erfolgreich 26

durchgeführt hast? 27

A: Ja, ich konnte alle Tasks durchführen. Die Tasks konnten ohne Instruktionen 28

durchgeführt werden. Die Applikation war selbsterklärend. 29

Page 93: ARTUR Augmented Reality Training Utility for Radiography

85

Q: Was ist deine Meinung zum Menü der Applikation? War es störend oder 30

logisch aufgebaut? 31

A: Das Menü ist übersichtlich, hier haben wir nur wieder das Problem mit dem 32

kleinen FOV. Wenn ich den Blick hebe und die Hologramme nicht mehr ganz im 33

Blickfeld habe, sehe ich eventuell nicht direkt welche Auswirkung meine Aktion 34

hat. Für die Audioausgabe ist das nicht so tragisch, wenn ich nach oben schauen, 35

um den Zentralstrahl einzublenden, dann nach unten schaue, um ihn zu sehen 36

kann es zu Problemen kommen. Idealerweise wäre jede Aktion gleich im Blickfeld. 37

Sonst ist das Menü gut aufgebaut, klar und verständlich. Durch den Input der 38

Fokusgruppe sind alle gewünschten Optionen vorhanden. Die Funktionen der 39

HoloLens durch das neue Entwickler Toolkit sind beeindruckend. Das 40

Audiofeedback der Buttons ist wichtig, um zu sehen ob die Inputs angenommen 41

wurden. 42

Q: War es einfach durchgeführte Aktionen zu wiederholen oder rückgängig 43

zu machen? 44

A: Ich musste nichts rückgängig machen. Es war nicht nötig. 45

Q: Waren alle erwarteten Menüpunkte vorhanden oder hast du etwas 46

vermisst das so eine Applikation deiner Meinung nach benötigt? 47

A: Ja 48

Q: Wie ist deine Meinung zur Interaktion mit dem Hologramm selbst? Ist das 49

Platzieren von Hologrammen in gewünschter Position und am gewünschten 50

Ort intuitiv oder schwierig? 51

A: Da ich schon Erfahrung im Umgang mit der HoloLens habe, denke ich, dass ich 52

mir das etwas leichter fällt als neuen Usern. Die Anwendung ist per se nicht 53

schwer, aber des Erlenen der Gesten ist wichtig. Das Vergrößern- oder Verkleinern 54

von Hologrammen an den Eckpunkten ist eigentlich selbsterklärend, aber für User, 55

die noch nicht viel mit der HoloLens gearbeitet haben mit diesen Gesten 56

umzugehen. Da könnte es auch zu Problemen mit der Interaktion der Hologramme 57

kommen. Für mich war es einfach handzuhaben und die Hologramme haben sich 58

wie erwartet verhalten. 59

Q: Wäre eine kurze Schulung für neue User hilfreich? 60

A: Eine grundsätzliche HoloLens Einschulung muss grundsätzlich durchgeführt 61

werden. Eventuell anhand eines Lehrvideos da die HoloLens vorstellt. Ich glaube 62

nicht, dass jeder Studierende die HoloLens aufsetzen muss, um die Bedienung zu 63

üben, das wäre umständlich. 64

Page 94: ARTUR Augmented Reality Training Utility for Radiography

86

Q: Gibt es Vorschläge die Interaktion intuitiver zu gestalten? 65

A: Interessant wäre es anstatt Vergrößern oder Verkleinern an den Ecken der Box, 66

ein größer und kleiner ziehen des Hologramms, analog zur Smartphone 67

Bedienung, durch direkte Interaktion mit dem Hologramm selbst. Aber eine 68

Manipulation von Objekten kennt man ja aus anderen Programmen wie 69

PowerPoint. Wenn ich dort ein Objekt einfüge, weiß ich sofort, dass die Ecken zum 70

Vergrößern sind, die Mitten zum Verschieben und ein drittes Feld zum Rotieren 71

sind. Intuitiver oder einfacher wäre es, wenn ich ein Hologramm einfach per 72

Smartphone ähnlichen Gesten manipulieren kann. 73

Q: Hat das System genug Feedback zu deinen Eingaben gegeben. 74

A: Ein visuelles Feedback, durch Farbänderung der Ecken der Boundingbox, wäre 75

gut. Wenn die gewählte Ecke zum Beispiel von Blau auf Orange wechselt. 76

Q: Wäre Audiofeedback zur Manipulation der Hologramme wünschenswert? 77

A: Ich bin kein Fan von zu viel Audiofeedback. Im Menü finde ich das 78

Audiofeedback gut, als Bestätigung ob Optionen gewählt wurden. Durch die 79

häufige Manipulation der Hologramme wäre Audiofeedback hier störend. Ein 80

bisschen heller werden die gewählten Ecken, aber das ist eventuell zu wenig und 81

zu subtil für neue User. 82

Q: Gab es Probleme mit der Applikation beim Test? Fehlermeldungen oder 83

Abstürze? 84

A: Nein 85

Q: Wie war dein Eindruck der Applikation im Vergleich zu anderen HoloLens 86

Applikationen? 87

A: Ich weiß nicht ob das an der Applikation selbst liegt, oder ob sich HoloLens 88

Applikationen generell weiterentwickelt haben. Meine Letzte Anwendung einer 89

HoloLens Applikation liegt zwei Jahre zurück. Die Anwendung ist im Vergleich 90

dazu angenehmer zu bedienen, übersichtlicher und aufgeräumter. 91

Q: Wie nützlich war das Hologramm dabei deinen Testpatienten zu 92

positionieren? 93

A: Die Größenanpassung ist hilfreich, um Details zu betrachten. Wo liegen 94

anatomische Strukturen oder ähnliches. Auch zur Positionierung des 95

Zentralstrahls ist die visuelle Hilfe des Hologramms hilfreich. 96

Q: War das Hologramm transparent genug? War die Helligkeitseinstellung 97

der HoloLens nützlich? 98

A: Man muss aufpassen in einem nicht zu dunklen oder zu hellen Raum zu 99

arbeiten. Zu helle Umgebung mit viel Sonnenlicht könnte störend sein. Ich denke 100

halbdunkle Räume, wie es in Röntgenaufnahmeräumen häufig ist, können zu 101

guten Ergebnissen führen. Eine Regelung der Transparenz der Hologramme in 102

der Applikation habe ich nicht vermisst. 103

Page 95: ARTUR Augmented Reality Training Utility for Radiography

87

Q: War es einfach oder schwierig die Patienten in das Hologramm zu 104

platzieren? 105

A: Anfangs war aufgrund zu hoher Helligkeit im Raum etwas schwierig. 106

Grundsätzlich ist das Platzieren von Testpersonen im Hologramm durch die 107

Möglichkeit Hologramme auf Oberflächen zu platzieren sehr gut möglich. Das 108

Hologramm war stabil genug, um verlässlich bei der Positionierung zu assistieren. 109

Q: Hilft der Virtuelle Zentralstrahl bei der Positionierung des echten 110

Zentralstrahls? 111

A: Ja die Möglichkeit die Position des Zentralstrahl vorab zu kontrollieren ist 112

hilfreich. Ich würde den Nutzen aber eher in Vorbereitungsübungen zur 113

tatsächlichen Positionierung sehen. Ein Einblenden von möglichen falschen 114

Positionierungen wäre interessant. 115

Q: Können AR Applikationen in der Ausbildung von 116

Radiologietechnolog*innen hilfreich sein? 117

A: Ja. Abläufe können genau erklärt werden. Der Aufbau von Maschinen kann 118

gezeigt werden. Theoretisch können Patienten und Patientinnengespräche per AR 119

simuliert werden. AR ist für das Vermitteln sämtlicher Lehrinhalte geeignet. Gerade 120

für Patienteninteraktion, wie im Röntgen hat AR Potential. Auch die Funktion von 121

Nuklearmedizinischen Untersuchungen könnte anhand von CT Datensätzen 122

visualisiert werden. In der Strahlentherapie könnte auch die 123

Patientenpositionierung am Linarbeschleuniger geübt werden. Neben der 124

praktischen Übung kann AR auch zum Visualisieren von Vorgängen genutzt 125

werden. Zum Beispiel die Positionierung von Magnetspulen im MR. 126

Q: Gibt es noch Anmerkungen oder Kommentare zu Themen, die wir nicht 127

besprochen haben? 128

A: Einfügen von Fehlpositionen des Zentrahlstrahls durch zusätzliche farblich 129

markierte Feldlichter. 130

Page 96: ARTUR Augmented Reality Training Utility for Radiography

88

c. German Interview with Participant Three

Q: Was ist dein fachlicher Hintergrund und wie lange hast du Erfahrung mit 1

augmented reality (AR)? 2

A: Ich bin Entwickler für AR und VR Applikationen. Ich habe Erfahrung mit 3

verschiedenen Plattformen, wie der HoloLens, MagicLeap, AR core und AR kit für 4

mobile Geräte. Auch das DAQRI Gerät, welches nicht mehr vertrieben wird, habe 5

ich ausprobiert. Google Glass habe ich auch getestet, Google Glass ist eigentlich 6

nur ein Head Up Display mit Kamera und kann nicht mit der Umgebung 7

interagieren. Ich habe Erfahrung mit diversen VR Headsets und besuche 8

Konferenzen im AR/VR Bereich. Ich würde sagen ich habe 3 bis 4 Jahre Erfahrung 9

im AR Bereich. 10

Q: Hast du das Gefühl, dass du alle Tasks auf dem User Testsheet erfolgreich 11

durchgeführt hast? 12

A: Ich konnte die Tasks dank meiner Erfahrung mit der HoloLens erfolgreich 13

durchführen. Mir ist aufgefallen, dass auf der Anleitung die Gesten zur Bedienung 14

der HoloLens (Bloom, Air Tap) angegeben sind. Jemand der die HoloLens zum 15

ersten Mal auf hat, weiß nicht was die Bloom Geste ist oder wie Air Tap funktioniert. 16

Vor Benutzung der App wäre es gut neuen Nutzern die Bedienung der HoloLens 17

zu erklären. Ich nehme an, dass Erstanwender ohne diese Einführung überfordert 18

wären. Neue User sollten auch auf das FOV hingewiesen werden, dass sich das 19

Sichtfeld nur in der Mitte befindet und sie an den Bildrändern keine Projektion 20

erwarten können. 21

Q: Wie würdest du die visuelle Qualität der Applikation beurteilen? Waren 22

alle Optionen und Informationen, die du erwartet hast, vorhanden? 23

A: Ja, grundsätzlich waren die in der Testingguideline geforderten Aktionen 24

durchführbar. Bei einigen UI Elementen war nicht klar ob es Buttons oder nur 25

Beschriftung sind. Das fällt einem je nach Erfahrungslevel wahrscheinlich mehr 26

oder weniger auf. 27

Page 97: ARTUR Augmented Reality Training Utility for Radiography

89

Q: Sollten interaktive und nicht interaktive UI Elemente visuell besser 28

voneinander abgegrenzt sein? 29

A: Ja, die Überschrift der Menüs war optisch fast gleich wie die interaktiven 30

Buttons. Hier besteht die Annahme, dass dieses Element möglicherweise auch 31

auswählbar ist. 32

Weiters ist mir aufgefallen, dass die Collider der 3D Modelle größer sind als das 33

tatsächliche Modell. Dadurch kann es passieren, dass man einen sichtbaren 34

Menüpunkt hinter dem Modell auswählen möchte, dieser aber vom box collider 35

des Modells verdeckt ist und dadurch nicht ausgewählt werden kann. Das liegt 36

daran, dass die collider boxen an die bounding boxen die das Skalieren und 37

Rotieren der Modelle erlaubt angepasst sind. 38

Q: Hättest du Vorschläge für eine bessere Gestaltung der collider boxen? 39

A: Capsule collider statt box collider wären eine Option. Ideal wären mesh collider, 40

diese benötigen allerdings höhere Rechenperformance. Es stellt sich auch die 41

Frage ob die collider boxen mit den definierten Eckpunkten nicht fehlen. Hier wäre 42

eine Manipulation mit beiden Händen anstatt mit der bounding box nötig. Das ist 43

wieder ein Lernfaktor und muss von der Hardware unterstützt werden. Eventuell 44

könnte man den collider erst per Knopfdruck mit der bounding box mit aktivieren. 45

Q: Interaktion mit Hologrammen, die ein Skalieren und Drehen der Modelle 46

durch Aufziehen mit beiden Händen erlauben, wurde von der Fokusgruppe 47

und den anderen Testnutzern als sehr intuitiv befunden, wie ist dein 48

Eindruck zur Interaktion mit Hologrammen. Würde ein Ersatz der bounding 49

box durch zweihändige Interkation intuitiver sein? 50

A: Ich denke dieses mit 2 Händen oder Fingern Auseinanderziehen kommt von der 51

Interaktion mit Smartphones. Dadurch erwarten User, die mit einem Hologramm 52

interagieren, dass sie dieses nehmen und auseinanderziehen können. Das ist 53

Aufgrund der Erfahrung aus dem Smartphone Bereich intuitiv. 54

Q: War das Bedienen der bounding box funktioniert? War sie intuitiv und hat 55

sie ausreichend Feedback auf Userinput gegeben? 56

A: Die bounding box ist für Leute mit Erfahrung intuitiv, ansonsten ist es durch 57

Ausprobieren zu erlernen. Neuen Nutzern sollte die Steuerung der HoloLens und 58

das Bedienen der Applikation gezeigt werden. 59

Q: Eine Art Einführung oder Tutorial für neue User wäre somit wichtig? 60

A: Ja, sonst kann die Bedienung schnell frustrieren. 61

Page 98: ARTUR Augmented Reality Training Utility for Radiography

90

Q: Die Applikation soll in der Lehre eingesetzt werden, sind Hands on 62

Tutorials nötig oder könnte das auch per Lehrvideo nähergebracht werden? 63

A: Es muss nicht unbedingt Hands on. Eventuell lassen sich interaktive Tutorials 64

gestalten, die direkt an der HoloLens zeigen wie die Gesten funktionieren, um 65

diese zu lernen. Wichtig ist, die Handhaltung zu erklären, wenn diese aus dem 66

Blickfeld der HoloLens gerät funktionieren Gesten nicht mehr. Details wie zu tiefe 67

Handposition, oder Verdrehung der Hand können dazu führen, dass Gesten nicht 68

mehr erkannt werden. Auch die Menüführung in der App sollte neuen Usern erklärt 69

werden. Zum Beispiel, dass das Menü angepinnt werden kann, um nicht mehr mit 70

dem Blick des Users mitzuwandern. 71

Q: Hat das System genug Feedback zu deinen Eingaben gegeben? 72

A: Ja, im Menü war optisch ersichtlich welche Punkte ausgewählt waren. Eine 73

Überlegung wäre, dass ein zum Verschieben ausgewähltes Modell auch visuelles 74

Feedback gibt und heller dargestellt wird. Grundsätzlich ist es ersichtlich, wenn 75

sich der Gaze Cursor auf das Objekt legt, ein zusätzliches Highlighten des Objekts 76

könnte aber hilfreich sein. 77

Q: Wie ist deine Einschätzung, würden neue User die Kennzeichnung der 78

Interaktivität durch den Gaze Cursor erkennen? 79

A: Ich habe bei einer mobilen AR Applikation, die sich mit Schielen beschäftigt 80

mitgewirkt. Hier haben wir Interaktionsaufforderungen von visuellen Highlights, 81

haptischen und Audiofeedback getestet. Hier sind visuelle Aufforderungen besser 82

aufgenommen worden als haptisches oder hörbares Feedback. 83

Q: Wäre erweitertes Audiofeedback, zusätzlich zum Geräusch gedrückter 84

Buttons sinnvoll? 85

A: Zu viel Sound, wie zum Beispiel beim Verschieben von Objekten halte ich nicht 86

für sinnvoll. Maximal beim Auswählen eines 3D Objekts, ein kurzes minimales 87

Audiofeedback welches die Auswahl bestätigt. Ähnlich wie die Klicks der Buttons. 88

Q: Würdest du so eine Applikation als Sinnvoll in der Ausbildung von 89

Radiologietechnolog*innen erachten? 90

A: Ich halte es schon für sinnvoll. Feedback zur Positionierung wäre jedoch wichtig, 91

um auf Fehler aufmerksam zu machen. Ich würde es als zusätzliches Lehrmittel 92

sehen und den vor Ort Unterricht nicht damit ersetzen. Dieser bietet den Vorteil, 93

dass Erfahrene Trainer ihr Wissen weitergeben können. Als unabhängige 94

Trainingsmöglich ist so eine Applikation sicher sinnvoll. 95

Q: Wie war dein Eindruck der Applikation im Vergleich zu anderen HoloLens 96

Applikationen? 97

A: Als erfahrener User waren keine komplizierten Vorgänge durchzuführen. In der 98

ersten Version habe ich den Inhalt aufgrund außerhalb des FOV liegender Objekte 99

gar nicht wahrgenommen. Nach dem Update sind Objekte direkt vor dem User 100

eingeblendet worden. 101

Page 99: ARTUR Augmented Reality Training Utility for Radiography

91

Q: Wie siehst du den Zusammenhang zwischen Hard- und Softwarelimitation 102

dieser Applikation auf der HoloLens? 103

A: Eine Applikation muss immer auf die Hardwarelimitationen programmiert 104

werden. Wenn ich Objekte nicht in meinem FOV habe, ist die Applikation für mich 105

im ersten Moment nicht funktionstüchtig. Wenn der Raum abgedunkelt ist, wirken 106

die Hologramme besser. 107

Q: War das Hologramm transparent genug? War die Helligkeitseinstellung 108

der HoloLens nützlich? 109

A: Wenn man weiß, dass es den Hardware Regler gibt, reicht dieser aus. Jede 110

Menüinteraktion zusätzlich ist anstrengend. Vor allem Tastatureingaben sollen 111

vermieden werden. Generell sollte stattdessen Spracheingabe verwendet werden. 112

Q: Werden die neuen Interaktionsmöglichkeiten der HoloLens 2 für 113

intuitivere Bedienung sorgen? 114

A: Ja, durch das genauere Handtracking glaube ich, nach einem zehn minütigen 115

Test der HoloLens 2 schon, dass die Bedienung interaktiver ist. Anstatt mit Gaze 116

und Air Tap, kann mit Objekten dann wie auf einem Smartphone direkt interagiert 117

werden. Spannend ist auch das remote Rendering, um Modelle mit höherer 118

Qualität auf die HoloLens zu streamen. Vor allem die HoloLens 1 ist von der 119

Rechenleistung sehr limitiert. 120

Q: Waren die Hologramme stabil genug im Raum? War die Framerate 121

ausreichend? War die visuelle Qualität der Hologramme ausreichend? 122

A: Ja optisch waren die Hologramme ansprechend. Die Performance war 123

ausreichend, für diesen Anwendungsbereich sind nicht unbedingt 60 FPS nötig. 124

Mir wären keine Ruckler aufgefallen. 125

Q: Gibt es noch Anmerkungen oder Kommentare zu Themen, die wir nicht 126

besprochen haben? 127

A: Eventuell könnte ein Einsatz des Hand Trackings auf einem Oculus Headset 128

dabei helfen die Positionierung der Körperteile zu überprüfen, um den Usern 129

Feedback zur Positionierung zu geben. Da AR Geräte selten sind, würde eine VR 130

Version der Applikation diese zugänglicher machen. 131

Page 100: ARTUR Augmented Reality Training Utility for Radiography

92

G. English Transcripts of the Semi-Structured Expert Interviews

a. English Interview with Participant One

Q: How long is your experience as a radiographer? 1

A: I worked for six years as a radiographer and for two years as a 3D planner for 2

medical devices. 3

Q: What impression did the wearing comfort of the HoloLens make on you? 4

A: This was my first experience with the HoloLens. At first, it’s a bit unfamiliar to 5

use, after getting used to it there are no problems with interacting with it. 6

Q: How about use time? Does it feel exhausting to use the HoloLens after 7

some time? 8

A: It was not exhausting; I was quickly accustomed to use the HoloLens. 9

Q: How do you rate the visual quality of the HoloLens application? 10

A: I found it to be clearly arranged. The menu structure is simple, and you know 11

your way around immediately. It is very user friendly; it doesn't take long to get into 12

[the operation]. The quality of the 3D scans is high enough to make out details. 13

Q: What was the impression of the HoloLens' Field of View? Was the field of 14

view large enough? Were objects cut off? 15

A: Maybe this will get better with more use time, but at the beginning I had the 16

feeling that I could not fit everything into my field of view. Maybe that’s a user 17

specific issue which resolves with more experience in using the HoloLens. When 18

the hologram sizes are adjusted and the menu is pinned, so that holograms won’t 19

occlude each other, it works well. 20

Q: How was the stability of the holograms? Were they stable enough in the 21

real world to put someone inside? Did they wobble or shake? 22

A: No, nothing wobbled or jittered. I think the best thing would be a higher table so 23

that users don't have to bend down. This might cause the Field of View to slip away 24

from the holograms. But it was stable in any case. 25

Q: Do you feel that you successfully completed all the tasks on the task 26

sheet? 27

A: Yes, all worked well. 28

Q: What is your opinion on the applications menu? 29

A: The menu is very logical; I can open and close it when I don't need it. I can pin 30

it down, which is very important because otherwise I have it in my field of view all 31

the time. It is clearly arranged and intuitive. 32

Page 101: ARTUR Augmented Reality Training Utility for Radiography

93

Q: Did you find the options you were looking for? 33

A: Yes. 34

Q: Could you easily undo or redo any action if you needed to? 35

A: For example, if I want to repeat the audio instructions. No, there were no 36

problems here, it worked great. 37

Q: Were all menu items you expected present? 38

A: As a basis I found it to be very good. The information needed is all there. A 39

possibility in the future would be to introduce some gamification. Users could 40

measure their progression or a matching between the real person and the 41

hologram, showing how far they differ in their position could be useful. 42

Q: How do you feel about the methods of interaction with the holograms? Is 43

the placement of holograms in the desired position and location intuitive or 44

difficult? 45

A: The learning curve is relatively steep; you may be a little clumsy at first. But after 46

a few tries you master it quite fast. 47

Q: Would a short training for new users be helpful? 48

A: Yes, a training or short introduction on how to use and interact with the HoloLens 49

would be great. For example, a thing you just have to know before using the 50

HoloLens is the Gaze Cursor. You must pay attention to the cursor and not to the 51

position of the hand. Someone must tell you that. 52

Q: Interaction with the HoloLens is not very intuitive? 53

A: Users would figure it out eventually, but I would not have found out about the 54

Gaze interaction immediately. 55

Q: What worked as expected when manipulating the holograms? What 56

didn't? 57

A: At first, I assumed that the corners of the bounding box were for rotating the 58

hologram. I expected to be able to rotate and enlarge at the corners, which I believe 59

is technically impossible. 60

Q: Any suggestions on how to improve the interaction methods? 61

A: It would be great if holograms could be easily touched with the user’s hands and 62

made bigger or smaller. But I do not know if that is technically possible. 63

Q: Do you have any wishes for additional input methods? 64

A: Voice control for enlarging holograms would be desirable. Positioning with 65

speech is probably difficult but voice-controlled resizing and rotating of holograms 66

would be great. 67

Q: Did the system provide enough feedback on your inputs? 68

A: Yes, there is always a click when you hit buttons. When rotating holograms, it 69

didn't click, but here you have an immediate visual feedback if the interaction 70

works. 71

Q: Was visual and audible feedback given sufficiently? 72

A: Yes 73

Page 102: ARTUR Augmented Reality Training Utility for Radiography

94

Q: Did you get confirmation if your inputs were accepted? 74

A: Yes 75

Q: Did you have any issues with the application? Did you encounter any 76

errors or crashes? 77

A: Nothing, no errors and nor crashes. 78

Q: Did you have any difficulties with interacting with the menu or the 79

holograms? Have there been problems with holograms occluding each other 80

or in the interaction with the real world? 81

A: Orientation in the real world was possible, I didn't run into anything. 82

Q: Did the holograms behave as you expected? Were there any problems 83

placing the holograms on the bucky table? 84

A: No, there were no problems with the placement of holograms. With brightness 85

and transparency of holograms maybe. Placing test patients into the hologram did 86

not work as well in artificial light as it did in daylight. But there were no issues with 87

the interaction with the holograms. 88

Q: Compared to other application on the HoloLens, how was this application 89

to use? 90

A: Sorry, I do not know any other HoloLens Applications. 91

Q: How useful was the hologram to help the positioning of your test patients 92

body part? 93

A: It depends a bit on what kind of learner you are. It's easy for me when I can see 94

something and replicate it. For me a visual aid is a great enrichment. On the one 95

hand, I can see how the position will look like and can prepare myself for 96

positioning a real patient. On the other hand, when positioning the patient, I have 97

visual clues how to position the patient’s body part or a wedge pillow for example. 98

Q: Was the size adjustment of the holograms useful? 99

A: The resizing was useful. I feel that it makes a difference how far away I am from 100

the patient and therefore I must resize the hologram accordingly. 101

Q: Was it easy to place the patient in the hologram? 102

A: This is probably also a matter of practice and depends on how skilled you are 103

at concentrating on what you are seeing and interacting with your patient at the 104

same time. It takes some practice, but it's fun. 105

Q: Was the hologram transparent enough? Was the brightness adjustment 106

of the HoloLens useful? 107

A: That depends on the light conditions. A voice command to adjust hologram 108

transparency would be neat. If I already have my hands on the patient and want to 109

check the position, I can make the hologram more or less transparent by voice 110

control while still manipulating the patient with my hands. 111

Q: Was the hologram stable enough to reliably assist in the positioning? 112

A: The holograms were stable; it is possible to move around the room and the 113

holograms stay in position. I could circle around the hologram without it moving. 114

Page 103: ARTUR Augmented Reality Training Utility for Radiography

95

Q: Did the virtual light beam localizer help with the positioning of the real-life 115

localizer? 116

A: I think so, here it is important to be able to change the transparency so that I 117

can see the central beam more clearly. The central beam on the hologram helps a 118

lot as a guide for the positioning of the real central beam. 119

Q: Do you think AR applications can benefit the education of radiographers? 120

A: Yes, I think AR can be used in education radiographers. There are few 121

internships long enough to train rare special radiographic procedures. These could 122

be practiced with an AR application. Another possibility is self-learning without the 123

presence of a lecturer or trainer Additionally, new technology promotes interest to 124

try the application. This curiosity could be used as an advantage to promote learner 125

participation. 126

Q: Do you have any further comments or remarks? 127

No128

Page 104: ARTUR Augmented Reality Training Utility for Radiography

96

b. English Interview with Participant Two

Q: How long is your experience as a radiographer? 1

A: Five years of clinical experience as a radiographer and four years in education. 2

Q: What impression did the wearing comfort of the HoloLens make on you? 3

A: As a wearer of glasses, I always found the HoloLens a bit uncomfortable, so I 4

would be interested in the wearing comfort of the HoloLens 2. Regarding the 5

wearing comfort for short sessions, I find the HoloLens comfortable to wear. 6

Depending on how you plan to use it, you should check whether there is a loss of 7

comfort when wearing it for 30 to 45 minutes. In the case of rental equipment, the 8

wear and tear of the equipment and the resulting loss of wearing comfort should 9

also be considered. The biggest limit in the use of the HoloLens 1 is certainly the 10

Field of View (FOV). 11

Q: How do you rate the visual quality of the HoloLens application? 12

A: The visual quality stands and falls with the recorded data, the higher the quality 13

of the 3D scans, the higher the quality of the holograms. The image quality is very 14

good, but the FOV is a scandal, for example if you want to have a short 15

conversation with colleagues, lift your gaze and only see half of the hologram, 16

that's annoying. In any case, the FOV is too small. A larger FOV is urgently needed, 17

which the HoloLens 2 will hopefully provide. However, the resolution and image 18

quality are enough in any case. The stability of the holograms is also good enough. 19

Q: Do you feel that you successfully completed all the tasks on the task 20

sheet? 21

A: Yes, I was able to complete all the tasks. The tasks could be performed without 22

instructions. The application was self-explanatory. 23

Q: What is your opinion on the applications menu? 24

A: The menu is straightforward, the small FOV is an issue again. If I raise my eyes 25

and don't have the holograms in my field of view anymore, I might not see the effect 26

of my action. For the audio output this is not an issue, but if I look up to enable the 27

central beam, then look down to see it, it can cause visibility problems. Ideally 28

every action would be in the same field of view. Otherwise the menu is well 29

structured, clear and understandable. Thanks to the input of the focus group, all 30

desired options are available. The functions of the HoloLens through the new 31

developer toolkit are impressive. The audio feedback of the buttons is important to 32

see if the inputs were accepted. 33

Q: Could you easily undo or redo any action if you needed to? 34

A: I didn't have to undo anything. It wasn't necessary. 35

Page 105: ARTUR Augmented Reality Training Utility for Radiography

97

Q: Were all menu items you expected present? 36

A: Yes 37

Q: How do you feel about the methods of interaction with the holograms? Is 38

the placement of holograms in the desired position and location intuitive or 39

difficult? 40

A: Since I already have experience in using the HoloLens, I think that I find this a 41

little easier than new users. The application is not difficult to use per se, but learning 42

the gestures is important. Resizing holograms at the corners is self-explanatory. 43

But for users who have not worked with the HoloLens much, it is important to learn 44

how to perform these gestures. This could also lead to problems with the 45

interaction of the holograms for new users. For me it was easy to handle and the 46

holograms behaved as expected. 47

Q: Would a short training for new users be helpful? 48

A: A basic HoloLens training course must be provided. Possibly by means of an 49

instructional video about the HoloLens. I don't think that every student must 50

practice operating the HoloLens with the hardware. This would be rather time 51

consuming. 52

Q: Any suggestions on how to improve the interaction methods? 53

A: Instead of enlarging or reducing the size of the hologram at the corners of the 54

bounding box, it would be interesting to resize the hologram, similar to smartphone 55

operation, by direct interaction with the hologram itself. But a manipulation of 56

objects with bounding boxes is known from other programs like PowerPoint. If I 57

insert an object there, I immediately know that the corners are for enlarging, the 58

centers are for moving and a third field is for rotating. It would be more intuitive or 59

easier if I could manipulate a hologram simply by using gestures like those of a 60

smartphone. 61

Q: Did the system provide enough feedback on your inputs? 62

A: A visual feedback, by changing the color of the corners of the bounding box, 63

would be great. For example, if the selected corner changes from blue to orange 64

Q: Would audio feedback to manipulation of the holograms be desirable? 65

A: I am not a fan of too much audio feedback. Regarding the menu I find the audio 66

feedback a great confirmation if options have been selected. Due to the frequent 67

manipulation of holograms, audio feedback would be an annoying feature here. As 68

visual feedback, selected corners of the bounding box are displayed a little bit 69

brighter, but this might be too little and too subtle for new users. 70

Q: Did you have any issues with the application? Did you encounter any 71

errors or crashes? 72

A: Nothing, no errors and nor crashes. 73

Page 106: ARTUR Augmented Reality Training Utility for Radiography

98

Q: How was your impression of the application compared to other HoloLens 74

applications? 75

A: I don't know if this is due to the application itself, or if HoloLens applications in 76

general have evolved. My last use of a HoloLens application was two years ago. 77

In comparison, the application is more pleasant to use, clearer and tidier. 78

Q: How useful was the hologram to help the positioning of your test patients 79

body part? 80

A: Resizing is useful for viewing details. It helps to visualize where anatomical 81

structures are located for example. The visual aid of the hologram is also helpful 82

for positioning the central beam. 83

Q: Was the hologram transparent enough? Was the brightness adjustment 84

of the HoloLens useful? 85

A: One must be careful not to work in a room that is too dark or too bright. Too 86

bright surroundings with a lot of sunlight could be disturbing. I think half-dark 87

rooms, as it is common in X-ray rooms, can lead to good results. I have not missed 88

a regulation of the transparency of holograms in the application. 89

Q: How easy or how hard was it to position a patient according to the 90

hologram? 91

A: In the beginning it was a bit difficult due to too much brightness in the room. 92

Basically, the placement of test persons in the hologram is easy due to the 93

possibility to place holograms on surfaces. The hologram was stable enough to 94

reliably assist in positioning. 95

Q: Did the virtual light beam localizer help with the positioning of the real-life 96

localizer? 97

A: Yes, the possibility to check the position of the central beam in advance is 98

helpful. But I would see the benefit rather in preparatory exercises for the actual 99

positioning. A fade-in button to display possible wrong positioning would be 100

interesting. 101

Q: Do you think AR applications can benefit the education of radiographers? 102

A: Yes. Procedures can be explained in detail. The structure of machines can be 103

shown. Theoretically patient conversations can be simulated by AR. AR is suitable 104

for teaching all types of content. Especially for patient interaction, such as in X-105

rays, AR has potential. The function of nuclear medical examinations could also 106

be visualized by utilizing CT data sets. In radiotherapy, patient positioning at the 107

linear accelerator could also be practiced. In addition to practical training, AR can 108

also be used to visualize processes. For example, the positioning of magnetic coils 109

in MRI. 110

Q: Do you have any further comments or remarks? 111

A: Possible display of common incorrect positions of the central beam by additional 112

coloured field lights.113

Page 107: ARTUR Augmented Reality Training Utility for Radiography

99

c. English Interview with Participant Three

Q: What is your professional background and how long have you had 1

experience with augmented reality (AR)? 2

A: I am a developer for AR and VR applications. I have experience with different 3

platforms, like the HoloLens, MagicLeap, AR core and AR kit for mobile devices. I 4

have also tried the DAQRI device, which is no longer distributed. I also tested 5

Google Glass. Google Glass is just a head up display with a camera and cannot 6

interact with the environment. I have experience with various VR headsets and 7

attend conferences in the AR/VR field. I would say I have 3 to 4 years of experience 8

in AR. 9

Q: Do you feel that you have successfully completed all tasks on the test 10

sheet? 11

A: Thanks to my experience with the HoloLens, I was able to perform the tasks 12

successfully. I have noticed that the instructions include the gestures for operating 13

the HoloLens (Bloom, Air Tap). Someone who uses the HoloLens for the first time 14

does not know what the Bloom gesture is or how Air Tap works. Before using the 15

app, it would be great to explain the HoloLens to new users. I suppose that first 16

time users would be overwhelmed without this introduction. New users should also 17

be made aware of the limited FOV and that the field of view is only in the middle of 18

the headsets screen and they cannot expect any projection at the edges of the 19

image. 20

Q: How would you rate the visual quality of the application? Were all options 21

and information you expected available? 22

A: Yes, the actions required by the testing guideline were doable. For some UI 23

elements it was not clear whether they were buttons or just labels. You will probably 24

notice this depending on your experience level. 25

Q: Should interactive and non-interactive UI elements be visually better 26

distinguishable? 27

A: Yes, the headlines of the menus were visually almost the same as the interactive 28

buttons. This could lead to the assumption that they were also selectable. 29

Furthermore, I noticed that the colliders of the 3D models are larger than the actual 30

model. This can lead to an issue if you want to select a visible menu item behind 31

the model, but it is hidden by the box collider of the model and therefore cannot be 32

selected. This is because the collider boxes are adapted to the bounding boxes 33

that allow scaling and rotating the models. 34

Page 108: ARTUR Augmented Reality Training Utility for Radiography

100

Q: Do you have any suggestions for a better design of the collider boxes? 35

A: Capsule collider instead of box collider would be an option. A mesh collider 36

would be ideal, but these require higher computing power. Missing corners to 37

manipulate the holograms could also be an issue with mesh colliders. Here a 38

manipulation with both hands instead of the bounding box would be necessary. 39

This again, is a learning factor and must be supported by the hardware. It might be 40

possible to activate the collider with the bounding box by pressing a button. 41

Q: Interaction with holograms, which allow scaling and rotating the models 42

by pulling with both hands, was found to be very intuitive by both the focus 43

group and the other test users. What is your impression about interaction 44

with holograms? Would replacing the bounding box with two-handed 45

interactions be more intuitive? 46

A: I think the interaction with holograms by manipulating them with 2 hands or 47

fingers comes from interaction with smartphones. So, users interacting with a 48

hologram expect to be able to touch it and pull it apart to resize it. This is intuitive 49

because of the experience with smartphones. 50

Q: Was the operation of the bounding box working? Was it intuitive and did 51

it give enough feedback on user input? 52

A: The bounding box is only intuitive for people with experience, otherwise it can 53

be learned by trial and error. New users should be shown how to control the 54

HoloLens and how to use the application. 55

Q: Some form of introduction or tutorial for new users would be important? 56

A: Yes, otherwise the user can quickly become frustrated. 57

Q: It is planned to use the application in an educational setting. Are hands 58

on tutorials necessary or could this also be brought closer by for example, 59

an instructional video? 60

A: It does not necessarily have to be Hands on with a trainer. It might be possible 61

to create interactive tutorials directly on the HoloLens to teach new users the 62

gestures and how they work. It is important to explain the importance of the hand 63

position, if it gets out of the HoloLens' field of vision, gestures do not work anymore. 64

Details such as too low a hand position or twisting the hand can lead to gestures 65

not being recognized. The menu navigation in the app should also be explained to 66

new users. For example, that the menu can be pinned so that the menu does no 67

longer follow the users gaze. 68

Q: Did the application provide enough feedback for your inputs? 69

A: Yes, you could see which items were selected in the menu. One possibility 70

would be that a model selected for moving it around the users’ space, could also 71

give visual feedback and be highlighted after selection. Now, it is visible to the user 72

when the gaze cursor is placed on the object, but an additional highlighting of the 73

object could be helpful. 74

Page 109: ARTUR Augmented Reality Training Utility for Radiography

101

Q: Do you think new users would recognize the gaze cursor's interactivity 75

label? 76

A: I worked on a mobile AR application that deals with strabismus. Here we tested 77

interaction prompts in the form of visual highlights, haptic and audio feedback. 78

Visual prompts were better received than haptic or audible feedback. 79

Q: Would extended audio feedback, in addition to the clicking noise by 80

pressed buttons be useful? 81

A: Too much sound, such as when moving objects, does not make sense. At most 82

maybe, when selecting a 3D object, a short minimal audio feedback to confirm the 83

selection could be helpful. Like the clicks of the buttons. 84

Q: Would you consider such an application useful in the training of 85

radiographers? 86

A: I think it does make sense. However, feedback on positioning would be 87

important, to also draw attention to possible mistakes. I would see it as an 88

additional teaching tool and not as a substitute for on-site teaching. On site 89

teaching offers the advantage, that experienced trainers can pass on their 90

knowledge. As an independent training possibility, such an application is certainly 91

useful. 92

Q: How was your impression of the application compared to other HoloLens 93

applications? 94

A: As an experienced user, there were no complicated procedures to perform. In 95

the first version I did not notice some the content due to objects being outside the 96

FOV. After the update, objects were displayed directly in front of the user. 97

Q: How do you see the connection between hardware and software 98

limitations of this application on the HoloLens? 99

A: An application must always be programmed around the hardware limitations. If 100

I don't have objects in my FOV, the application will not work for me. If the room is 101

darkened, the holograms work better. 102

Q: Was the hologram transparent enough? Was the brightness adjustment 103

of the HoloLens useful? 104

A: If you know that there are hardware buttons, they are enough for brightness 105

control. Every additional menu interaction on the HoloLens is exhausting. 106

Especially keyboard input should be avoided. In general, voice input should be 107

used instead of keyboard inputs. 108

Page 110: ARTUR Augmented Reality Training Utility for Radiography

102

Q: Will the new interaction possibilities of HoloLens 2 provide more intuitive 109

operation? 110

A: Yes, thanks to the more precise hand tracking, and after a short ten-minute test 111

of the HoloLens 2, I believe that the operation is more intuitive. Instead of using 112

gauze and Air Tap, you can interact with objects directly, just like on a smartphone. 113

Another exciting feature is the remote rendering to stream models with higher 114

quality to the HoloLens. Especially the HoloLens 1 is very limited in computing 115

power. 116

Q: Were the holograms stable enough in the real space? Was the frame rate 117

enough? Was the visual quality of the holograms enough? 118

A: Yes, visually the holograms were attractive. The performance was enough, 60 119

FPS are not necessarily required for this application. I would not have noticed any 120

stutter. 121

Q: Are there any remarks or comments on topics we have not discussed? 122

A: Maybe the use of hand tracking on an Oculus headset could help to check the 123

positioning of the body parts to give the users feedback on the positioning. Since 124

AR devices are rare, a VR version of the application would also make it more 125

accessible to more use126


Recommended