+ All Categories
Home > Documents > Integra - UserLab

Integra - UserLab

Date post: 17-Mar-2016
Category:
Upload: integra
View: 227 times
Download: 0 times
Share this document with a friend
Description:
Integra - UserLab presentation at RESCON 10, Birmingham City University
8
Testing the usability of Integra:Live Jerome Turner Dr. Jamie Bullock Design Manager Senior Researcher User-lab Integra:Lab BIAD Birmingham Conservatoire
Transcript

Testing the usability of Integra:Live Jerome Turner Dr. Jamie Bullock Design Manager Senior Researcher User-lab Integra:Lab BIAD Birmingham Conservatoire

1

Rationale Existing research has shown that Open Source software development projects suffer from poor usability [1]. The aim of this project is to establish usability testing as a core element in the Integra:Live development process with the hypothesis that it will improve software usability. Objectives To determine user responses to Birmingham Conservatoire’s Integra:Live software for live electronics composition and performance through structured tests. To generate immediate feedback as part of an iterative interface design process.

2

Method Five user-testing sessions were planned following Nielsen [2] Sessions were 45 minutes comprising: • introductions and coffee • pre-questionnaire for demographic data • 4 structured tasks focusing on specific aspects of the software • post-test questionnaire gathering users’ evaluation and conclusions

Tests were conducted at User-lab’s bespoke testing facilities and observed by both the main facilitator and an Integra developer who was available to assist with technical problems.

3

Results Task success/fail Four consecutive tasks given to participants and scored independently by both researchers: • Pass: able to complete the task independently and scored 1. • Fail: unable to complete the task (in some cases with intervention from facilitator) and scored 0. • Partial success: 1. intervention required or 2. activity only partly completed. This scored 0.5.

Scores were then aggregated across all participants to give an indication of overall success rate for each task

4

Salient findings • labelling of module controls doesn't convey

their function • general lack of visual feedback when

interacting with modules • no indication when the software is recording

live input • no indication of recorded sound • module controls shouldn't all be in module

properties • controls could be positioned on modules or

elsewhere • users expect drag 'n' drop from desktop • scenes should have MIDI control • module controls poorly laid out and confusing • needs to be clearer what each module does • drag 'n' drop is not consistent throughout the

application • not obvious that modules need to be

connected

• it isn't obvious how to connect modules • need in-application help/documentation • the user interface is different from existing

software (positive?) • needs to be more obvious where to start

when the application launches • not obvious whether play/pause are on or off • tooltips are too slow to appear • many mouse clicks to achieve certain actions

- could be eliminated • it generally isn't clear what each module

control does • users enjoyed making interesting sounds • users enjoyed playing with the XYScratchPad

control • recSound and resizeSound particularly non-

obvious • live view initially confusing • not obvious how to make scenes

Post test questionnaire results The following shows results of Lickert scale questions. Red dot indicates the average score across all 5 participants. The yellow bar indicates the full range of answers e.g. participants felt the software to be less technical, and more creative than previous ways of doing tasks. NB Results included here are just some of the data sources collected. User comments were also noted, and various issues logged in multiples

6

‘What other software / tools do you use or have you used that achieve similar results for you?’ The following word cloud shows aggregated answers.

Participants were asked to tick words from a list they felt described their experience of using the software, resulting in the following word cloud.

7

Conclusions • Most users appreciated the relative simplicity, compared to their

experience of other similar software and understood this was a creative, rather than technical interface (see Questionnaire Results above).

• Some observed issues can be dealt with in ongoing design of the interface; some are inevitable in the design of a new, innovative interface, and would be explained in training / help / support.

• Further exploration of the recorded sessions and research notes will result in immediate recommendations for development of the interface.

Future Work • Future collaboration / user research relating to Integra:Live • Collaboration on IDEAs project re: Integra:Live multitouch research • Potential for further project – Interactive Forms in Mobile Electronic Music

References [1] Nichols, D., Twidale, M. (2002). The Usability of Open Source Software. First Monday, 8(1) [2] Nielsen, J. (2000, March). Why you only need to test with 5 users: Alertbox. Acceseed June 17, 2010 from http://www.useit.com/alertbox/20000319.html


Recommended