+ All Categories
Home > Documents > Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or...

Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or...

Date post: 07-Jul-2018
Category:
Upload: doannguyet
View: 214 times
Download: 0 times
Share this document with a friend
55
Memo To: J. Brenton Stewart cc: Dr. Carol Barnum From: Library Usability Group: Tom Burns, Brian Curtis, Astrid Dahl, Dorothy Whisenhunt Subject : Findings and analysis report for Web site usability test Date: 12/7/06 Attached is the Library Usability Group’s final report on the SPSU Library Web site usability study. It includes the following: Project scope and purpose Identified goals Process followed (heuristic review, test design, recruitment and screening, logging and analysis) Results and findings organized by category Recommendations Also included are appendices that include the test logs and materials (participant consent forms, questionnaires, task lists, etc.) and the reports submitted earlier in the project life cycle (heuristic evaluation, personas, and test plan). The Library Usability Group would like to thank you for serving as our sponsor and for taking the time to work with us on this project. We would like to acknowledge the assistance of our professor, Dr. Barnum, throughout the project. We hope this report will be useful to the Southern Polytechnic library staff and will benefit all students, faculty, and staff. ©2011 Elsevier, Inc.
Transcript
Page 1: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

MemoTo: J. Brenton Stewartcc: Dr. Carol Barnum

From: Library Usability Group: Tom Burns, Brian Curtis, Astrid Dahl,Dorothy Whisenhunt

Subject: Findings and analysis report for Web site usability testDate: 12/7/06

Attached is the Library Usability Group’s final report on the SPSU Library Web site usability study. It includes the following:

Project scope and purpose Identified goals Process followed (heuristic review, test design, recruitment and screening,

logging and analysis) Results and findings organized by category Recommendations

Also included are appendices that include the test logs and materials (participant consent forms, questionnaires, task lists, etc.) and the reports submitted earlier in the project life cycle (heuristic evaluation, personas, and test plan).

The Library Usability Group would like to thank you for serving as our sponsor and for taking the time to work with us on this project. We would like to acknowledge the assistance of our professor, Dr. Barnum, throughout the project. We hope this report will be useful to the Southern Polytechnic library staff and will benefit all students, faculty, and staff.

©2011 Elsevier, Inc.

Page 2: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

Web Site Usability Test of Lawrence V. Johnson Library atSouthern Polytechnic State University

Prepared for: J. Brenton Stewart, Project SponsorLawrence V. Johnson Library Southern Polytechnic State University

Carol Barnum, Ph.D.Professor, Information Design and CommunicationSouthern Polytechnic State University

Prepared by: Library Usability Group: Tom Burns, Brian Curtis, Astrid Dahl, Dorothy Whisenhunt

Date: December 7, 2006

©2011 Elsevier, Inc.

Findings and Analysis Report

Page 3: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

Table of ContentsMemo........................................................................................................1Table of Contents.......................................................................................1Executive Summary....................................................................................3

Background................................................................................................................... 5Purpose......................................................................................................................... 5Report overview............................................................................................................6

Methodology..............................................................................................8Overview....................................................................................................................... 8User profiles.................................................................................................................. 8Usability criteria............................................................................................................9Test objectives............................................................................................................11Types of data collected...............................................................................................11Screening profiles.......................................................................................................12Testing environment and equipment..........................................................................12Test scenarios and tasks.............................................................................................13Supplemental test materials.......................................................................................16

Test Results.............................................................................................17Summary of quantitative data....................................................................................17Results by category....................................................................................................18

Key Findings............................................................................................27Positive findings..........................................................................................................27Severity ratings...........................................................................................................27Collated findings, rated by severity............................................................................28User impressions.........................................................................................................29

Recommendations....................................................................................31Short-term Changes....................................................................................................31Long-term Changes.....................................................................................................33Future development and testing.................................................................................34

Conclusion...............................................................................................36Findings...................................................................................................................... 36Comments...................................................................................................................36

Appendices..............................................................................................37Appendix A: Heuristic Evaluation................................................................................38

Findings and Analysis Report ©2011 Elsevier, Inc. Page 1

Page 4: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

Appendix B: Personas Report......................................................................................39Appendix C: Test Plan.................................................................................................40Appendix D: Participant Materials...............................................................................41Appendix E: Testing Materials.....................................................................................42Appendix F: Session Logs............................................................................................43Appendix G: Resources...............................................................................................44

Findings and Analysis Report ©2011 Elsevier, Inc. Page 2

Page 5: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

Executive SummaryOrigin and Goals of the Test. The SPSU Library Web site usability test originated as a project for the Information Design and Communication (IDC) course, Usability Testing and Research. The members of the Library Usability Group are master’s students in the IDC program, and consulted with the sponsor, J. Brenton Stewart, to define the needs of the SPSU library and the scope of the project. Our sponsors wanted to know how to help their student users improve their research skills, to increase their library use, and to ensure that distance learners who do not have access to the library building are still being served. Brenton also asked the team to test student opinions on an animated tutorial on GALILEO searching. This report includes The methodology our team used The test results we gathered The key findings that were revealed The team’s recommendations, based on the results and findings Copies of all preliminary research documents, including heuristic evaluation,

personas report, and screening instruments Our test plan, with scenarios and tasks Copies of all test data, findings, supplementary materials, test logs DVD of PowerPoint presentation (with video clips embedded) DVDs of test session (sponsor only)

Our Process. Following industry-standard methods of usability testing, the Library Usability Group undertook an evaluation of the site, developed user profiles, and developed a test plan that consisted of two scenarios containing six real-world tasks. Four undergraduate and five graduate users participated in tests over a three-week period in The SPSU Usability Center—a state-of-the-art testing facility where we captured detailed test notes and screen-in-screen DVD captures with participant comments.

Our Results. The team found many areas of overlap in problems, which we have summarized in the body of this report. Generally, we divided these repeated issues repeats into six categories: Mental Model—The mental image users expect to find and how well the site meets

the expectation Terminology—The degree to which the terms are understandable and helpful Password—The biggest frustration for all users, with no warning or help Tutorial—A good idea that users thought could be improved in simple ways Layout—The arrangement of page elements to help guide the user User Support—Features that can help the user complete a task

Findings and Analysis Report ©2011 Elsevier, Inc. Page 3

Page 6: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

Problems Recommendations

Terminology on the site is problematic: double meanings, inconsistent use

Clear up ambiguity and double meanings: “Reserves,” “Database,” “Request,” “Universal,” “Resources,” “Availability”

Employ “mouseovers” to define key words. Put terminology help at the point of need.

Site is organized by resource type, rather than task

Rather than linking to “research databases,” say “research topics here.”

Let them know where to go for books; where to go for periodicals.

Is there a place where they can search for everything? If so, label it.

The page layouts are crowded and distracting

Use the page space available to make the links clear and easy to read.

“Services” are actually text-heavy help

Cut the dense, text-heavy help pages that don’t contain links. Users will not read them. No one does.

Break up text with bullets and eliminate unnecessary verbiage.

Demand for password without warning causes major frustration

Although this may be a feature of the system outside SPSU, it may be possible to give contextual clues that the password demand will occur, as well as helpful information on what to do.

The tutorial contains elements that are distracting to the overall purpose (speed, animation, user control, etc.)

Some relatively minor fixes will improve the usefulness.

This should be re-tested after the changes are made to re-evaluate usefulness.

Recommendations and Rationale. From analysis of the data, our team identified several specific items that, if addressed quickly, will yield significant benefits for users for relatively little effort. The preceding table highlights the major problem areas and recommendations; more detail can be found in the body of the report. We have divided our recommendations into short-term and long-term categories, reasoning that it would not be beneficial to users to wait for relatively minor terminology tweaks while a major redesign is carried out. As SPSU moves into a new phase of expansion and reaches out to the world of learners, it is critical for the library Web site to meet the needs of all its learners. In this age of ready computer access, all learners are “distance learners.” Whether across the street or across the world, SPSU library should—and can—meet the needs of the community it serves.

BackgroundWe are the Library Usability Group, a team of students in SPSU’s Master of Science in Information Design and Communication (MSIDC) degree program. Dr. Carol Barnum,

Findings and Analysis Report ©2011 Elsevier, Inc. Page 4

Page 7: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

the professor of our Usability Testing and Research course in the MSIDC program, suggested the idea for the usability test of the SPSU library’s Web site and distance learner resources.

The methods used in this project are known generally as “discount usability testing,” so named because it relies on using a few carefully screened participants instead of recruiting dozens and relying on statistical analysis for results. In discount testing, user profiles called personas are employed to define representative user groups, and that information is turned into screening criteria to identify potential participants.

The subject of the test is the Web site of the Lawrence V. Johnson Library at Southern Polytechnic. SPSU’s library is key to the educational process for undergraduate students, graduate students, and faculty. The library’s Web site is the online portal to the SPSU library, Georgia Interconnected Libraries (GIL), and Georgia Library Learning Online (GALILEO).

PurposeCatalog Librarian Brenton Stewart was interested in having the Library’s site evaluated for usability questions and issues. In the initial meeting, he discussed the scope of the site and its typical users, as well as the concerns and questions he had about the site that our research could help him resolve. Some were known problems; others were still only supposition. In addition, Brenton sought feedback on an animated tutorial that was created to assist students in learning about searching GALILEO databases.

The Library Usability Group worked with Brenton’s information and questions to more fully define The scope—How much of the site would be included in the evaluation The user base—What types of users work with the site Use cases—Typical tasks and situations that bring distance learners to the Web siteThe outcomes of this analysis included detailed user personas, a heuristic evaluation by the project team, and a detailed test plan with suggested tasks and questions for the test sessions.Our goal was to provide feedback and data to library staff on the usability of the SPSU Library (“the Library”) Web site (http://www.spsu.edu/library) and an accompanying GALILEO search tutorial (http://home.comcast.net/~brentonstewart/INDEX.HTML) for remote users through needs assessment, heuristic evaluation, and usability testing. The purpose of this usability test was to collect information about how users interact with the distance learner resources found on the Library’s Web site and to identify areas in which the site can be improved to better benefit its users.

Report overviewThis report contains the following major sections:

Findings and Analysis Report ©2011 Elsevier, Inc. Page 5

Page 8: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

Methodology. Our user profiles, usability criteria, test objectives, and testing procedures

Test Results. The data we collected Key Findings. Positive findings, areas in need of improvement, and test

participants’ impressions of the library’s site Recommendations. Recommendations for short-term and long-term usability

fixes and offers suggestions for future development and testing of the site Conclusion. A summary our testing process, findings, and recommendations

The appendices of this report include all materials relevant to our usability test so that subsequent usability tests can be easily created. With these documents, future versions of the library’s Web site can be tested in significantly less time than the initial test. The appendices are labeled for easy access and include the following:

Appendix A: Heuristic Evaluation. Describes the criteria we used in our initial evaluation of the site’s usability and identifies the areas of concern which became the basis for our test plan

Appendix B: Personas Report. Describes the characteristics, goals, and needs of fictional characters that represent typical users of the library’s distance learner resources

Appendix C: Test Plan. Describes the objectives, user types, methodology, and plans for recruiting participants for our tests; identifies evaluation team member roles; and sets a timeline for project deliverables

Appendix D: Participant Recruiting Materials. Includes blank (sample) forms used for recruiting qualified participants as well copies of the completed forms.

o Sample participant screening questionnaire. The screening questionnaire was carefully crafted to ensure participants match the personas.

o Completed participant screening questionnaires (with identifying information obscured) for the nine test participants.

o Sample pre-test questionnaire. The brief final screening questionnaire to verify that the user is suitable before proceeding with the test.

o Completed participant pre-test questionnaires (with identifying information obscured) for the nine test participants.

Appendix E: Testing Materials. Copies of the materials used during the testing process, including:

o Facilitator script. A copy of the script the facilitator followed to orient the participant to the testing process. A script was used to ensure that all participants received the same information.

o Sample participant consent form. Participants were informed of their rights during the testing process as well as asked to sign their agreement to allow videotaping.

o Completed participant consent forms (with identifying information obscured) for the nine test participants.

Findings and Analysis Report ©2011 Elsevier, Inc. Page 6

Page 9: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

o Participant scenarios and tasks packet. This testing packet includes the task instructions for the participants. Each task is followed by a post-task questionnaire, which was used to guide the facilitator’s post-task discussion and debrief with the participant. An undergraduate student test packet and a graduate student test packet were developed to ensure the tasks were tailored to each persona. The testing team also used the packets to record notes and information during the test.

Appendix F: Session Logs. The computer-generated logs of our testing sessions, exported to Excel spreadsheets.

Appendix G: Resources. A literature review of a book on library usability testing, links to sites of interest, and links to other libraries.

Findings and Analysis Report ©2011 Elsevier, Inc. Page 7

Page 10: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

Methodology

OverviewThis section of the report provides information on the following:

Typical user profiles for the library’s site Criteria by which we judged the usability of the site Goals for the test Profiles of the actual test participants Test location and equipment Tasks presented to participants during testing Other materials used throughout our testing process

User profilesTo identify prospective test participants who represented the site’s actual users, we worked with project sponsor, Brenton Stewart, to identify primary user groups and create detailed descriptions of those users (called personas). These characteristics, along with other questions about prior experience, behaviors, and other criteria, were the basis for the screening questionnaire (Appendix D) we developed to identify suitable test participants. A brief summary of those characteristics follows:

Findings and Analysis Report ©2011 Elsevier, Inc. Page 8

Page 11: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

Undergraduate user, “Jeff” Graduate user, “Catherine”

Age 20, single 35, married

Major Civil Engineering Information Design and Communication

Education High-school graduate, 2006 B.S. in Computer Science, 1993

Work Experience

Part-time jobs during and after school

Full-time technical writer

Residence On-campus (or nearby off-campus apartment)

Off-campus, suburban home (Sandy Springs, GA)

Computer Experience

Advanced enthusiast; frequently online for games, communication with friends (often late-night), and entertainment

Proficient; uses a computer all day at work, but sees them as tools (shopping, e-mail, news, etc.), not “fun;” uses computer for online classes and school research

Library Experience

Used high-school libraries, in-person only; has used Google searches and similar online search tools, but seldom for schoolwork

Used on-site college library resources and early-model library-search tools in the early ‘90s. Used SPSU library site several times, with some difficulty, but is an efficient and quick researcher.

Priorities Learn the SPSU library system for school assignments; keep grades up; save money and avoid frustration

Use GALILEO to quickly locate needed resources, preferably those in online format; avoid long drives; juggle schedule concerns; improve work recognition

Detailed information about these user profiles can be found in Appendix B: Personas Report.

Usability criteriaFrom the perspective of a persona, we individually reviewed the library site’s distance learner resources and completed a heuristic evaluation in which we judged the usability of the library’s site against a specific set of usability principles (see chart below). We compiled the findings of our heuristic evaluation into a report that identified areas of the site with which users were likely to have trouble. Using these criteria, we devised the test objectives in the following section.

Findings and Analysis Report ©2011 Elsevier, Inc. Page 9

Page 12: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

Heuristic Description Example

Effectiveness How well the user can perform tasks successfully with the features provided

A “Submit” button on a Web page does not function, so a user cannot submit an online form. The inoperability of the “Submit” button is a usability issue of effectiveness.

Efficiency How quickly and easily the user can find the desired information or results

From a Web site’s homepage, a user must click 10 links before landing on the “Reports” page (which page statistics have identified as the third most frequently visited page on the site). This is a usability issue of efficiency because of the amount of time a user spends looking for information; if a higher-level Web page contained a link to the “Reports” page, then the user would have spent less time looking for the information he or she needed.

Engagement Degree of appeal to the interface, layout, and overall interaction

The user was dissatisfied with the Web page because it consisted of merely paragraph after paragraph of dense text, with not even a single graphic on the page. This is a usability issue of engagement because information on the Web page is not successful in catching the attention of the user.

Error Tolerance Ability to prevent, minimize, and recover from user or system errors

Form data is accidentally deleted from an online form when a user mistakenly clicks on the “Reset Form” button instead of the “Submit” button on a Web page. Omission of the “Reset Form” button from the form page reduces the chance that a user will accidentally lose all of his or her form data, resolving this usability issue of error tolerance.

Ease of Learning How readily a new user can learn the basic concepts and tasks of the site

Because the navigational menu of a Web site is located in a different place on each page of the site, a user is confused as to where each link leads. The site’s inconsistency in page layout (i.e., the different placements of the navigational menu) is a usability issue of ease of learning because it impedes the user’s ability to become familiar with the site’s features.

Detailed information about these heuristics can be found in Appendix A: Heuristic Evaluation.

Findings and Analysis Report ©2011 Elsevier, Inc. Page 10

Page 13: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

Test objectivesBased on information obtained from the sponsor and our heuristic evaluation, we assessed the following elements:

1. Terminology. Do users understand the terminology on the library site—such as GALILEO, GIL Express, GIL, etc.?

2. Resource types. Do users understand the differences between resource types as described on the site and how that affects the availability of items (on-campus, online, in other locations, etc.)?

3. Organization. Is the site organization (by resource type) effective and usable for students trying to locate books, research articles, and reference information?

4. Navigation. Is the navigation of the site efficient for a common resource-search task? Are users aware of their current location in the site and how to return to a prior point in the process?

5. Page layout. Are the page layouts confusing or distracting? Are they too similar or too different—too dense, or too sparse?

6. Tutorial usage. Do users take the tutorials when they encounter questions?7. Tutorial effectiveness. Do the tutorials help the user understand how to work

with the relevant parts of the site? 8. Tutorial clarity. Are the tutorials clear and effective in increasing user

knowledge?9. User control. Do users feel engaged and “in control” when using the tutorial?

Will they confuse the tutorial’s interface with the real one?10. Quality of writing. Is the site well-written overall and clear enough for users to

understand the information they see?11. Links and controls. Are hyperlinks and controls always spotted and recognized

as such on each page?Detailed information about these issues can be found in Appendix A: Heuristic Evaluation.

Types of data collectedThe testing process captured both quantitative and qualitative data using the following measures:

Quantitative

Time to complete the requested tasks Number of test participants who complete each task successfully Number of calls to help desk, or uses of tutorials, online help, or other assistance Correct answers to task-specific questions Number of errors and problems encountered, categorized

Qualitative

Findings and Analysis Report ©2011 Elsevier, Inc. Page 11

Page 14: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

Pre-test and post-test interview comments Participant comments and reactions during and after performance of tasks Observations of non-verbal cues (such as posture, body movements, facial

expressions, and behaviors)

Screening profiles Including the pilot test, we collected data from nine participants: five graduate students and four undergraduate students. Graduate students: Three female, two male. Ages ranged from 27 to 36. Undergraduate students: Three male, one female. All were in their freshman year.

Ages ranged from 18 to 19.Additional demographic information about the participants (with names removed) is provided in the completed screening questionnaires in Appendix D.

Our test data showed that graduate students were significantly more successful in their task outcomes than the undergraduates, which is not surprising, given their somewhat greater experience with using the site. They did experience frustrations and failures, however, even if they successfully completed a task. Our undergraduate participants were 18 or 19 years old, and had varying levels of expertise and exposure in the online library. User 8 was an outlier among the undergraduates, in that he was a very fast and confident user who revealed at the end that his mother is a librarian. Even still, he had some hesitations and failures, and expressed frustrations.

User 7, an undergraduate, had significant language challenges. These challenges did not disqualify her from testing, and face-to-face, her language skills were reasonably good, and her computer knowledge was exemplary – it was the terminology on the site that gave her problems. As an SPSU student who (most likely) is representative of the 5.6 percent of students who are non-resident aliens, she represents a significant challenge to the SPSU library staff: how can students who are not native English-speakers be accommodated? Although not all non-resident aliens have language challenges (and some resident aliens or citizens may have language challenges), this need bears study.

Testing environment and equipmentTesting was conducted at Southern Polytechnic’s Usability Center, located in the Atrium building on the SPSU campus. This facility includes an evaluation room (where the test participants completed the tasks), a control room (where we observed and recorded each test), and the executive viewing room (where additional observers can view test participants during testing). Following is a summary of the equipment used during testing:

Test participant workstation in the evaluation room: SPSU-standard PC configuration (Windows XP, campus network connection), keyboard, mouse, and monitor

Findings and Analysis Report ©2011 Elsevier, Inc. Page 12

Page 15: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

Test participant desk, chair, and telephone (for “help desk” support) Video cameras and microphones positioned throughout the evaluation room One-way mirror between the evaluation room and the control room Monitors and image-mixing board in the control room Control-room telephone for “help desk” calls from the test participant Additional PCs and equipment in the control room for audiovisual recording,

logging, screen sharing, etc.

Test scenarios and tasksTest participants were not permitted to use any resources that were not on the site (i.e., Google). This was not stated in the facilitation interview so as not to introduce the idea. If test participants were to have opened a search engine on their own, then they would have been told that it was not allowed and the attempt would have been noted.

All participants were given a stapled packet that contained the instructions for each task, the tasks, and the post-task questionnaire. Because each task instructed participants to begin from the library’s homepage, we set browser bookmarks on the evaluation room computer so that the facilitator could quickly and easily open the library’s homepage for the participant upon the start of each new task.

Based upon our profiles of the two primary user groups—graduate and undergraduate students—we worded the task narratives slightly differently, but the underlying goals for both the graduate and undergraduate tasks (and therefore, the Web site actions to complete the tasks) are identical. These tasks (see Appendix E) are “routine” (typical tasks any student would need to accomplish) and so were presented to participants in the order in which an actual user was likely to perform them. The testing sequence for each user profile (graduate and undergraduate) is as follows:

Overview/BriefingThe test facilitator welcomed the test participant and explained the equipment setup, testing process, “think-aloud” protocol, recording waiver, phone-support option (i.e., “call for help or if stuck”), etc. The facilitator then worked with test participant to complete pre-test questionnaire (Appendix D). After the completed pre-test questionnaire was reviewed, the facilitator left the evaluation room to allow the test participant to begin testing.

Scenario 1: Initial SearchesRelevant test objectives:

Findings and Analysis Report ©2011 Elsevier, Inc. Page 13

Page 16: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

X 1. Terminology X 5. Page layout 9. User control

X 2. Resource types X 6. Tutorial use X 10. Writing quality

X 3. Organization 7. Tutorial effectiveness X 11. Links/controls

X 4. Navigation 8. Tutorial clarity

Task #1: Finding a book in the SPSU library. Participant was to locate a specific book in the library and then answer questions about its availability. The underlying goals of the test participant for this task were as follows:

Find the GIL link Recognize GIL as being the library’s catalog Search for the book using GIL Determine whether a copy of the book is available for checkout from the library

Task #2: Finding an article in Course Reserves. Participant was to locate a specific resource in the course reserves and then answer questions about its publication information. The underlying goals of the test participant for this task were as follows:

Find the course reserves link Find a specific article Retrieve the course reserves password so that the article could be opened

Task #3: Researching a topic. Each test participant was to locate two current professional journal articles related to a given topic. The underlying goals of the test participant for this task were as follows:

Find a database Find an article (with publication date and article type limiters) using this

database

Scenario 2: Tutorial and Follow-up SearchesRelevant test objectives:

1. Terminology 5. Page layout X 9. User control

2. Resource types 6. Tutorial use X 10. Writing quality

X 3. Organization X 7. Tutorial effectiveness 11. Links/controls

X 4. Navigation X 8. Tutorial clarity

Task #4: Tutorial. The facilitator instructed test participants to click the browser bookmark for the GALILEO Quick Search tutorial (located at http://home.comcast.net/~brentonstewart/INDEX.HTML) to begin this scenario.

Findings and Analysis Report ©2011 Elsevier, Inc. Page 14

Page 17: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

Participants were to watch the tutorial and provide comments about the experience. The underlying goals for this task were as follows:

Test participant critiques animated tutorial (design, effectiveness, purpose, etc.) The team gauges participant reaction to pace, content, design

Task #5: Post-tutorial task—finding another specific article. After viewing the tutorial, the participant was instructed to use GALILEO to locate a specific article and then determine whether the full text of the article could be displayed online. The article’s title, year of publication, and topic were provided along with the name of the publication in which the article appeared. The underlying goals of the test participant for this task were as follows:

Find GALILEO link Recognize GALILEO as being the appropriate resource to use Search for a specific article (using publication date and journal name limiters) Determine if article’s full text can be displayed

Task #6: Finding a resource at another library. Participant was to search for a specific resource, determine where copies of it could be found, and reserve a copy of it. The underlying goals of the test participant for this task were as follows:

Find GALILEO or GIL link Recognize GALILEO/GIL as being the appropriate resource to use Search for a title Identify where the title is held Reserve a copy of the title through Interlibrary Loan

The actual task lists we gave to participants can be found in Appendix E: Participants’ Scenarios and Tasks.

Post-task interview with facilitator

The facilitator used the post-task questionnaire (see Appendix E: Post-Task Questionnaire) to guide the post-task interview. Participant comments were noted and used for qualitative data. These user comments will be seen in parts of the Test Results, Key Findings, and Recommendations sections.

Findings and Analysis Report ©2011 Elsevier, Inc. Page 15

Page 18: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

Supplemental test materialsThe table below provides information on materials used in different phases of our testing:

Questionnaire title Purpose When given

Screening Questionnaire Used to identify test volunteers who matched criteria for testing (i.e., those who fit the user profile)

During recruitment phase, before start of testing

Pre-test Questionnaire Used to obtain more detailed information on participant’s Web usage and library site usage so we could determine whether the participant was still a good candidate for the test

Prior to start of testing session

Post-task Questionnaire Used to gauge participant’s level of ease at completing task and perception of level of difficulty of the task

After the end of a task

Post-test Questionnaire Used as a guide by facilitator to solicit comments, overall impressions, suggestions, and other feedback from participant

After testing session has ended

Findings and Analysis Report ©2011 Elsevier, Inc. Page 16

Page 19: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

Test Results

Summary of quantitative dataThe table below illustrates the average time participants spent on each task before completing the task successfully, completing part of the task successfully, or ending attempts at the task at the request of the facilitator (due to time constraints). If, as a result of being confused or not knowing where to go or what to do next, a participant returned to the library’s homepage to “start over,” the action was counted as one attempt at completing the task. If a participant indicated verbally, at any point in the site, that he or she needed or wanted to start the task over, this was also counted as another attempt. The repetition of taking a certain “path” through the site (with no favorable outcomes of each attempt) was also considered to be another attempt at the task.

TaskAverage duration

Success/partial

success/failure

Average number of attempts

#1 Book search in the SPSU library 5 min., 4 sec. 9 / 0 / 0 2

#2 Article search in Course Reserves (finding article & retrieving password)

11 min., 3 sec. 5 / 2 / 2 4 (article)4 (password)

#3 Topic search in periodicals 6 min., 28 sec. 4 / 2 / 3 5

#5 Post-tutorial article search 4 min., 19 sec. 8 / – / – 2

#6 Search for resources at another library (finding a book not in SPSU library & reserving it through Interlibrary Loan)

5 min., 44 sec. 2 / 3 / – 5

The following table categorizes task duration and attempts by user type—graduate and undergraduate:

Findings and Analysis Report ©2011 Elsevier, Inc. Page 17

Page 20: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

Task

Graduate Undergraduate

Duration# of

attempts

Duration# of

attempts

#1 Book search in the SPSU library

6 min., 10 sec.

3 4 min., 35 sec.

2

#2 Article search in Course Reserves (finding article & retrieving password)

12 min., 41 sec.

5+3 10 min., 5 sec.

3+5

#3 Topic search in periodicals 7 min., 24 sec.

6 5 min., 55 sec.

4

#4 Tutorial N/A N/A N/A N/A

#5 Post-tutorial article search 4 min., 15 sec.

1 4 min., 22 sec.

3

#6 Search for resources at another library (finding a book not in SPSU library & reserving it through Interlibrary Loan)

6 min., 16 sec.

4 5 min., 11 sec.

6

Results by categoryWe used a “bottom-up” approach to categorize the results of our testing—we determined which identified usability issues should be matched together and then what each group of matched items should be called. The following categories represent the areas in which test participants had usability issues:

Mental Model Terminology Password Tutorial Layout User Support

Findings and Analysis Report ©2011 Elsevier, Inc. Page 18

Page 21: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

Mental Model

The user’s mental model encompasses all categories, in that it relates to his ideas about What do I want to do?, Where do I start?, How do I do this?, What is this? The user’s mental model refers to the difference between actual site structure, content, and functionality and what the participants expected or assumed.

Mental Model

Users experiencing problem0 1 2 3 4 5 6 7 8

Where do I search for a book? x x x x x *Where is Course Reserves? x x x x x * xWhere do I search for articles? x x x x * xWhere do I go if I’m doing research on a topic (need books and articles)?

x x x x *

Misunderstood function/connection of GIL and GALILEO

x x x x *

Expressed confusion between what would be in “Resources” and “Services”

x x x * x

Expressed confusion between function of GIL, GALILEO, and Reference databases

x x x x x *

Didn’t have a clear picture of where they’d go with a click

x x *

Didn’t know what “Distance Learners” page was for when it came up

- - - - x x - * -

Wanted password warning/help x x x x **Note: User 7 had language difficulties and was unable to successfully complete any of the tasks, despite use of a pocket translating device.

In our tests, the mismatch between what the user expects and what the site contains can be seen in issues over Where to find a book Where to find Course Reserves How and where to search for articles, journals, and topicsIn these tasks, five or six out of the eight participants experienced problems.

Another issue participants struggled with in regard to the mental model of the site was in what they perceived the features meant or what the feature would do. There was considerable confusion over the differences between GIL and GALILEO

(four of eight users). Four of the eight did not understand what the difference between “Resources” and

“Services” was, and completed the task by trial and error.

Findings and Analysis Report ©2011 Elsevier, Inc. Page 19

Page 22: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

Two of the eight users dismissed the “Distance Learners” page, remarking that it wasn’t what they wanted, without reading the information

Four of the eight users would have liked a “forgot password?” option, like they frequently find on most other Web sites.

Terminology

Do terms used match what the user thinks/wants? Usability issues in the terminology category include unexpected or missing keywords, vague or ambiguous language, or unnecessary jargon.

TerminologyUsers experiencing problem

0 1 2 3 4 5 6 7 8Misunderstood function/connection of GIL and GALILEO

x x x x *

Confused by multiple meanings of “reserves” x x *Confused by meaning of “universal” x * xConfused by meaning of “database” x *Can’t find Course Reserves x x *“Resources” page confusing x x *Other word confusions: availability; call number

x x

GIL versus GALILEO: four of the eight participants were not confident they knew what these terms meant.

Findings and Analysis Report ©2011 Elsevier, Inc. Page 20

Page 23: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

“Database”—this term has double meaning: computer database or reference database?

“Request”—this term has double meaning: request book or request password?

Other ambiguities in terminology: Reserves (triple meaning): reserve book, course reserve, electronic reserve Universal (Catalog): what is in the UC and why is it universal? Reference Services: is it service or help? Misunderstanding of language/terminology: SPSU has a relatively large number of

students who are non-native English-speakers. The percentage of resident aliens is 5.6%, a large number of students who may be effectively denied access to the library’s resources because of language difficulties.

Layout

Findings and Analysis Report ©2011 Elsevier, Inc. Page 21

Request what?

What is the difference between these two databases?

Page 24: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

The layout category generally concerns visual matters such as density of text and amount of white space on a page; size, color, and placement of page elements (such as links and buttons); and visual hierarchy of text and images.

Page LayoutUser experiencing problem

0 1 2 3 4 5 6 7 8Too much text in help xPoor layout – cramped, hard to read

x x x x

Good layout x x

Many links were not obvious

Four of the eight users commented that the links were hard to read because the layout was cramped

Findings and Analysis Report ©2011 Elsevier, Inc. Page 22

Link doesn’t stand out

Page 25: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

Two participants commented that there was a “first link advantage” in their trial and error – “library catalog (GIL)” was the first link, and seemed like an obvious place to start to look for a book in the first task, even if the words “library catalog (GIL)” isn’t intuitively indicative of a book search.

User 0 had very harsh comments about the quantity of text in the “Services” pages, saying, “I don’t want to read a book!” Other users merely clicked out of the pages as soon as they came up, choosing not to read or comment on them.

Password

The password category refers to the password retrieval system and process used for accessing

Findings and Analysis Report ©2011 Elsevier, Inc. Page 23

Many crowded links; hard to read; link status is not intuitive, i.e., by color, etc.

From the “Services” page, the pages are very text-dense, with no links to take users to what they want to do. Is this Services or Help?

Page 26: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

certain databases remotely from the library’s Web site and for opening materials downloaded from the Course Reserves. The password demand that appears when the user attempts to download a PDF from Course Reserves caused the biggest frustration for the participants.

Password for Course Reserve article

User experiencing problem0 1 2 3 4 5 6 7 8

Knew password x x xRetrieved password easilyRetrieved password with difficulty

x x

Failed to retrieve password x x x xWanted password warning/help

x x x x

Although three of the nine participants knew the password, only two (graduate students) knew it readily, saying they had used it recently. The third remembered after much cursing.

Four of the nine volunteered the suggestion that there should be a warning that a password would be required, or some type of help for finding it. They suggested a password prompt like those found on many sites, such as “forgot password?” , suggesting that this is also a departure from the mental model that most users have of how to deal with sites that ask for passwords.

Tutorial

Comments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed into a single tutorial category. The grid below shows that the participants had multiple issues with the tutorial.

TutorialUser experiencing problem

0 1 2 3 4 5 6 7 8Too fast x x x x x xAnimation distracting x x x xUser control (start/stop) not explicit

x x x x

Wanted volume control x x xCouldn’t see everything x x xWas helpful x x -Was not helpful x x x x x -

Six of the eight expressing opinions said it went too fast

Findings and Analysis Report ©2011 Elsevier, Inc. Page 24

Page 27: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

Four of the eight found the animation distracting

Three noted that parts of the screen were off the screen or obscured Four wanted more control over start/stop and wanted volume control Five of seven who had an opinion did not find it helpful Despite the negative comments, two participants did find it helpful, and it is likely

that some fixes to the distractions noted above would significantly increase the usefulness.

User Support

User support includes any on-screen instructions, headings, link or page titles, link mouseovers, error or warning messages, and other information that might help users

Findings and Analysis Report ©2011 Elsevier, Inc. Page 25

Page 28: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

succeed in their tasks. In most cases, careful design and judicious use of color, along with intuitive placement, logical proximity, and correct sizing can add tremendously to the support users gain on a site.

User SupportUser experiencing problem

0 1 2 3 4 5 6 7 8Title and TOC pages confusing x x xGIL search box helpful xWanted explicit on-screen descriptions

x x x x x x x

Wanted password warning/help

x x x x

Wanted help: site map, search, etc.

x x x x x

GALILEO: is this article really the full text?

x x x

How do you reserve a book in another library? (Interlibrary Loan)

- - x x - - - - x

SPSU book search gave incorrect match (grad task)

- x x x - - -

Three of the nine participants found the Title page and the Table of Contents pages confusing

One user found the GIL search box helpful, but noted this was only “after you find it.”

Seven of the nine expressed a desire for more explicit on-screen descriptions. This includes clearing up terminology confusions, such as “books in the SPSU library” rather than “library catalog (GIL).”

Four of the nine wanted some help or warning that a password would be required. The password prompt that gave no redirection, help, or explanations was a source of great frustration.

Of the three participants who got to the end of the tasks and tried to reserve the book by Interlibrary Loan, two gave up and one succeeded with difficulty (user 8). All expressed that help at the point of need would have been helpful.

Three of the five graduate users who searched for the book not in the SPSU library (in the Universal Catalog) received an incorrect match on their exact search terms.

Findings and Analysis Report ©2011 Elsevier, Inc. Page 26

Page 29: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

Key Findings

Positive findingsDespite the problems encountered, participants had positive feedback on many aspects of the experience: Ease of use for GIL and GALILEO—Graduate students, in particular, commented

that the GIL and GALILEO sites were easy and satisfying to work with. In particular, the Search feature was pointed out as a very helpful tool that matches well with their other online experiences and expectations. Comments from graduate students ran along the lines of “Once I found GALILEO/ knew where to start my search, it was easy to find a specific resource.” Frequency: 6 participants.

Tutorial content—Although there were negative comments about the speed, size, and other technical aspects of the tutorial, feedback on the content presented was neutral to positive. Some participants noted that the tutorial explained aspects of GALILEO that they hadn’t known about, and even negative comments about the content suggested expanding its scope rather than criticizing what was already present. Frequency: 3 participants.

Placement of key links—Several users commented on the importance of putting the most common and important features (GALILEO search, GIL) first in the page structure and layout. Likewise, putting the GIL Universal link (for searching other libraries if the SPSU library doesn’t have a resource) in close proximity was described as helpful. This is an effective aspect of the site that should be preserved through any future design changes. Frequency: 3 participants.

Explanatory text for links and buttons—In those areas of the site where navigation links and buttons are accompanied by explanatory text, the feedback was positive about the usefulness of this information. We suggest keeping and expanding on this practice. Frequency: 2 participants.

Specific comments and task circumstances surrounding this feedback are included in the test session logs (attached as Appendix F).

Severity ratingsAfter our initial findings were compiled, we assigned to each issue a severity rating which determined the seriousness of the usability problem. The four levels of severity are described below:

Level 1—Prevented completion of a task Level 2—Frustrated participant and caused significant delay for a task Level 3—Had a minor effect on usability Level 4—Caused no significant impact on performance, but participant indicated

a preference or a suggestion for future changes

Findings and Analysis Report ©2011 Elsevier, Inc. Page 27

Page 30: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

Collated findings, rated by severityTaking into consideration our test objectives, quantitative data from testing, and participant feedback, we identified 24 significant usability issues with the library site. The following table lists these issues and indicates their impact on the usability of the site, along with the number of participants who experienced each problem.

Usability problem # Affected Severity

Mental Model: Where is Course Reserves? 7 1User Support: Wanted explicit on-screen descriptions 7 2Mental Model: Where do I search for a book? 6 2Mental Model: Where do I search for articles? 6 1Mental Model: Expressed confusion between function of GIL, GALILEO, and Reference databases

6 2

Tutorial: Too fast 6 2Mental Model: Where do I go if I’m doing research on a topic (need books and articles)?

5 2

Mental Model: Misunderstood function/connection of GIL and GALILEO

5 2

Mental Model: Expressed confusion between what would be in “Resources” and “Services”

5 3

Mental Model: Wanted password warning/help 5 2Tutorial: Was not helpful 5 3User Support: Wanted help: site map, search, etc. 5 4Layout: Poor layout – cramped, hard to read 4 3Password: Failed to retrieve password 4 1Tutorial: Animation distracting 4 3Tutorial: User control (start/stop) not explicit 4 3Mental Model: Didn’t have a clear picture of where they’d go with a click

3 2

Terminology: Confused by multiple meanings of “reserves”

3 2

Terminology: Confused by meaning of “universal” 3 3Terminology: “Resources” page confusing 3 2Tutorial: Wanted volume control 3 4Tutorial: Couldn’t see everything 3 3User Support: Title and TOC pages confusing 3 2User Support: GALILEO: is this article really the full text? 3 3User Support: How do you reserve a book in another library? (Interlibrary Loan)

3 2

User Support: SPSU book search gave incorrect match (grad task)

3 3

We expected the testing to show a great deal of overlap in results in most tasks. Since user research revealed that all the graduate students questioned had used the SPSU

Findings and Analysis Report ©2011 Elsevier, Inc. Page 28

Page 31: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

site previously and all had used the GALILEO database feature (as opposed to none of the undergraduates), there may be differences between user groups on that scenario. For screening purposes, we asked all students about their prior use of GALILEO research databases in order to match their usage history with the user profiles.

User impressionsIn addition to task outcomes and feature-specific positive and negative feedback, participants also offered some general comments about their experiences and preferences. These comments are included in the test logs, but a representative sampling follows.

Overall comments

“I want the library to recognize me and show me options just for me.” —User 0 “I don’t know how to look for articles.” —User 1, researching a report topic “I still have some problems with the site, after four years.” —User 2 “I need to feel like there's been work done to understand the complexity of what

takes place at the location. A left-nav menu, like on SPSU's home page, is helpful.” —User 3

“I didn’t really know where the choices would take me; it was trial and error.” —User 4

“I often do that—I blame myself if I can’t find something.” —User 5, on navigating the site

“All the things are easy to use—it’s getting there that’s hard.” —User 5 “You shouldn’t have to spend half your time figuring out the site.” —User 5 “I usually try the first link on the page if I don’t know where to go.” —User 8 “It doesn’t make sense to have passwords anyway; university libraries are supposed

to be open.” —User 8 “The only reason I know what this [EBSCO] is, is because my mother’s a librarian.”

—User 8

Positive impressions

“I love this page because I know exactly what I’m looking for.” —User 0, searching for a book on the Galileo Search tab

“…and that’s it! That’s really cool!” —User 2, locating a book with GALILEO “Once I got to a Search box, I knew what to do.” —User 5 “This task was the easiest of all so far… I wouldn’t have started here without [the

tutorial].” —User 6, finding a book in GALILEO after viewing the tutorial

Negative impressions

“This is ridiculous!” —User 0, searching for a Course Reserves link

Findings and Analysis Report ©2011 Elsevier, Inc. Page 29

Page 32: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

“I feel so dumb!” —User 0, attempting to open a password-protected article “Hardest part? Where it said GIL, or GALILEO, or Reference Databases… if I weren’t

used to coming onto this site, I wouldn’t know what they mean.” —User 1 “I didn’t know where to go, where to start.” —User 4, on looking for a password “Dumb it down for me.” —User 5, on locating books vs. periodicals

Suggestions

“Why can’t I have a Search button?” —User 0, looking for links to research a topic “A universal catalog should do a universal search on all resources.” —User 3 “The Search function is most helpful; it works every time.” —User 4 “If it said ‘Reserves’ on the Resources page, that would make it easier for me.” —

User 5, locating course reserves “They should have a standard ‘Forget your password?’ option if you don’t know it.”

—User 5 “The interface needs some rearranging… I’d like some bifurcation of the links, side-

by-side title and explanation.” —User 6 “There should be some kind of help for the password… I’d like to know in advance

that I’ll need one.” —User 6 “Better subcategories would help.” —User 8, on navigating the site links

Findings and Analysis Report ©2011 Elsevier, Inc. Page 30

Page 33: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

RecommendationsBased on the findings of our usability test, we recommend the following short-term and long-term changes to help improve the library site’s usability. In addition, we will offer our suggestions for future development and usability testing of the site.

Short-term Changes Navigation Links (mental model, terminology, user support)

Rename links and include short descriptions to give users a better understanding of where each link will take them. Be descriptive but concise! Users of the library’s site need a clear description of where to start looking on the library’s site to find information or to do research. They also want to know how each page and section of the site will help them accomplish their tasks. Use task-oriented language in your links when possible.Example 1

Search for Books at SPSU (GIL) or Search for Books at other libraries (GIL Express)

Search for Articles (GALILEO) Note: password required

Search for Course Reserves Note: password required

Example 2

Supplemental Links (user support, password)Provide links to GIL and GALILEO everywhere these systems are referenced: on the Circulation Services page, the GIL Express help page, and so on. Also provide a link to password help in the context where the password prompt occurs (if not in the prompt itself)Example

Findings and Analysis Report ©2011 Elsevier, Inc. Page 31

Page 34: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

Text Reduction (layout, user support)Remove or relocate links and text that do not directly support the most common user tasks. Present only the information needed to support the current task.Examples― Delete “Technical Display” on GIL entries if users don’t need it.― Remove links to departmental pages from the Subject Database pages.― Demote Library Services to a subsection within Library Information.

Clarified Wording (mental model, terminology,)Distinguish between “library reserves” and “course reserves,” define “databases,” and so on.Examples― Include a reference and link to Course Reserves on the Library Reserves information page.― Change the link on the Services page from “Reserves” to “Reserve a Book or Journal.” ― Rearrange GIL entry information to make the number of copies, author, and call number more

visible.― Instead of “databases,” call the section Online Resources by Subject Area.― Include a short description of what “universal” means for GIL Universal (Search All Georgia

Libraries).

Prioritized Layout (mental model, layout)Give greater visual emphasis to the most common and important features; de-emphasize the rest.Examples― If EBSCO Host is the main database-search tool with the broadest use, put it at the top of the

Subject Databases section and at the top of all subject-specific pages as well.― Provide a larger, bolder first-position link to the GALILEO search on the home page and

Resources page. Upgraded Tutorial (tutorial, user support)

Findings and Analysis Report ©2011 Elsevier, Inc. Page 32

Add link to password here if it’s needed to open e-item

Page 35: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

Update as follows:― Indicate at the start how long the tutorial will last.― Slow down the effects and switch to manual (user-controlled) advancement from one slide to

the next.― Revise graphic size so control bar doesn’t hide window features. ― Delete the letter animation at beginning and end. ― Revise content to clarify which tool (GIL, GALIELO, etc.) is appropriate to which task. ― Consider adding a closing slide at the end saying “End of tutorial. Close this window.”

With comparatively little time and effort, the short-term changes noted above will address the most significant usability issues we identified on the library Web site.

For a more thorough solution as time and resources permit, we also suggest the following long-term changes.

Long-term Changes Site Structure and Navigation (mental model, user support)

Organize site content by user task, rather than resource category.Examples― Make the current Resources page the home page (merge with Distance Learning page for

descriptions); add a link to Library Information for hours, directions, services, etc.― Make Course Reserves a separate section outside of GIL; if this is not possible, “promote” the

Course Reserves link to co-equal status on the home page, alongside GIL and GALILEO.― If possible, put the GALILEO Search box at the very start (home page), instead of just linking to

it. Label as “Search the SPSU Library.”

Integrated User Support (user support, layout)Insert labels and tips in mouseovers on graphics, buttons, and links. If any long pages are still present, include mid-page jumps and “return to top” links.Example

Edited Content (user support, layout, help)Rewrite and edit all site text for brevity and clarity to all users.

Findings and Analysis Report ©2011 Elsevier, Inc. Page 33

Page 36: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

Example― Shorten the content of the following pages: Services, Borrowing, Circulation, and Reserves.― Interview international students to learn what terms and keywords they look for; make sure

those terms are present on the site.― Ensure consistency of links, fonts, and colors throughout all pages.― Take help content out of PDF format and incorporate into site HTML text.

Tutorial (mental model, tutorial, user support)Revise as follows:― Redesign content around most common user-identified tasks to support the new site structure.― Add narration.― Consider adding instructions on how to navigate the tutorial (e.g., “Using the Control Bar”).

Inter-library Loan (layout, user support)If this area is under SPSU control, redesign entry pages to make “Reserve this Book” function clearer.

Support for Non-native English-speaking Students (terminology, user support)SPSU’s small – but growing – group of foreign students represent an underserved constituency. As SPSU branches out into more and more distance learning, partnerships with foreign universities, and exchange student arrangements, this is an area that needs to be addressed.

Future development and testingWhen the SPSU Library site undergoes significant changes, updates, or redesign, additional usability testing will be essential to success. Consider conducting interviews with student and faculty users of all types, especially international students. User feedback can oftentimes be the best feedback! The following books and journal articles might be helpful to review before beginning any new development on the library’s site:

Usability Testing for Library Websites by Elaina Norlin “Web Usability Testing in a Polytechnic Library” by Debby Wegener, et al. (from

The Australian Library Journal) Web Site Design With the Patron in Mind: A Step-by-Step Guide for Libraries, by

Susanna Davidsen and Everyl Yankee (Chicago: American Library Association, 2004)

For even more ideas, examine other university library Web sites. The University of Georgia (http://www.libs.uga.edu/), Virginia Polytechnic Institute and State University (http://www.lib.vt.edu/), and Georgia Southern University (http://library.georgiasouthern.edu/) are just a few library sites that keep their users in mind by using task-oriented language.

Most importantly, consider testing any site redesigns at the SPSU Usability Center, making use of test materials (user personas, screening questionnaires, task lists, etc.)

Findings and Analysis Report ©2011 Elsevier, Inc. Page 34

Page 37: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

already employed in this project and included as attachments to this report. The Usability Center here on campus is a nationally recognized, state-of-the-art facility with qualified personnel to test the usability of subsequent versions of the library’s site.

The SPSU Usability Center can also be helpful for validation and quantification of the degree of improvement due to the short-term changes we suggested in this report, which in turn can help justify the more detailed long-term work the library staff may want to initiate in the future.

Findings and Analysis Report ©2011 Elsevier, Inc. Page 35

Page 38: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

ConclusionWith a desire to help our fellow students make the best use of the SPSU Library, our team undertook an ambitious usability test using SPSU’s state-of-the-art Usability Center and the instructional expertise of its director and our professor, Dr. Carol Barnum. Beginning with a thorough heuristic evaluation of the library Web site, we identified several areas that warranted further study and developed a test plan. In addition, we tested an animated tutorial for usability and effectiveness.

Pursuing a “discount usability” approach that makes use of a few carefully selected and screened site users to collect the most reliable information, we worked with five graduate and four undergraduate students whose profiles matched the defined personas. We recorded their performance and responses to tasks and questions and found multiple areas of overlapping data and findings that we believe are significant.

FindingsAnalysis of the logs and recordings resulted in categorization of findings into six areas:

Mental Model Terminology Password Tutorial Layout User Support

CommentsBased on our results and findings, we believe that this short-term discount usability process was effective in evaluating and addressing many of the needs of the SPSU Library site. In addition, we gathered some valuable feedback on the animated tutorial that we believe will be helpful for creation of additional online learning tools.

As a result of the team’s analysis of the findings outlined in this report, we have generated recommendations for short-term and long-term site improvements to address these issues. Many of these recommendations are easy to implement and we believe they will result in dramatic improvements in usability. Our longer-term recommendations bear consideration as the library and the university look ahead to greater investment in distance learning. We recommend additional usability testing to assist in future development of the site as well as some resources that may be helpful in site redesign. We strongly recommend repeat usability tests prior to rollout, as part of an iterative design cycle, to ensure the success of the changes. We feel sure that the library is on the threshold of advancement that mirrors the great changes occurring all around SPSU, and we wish them great success.

Findings and Analysis Report ©2011 Elsevier, Inc. Page 36

Page 39: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

Appendices

Findings and Analysis Report ©2011 Elsevier, Inc. Page 37

Page 40: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

Appendix A: Heuristic Evaluation

Findings and Analysis Report ©2011 Elsevier, Inc. Page 38

Page 41: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

Appendix B: Personas Report

Findings and Analysis Report ©2011 Elsevier, Inc. Page 39

Page 42: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

Appendix C: Test Plan

Findings and Analysis Report ©2011 Elsevier, Inc. Page 40

Page 43: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

Appendix D: Participant Materials

Findings and Analysis Report ©2011 Elsevier, Inc. Page 41

Page 44: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

Appendix E: Testing Materials

Findings and Analysis Report ©2011 Elsevier, Inc. Page 42

Page 45: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

Appendix F: Session Logs

Findings and Analysis Report ©2011 Elsevier, Inc. Page 43

Page 46: Memo - Elsevierbooksite.elsevier.com/.../docs/02_Library_Usability_Grou…  · Web viewComments or suggestions related to the sponsor’s GALILEO Quick Search tutorial were placed

Appendix G: Resources

Barnum, Carol M. (2001). Usability testing and research. New York: Longman Publishing.

Davidsen, S., & Yankee, E. (2004). Web site design with the patron in mind: a step-by-step guide for libraries. Chicago: American Library Association.

Norlin, E., & Winters, C.M. (2001). Usability testing for library web sites: a hands-on guide. Chicago: American Library Association.

Findings and Analysis Report ©2011 Elsevier, Inc. Page 44


Recommended