Date post: | 18-Oct-2014 |
Category: |
Education |
View: | 230 times |
Download: | 5 times |
EVIDENCE BASED LIBRARIANSHIP IN PRACTICE USING EVIDENCE IN HEALTH SCIENCES LIBRARIES
Lorie Kloda, MLIS, PhD, AHIPMcGill University
Central New York Library Resources Council, Syracuse, March 2014
Introductions
Lorie Kloda
Assessment Librarian since 2012
Health Sciences Librarian for 12 years
Montreal, McGill University
Associate Editor, EBLIP journal
Introductions
1. Your name
2. Your title/position
3. Your city, institution
4. What is your interest in evidence based practice? Why are you here today?
WHAT WE WILL COVER TODAY
Course objectives
• Identify the steps in evidence based practice
• Formulate answerable questions relevant to their own work setting
• Define what constitutes evidence in their own work setting
• Identify strategies for locating local or external evidence to answer their questions
• Make use of tools for critically appraising published research
• Provide examples of how evidence can be applied by health librarians in the real world
ACTIVITY 1what are your "burning" questions?
THE EBLIP PROCESS
What is EBLIP?
“an approach to information science that promotes the collection, interpretation and integration of valid, important and applicable user-reported, librarian observed, and research-derived evidence. The best available evidence, moderated by user needs and preferences, is applied to improve the quality of professional judgements.”
(Booth, 2000)
Why should you care?
“Wisdom means acting with knowledge while doubting what you know.”
Jeffrey Pfeffer & Robert I. Sutton
A brief history
1997 Hypothesis article by Jon Eldredge
2000 MLA Research Section created an Evidence-Based Librarianship Implementation Committee
2000 Eldredge publishes papers that provide the framework for EBL
2001 1st Evidence Based Librarianship conference held in Sheffield, UK
2004 Booth and Brice book on EBIP
2006 EBLIP journal launches
The 5 As of EBLIP
1) Formulate a focused question (Ask)2) Find the best evidence to help answer that
question (Acquire)3) Critically appraise what you have found to
ensure the quality of the evidence (Appraise)4) Apply what you have learned to your practice (Apply)5) Evaluate your performance (Assess)
5 As process
Hayward, 2007, http://www.cche.net/info.asp
Is the EBLIP model used?
• The ideal vs reality
• Criticisms of EBLIP
• Barriers to practicing in an evidence based manner
Barriers to evidence use
• Organizational dynamics
• Lack of time/competing demands on time
• Personal outlook / lack of confidence
• Education and training gaps
• Information needs not being met
• Financial limits
Determinants by level of control
Other considerations
• individual vs group decision making
• influences / biases
• impact of work environment
• types of evidence
• enablers
Widening the modelA revised process:
Articulate – come to an understanding of the problem and articulate it.Assemble – assemble evidence from multiple sources that are most appropriate to the problem at hand.Assess – place the evidence against all components of the wider overarching problem. Assess the evidence for its quantity and quality.Agree – determine the best way forward and if working with a group, try to achieve consensus based on the evidence and organisational goals.Adapt –revisit goals and needs. Reflect on the success of the implementation.
Bringing the components together
Research Evidence
Professional knowledge
Local evidence
Questions to ask yourself
What do I already know?
What local evidence is available?
What does the literature say?
What other information do I need
to gather?
How does the information I have
apply to my context?
Make a decision
What worked? What didn’t? What did I
learn?
PRACTITIONERPRACTITIONER
Case examples
Academic librarian wants to know what professors think of information literacy instruction to students
Librarian at a pediatric hospital wonders if residents’ searches are improved with librarian assistance
BREAK
FORMULATING AN ANSWERABLE QUESTIONAsk
“Questions drive the entire EBL process. […] The wording and content of the questions will determine what kinds
of research designs are needed to secure answers.”
(J. Eldredge, 2000)
SPICE question structure
Setting the context (e.g., hospital library, academic health center)
Perspective the stakeholder(s) (e.g., graduate students, managers, reference librarians)
Intervention the service being offered (e.g., chat reference, RefWorks workshops)
Comparison the service to which it is being compared (optional)
Evaluation the measure used to determine change/success/impact (e.g., usage statistics, course grade)
Librarianship domains
Reference/Enquiries—providing service and access to information that meets the needs of library users.
Education— Incorporating teaching methods and strategies to educate users about library resources and how to improve research skills.LIS Education subset – Specifically pertaining to the professional education of librarians.
Collections—Building a high-quality collection of print and electronic materials that is useful, cost-effective and meets the users’ needs.
Management—managing people and resources within an organization. This includes marketing and promotion as well as human resources.
Information access and retrieval—creating better systems and methods for information retrieval and access.
Professional Issues—exploring issues that affect librarians as a profession.
(Koufogiannakis, Crumley, and Slater, 2004)
Librarianship domains
• Information access & retrieval
• Collections
• Management
• Education
• Reference
• Professional issues
• [Scholarly communications]
Burning question example 1
What are university faculty members’ perceptions of information literacy?
SPICE example 1
Setting Research university
Perspective LibrariansProfessors
Intervention Survey questionnaire to determine attitudes, perceptions, experiences
Comparison Not applicable
Evaluation Ratings of information literacy competenciesInclusion of IL in coursesDisciplinary differences
Burning question example 2
Are pediatric residents’ search results improved with help from a librarian?
SPICE example 2
Setting Pediatric teaching hospital
Perspective Librarians
Intervention Help from a medical librarian for a literature search
Comparison Literature search without assistance
Evaluation Relevance of retrieved results; Quality of search strategy
ACTIVITY 2formulate your burning question using SPICE
WHAT QUALIFIES AS EVIDENCE?
Definition of evidence
“the available body of facts or information indicating whether a belief or proposition is true or valid”
(Oxford English Dictionary, 2011)
ACTIVITY 3
What are some possible evidence sources we use to make decisions in libraries?
Evidence Sources
Hard evidence Soft evidence
Published literature Input from colleagues
Statistics Tacit knowledge
Local research and evaluation
Feedback from users
Other documents Anecdotal evidence
Facts
36
LUNCH12:15 – 1:00
SOURCES FOR LOCATING AND CREATING EVIDENCEAcquire
Locating
Published research
• Databases
• Books, bibliographies
• Mail lists, blogs, word of mouth
• Conferences
• Systematic reviews, Evidence summaries
Creating
Local evidence
• Usage data
• Transaction data
• Evaluation results
• Survey, interview, focus group findings
• Inputs, outputs, outcomes, impact
Locating published evidence
Databases
• Library and information studies
• Management
• Education
• Social sciences
• Health sciences, psychology
http://libvalue.cci.utk.edu/
http://www.informedlibrarian.com/
http://eprints.rclis.org/
Locating published evidence
Conferences
• EBLIP (1-7)
• Health librarianship, e.g., MLA, CHLA, EAHIL, ICML
• Subject librarianship (music, law)
• Assessment, e.g., Northumbria Conference, Library Assessment Conference
• Academic, e.g., ACRL
• Information literacy, e.g., LOEX, WILU, LILAC
• LIS research conferences, e.g., ISIC, ASIS&T, CAIS, ALISE, IIiX, AMIA
Locating published evidence
Systematic reviewshttp://lis-systematic-reviews.wikispaces.com
Locating published evidence
Evidence summaries
http://ejournals.library.ualberta.ca/index.php/EBLIP
Evidence Based Library and Information Practice journal, 2006-
>250 evidence summaries
Creating evidence
Data and findings
• Usage data
• Transaction data
• Evaluation results
• Survey, interview, focus group findings
Creating evidence
Sources for local evidence already available
• Library assessment department
• University planning and institutional analysis
• Annual reports
• Internal reports
• "Stats"
Creating evidence
Dudden, R. F. (2007). Using benchmarking, needs assessment, quality improvement, outcome measurement, and library standards. New York: Neal Schuman.
Evidence for example 1
Locating evidence
• Databases: LISA
• Systematic Review Wiki
• Journals: Communications in IL, J of IL, J of Academic Librarianship
• Conferences: LILAC, LOEX, WILU
• EBLIP Evidence Summary
Creating evidence
• survey questionnaire
Evidence for example 2
Locating evidence
• Databases: LibValue, LISA
• Systematic review wiki
• Journals: JMLA, HILJ, etc.
• Conferences: MLA
• EBLIP Evidence Summary
Creating evidence• ???
ACTIVITY 41. identify 2-3 sources for locating evidence to answer your question
2. consider 1 potential source of local evidence to look into
CRITICAL APPRAISALAppraise
Critical appraisal
Weigh up the evidence
• Reliable
• Valid
• Applicable
Checklists help with critical appraisal process
Language is different for interpretive (qualitative) research
Reliability
1. Results clearly explained
2. Response rate
3. Useful analysis
4. appropriate analysis
5. Results address research question(s)
6. Limitations
7. Conclusions based on actual results
Validity
1. Focused issue/question
2. Conflict of interest
3. Appropriate and replicable method
4. Population and representative sample
5. Validated instrument
Applicability
1. Implications reported in original study
2. Applicability to other populations
3. More information required
ReLIANT
For appraising research on information skills instruction
Focuses on:
• Study design
• Educational context
• Results
• RelevanceKoufogiannakis, D., Booth, A., & Brettle, A. (2006) Reliant: Reader's Guide to the Literature on
Interventions Addressing the Need for Education and Training. Library & Information Research 30(94), 44-51.
CRiSTAL Checklist
For appraising research on user studies
Focuses on:
• Study design
• Results
• Relevance
Developed by Andrew Booth and Anne Brice. Available from: http://nettingtheevidence.pbworks.com/w/page/11403006/Critical%20Appraisal%20Checklists
ACTIVITY 5critically appraise a study using the appropriate checklist
Critical appraisal: the shortcut
APPLYING EVIDENCE IN PRACTICEApply
Ways to apply evidence
1) The evidence is directly applicable
2) The evidence needs to be locally validated
3) The evidence improves understanding
Reflection
DEALING WITH THE BARRIERS
Enablers of evidence use
• Positive organizational dynamics
• Ongoing education
• Positive personal outlook
• Time
ACTIVITY 63 things you will take home and act upon
CONCLUSIONAssess