Achieve your 2017 Assessment Resolutions: How to Improve Your Item Writing Skills and Create Better...

Post on 10-Feb-2017

13 views 0 download

transcript

Achieve Your 2017 Resolution to Improve Your Test Item Writing Skills and Create Better Exams Ainslie T. Nibert, PhD, RN, FAAN

Consultant

Email – anibert@comcast.net 

Objectives Create/Edit test items that assess for application and analysis.

Critique exams for alignment with the NCLEX-RN® Test Plan.

Evaluate the relevance and effectiveness of current testing policies.

Resources for Developing Critical Thinking Test Items and Alternate

Format Items:National Council Website

www.nscbn.org◦ NCLEX Test Plans

◦ 2016 RN◦ Candidate FAQ◦ Alternate item formats FAQ◦ Exam Development FAQ

Source: https://www.ncsbn.org/9010.htm

Relationship betweenTesting & the Curriculum

Focus today: INTERNAL Curriculum

Evaluation(Teacher-made Tests)

Writing Critical Thinking Test Items Item Analysis Software & Blueprinting Test Item Banking & Exam Delivery

Internal EvaluationEvaluation of course objectives (faculty designed)

Five Guidelines to Developing Effective Critical Thinking Exams

Assemble the “basics.” Write critical thinking test items. Pay attention to housekeeping duties. Develop a test blueprint. Scientifically analyze all exams.

Critical Thinking Test Items Contain Rationale Written at the Application Level or Above Require Multilogical Thinking to Answer Ask for High Level of Discrimination

Source: Morrison, Nibert, & Flick (2006)

Housekeeping Tips

Rules Get rid of names Get rid of ‘multiple’ multiples Use non-sexist writing style Develop parsimonious writing style

Eliminate Delete scenarios

Write items independent of each other

“of the following…”X

… and More Rules

Use a question format when possible Make distracters plausible and homogeneous Include in stem words repeated in responses

… and More Rules Eliminate “all of the above” and “none of the above” Rewrite any “all except” questions Ensure that alternatives do not overlap Present choices in a logical order Vary correct answer

… and the MOST IMPORTANT Rule

Develop written testing policy Writing style Format

Does the test measure what it claims to

measure?

Content ValidityContent Validity

Use a Blueprint to Assess a Test’s Validity

Test Blueprint Reflects Course Objectives Rational/Logical Tool

Testing Software Program Storage of item analysis data (Last & Cumulative) Storage of test item categories

Consistency of Scores

Reliability Tools

Kuder-Richardson Formula 20 (KR20)—EXAMRange from –1 to + 1

Point Biserial Correlation Coefficient (PBCC)—TEST ITEMS

Range from – 1 to + 1

Item difficulty 30% - 90%

Item Discrimination Ratio 25% and Above

PBCC 0.20 and Above

KR20 0.70 and Above

Standards of Acceptance

one “absolute” rule about item difficulty

TEST ITEMS ANSWERED CORRECTLY BY 30% or LESS of the examinees should always be considered too difficult, and the instructor must take action.

…but what about high difficulty levels?Test items with high difficulty levels (>90%) often yield poor discrimination values.Is there a situation where faculty can legitimately expect that 100% of the class will answer a test item correctly, and be pleased when this happens?RULE OF THUMB ABOUT MASTERY ITEMS: Due to their negative impact on test discrimination and reliability, they should comprise no more than 10% of the test.

Item difficulty 30% - 90%

Item Discrimination Ratio 25% and Above

PBCC 0.20 and Above

KR20 0.70 and Above

Standards of Acceptance

Thinking more about item discrimination statistics on teacher-made tests…IDR can be calculated quickly, but doesn’t effectively consider variance of the entire group. Use it to quickly identify items that have zero/negative discrimination values, since these need to be edited before using again.PBCC is a more powerful measure discrimination.

Correlates the correct answer to a single test items with the total test score of the student.

Considers the variance of the entire student group, not just the lower and upper 27% groups.

For a small ‘n,’ consider referencing the cumulative value.

… what decisions need to be made about Test items?

When a test item has poor difficulty and/or discrimination values, action is needed.All of these actions require that the exam be

rescored.Credit can be given for more than one choice.Test item can be nullified.Test item can be deleted.

REMEMBER: Each of these actions has a consequence, so faculty need to carefully consider these when choosing an action. Faculty judgment is crucial when determining actions affecting test scores.

Standards of Acceptance Nursing

Nursing-PBCC 0.15 and Above

Nursing-KR20 0.60 - 0.65 and Above

3-Step Method forItem Analysis

1. Review Difficulty Level2. Review Discrimination Data

Item Discrimination Ratio (IDR) Point Biserial Correlation Coefficient

(PBCC)3. Review Effectiveness of Alternatives

Response Frequencies Non-distracters

..and a word about using Response Frequencies

A review of the response frequency data can focus your editing.

For items where 100% of students answer correctly, and no other options were chosen, make sure that this is indeed intentional (MASTERY ITEM), and not just reflective of an item that is too easy (>90% DIFFICULTY.)

Target re-writing the “zero” distracters – those options that are ignored by students. Replacing “zeros” with plausible options will immediately improve item DISCRIMINATION.

Critically-thinking Questions

Which intervention is most important?Which intervention, plan, assessment data is/are most critical to developing a plan of care?

Which intervention should be done first?

What action should the nurse take first?Which intervention, plan, nursing action has the highest priority?

What response is best?

33

Fair/Common Universal Language

The client is running late for an appointment. The client understands Buddhist practices are peaceful. The client is on five different medications. The client ate a submarine sandwich. The alcoholic client with delirium tremens is agitated. After the client sneezed, the nurse said “bless you.” The nurse is giving a report on the client. The nursing unit is working shorthanded.

Source: Bristol, T. 2016) NCLEX® Updates (webinar series) Available: http://nursetim.com/webinars/nclex

Latest NCLEX® Test Item Format

Considerations Units of Measure•International Systems of Units (SI)•Metric•Imperial Measurement

Generic vs. Trade Names for Medications•Generic names only in most cases•References to general classifications of medications

Item Writing Tools for Success …

Knowledge

Test Blueprint

Testing Software

ReferencesMorrison, S., Nibert, A., & Flick, J. (2006). Critical thinking and test item writing (2nd ed.). Houston, TX: Health Education Systems, Inc.

National Council of State Boards of Nursing. (2016) 2016 NCLEX-RN test plan. Chicago, IL: National Council of State Boards of Nursing. https://www.ncsbn.org/RN_Test_Plan_2016_Final.pdf

Nibert, A. (2010) Benchmarking for student progression throughout a nursing program: Implications for students, faculty, and administrators. In Caputi, L. (Ed.), Teaching nursing: The art and science, 2nd ed. (Vol. 3). (pp.45-64). Chicago: College of DuPage Press.