+ All Categories
Home > Documents > Assessing Student Learning Outcomes at UVa Institutional Assessment and Studies [email protected]...

Assessing Student Learning Outcomes at UVa Institutional Assessment and Studies [email protected]...

Date post: 20-Dec-2015
Category:
View: 214 times
Download: 0 times
Share this document with a friend
Popular Tags:
65
Assessing Student Learning Outcomes at UVa Institutional Assessment and Studies http://www.web.virginia.edu/iaas [email protected] 434.924.3417
Transcript

Assessing Student Learning Outcomes at UVa

Institutional Assessment and Studieshttp://www.web.virginia.edu/[email protected]

Workshop OverviewPart I: Assessment: What, why and how

► Assessment: what it is and how it applies to UVa► Developing an assessment plan► Strategies and challenges for implementing a useful and

feasible assessment plan► Defining student learning outcomes

Part II: Useful and Feasible Methods and Tools► Using grades for assessment—why not?► Creating rubrics and prompts► Writing test questions► Curriculum mapping

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

Workshop Goals

► A clear understanding of what each program is expected to do.

► Tangible guidance on how to develop a program assessment plan at UVa.

► Working knowledge for writing measurable student learning outcomes.

► Tools and methods that will aid faculty in conducting useful and feasible assessments.

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

To provide faculty with:

Part I: Assessment: What, why and how

► Describe the major tenets of the University’s assessment model.► Recall the steps for completing an assessment plan (matrix). ► Define and give examples of learning outcomes.► Summarize the University’s 2005-06 expectations for drafting a

program assessment plan.

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

Outcomes At the conclusion of this presentation, participants should be able to:

Assessment is a process of gathering and interpreting information to discover if a program is meeting established goals and then using that information to enhance the program.

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

Why must we Assess?

Accountability► Federal requirements► State oversight► Southern Association of Colleges and Schools (SACS)

reaffirmation

Office of Institutional Assessment and Studieshttp://www.web.virginia.edu/iaas

“The institution demonstrates that each educational program for which academic credit is awarded (a) is approved by the faculty and the administration, and (b) establishes and evaluates program and learning outcomes.”

--SACS Comprehensive Standard 3.4.1

Why should we Assess?

Office of Institutional Assessment and Studieshttp://www.web.virginia.edu/iaas

Program improvement►To maintain, foster, and improve the University of Virginia’s unique teaching and research environment.

A Model for Assessing Student Learning Outcomes at UVa

Assessments should:► Be useful and feasible.► Minimize burden on faculty and students.

Programs define their own most important outcomes.

Methods should be kept as simple as possible and connected to what faculty are already doing.

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

More about the model

► Begin with what would be most helpful to your program.

► Faculty involvement is essential.

► The unit of analysis for assessment is the program.

► Results should be used to make program improvements.

► The Provost is committed to providing support.

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

Resources

► Assessment Guide

► Workshops

► Individual meetings

► IAS staff (liaisons)

► Web site http://www.web.virginia.edu/iaas/assessment/assessment.htm

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

7 Steps to developing & implementing an assessment plan

1) Begin with a brief statement of the mission and general goals for your program.

2) Identify the 3-5 most important outcomes of the program, including at least 2 student learning outcomes.

3) Identify useful and feasible assessment methods.

4) Develop the assessment plan and complete the matrix.

5) Develop assessment instruments and fine-tune the methodology.

6) Tabulate, analyze and report the results.

7) Communicate the findings and consider appropriate action.

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

Program Outcomes/ Objectives (3-5) What will students know and be able to do when they graduate? Should be specific & measurable.

Check if this is a Student Learning Outcome (min. of 2)

Assessment Methods/ Measures How will the outcome be measured? Who will be assessed, when, and how often?

Standards of Comparison/ Target Level How well should students be able to do on the assessment?

Interpretation of Results/ Findings What does the data show?

Use of Results/ Action Plan Who reviewed the finding? What changes were made after reviewing the results?

1.

2.

3.

4.

5.

6.

7.

8.

9.

10.

PROGRAM OR SCHOOL: Assessment Coordinator for Program or School:

Name: Email: Phone:

Program or School Mission Statement:

Program or School Goals (2-3):

ASSESSMENT MATRIX

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

Interactive version available online: http://www.web.virginia.edu/iaas/assessment/assessment.htm

Writing a mission statement & goals

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

University Assessment conducts, supports, and coordinates research to maintain and improve academic programs and student services. The Assessment team implements and supports useful and feasible assessment approaches in order to foster the University’s unique learning and research environment. Our goals are to:

• Support assessment activities by providing relevant training and feedback, and designing practical methods of measurement and evaluation.• Administer the University’s general education and core competency assessments in an efficient manner to help faculty and administrators improve student learning and development.• Coordinate the development and implementation of school and program assessment plans that define, measure, and evaluate student learning goals and outcomes.• Conduct surveys of academic and student services programs that effectively meet client needs, and do not pose an undue burden to the respondents.

Setting Standards/Target Levels Oral Communication Example

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

Performance Level

TargetStandard of Comparison

Highly Competent 4 40%

Competent 3 and above 85%

Minimally Competent

2 and above 95%

What are program student learning outcomes?

The accumulated knowledge, skills, and

attitudes that students develop during a

course of study.

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

The Leading Questions

What do students need to know and be able to do after they graduate from this program, or complete a course of study within the program?

What do students learn in your program that distinguishes them from another program’s graduates?

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

Student Learning Outcome Examples

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

Program Learning Outcome:A graduate of the Civil Engineering Program can design a component of a civil engineering system, incorporating social, economic, ethical, and contractual considerations.

University Learning Outcome:Critical ThinkingUndergraduate students graduating from the University of Virginia should be able to:

►Carefully interpret, analyze, and evaluate evidence, statements, graphics, questions, etc.►Construct well-supported and sustained arguments.►Justify conclusions based on well-supported arguments.

Writing Learning Outcomes► In one sentence describe one major skill, piece of

knowledge, or attitude that a student will have gained as result of completing a course of study.

► Use action verbs.► Write outcome statements that describe OVERT

performance.► Make sure the outcome is something that can be

assessed or tested.► Include a timeframe when the student will be expected to

have learned it (end of first year in major, at graduation etc.).

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

Covert versus overt performance criteria.

Covert Overt

Know the arguments Summarize the arguments

Reflect on the issues Share reflections on the issues

Think critically Interpret, analyze, and evaluate evidence, construct arguments

Envision solutions Illustrate solutions

Understand principles Apply principles

Understand methods Explicate methods

Appreciate art Choose to attend art events

Appreciate philosophy Choose to discuss philosophical issues

Know principles of social justice Advocate principles of social justice

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

Exercise► Draft a learning outcome for your program (graduate or

undergraduate) after a specified course of study. Examples of a course of study:

► End of the first year in the major, or graduate study► Upon graduation

► What are the most important skills, knowledge or attitudes that you want your students to have at the end of this course of study?

► Examples: By the end of the first year of undergraduate study in the Department of Politics, students will be able to:

► Explain the key features of American liberal democracy written into the U.S. Constitution.

► Identify the unique contributions to 18 th century American political thought of three continental political theorists.

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

Learning Outcome ChecklistDoes the learning outcome:

► ___ Represent a fundamental result of the course of study or program—does it assess what is most important?

► ___ Focus on what students learned rather than on what material is covered?

► ___ Clearly describe what students are asked to do, using action verbs (write, compose, speak)?

► ___ Ask students to apply what they have learned by producing something (essay, lab report, work of art)?

► ___ Include a time frame for students to accomplish this goal (end of second year, end of program)?

► ___ Is the outcome specific and measurable?

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

Opportunities for Measuring Student Learning Outcomes► Course embedded assignments

(Papers, exams, projects)► Comprehensive exams► Capstone courses/fourth-year seminars► Thesis/dissertation► Internships/practicums/jobs (ratings of

students’/graduates’ skills)

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

As much as possible, use what you are already doing!

Student Learning Outcome Assessment Methods

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

Direct Assessments► Standardized tests► Licensure or professional

exams► Rubrics for evaluating:

► Essays/Papers► Exam questions► Exhibits► Performances/Presentations► Portfolios of student work► Comprehensive exams

Indirect Assessments► Surveys

► Student► Alumni► Employer

► End-of-course memos► Focus groups► Exit interviews► Course evaluations

2005-06 Expectations (by May 06)

► Mission statement with two or three of the most important program goals.

► Three to five measurable program outcomes; two student learning outcomes.

► Plan (matrix or equivalent) for assessing one student learning outcome:

► Definition, methods, timeframe and standards.► Interpretation not be possible.► Sketch of process for using the results.

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

Timeline 2005-06► December 2005 – January 2006:

Workshops and individual meetings with schools/programsDevelopment of plans

► February - March 2006: First drafts of assessment plans submitted to IASFeedback provided within 2 weeks

► March – April 2006:Second drafts of assessment plans, if necessary

► Summer and Fall 2006: Preparation of assessment plans by IAS for SACS

► September 2006:SACS compliance report due

► November 2006:SACS off-site review

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

Future Expectations

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

How will assessment activities be reported in the future?

►Program Review►Annual reports►WEAVE online►Accreditation

How much assessment?How often?

► Start with the most important program outcomes. At least two should be learning outcomes.

► Focus on what is useful for program improvement.

► Not all outcomes can, or should, be assessed.

► Stagger assessments over multiple years.

► Allow time for analysis.

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

Challenges and Strategies

► Getting started► What are you already doing? Work backwards.► Accreditation reports.► Identify issues faculty have been discussing.

► Assign responsibility for collecting assessment information AND using it.

► The information is available but is not organized to fit matrix.

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

Program Outcomes/ Objectives (3-5) What will students know and be able to do when they graduate? Should be specific & measurable.

Check if this is a Student Learning Outcome (min. of 2)

Assessment Methods/ Measures How will the outcome be measured? Who will be assessed, when, and how often?

Standards of Comparison/ Target Level How well should students be able to do on the assessment?

Interpretation of Results/ Findings What does the data show?

Use of Results/ Action Plan Who reviewed the finding? What changes were made after reviewing the results?

1.

2.

3.

4.

5.

6.

7.

8.

9.

10.

PROGRAM OR SCHOOL: Assessment Coordinator for Program or School:

Name: Email: Phone:

Program or School Mission Statement:

Program or School Goals (2-3):

ASSESSMENT MATRIX

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

Interactive version available online: http://www.web.virginia.edu/iaas/assessment/assessment.htm

“Good assessments give reasonably accurate, truthful information.”

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

Why do we insist on measuring it with a micrometer when we mark it with chalk and cut it with an axe?

--Peter Ewell

In assessment, “the perfect is the enemy of the good.” Let’s keep striving for the good.

--Tom Angelo

An approximate answer to the right question is worth a good deal more than an exact answer to an approximate question.

--J. W. TukeySlide replicated from Linda Suskie’s Achieving Middle States’ Expectations for Assessing Student Learning, a presentation to the Middle States Commission on Higher Education, May 2005. Used with permission.

Questions?

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

Part II: Assessment Methods and Tools

Institutional Assessment and Studieshttp://www.web.virginia.edu/[email protected]

Workshop Overview

►Course grades for assessment—why not?

►Assessment Tools►Rubrics►Prompts►Multiple-Choice instruments►Curriculum mapping

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

Workshop Outcomes

At the end of this part of the presentation, participants should be able to:

► Explain the limitations, and potential, for using grades for assessment.

► Define what rubrics are and explain how they can be used for assessment.

► Use an existing multiple-choice exam as an assessment tool.

► Construct a curriculum map.

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

Program Outcomes/ Objectives (3-5) What will students know and be able to do when they graduate? Should be specific & measurable.

Check if this is a Student Learning Outcome (min. of 2)

Assessment Methods/ Measures How will the outcome be measured? Who will be assessed, when, and how often?

Standards of Comparison/ Target Level How well should students be able to do on the assessment?

Interpretation of Results/ Findings What does the data show?

Use of Results/ Action Plan Who reviewed the finding? What changes were made after reviewing the results?

1.

2.

3.

4.

5.

6.

7.

8.

9.

10.

PROGRAM OR SCHOOL: Assessment Coordinator for Program or School:

Name: Email: Phone:

Program or School Mission Statement:

Program or School Goals (2-3):

ASSESSMENT MATRIX

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

Using grades as assessment information: Why not?

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

What goes into a grade?

► Participation► Attendance► Timeliness of work► Test grades► Paper grades► General education skills

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

Problems with using grades

► Suppose one of the outcomes for the course is:

Student will be able to calculate an effect size using figures available in a published research paper.

► …and the student gets a C in the course. Does this mean the student has not achieved regarding this outcome?

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

How to use grades

► Break composite grades into components► Isolate the parts of the grade related to

each outcome.► This piece of the grade is useful for

assessment.► Also helps tease out effects of different

professors and their research interests within the discipline

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

2005-06 Expectations (by May 06)

► Mission statement with two or three of the most important program goals.

► Three to five measurable program outcomes; two student learning outcomes.

► Plan (matrix or equivalent) for assessing one student learning outcome:

► Definition, methods, timeframe and standards.► Interpretation not be possible.► Sketch of process for using the results.

Where in the curriculum can you do assessment?

► Depending on what you are trying to assess

► Change over time► 1st year in major experience

► Comparison to other institutions► Capstone course► Seminars

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

What is a rubric?

► A kind of scorecard that breaks down a written or demonstrated assignment into manageable, observable pieces.

► A way to consistently assess student work to determine

whether a program is meeting its goals.

Office of Institutional Assessment and Studieshttp://www.web.virginia.edu/iaas

Rubrics are:

► Useful assessment tools for measuring subjective skills, knowledge and attitudes.

► An efficient means of incorporating assignments that are already part of the curriculum into an assessment plan.

► A method of rating student work in a more objective manner.

Office of Institutional Assessment and Studieshttp://www.web.virginia.edu/iaas

Rubrics are used to assess:

► Projects► Portfolios► Term papers► Internships► Performances► Examinations► Artwork

► Theses► Dissertations► Exhibitions

Office of Institutional Assessment and Studieshttp://www.web.virginia.edu/iaas

UNIVERSITY OF VIRGINIA CORE COMPETENCY ASSESSMENT ORAL COMMUNICATION SCORING RUBRIC

SCORING GUIDE: 1 = Not Competent 2 = Minimally Competent 3 = Competent 4 = Highly Competent Assign one whole number to each objective: OBJECTIVE

Not Competent

Minimally Competent

Competent Highly

Competent Not

Applicable Chooses and narrows a significant topic appropriate for the audience and occasion

1 2 3 4 NA

Communicates thesis/specific purpose to audience in a clear manner

1 2 3 4 NA

Balances purpose, and occasion with audience needs and expectations

1 2 3 4 NA

Provides a clear, easily identified organization appropriate to topic, audience, purpose and occasion

1 2 3 4 NA

Demonstrates appropriate understanding of the topic, discipline, or genre

1 2 3 4 NA

Provides appropriate supporting evidence

1 2 3 4 NA

Uses language appropriate to the audience and occasion

1 2 3 4 NA

Uses vocal variety (pitch, pace, inflection, volume

1 2 3 4 NA

Uses physical behaviors (gestures, postures, movement, eye contact) that support the verbal message

1 2 3 4 NA

Uses visual aids, when appropriate, to provide useful illustrations or examples

1 2 3 4 NA

RATER ID__________ PRESENTATION ID _______

Rubric example

Office of Institutional Assessment and Studieshttp://www.web.virginia.edu/iaas

Types of rubrics

Four commonly used types:► Checklists► Rating scales► Descriptive rubrics► Holistic rubrics

Office of Institutional Assessment and Studieshttp://www.web.virginia.edu/iaas

Checklist exampleChecklist for Evaluating Program Outcomes

Does the outcome/objective:___ Represent a fundamental result of the course of study or

program—does it assess what is most important?___ Focus on what students learned rather than on what

material is covered?___ Clearly describe what students are asked to do, using

action verbs (write, compose, speak)?___ Ask students to apply what they have learned by

producing something (essay, lab report, work of art)?___ Include a time frame for students to accomplish this goal

(end of second year, end of program)? ___ Is the outcome specific and measurable?

Office of Institutional Assessment and Studieshttp://www.web.virginia.edu/iaas

UNIVERSITY OF VIRGINIA CORE COMPETENCY ASSESSMENT ORAL COMMUNICATION SCORING RUBRIC

SCORING GUIDE: 1 = Not Competent 2 = Minimally Competent 3 = Competent 4 = Highly Competent Assign one whole number to each objective: OBJECTIVE

Not Competent

Minimally Competent

Competent Highly

Competent Not

Applicable Chooses and narrows a significant topic appropriate for the audience and occasion

1 2 3 4 NA

Communicates thesis/specific purpose to audience in a clear manner

1 2 3 4 NA

Balances purpose, and occasion with audience needs and expectations

1 2 3 4 NA

Provides a clear, easily identified organization appropriate to topic, audience, purpose and occasion

1 2 3 4 NA

Demonstrates appropriate understanding of the topic, discipline, or genre

1 2 3 4 NA

Provides appropriate supporting evidence

1 2 3 4 NA

Uses language appropriate to the audience and occasion

1 2 3 4 NA

Uses vocal variety (pitch, pace, inflection, volume

1 2 3 4 NA

Uses physical behaviors (gestures, postures, movement, eye contact) that support the verbal message

1 2 3 4 NA

Uses visual aids, when appropriate, to provide useful illustrations or examples

1 2 3 4 NA

RATER ID__________ PRESENTATION ID _______

Rating scale example

Office of Institutional Assessment and Studieshttp://www.web.virginia.edu/iaas

Descriptive rubric exampleObjective Not Competent Minimally Competent Competent Highly Competent Chooses and narrows a significant topic appropriate for the audience and occasion

The speaker’s choice of topic is inconsistent with the purpose, the topic cannot be adequately treated in the time limitation of the speech, and there is little or no evidence of successful audience analysis.

The speaker’s topic is somewhat consistent with the purpose, is barely able to fit within the time limitations (either too short or too long), and reflects a minimal analysis of the audience.

The speaker’s choice of topic is generally consistent with the purpose, is a reasonable choice for the time limitations of the speech, and reflects appropriate analysis of a majority of the audience

The speaker’s choice of topic is clearly consistent with the purpose, is totally amenable to the time limitations of the speech, and reflects unusually insightful audience analysis

Communicates thesis/specific purpose to audience in a clear manner

The listener has difficulty understanding, within the opening few sentences of the speech, precisely what the specific purpose/thesis of the speech is.

The listener may minimally comprehend, within the opening few sentences of the speech, precisely what the specific purpose/thesis is.

The listener should understand, within the opening few sentences of the speech, precisely what the specific purpose/thesis of the speech is.

There is no question that the listener should understand clearly, within the opening few sentences of the speech, precisely what the specific purpose/thesis of the speech is.

Balances purpose, and occasion with audience needs and expectations

The speaker shows little or no ability to balance purpose and occasion and is clearly uncomfortable with audience response.

Speaker is adequately able to balance the purpose and occasion, but may not regularly check in with the audience.

Speaker takes into consideration the purpose and occasion and regularly checks in with the audience to make appropriate adjustments.

Speaker has a clear command of the purpose and occasion, and is able to fine tune and alter the speech as needed to adapt to the audience.

Office of Institutional Assessment and Studieshttp://www.web.virginia.edu/iaas

Holistic scoring guide exampleThe exceptional paper (15-14 points)► Shows a clear understanding of ethics problems. ► Demonstrates understanding of the subtleties and flaws of the

Potter Box► Treatment of the Potter Box and the five philosophical theories

includes a determination of why a step or theory works or fails to work in confronting an ethics problem.

► Points out flaws/inconsistencies in theories and should anticipate problems that might arise in other situations.

► Exhibits precision and control. The above-average paper (13-10 points)

► Shows a clear understanding of the difference between an ethics problem and a media issue

► Presents a less thorough or less accurate treatment of the Potter Box or of the five philosophical theories.

► May fail to demonstrate an understanding of the subtleties of the Potter Box or even misunderstand a particular step.

► May be flawed in their presentation of one or more of the five philosophical theories.

► Rarely considers problems that might arise in other situations. ► Exhibits syntactic maturity, but may contain occasional lapses in

control.The average paper (9-8 points)► Offers--in general--an adequate discussion of the Potter Box and of

the five philosophical theories► Does not demonstrate the insight necessary to discuss fully the

complexities of all of these elements. ► Likely to misunderstand one or more steps in the Potter Box► Treats one or more of the philosophical theories superficially or

inaccurately. ► Will usually fail to consider why a particular theory works or does

not work when it is applied to a specific problem► Will be very unlikely to consider implications of the theories in other

situations. ► Will often demonstrate a failure to attend to detail.

The below-average paper (7-6 points)► Begins with an ethics problem► Presents a discussion of the Potter Box and the five

philosophical theories that is both inadequate and inaccurate.

► May misunderstand two or more steps of the Potter Box and may misapply two or more of the philosophical theories.

► May even omit a step entirely and will never move beyond the superficial.

► Will often be flawed by improper word choice, awkward sentence structure, and omitted transitions which produce erratic jumps in logic.

The failing paper (5-0 points)► Does not focus on an ethical issue or takes on a topic

not approved by the TA. ► Does not understand even the simplest implications of

the Potter Box or of the philosophical theories and so fail to explain correctly any of the steps and misapply or omit entirely the five theories.

► Exhibits an extreme lack of attention to detail which obscures or perverts meaning time and time again. In short, the failing paper demonstrates few, if any, virtues.

Office of Institutional Assessment and Studieshttp://www.web.virginia.edu/iaas

Developing a rubric

► Brainstorm a list of what you expect to see in the student work that demonstrates the particular learning outcome(s) you are assessing.

► Keep the list manageable (3-8 items) and focus on the most important abilities, knowledge, or attitudes expected.

► Edit the list so that each item is specific and concrete, use action verbs when possible, and descriptive, meaningful adjectives (e.g., not “adequate” or “appropriate” but “correctly” or “carefully”).

► Assign values, either numeric or descriptive, to varying levels of competence or skill. These levels could be described in detail or could simply be numeric representations of some level of an ideal.

► Test the rubric by scoring a small sample of student work. Are your expectations too high or too low? Are some items difficult to rate and in need of revision?

Office of Institutional Assessment and Studieshttp://www.web.virginia.edu/iaas

Using the rubric► Evaluators should meet together for a training session.► One or more examples of student work should be examined and

scored.► Discuss the scores and make decisions about conflicts that arise.► More than one faculty member should score the student work. ► If two faculty members disagree significantly (more than 1 point on a

4 point scale) a third person should score the work.► If frequent disagreements arise about a particular item, the item may

need to be refined or removed.

Office of Institutional Assessment and Studieshttp://www.web.virginia.edu/iaas

Communicating ExpectationsWhen writing assignments, tell students exactlywhat they are to do. These “prompts” can:

► Will elicit more appropriate responses to the assignment.

► Ensure a common basis for response so that the assignment can be scored consistently, using a rubric.

► Provide clear, written directions for the task.

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

For critical thinking competencyassessment, IAS asked students to:

► Analyze and evaluate evidence from statements and graphics

► Construct and support arguments► Develop and sustain a clearly articulated thesis► Justify conclusions only on the basis of well-supported

arguments

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

Example prompt

When we wrote “Analyze and evaluate evidence from statements and graphics,” we asked students to:

► Interpret the statement/graphic► Summarize its key findings or claims► Provide reasons for accepting or rejecting the argument

made.

We also defined the terms “interpret,” “summarize,” and

“provide reasons” for students as part of the assignment

directions.

Critical thinking – further detail

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

Most importantly:

Let students know exactly what is expected of them on the assignment and provide precise directions in order to help them complete the task well.

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

Communicating Expectations to Students

Multiple-choice instruments

► Benefits► Drawbacks

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

The Back Translation (Dawis, 1987)

► Faculty and content experts assign test items to corresponding learning outcome

► Discuss any differences until consensus is reached

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

Use graphical, symbolic, and numerical methods to analyze, organize, and interpret natural phenomenon.

Discriminate between association and causation, and identify the types of evidence used to establish causation.

Formulate hypotheses, identify relevant variables, and design experiments to test hypotheses.

Evaluate the credibility, use, and misuse of scientific and mathematical information in scientific developments and public-policy issues.

Example item 20: Which of the following is not an essential feature of experimental design?a) manipulation of the independent variableb) random assignment of subjectsc) at least one comparison or control groupd) use of modern technologye) measurement of the dependent variable

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

Student Learning Outcomes and Multiple Choice test itemsUse graphical, symbolic, and numerical methods to analyze, organize, and interpret natural phenomenon.

Items 1-4, 7, 9, 23, 27-31 (12 items)

Discriminate between association and causation, and identify the types of evidence used to establish causation.

Items 6, 17-19, 32-38, 43, 45, 50 (14 items)

Formulate hypotheses, identify relevant variables, and design experiments to test hypotheses.

Items 20, 21, 39-42, 46-49 (10 items)

Evaluate the credibility, use, and misuse of scientific and mathematical information in scientific developments and public-policy issues.

Items 5, 8, 10-16, 22, 24-26, 44 (14 items)

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

Scanning, scoring, improving

► ITC can scan forms in for you, and provide either raw or scored data. Can be done in 24 hours

► ITC can provide you with item analysis

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

Example Text from Item Analysis

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

Answer Distribution

Top and bottom refer to the top and bottom 27 percent of students.

Question Number

Group Correct Answer

Weight Factor

Percent Correct

a b c d e Blank Difficulty Index

Discrimination Index

1 all A 1.00 100.00 22 0 0 0 0 0 dif = 100.00

top 100.00 0 0 0 0 0 0 dis = .00

bottom 100.00 0 0 0 0 0 0

19 all A 1.00 40.91 9 4 9 0 0 0 dif = 46.15

top 37.50 3 0 5 0 0 0 dis = -.22

bottom 60.00 3 2 0 0 0 0

Curriculum Mapping

► Before asking the question “Do students know this?” need to ask “Are we providing this experience?”

► Curriculum Mapping can help determine degree of coverage of outcomes

► Ask faculty members for level of coverage they provide in the courses they teach

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

Example Curriculum MapOutcomes Psych

411

Psych 435 Psych 442 Psych 500

Design a correlational study

X X

Evaluate published research

X X

Integrate ethical concerns with study design

X

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas

Thank You

Questions?

For copies of this presentation please contact:Institutional Assessment and [email protected]

It is available online at:www.web.virginia.edu/iaas/assessment/assessment.htm

A note of thanks and attribution to Linda Suskie and her excellent book, Assessing Student Learning: A Common Sense Guide, from which many of the ideas presented originated. (2004). Bolton, MA: Anker Publishing Company, Inc.

Office of Institutional Assessment and Studieswww.web.virginia.edu/iaas


Recommended