Integrating Formative and Summative: Assessment Approaches in
Student Support WorkAssessment in Action Conference
Assessment
Spring 3-16-2018
Ciji Heiser Western Michigan University,
[email protected]
Follow this and additional works at:
https://scholarworks.wmich.edu/assessment_day
Part of the Educational Assessment, Evaluation, and Research
Commons
WMU ScholarWorks Citation WMU ScholarWorks Citation Heiser, Ciji,
"Integrating Formative and Summative: Assessment Approaches in
Student Support Work" (2018). Assessment in Action Conference. 60.
https://scholarworks.wmich.edu/assessment_day/60
This Presentation is brought to you for free and open access by the
Assessment at ScholarWorks at WMU. It has been accepted for
inclusion in Assessment in Action Conference by an authorized
administrator of ScholarWorks at WMU. For more information, please
contact
[email protected].
HELLO! I’m Ciji.
Choose your own adventure.
2
LANGUAGE Let’s start by clarifying what we mean by formative &
summative assessment
Formative Assessment takes place before or during the
implementation of a program or service. This type of
assessment
asks, are we on track? (stop, start, continue)
4
Summative assessment looks at the overall impact of a program or
service.
This type of assessment asks what the program or service
achieved.
5
ASSESSMENT PROCESS Let’s identify the different aspects of the
assessment process.
7
2002) Double looped
Identifying Outcomes
Presentation Notes
Measureable: “is it measurable, meaning is it identifiable, not
necessarily countable?” Meaningful: “is it meaningful to the
organization and the students it serves?” Manageable: “do we really
have the means to deliver and evaluate the intended outcome?” -(as
cited in Bresciani et al., 2009, p. 38).
Operational Outcomes: These outcomes related to our programs,
processes, and
services help us demonstrate accountability.
11
Presenter
Presentation Notes
Measuring outcomes enables us to track continuous improvement on
our strategic plan in the areas of process, results, and
satisfaction among others. These outcomes reflect what is expected
of a program.
Operational Outcome Examples Dining Services will increase the
number of visits to on-campus dining locations by 5% this
year.
Athletics will increase overall event ticket sales by 10% this
year.
WMU Signature will engage 1,500 first year students in a Signature
designated program during the 2018-2019 academic year.
12
What operational outcomes can you think of that might
represent
your unit?
Learning Outcomes: These outcomes “describe expected learning and
behavior in precise terms,
providing guidance for what needs to be assessed” (Banta &
Palomba, 2015, p. 66).
14
15
Presentation Notes
In writing learning outcomes, staff can use the ABCD structure:
Audience: for whom is the outcome written? Who will exhibit
learning? Behavior: what will they be able to do after the
instruction/program/experience? Condition: what are the
circumstances under which outcomes are demonstrated? Degree: what
is a minimum acceptable standard? -(Heinich et al., 2002)
16
Learning Outcomes Examples Following a training session on mental
health, students in attendance will be able to identify five campus
resources to support their mental health.
Students who attend a First Year Seminar will list and locate three
campus resources for their academic success.
17
Presentation Notes
What is our a, b, c, and d in each of these examples?
Audience’s choice: Would you like to spend time working on learning
outcomes and get feedback from your peers or move on to the
next
section?
20
posttest
21
Assessment Tools Focus Groups Focus groups are great for digging
deeper into the material and figuring out the “what” factor.
Rubrics Articulate clear standards or expectations, guides &
enhances performance, criteria for progress are clear, provide
feedback for success.
Minute Paper Great for a quick touch point – very fast. Clarify
murkiest point or strongest learning outcome.
22
Interviews Interviews are like one on one focus groups, great for
digging deeper into the material and figuring out the “what”
factor.
Surveys Measure satisfaction/effectiveness and can take many forms
such as multiple choice, a checklist, ranking, and open-ended
questions.
Journal Written reflection on an experience
Assessment Tools: Retrospective Pre and Posttest
23
How you write your learning outcome will drive how you
measure your learning outcome.
26
Practice Time! Every year facilities does a large scale
recycling
program at the end of the year. This year, we are thinking about
doing a smaller scale recycling program and would like to know if
there is any interest from students. What would you advise?
28
Practice Time! We have heard through anecdotal data that
international students are really struggling with their break
housing options. How would you recommend we learn more
international student perspectives and ideas around break
housing?
29
Practice Time! You just hosted a training for a large group of
student
leaders and you want to know if they learned what you set out to
teach them. How do you proceed?
INTERPRET EVIDENCE Once you have data, how do you make meaning out
of it?
“The goal is to turn data into information, and information into
insight.” – Carly Fiorina, former
executive, president, and chair of Hewlett-Packard Co.
Analyzing Data Qualitat ive If you can do pinterest, you
can code data. Themes
- Explain or validate Approaches to coding Consistency Double,
triple coding Easier to condense than
expand
strongly agree that you are amazing.
Patterns Aggregate/Disaggregate Frequencies/% Range Mean Skew
33
Presenter
Presentation Notes
Qualitative data Themes Pinterest 7-10 Explain or validate
Approaches to coding Consistency Double, triple coding Easier to
condense than expand Interpreting quantitative data (training data
Q3) Describing patterns Aggregate (whole group) Disaggregate
(specific group) Frequency & Percentages – How many Range of
Responses – Highest and lowest response Mean - Typical value or the
average response Skew – Whether responses cluster around a
particular point or side of the scale Criteria – Percent of
students who meet or exceed an established criterion.
ACTION ITEMS & CHANGE You’ve collected data, analyzed it, and
probably talked about It with someone… now what?
Data- Informed Decision Making This is an art.
35
Practice time When asked, 394 upper class students listed their
goals for the academic year, 114 said they wanted to meet more
people or make friends
When asked to rate their ability to meet other people, students’
average response was a 5.17 in 2015-2016. This is .02 points lower
than 2014- 2015. Our goals is 5.5.
36
Key Finding
• 22% of first year students accurately listed and located a campus
resource for their support. Our target was 85%
Recommended action item.
• Engage students when talking about resources.
• Send a follow up email with resources to students after
orientation.
• Select 5-6 resources to go into detail.
Person responsible
85% list an locate a resource
Survey or Minute paper
Facilities concerns Quiet hours
REPORTING We do not have the time this topic deserves, but how you
convey your findings has huge impact.
43
45
Questions
Great! Time for me to model the
way…
WMU ScholarWorks Citation
HELLO!
LANGUAGE
Slide Number 24
Slide Number 25
Slide Number 26
Analyzing Data