Robert StempelCollege of
Public Health and Social Work
Assessment Retreat
October 22, 2012
Office of Academic Planning and Accountability (APA)
Florida International University
Introduction
• Susan Himburg, Director of Accreditation, APA
• Mercedes Ponce, Director of Assessment, APA
• Katherine Perez, Associate Director of Assessment, APA
• Bridgette Cram, Assessment Coordinator• Barbara Anderson & Claudia Grigorescu, GAs, APA
Assessment in FIU
Retreat Agenda
9:00 am – 9:15 IntroductionContinental BreakfastWelcome
9:15 am – 11:00 Session I: Assessment Overview
1. Overview of FIU Assessment 2. SLOs, POs, ALCs, CCOs, Global Learning3. Tying in competencies to outcomes4. Using direct and indirect measures to assess competencies5. Model/Examples of Outcomes with Methods – Capstone courses
11:00 am – 12:00 pm Session II: Specialized Accreditation and FIU Assessment
1. Break-out sessions addressing disciplines2. Aligning program assessment to accreditation requirements3. Using data that is helpful for departments & accreditation
12:00 pm – 1:00 Q & A SessionLunch
1:00 pm. – 2:00 pm Session III: Online Assessment Management & Reports1. How to enter assessments2. Creating reports
Overview of Assessment: Cycle
Outcomes
Assessment Methods
Data Collection
Analysis of Results
Improvement Strategies
• Use program mission and goals to help identify outcomes• Use SMARTER Criteria for creating outcomes
Step 1: Identify Specific Outcomes
• Determine how to assess the learning outcomes within the curriculum (by curriculum mapping)
Step 2: Determine Assessment Methods
• Collect evidence
Step 3: Gathering Evidence
• Organize and process data• Discuss significance of data and report it
Step 4: Review & Interpret Results
• Collaborate with faculty to develop improvement strategies based on results
Step 5: Recommend Improvement Actions
• Follow-up on improvement strategies, implement them, and report progress
• Restart the cycle to assess the impact of actions
Step 6: Implement Actions and Reassess
Overview of Assessment: Timeline
Deadline Task Due
Summer2012 May 15 Interim Report Due
Fall2013 Oct 15 Full Reports/Plans Due
Summer2014 May 15 Interim Report Due
Cycle B
Student Learning Outcomes (SLOs)
• Student Learning Outcomes (SLOs) is a program related outcomes• SLOs focus on students’ knowledge and skills expected upon completion of an
academic degree program• “A learning outcome is a stated expectation of what someone will have
learned” (Driscoll & Wood, 2007, p. 5)• “A learning outcome statement describes what students should be able to
demonstrate, represent, or produce based on their learning histories” (Maki, 2004, p. 60)
• “A learning outcome describes our intentions about what students should know, understand, and be able to do with their knowledge when they graduate” (Huba & Freed, 2000, p. 9-10)
• What should my students know or be able to do at the time of graduation?
Program Learning Outcomes (POs)
• Program Outcomes (POs) focus on expected programmatic changes that will improve overall program quality for all stakeholders (students, faculty, staff)• Program outcomes illustrate what you want your program
to do. These outcomes differ from learning outcomes in that you discuss what it is that you want your program to accomplish. (Bresciani, n.d., p. 3)
• Program outcomes assist in determining whether the services, activities, and experiences of and within a program positively impact the individuals it seeks to serve.
• Emphasizes areas such as recruitment, professional development, advising, hiring processes, and/or satisfaction rates.
• How can I make this program more efficient?
Administrative Assessment (AAs)
• Administrative Areas Dean’s Office Centers/Institutes
• Outcomes aligned to: Unit mission/vision Annual goals University mission/vision Strategic plan
• Outcomes focus on each of the following areas (all 4 required for Dean’s Office):
Administrative Support Services Educational Support Services Research Community Service
• Student learning is also assessed for units providing learning services to students (e.g., workshops, seminars, etc.)
Matrixes I:Effective Outcomes
Student Learning Outcomes (SLOs)
SMARTER Criteria• Specific – Is the expected behavior and skill clearly
indicated?• Measureable – Can the knowledge/skill/attitude be
measured?• Attainable – Is it viable given the program courses and
resources?• Relevant – Does it pertain to the major goals of the
program?• Timely – Can graduates achieve the outcome prior to
graduation?• Evaluate – Is there an evaluation plan?• Reevaluate – Can it be evaluated after improvement
strategies have been implemented?
Student Learning Outcomes (SLOs)
Student Learning Outcomes (cognitive, practical, or affective)
1. Can be observed and measured2. Relates to student learning towards the end of the program (the graduating student)3. Reflects an important higher order concept
Formula: Who + Action Verb + What
Dietetics and nutrition students will determine the preparation, storage, handling of food products and adequacy of meals, meal service and delivery of foods to customer/client.
Student Learning Outcomes (SLOs)
• 90% of dietetics and nutrition students in their final year will pass the comprehensive dietetics exam, covering the didactic curriculum as set forth by the accrediting agency ACEND of The American Dietetic Association (ADA).
• Environmental & Occupational Health - EOH doctoral candidates will write and defend a dissertation proposal, and demonstrate proficiency in the content and methodological areas related to their proposed research study.
• Social work undergraduate students will identify diversity and difference in practice in their final project for course x.
Strong Examples
• Students will be able to demonstrate communication skills from dietetics and nutrition.• Graduates will demonstrate a basic knowledge of ethics in public health.• Social work students understand oppression and injustice.
Weak Examples
Program Outcomes (POs)
Program Outcomes (efficiency measures)
1. Can be observed and measured2. Related to program level goals that do not relate to student learning (e.g., student services, graduation, retention, faculty productivity, and other similar)
Formula: Who + Action Verb + What
Full-time students will graduate from the doctoral program in Public Health within 6 years of program admission.
Program Outcomes (POs)
• The department’s advising office will schedule student appoints within 2 weeks of initial contact.
• Students will be satisfied with services provided by the career placement office in the Dietetics and Nutrition department.
• Faculty in Social Works will be involved with a minimum of 4 public events per semester.
Strong Examples
• Graduation rates will increase. • Surveys will be used to assess student satisfaction.• Career services will work with student placements.
Weak Examples
Streamlining Outcomes with Program Goals
Goals
Accreditation
Course Outcomes
Program Mission and Goals•Question: Do the mission and goals match the knowledge/skills expected for graduates?
•Task: Break down mission and goals; Verify these are reflected in the outcomes.
Accreditation Principles•Question: What are the competencies required for assessment and how do they match my program mission/goals?•Task: Review required competencies for accreditation or other constituencies; Streamline requirements and outcomes.
Course Outcomes•Question: How are the program’s learning outcomes reflected in the courses?•Task: Review course syllabi and outcomes to check for alignment; Develop a curriculum map.
Aligning Accreditation Competencies with the FIU Assessment Process
• Why Alignment Matters
– Assessment can seem like a burdensome process, but if all of your assessment needs are aligned, the assessment process only needs to be completed once.
• How APA Can Help– The competencies required by your specialized accrediting agencies can be easily
combined with the SACS requirements. APA can assist in developing outcomes that are relevant for both purposes.
– TracDat is fully customizable and the APA staff can include custom alignments according to your specialized accreditation needs
• A “one stop shop” for all of your assessment needs
• Serves as a “data warehouse”, store all of your assessment data in one place!
• Reports can be run based on which alignments you are looking for, for example: outcomes aligned with your specialized accreditation, SACS, or both
Getting Started with Alignment
• Identify the core competencies that are required by your specialized accrediting agency
• Do any of these outcomes fulfill the SACS requirements? (Content Knowledge, Critical Thinking, Communication (Oral/Written), Technology)
• For competencies that do not fall into one of the above categories, you should develop outcomes for them and enter them into TracDat accordingly.
Curriculum Mapping
Curriculum maps help identify where within the curriculum learning outcomes are addressed and provide a means to determine whether the
elements of the curriculum are aligned.
Planning
Curriculum Learning Outcomes
Identifying
Gaps ImprovementAreas Measures
• A good curriculum map ensures that all program stakeholders understand how your outcomes align with certain course throughout the curriculum.
• For specialized accreditation alignment purposes, your curriculum map should include all competencies required, not just those that are being used for FIU Assessment purposes
Tying Outcomes to Curriculum: Curriculum Maps
Make Changes as AppropriateIf there are any gaps in teaching or assessing learning outcomes
Create a Curriculum MapCourses in one axis and learning outcomes in the other
Identify Major Assignments within CoursesDiscuss how accurately they measure the learning outcomes
Collaborate with Faculty and Staff MembersDelineate where the learning outcomes are taught, reviewed, reinforced, and/or evaluated within each of the required courses
Collect All Relevant or Required InformationEX: Course syllabi, curriculum requirements, and major learning competencies
Tying Outcomes to Curriculum: Curriculum Maps
Competency/Skill Introductory Course
Methods Course
Required Course 1
Required Course 2
Required Course 3
Required Course 4
Capstone Course
Content SLO 1 Introduced Introduced Reinforced Reinforced Mastery/Assessed
Content SLO 2 Introduced Reinforced Introduced Reinforced Mastery/Assessed
Content SLO 3 Introduced Introduced Reinforced Mastery/Assessed
Critical Thinking SLO 1 Introduced Introduced Reinforced
Critical Thinking SLO 2 Introduced Introduced Mastery/Assessed
Communication SLO 1 Introduced Reinforced Mastery/Assessed
Communication SLO 2 Introduced Mastery/Assessed
Integrity / Values SLO 1 Introduced Reinforced Reinforced Mastery/Assessed
Integrity / Values SLO 2 Introduced
•Introduced = indicates that students are introduced to a particular outcome•Reinforced = indicates the outcome is reinforced and certain courses allow students to practice it more•Mastered = indicates that students have mastered a particular outcome•Assessed = indicates that evidence/data is collected, analyzed and evaluated for program-level assessment
*Adapted from University of West Florida, Writing Behavioral, Measurable Student Learning Outcomes CUTLA Workshop May 16, 2007.
Matrixes II:Effective Methods
Choosing Assessment Measures/Instruments
1. Identify Assessment Needs• What are you trying to measure or understand? Every thing from artifacts for
student learning to program efficiency to administrative objectives.• Is this skill or proficiency a cornerstone of what every graduate in my field
should be able know or do?
2. Match Purpose with Tools• What type of tool would best measure the outcome (e.g., assignment, exam,
project, or survey)? • Do you already have access to such a tool? If so, where and when is it
collected?
3. Define Use of Assessment Tool• When and where do you distribute the tool (e.g., in a capstone course right
before graduation)? • Who uses the tool (e.g., students, alumni)?• Where will the participants complete the assessment?• How often do you use or will use the tool (e.g., every semester or annually)?
Understanding Types of Measurements
• Direct versus Indirect Measures• Direct Measure: Learning assessed using tools that measure direct observations of
learning such as assignments, exams, and portfolios; Precise and effective at determining if students have learned competencies defined in outcomes
• Indirect Measure: Learning assessed using tools that measure perspectives and opinions about learning such as surveys, interviews, and evaluations; Provide supplemental details that may help a program/department understand how students think about learning and strengths/weaknesses of a program
• Program Measures versus Course Measures• Program Measure: Provides data at the program level and enables department to
understand overall learning experience; Includes data from exit exams and graduation surveys
• Course Measure: Provides data at the course level and enables professors to determine competencies achieved at the end of courses; Includes data from final projects/presentations and pre-post exams
• Formative Measures versus Summative • Formative Measures: Assessing learning over a specific timeline, generally
throughout the academic semester or year • Summative Measures: Assessing learning at the end of a semester, year or at
graduation
Examples of Measures/Instruments
Course Level•Essays•Presentations•Minute papers•Embedded questions•Pre-post tests
Program Level•Portfolios•Exit exams•Graduation surveys •Discipline specific national exams
Direct Measures
• Standardized exams• Exit examinations• Portfolios • Pre-tests and post-tests• Locally developed exams• Papers• Oral presentations• Behavioral observations• Thesis/dissertation• Simulations/case studies• Video taped/audio taped assignments
Indirect Measures
• Surveys or questionnaires• Student perception• Alumni perception• Employer perception
• Focus groups• Interviews• Student records
Institution-Level Assessments
NSSEFSSE
Proficiency Profile
(Kuh & Ikenberry, 2009, p. 10)
1. Graduating Master’s and Doctoral Student Survey
2. Graduating Senior Survey3. Student Satisfaction Survey4. Global Learning Perspectives Inventory
Alumni Survey
Case Response
Assessment
Introduction to Rubrics
Definition• Rubrics are tools used to score or assess student work using well-defined criteria and
standards.
Common Uses• Evaluate essays, short answer responses, portfolios, projects, presentations, and other
similar artifacts.
Benefits• Learning expectations clear for current and future faculty teaching the course• Transparency of expectations for students • Providing meaningful contextual data as opposed to only having grades or scores• Providing students with clearer feedback on performance (if scored rubrics are handed
back to students)• Useful for measuring creativity, critical thinking, and other competencies requiring deep
multidimensional skills/knowledge• Increase of inter-rater reliability by establishing clear guidelines for assessing student
learning• Possibility of easy, repeated usage over time• Inexpensive development and implementation
Steps for Developing Rubrics
1. Identify Competencies
• Narrow down the most important learning competencies you are trying to measure. Ask yourself what you wanted students to learn and why you created the assignment.
• List the main ideas or areas that would specifically address the learning competencies you identified.
2. Develop a Scale
• Think of the types of scores that would best apply to measuring the competencies (e.g., a 5 point scale from (1)Beginning to (5)Exemplary).
• Scales depend on how they would apply to the assignment, the competencies addressed, and the expectations of the instructor.
3. Produce a Matrix
• Using the information gathered from the previous two steps, you can create a matrix to organize the information.
• Optional: describe the proficiencies, behaviors, or skills each student will demonstrate depending on the particular criterion and its associated performance scale ranking or score.
Rubric Template
List All of the Competencies Measu
red
Performance Scale
1
Unacceptable2
Acceptable3
Excellent POINTS
Criterion 1
Competency not demonstrated
Competency demonstrated
Competency demonstrated at an
advanced level
Criterion 2 Competency not demonstrated
Competency demonstrated
Competency demonstrated at an
advanced level
Criterion 3 Competency not demonstrated
Competency demonstrated
Competency demonstrated at an
advanced level
AVERAGEPOINTS
Reporting Results
Summary of Results
Format Narrative Tables or charts
Analysis/Interpretation of results Explain results in a narrative form
by interpreting results or using qualitative analysis of the data.
Every student learning outcome must have at least:
One set of results One student learning improvement
strategy (use of results)
Reporting Results
Non-Examples:1. Our students passed the dissertation defense on the first attempt. 2. All the students passed the national exam.3. Criteria met.
Examples:1. 75% of the students (n=15) achieved a 3 or better on the 5 rubric
categories for the capstone course research paper. Average score was: 3.45
2. Overall, 60% of students met the criteria (n=20) with a 2.65 total average. The rubric’s 4 criteria scores were as follows:
o Grammar: 3.10 (80% met minimum criteria)o Research Questions: 2.55 (65% met minimum criteria)o Knowledge of Topic: 2.50 (55% met minimum criteria)o Application of Content Theories: 2.45 (60% met minimum criteria)
Reporting Results
Frequency of Student Results for all Four Categories of the Research Paper (N=20 Students)
1 NOVICE
2APPRENTICE
3PRACTITIONER
4EXPERT
TOTAL MEETING CRITERIA
Grammar N=2 (10%)
N=2 (10%) N=8 (40%) N=8 (40%)
3.10 average (62 points)80% (n=16) met criteria
Essay Structure
N=4 (20%)
N=3 (15%) N=11 (55%) N=2 (10%)
2.55 average (51 points)65% (n=13) met criteria
Coherence of Argument
N=2 (10%)
N=7 (35%) N=10 (50%) N=1 (5%)
2.50 average (50 points)55% (n=11) met criteria
Research Based Evidence
N=3 (15%)
N=5 (25%) N=12 (60%) N=0 (0%)
2.45 average (49 points)60% (n=12) met criteria
AVERAGE TOTAL
2.65 average score65% (n=11) met criteria
Reporting Results: Formulas
N = 20 (students) 1 2 3 4 TOTAL MEETING CRITERIA
Grammar N=2 (10%)2/20 = .10.10 (100) = 10%
N=2 (10%)2/20 = .10.10 (100) = 10%
N=8 (40%)8/20 = .40.40 (100) = 40%
N=8 (40%)8/20 = .40.40 (100) = 40%
3.10 average (62 points)2(1) + 2(2) + 8(3) + 8(4) = 62 62/20 = 3.1080% (n=16) met criteria40% + 40% = 80% (8+8=16)
Essay Structure N=4 (20%)4/20 = .20.20 (100) = 20%
N=3 (15%)3/20 = .15.15 (100) = 15%
N=11 (55%)11/20 = .55.55 (100) = 55%
N=2 (10%)2/20 = .10.10 (100) = 10%
2.55 average (51 points)4(1) + 3(2) + 11(3) + 2(4) = 5151/20 = 2.5565% (n=13) met criteria55% + 10% = 65% (11+2=13)
Coherence of Argument
N=2 (10%)2/20 = .10.10 (100) = 10%
N=7 (35%)7/20 = .35.35 (100) = 35%
N=10 (50%)10/20 = .50.50 (100) = 50%
N=1 (5%)1/20 = .05.05 (100) = 5%
2.50 average (50 points)2(1) + 7(2) + 10(3) + 1(4) = 50 50/20 = 2.5055% (n=11) met criteria50% + 5% = 55% (10+1=11)
Research Based Evidence
N=3 (15%)3/20 = .15.15 (100) = 15%
N=5 (25%)5/20 = .25.25 (100) = 25%
N=12 (60%) 12/20 = .60.60 (100) = 60%
N=0 (0%)0/20 = 00 (100) = 0%
2.45 average (49 points)3(1) + 5(2) + 12(3) + 0(4) = 49 49/20 = 2.4560% (n=12) met criteria60% + 0% = 60% (12+0=12)
AVERAGE TOTAL 3.10 + 2.55 + 2.50 + 2.45 = 10.610.6/4 = 2.65
2.65 average score
80% + 65% + 55% + 60% = 260260/4 = 65%16 + 13+ 11+ 12= 4343/4 =10.75 = 11
65% (n=11) met criteria
Using Results for Improvements
DO DON’T•DO focus on making specific improvements based on faculty consensus.
•DON’T focus on simply planning for improvements or making improvements without faculty feedback.
•DO focus on improvements that will impact the adjoining outcome.
•DON’T focus on improvements that are unrelated to the outcome.
•DO use concrete ideas (e.g., include specific timelines, courses, activities, etc.).
•DON’T write vague ideas or plan to plan.
•DO state strategies that are sustainable and feasible.
•DON’T use strategies that are impossible to complete within two years considering your resources.
•DO use strategies that can improve the curriculum and help students learn outside of courses.
•DON’T simply focus on making changes to the assessment measures used.
• Mandate or create new courses• Eliminate/merge course(s)• Change degree requirements
Curriculum Changes
• Change course descriptions• Change syllabi to address specific learning outcomesCourse Objectives
• Add new assignments to emphasize specific competencies• Increase time spent teaching certain content• Change themes, topics, or units
Within Course Activities
Using Results for Improvements: Student Learning
Using Results for Improvements: Student Learning
• Use outside resources to enhance student learning (e.g. refer students to the Center for Academic Excellence)University Resources
• Publish or present joint papers• Provide feedback on student work, advising, office hours• Disseminate information
(e.g. distributing newsletters, sharing publications, etc.)
Faculty- Student Interaction
• Create/maintain resource libraries(e.g. books, publications, etc.)
• Offer professional support or tutoring• Provide computer labs or software
Resources for Students
Using Results for Improvements: Program Outcomes
• Obtain financial resources: funding, grants, etc.• Hire new faculty/staff• Reduce Spending
Financial
• Change recruiment efforts/tactics• Increase enrollmentEnrollment
• Change policies, values, missions, or conceptual frameworks of a program or unit
Policy Changes
• Add or expand services to improve quality• Add or expand processes to improve efficiencyServices
Using Results for Improvements: Program Outcomes
• Conduct research• Gather and/or disseminate information• Produce publications or presentations
Research and Information
• Create professional development opportunities• Attend professional conferences or workshops
Professional Development
• Establish collaborations across stakeholders or disciplines• Provide services or establish links to the community Engagement
• Acquire new equipment, software, etc.• Provide resources to specific groupsResources
Q & A Session
References for Cover Pictures1. http://www.superscholar.org/rankings/online/best-public-health-degrees/2. http://www.collegeonline.com/news-about-online-colleges/social-work-online/3. http://greensandberries.squarespace.com/greens-and-berries/2010/7/28/civic-dietetics.html
References
Thank you for attending.
Contact Us:Katherine [email protected]
Bridgette [email protected]
Departmental Information:[email protected] 112