+ All Categories
Home > Education > Strategies for Implementing Program-Level Assessment through Blackboard Outcomes

Strategies for Implementing Program-Level Assessment through Blackboard Outcomes

Date post: 13-Apr-2017
Category:
Upload: kaitlin-walsh
View: 125 times
Download: 1 times
Share this document with a friend
35
Strategies for Implementing Program-Level Assessment through Outcomes Jeremy Anderson Manager, Academic Computing Dr. Kaitlin Walsh Instructional Designer/Technologist American International College
Transcript

Strategies for Implementing Program-Level Assessment through OutcomesJeremy AndersonManager, Academic Computing

Dr. Kaitlin WalshInstructional Designer/Technologist

American International College

About American International College

Who are you?

• Faculty, staff, administration?

• What is your experience using Outcomes?– No experience

– Heard of it

– Investigated it

– Adopted it

• Three burning questions?

The Challenge

Program Assessment Needs to Be:

• Systemic

• Sustainable

• And…

Systematic

Team-Based Approach

Assessment at AIC

President

Provost

Institutional Effectiveness

Deans & Departments

Senate

Assessment Committee

Faculty Secretary*

EVP Administration

IT

Working Across the Organization

Assessment Committee

Working Group

Pilot

Methods to Scale -Preparation

Getting the Framework in Place

External Feedback

• Assessment Plan Template

• Program Review Schedule

• Assessment Calendar

Internal Planning

• Other adopters

• Bb Consulting

Assessment Plan Template

Develop outcomes

Assessment Plan & Bb Outcomes

Gather Evidence &

Evaluate

Analyze & Discuss

Improve Instruction

Fall

SpringSum/Fall

Next AY

Assessment Calendar

Institutional Support

• Assessment Day (2)

• Assessment Hour (1)

• ½ position to support Outcomes

Methods to Scale -Adoption

Questions for New Adopters – Readiness Check

1. Do you have an assessment lead?

2. Are your goals ready?

3. Are your assignments ready?

4. Are your rubrics ready?

5. What is your assessment calendar?a. Frequency of each goal

b. Frequency of full cycle

Questions for New Adopters – Operational Decisions

1. What is your collection period?

2. Who will complete evaluation sessions? How many must complete evaluations?

3. What sampling level will you require?

4. Will you need to keep samples of student work with your reports?

Starting Small

Undergraduate Psych

• All outcomes

• 1 course

• Capstone assignment

• Independent evaluator

• Department rubric

MBA

• 2 outcomes

• 2 courses (1 o/c)

• Milestone assignments

• Independent evaluators

• AAC&U Value rubric

• Department rubric

Training

• Introductory section for faculty and chairs in several prospective departments

• One-on-one training with individual assessment coordinators in pilot departments

• Reconvened for follow-up training

• Next step: documentation

Roles – What Can They Do?

Alignments Goals Surveys RubricsEvidence

Collection

SystemAdministrator

AddRemove

CreateEdit

Delete

CreateEdit

Delete

CreateEdit

Delete

CreateEdit

Delete

AssessmentAdministrator

AddCreate

EditDelete

CreateEdit

Delete

CreateEdit

Delete

CreateEdit

Delete

Assessment Manager

None Run Reports None NoneCreate

EditDelete

RubricManager

None Read-Only NoneCreate

EditDelete

None

SurveyAuthor

None Read-OnlyCreate

EditDelete

None None

Goals Manager

NoneCreate

EditDelete

None None None

Roles – What Can They See?

Admin Tab Outcomes TabOutcomes

DashboardSurveys

SystemAdministrator

Yes Yes All All

AssessmentAdministrator

Yes Yes All Only their own

Assessment Manager

No YesGoals &

AssessmentsOrganizations

None

RubricManager

No No No No

SurveyAuthor

No No No Only their own

Goals Manager

Yes No No No

Caveat – Class and Program Size

• AIC’s class and program sizes are generally very small, therefore no need to limit sample size when evaluating outcomes.– Also no real need to make

evaluations anonymous - instructor often involved in evaluation

• Small program sizes also impact outcomes planning– Limits to number of outside

evaluators and assessment coordinators

Source: US News & World Report

Things We Considered (But Didn’t Adopt Yet)

• Help Desk Ticket to submit Outcomes changes

• Organization to distribute reports

• Having faculty input own outcomes

General Education Outcomes

General Education at AIC

• Largely in flux

• Writing Intensive Courses

• Shared rubric for Title III

• AAC&U’s VALUE Rubrics

Pilot

• COM2200 – Information & Technology

• Writing Intensive Course

• 4 sections, 3 faculty, 2 evaluators

Next Steps

• Develop anchor set

• Train on the rubric

• Include other WICs

Next Steps & Lessons Learned

Building on What We Have

• New programs

• Assessment Hours throughout the year

• Expanding within departments

• ½-time position

Lessons Learned – Standardize Everything!

Lessons Learned – Rubric Design

More on Reports

• Depending on the type of data you need, you may need to adjust your reports.– Convert rows to columns

– Cleanup for SPSS

– May be no problem for some faculty

– Need different data? Submit an enhancement request

Lessons Learned – Assignment Submission

• Needed to train some faculty on collecting assignments in Bb

• No direct submit

• No multiple attempts

• Group assignments not recommended

Unexpected benefit!

Using outcomes rubrics motivated faculty to explore the use of Blackboard’s rubric tool within their own courses.


Recommended