+ All Categories
Home > Documents > 2008 Library Assessment Conference, Seattle, WA Using iSkills to measure instructional efficacy: One...

2008 Library Assessment Conference, Seattle, WA Using iSkills to measure instructional efficacy: One...

Date post: 14-Dec-2015
Category:
Upload: yesenia-howey
View: 213 times
Download: 0 times
Share this document with a friend
Popular Tags:
21
2008 Library Assessment Conference, Seattle, WA Using iSkills to measure instructional efficacy: One example from the University of Central Florida Penny Beile University of Central Florida
Transcript
Page 1: 2008 Library Assessment Conference, Seattle, WA Using iSkills to measure instructional efficacy: One example from the University of Central Florida Penny.

2008 Library Assessment Conference, Seattle, WA

Using iSkills to measure instructional efficacy: One example from

the University of Central Florida

Penny Beile

University of Central Florida

Page 2: 2008 Library Assessment Conference, Seattle, WA Using iSkills to measure instructional efficacy: One example from the University of Central Florida Penny.

Background

Institutional description SACS Quality Enhancement Plan Accreditation driven initiative Four programs initially selected Multiple assessments

Page 3: 2008 Library Assessment Conference, Seattle, WA Using iSkills to measure instructional efficacy: One example from the University of Central Florida Penny.

A (very brief) Comment on Types of Direct Measures…

Objective Interpretive

Costs $$ to purchase Labor to score

Administration Large scale Smaller numbers

Results Wide and thin Narrow and deep

Domain Knowledge Performance

Page 4: 2008 Library Assessment Conference, Seattle, WA Using iSkills to measure instructional efficacy: One example from the University of Central Florida Penny.

Methods - Nursing

Goal is to collect baseline data, design curricular interventions, reassess to evaluate instructional efficacy

Nursing students matriculate as a cohort, ~120 students enter each semester for BSN program

Analyze at cohort level and across cohorts

Page 5: 2008 Library Assessment Conference, Seattle, WA Using iSkills to measure instructional efficacy: One example from the University of Central Florida Penny.

Program Entry Program Exit

Cohort 1Baseline, no intervention,Design instruction to target deficiencies (2007)

Maturation, possibly control for that later (non- instructional variables) (2009)

Cohort 2 Intervention effect (2008) Growth in program (2010)

Cohort 3 Intervention effect (2009)Growth in program (2011)

Cohort 4 Intervention effect (2010) Growth in program (2012)

2007-2012 Plan

Page 6: 2008 Library Assessment Conference, Seattle, WA Using iSkills to measure instructional efficacy: One example from the University of Central Florida Penny.

Program Entry Program Exit

Cohort 1Baseline, no intervention,Design instruction to target deficiencies (2007)

Maturation, possibly control for that later (non- instructional variables) (2009)

Cohort 2 Intervention effect (2008) Growth in program (2010)

Cohort 3 Intervention effect (2009)Growth in program (2011)

Cohort 4 Intervention effect (2010) Growth in program (2012)

Still Early in the Project

Page 7: 2008 Library Assessment Conference, Seattle, WA Using iSkills to measure instructional efficacy: One example from the University of Central Florida Penny.

Results of 2007 Administration

114 students in class, 107 completed iSkills

Scores ranged from 485 to 625, m=561.36, sd=29.94

Established cut score of 575

Page 8: 2008 Library Assessment Conference, Seattle, WA Using iSkills to measure instructional efficacy: One example from the University of Central Florida Penny.

How Data are Being Used

To identify where instruction is needed Over time, to assess efficacy of

interventions and instructional models To provide evidence that we are meeting

our instructional goals

Page 9: 2008 Library Assessment Conference, Seattle, WA Using iSkills to measure instructional efficacy: One example from the University of Central Florida Penny.

Implications for Practice

Assessment offers a critical and objective way to see how effective we are in meeting our instructional goals– Do libraries contribute to the academic

mission of the institution?– How effective are our current models?– Lead us to explore new frameworks

Page 10: 2008 Library Assessment Conference, Seattle, WA Using iSkills to measure instructional efficacy: One example from the University of Central Florida Penny.

2008 Library Assessment Conference, Seattle, WA

Using iSkills to Measure Instructional Efficacy: The CSU Experience

Stephanie BrasleyCalifornia State University, Office of the Chancellor

[email protected]

Page 11: 2008 Library Assessment Conference, Seattle, WA Using iSkills to measure instructional efficacy: One example from the University of Central Florida Penny.

Background- Calif. State University

23 Campuses Information Competence (IC) Pioneers Sponsoring Partner with ETS on iSkills Assessment IC Grant Program, iSkills Focus, 2006-2008

– 9 Campuses Snapshot of Use

– California Maritime Academy – Small campus– CSU Los Angeles – Medium-Sized campus– San Jose State – Large campus

Page 12: 2008 Library Assessment Conference, Seattle, WA Using iSkills to measure instructional efficacy: One example from the University of Central Florida Penny.

California Maritime Academy (CMA) - Approach

Mindy Drake – [email protected]

Advanced test used as pre-test Goal: Baseline Data and Current skills-set Test Groups:

– Freshmen in Com 100 and Engr 120• 151 Tested; 137 analyzed: 57% of incoming frosh

– Seniors in capstone courses• 80 tested; 49 analyzed: 32% of Senior population

Page 13: 2008 Library Assessment Conference, Seattle, WA Using iSkills to measure instructional efficacy: One example from the University of Central Florida Penny.

CMA - Deliverables

– Information Fluency and Communication Literacy Learning Objectives

– Rubric for assessing the development of information and communication technology skills within

course assignments

– Modified COM 100 & ENG 120 assignments and supplemental materials

– Syllabus and iSkills-influenced learning objectives of the newly developed LIB 100: Information

Fluency in the Digital World course

Page 14: 2008 Library Assessment Conference, Seattle, WA Using iSkills to measure instructional efficacy: One example from the University of Central Florida Penny.

CMA – Summary Results

iSkills data used in 4 ways– Development of learning objectives– Baseline for ICT Literacy of incoming

freshmen– Determining ICT Literacy skill-set of current

seniors– Catalyst for innovation in design of ICT

literacy instructional activities for freshmen

Page 15: 2008 Library Assessment Conference, Seattle, WA Using iSkills to measure instructional efficacy: One example from the University of Central Florida Penny.

CMA – More Results

Page 16: 2008 Library Assessment Conference, Seattle, WA Using iSkills to measure instructional efficacy: One example from the University of Central Florida Penny.

CSU Los Angeles – ApproachCatherine Haras - [email protected]

Advanced test used as pre-post test Goal: Evaluate ICT literacy-related

instructional interventions Target Group: 234 students enrolled in

Business Communications (Juniors and Seniors)– Approx. 60% transfer students and 70% ESL

students Study run over three quarters (Fall 2006,

Winter 2007, Spring 2007)

Page 17: 2008 Library Assessment Conference, Seattle, WA Using iSkills to measure instructional efficacy: One example from the University of Central Florida Penny.

CSU Los Angeles StudyMethodology

Treatment (day)Instructor A

Business 305 Curriculum1.5 hr Library Lecture

Two Library WorkshopsInformation Literacy project

Treatment (evening)Instructor B

Business 305 Curriculum1.5 hr Library Lecture

Two Library Workshops

ControlInstructor A

Business 305 Curriculum1.5 hr Library Lecture

iSkillsPretest

iSkillsPosttest

Page 18: 2008 Library Assessment Conference, Seattle, WA Using iSkills to measure instructional efficacy: One example from the University of Central Florida Penny.

CSU Los Angeles - Summary Results

510

520

530

540

550

560

570

580

590

Pretest Posttest

Treatment (day)

Control

Treatment (evening)

Page 19: 2008 Library Assessment Conference, Seattle, WA Using iSkills to measure instructional efficacy: One example from the University of Central Florida Penny.

CSU Los Angeles – More Results

Submitted June, 2008 – J. of Education for Business

500

510

520

530

540

550

560

570

580

590

600

Pre-test Post-test

Treatment (Day) - English

Treatment (Evening) - English

Control - English

Treatment (Day) - Another

Treatment (Evening) - Another

Control - Another

Page 20: 2008 Library Assessment Conference, Seattle, WA Using iSkills to measure instructional efficacy: One example from the University of Central Florida Penny.

San Jose State UniversityToby Matoush – [email protected]

Advanced test used as a post test Goal: Determine gaps and develop

instructional interventions Test Groups:

– Freshmen: MUSE students (59); Eng. 1B (100)– Sophomores - Juniors

Page 21: 2008 Library Assessment Conference, Seattle, WA Using iSkills to measure instructional efficacy: One example from the University of Central Florida Penny.

Grade N Mean Std. Deviation 95% Confidence Interval for Mean Minimum Maximum

Lower Bound Upper Bound

10th grade2 555.00 21.213 364.41 745.59 540 570

12 grade4 520.00 48.305 443.14 596.86 480 590

Freshman154 554.77 31.645 549.73 559.81 455 620

Sophomore93 547.90 26.696 542.41 553.40 465 615

Junior192 548.15 32.345 543.55 552.76 470 615

Senior149 548.56 32.290 543.33 553.78 475 620

Grad4 580.00 44.347 509.43 650.57 525 625

Other5 544.00 29.240 507.69 580.31 515 575

Total603 549.92 31.623 547.39 552.45 455 625


Recommended