Learner Analytics Panel Session: Deja-Vu all over again?

Post on 27-Jun-2015

189 views 0 download

Tags:

description

Panel presentation at the DET/CHE 2012 conference on November 28, 2012 by Kathy Fernandes (Chico State), James Frazee (San Diego State), Andrew Roderick (SFSU), and Deone Zell (CSU Northridge).

transcript

Panelists: Kathy Fernandes, James Frazee, Andrew Roderick & Deone Zell

Moderator: John Whitmer

DET/CHE 2012November 28, 2012

Learning Analytics Panel Session

D/L slides @ http://goo.gl/UgBYM

Slides: http://goo.gl/UgBYM

IntroductionsKathy Fernandes

Director of Academic Technology, Chico State University

Director of CSU System-wide LMSS Project, CSU Office of the Chancellor

James Frazee

Director, Instructional Technology Services, San Diego State University

Andrew Roderick

Technology Development Manager, San Francisco State University

Deone Zell

Senior Director, Academic Technology, California State University, Northridge

John Whitmer

Associate Director System-wide LMSS Project, CSU Office of the Chancellor

Slides: http://goo.gl/UgBYM

Outline

1. L.A. Context: Definition & Examples

2. Panelist Discussion

3. Audience Q & A

Slides: http://goo.gl/UgBYM

Learning Analytics: Working Definition for Academic Technologists

Analysis of the relationship between the use of academic technology and student achievement– Uses data, frequently “big data”, disaggregated– Builds on (and moves beyond) usage reporting

to examine impact– Focused on making change

Isn’t a yes/no dichotomy; better seen as a spectrum

EXAMPLES OF LEARNING ANALYTICS IN PRACTICE

Chico State Case Study: RELS 180

• Redesigned to hybrid delivery through Academy eLearning

• Enrollment: 373 students (54% increase on largest section)

• Highest LMS usage entire campus Fall 2010 (>250k hits)

• Bimodal outcomes:• 10% increased SLO mastery• 7% & 11% increase in DWF

• Why? Can’t tell with aggregated reporting data

54 F’s

Slides: http://goo.gl/UgBYM

Correlation: LMS Use w/Final Grade

Scatterplot of Assessment

Activity Hits vs. Course Grade

Statistically Significant (strong to weak) r % Variance Sign.Total Hits 0.48 23% 0.0000Assessment activity hits 0.47 22% 0.0000Content activity hits 0.41 17% 0.0000Engagement activity hits 0.40 16% 0.0000Administrative activity hits 0.35 12% 0.0000

Mean value all significant variables 18%

Slides: http://goo.gl/UgBYM

Correlation: Student Char. w/Final Grade

Scatterplot of HS GPA vs.

Course Grade

Slides: http://goo.gl/UgBYM

Most interesting finding (so far):

LMS Use

Variables

18%(Mean % change explained by any

single LMS variable)

Student Characteristic

Variables

4% (Mean % change explained by any

single S.C. variable)

>

SIGNALS

http://www.itap.purdue.edu/studio/signals/Purdue Signals Project

Wordcloud of student evaluations of Course Signals (Arnold, 2010)

Slides: http://goo.gl/UgBYM

Signals Course Outcomes

Compare same course (w/Signals v. w/o Signals)-6.41% DWF

+10.97% A/B

(Arnold, 2010)

PANEL QUESTIONS

Q1: What strategic question(s) does your campus hope to answer with Learning Analytics?

Q2: What are some example(s) of Learning Analytics that you have conducted on your campus, no matter how small the sample size? What’s worked? What hasn’t? Why?

Q3: What should educational technologists do to make progress deploying Learning Analytics on their campuses?

Slides: http://goo.gl/UgBYM

For More Information

Learning Analytics Resources Googledoc: http://goo.gl/Fwur6

Society for Learning Analytics Research: http://goo.gl/bH9ts

Educause Learning Analytics Library: http://goo.gl/UDRMx

AUDIENCE Q & A By WingedWolfDamián Navas

Slides: http://goo.gl/UgBYM

Contact Information

Kathy Fernandes (kfernandes@csuchico.edu)

James Frazee (jfrazee@mail.sdsu.edu)

Andrew Roderick (roderick@sfsu.edu)

Deone Zell (deone.zell@csun.edu)

John Whitmer (jwhitmer@calstate.edu)