The Learning Analytics Landscape - NERCOMP › uploadFiles › 3898000001E8.filename.Learning... ·...

Post on 24-Jun-2020

2 views 0 download

transcript

The Learning Analytics Landscape Surveying Where we are &

Where we are Going

NERLA 2013 - Second North East Regional Learning Analytics Symposium

Post questions/thoughts using hash tag #NERLA13

Josh Baron Senior Academic Technology Officer

Marist College

POP QUIZ

36%

Reference: Integrated Postsecondary Education Data System (IPEDS)

13% Four Graduation Rate for

Savannah State University (2010)

Presentation Outline

High-Level Overview of Learning Analytics (LA)

Open Academic Analytics Initiative (OAAI)

Peaks on the Learning Analytics Landscape

The Big Questions about Big Data

Group Discussion/Q&A

Post questions/thoughts using hash tag #NERLA13

Analytics in Higher Ed from 50k Feet

Academic Analytics Learning Analytics A process for providing higher education institutions with the data necessary to

support operational and financial decision making*

The use of analytic techniques to help target instructional, curricular, and support resources to support the achievement of

specific learning goals*

Focused on the business of the institution Focused on the student and their learning behaviors

Management/executives are the primary audience

Learners and instructors are the primary audience

* - Analytics in Higher Education: Establishing a Common Language

Different Routes on the LA Landscape

Target Audience Results from analytics

presented to…

Instructor – Decides what, if any, action to take with the learner.

Student – Interprets results & decides on

action to take

Analytics Approach Predictions are based on…

Learning Sciences – Theories and understandings of

student behavior

“Big Data”– Analysis of large data sets of student behavior

Institution

Program

Course

The Open Academic Learning Analytics Initiative (OAAI)

Open Academic Analytics Initiative

EDUCAUSE Next Generation Learning Challenges (NGLC)

Funded by Bill and Melinda Gates Foundations

$250,000 over a 15 month period Goal: Leverage Big Data concepts to create

an open-source academic early alert system.

Student Attitude Data (SATs, current GPA, etc.)

Student Demographic Data (Age, gender, etc.)

Sakai Event Log Data

Sakai Gradebook Data

Predictive Model Scoring

Identifies students “at risk” to not complete

course

SIS

Data

LM

S Da

ta

OAAI Early Alert System Overview

Intervention Deployed “Awareness” or Online

Academic Support Environment (OASE)

Developed model using historical data

“Creating an Open Ecosystem for Learning Analytics”

ONLINE ACADEMIC SUPPORT ENVIRONMENT (OASE)

• OER Content • Self-Assessments • Learning Skills -

Flat World Knowledge

• Learning Support Facilitation & Mentoring

OAAI GOALS AND MILESTONES • Build “open ecosystem” for Learning analytics

• Sakai Collaboration and Learning Environment • Secure data capture process for extracting LMS data

• Pentaho Business Intelligence Suite • Open-source data mining, integration, analysis and reporting tools

• OAAI Predictive Model released under open license • Predictive Modeling Markup Language (PMML)

• Researching learning analytics scaling factors • How “portable” are predictive models?

• What intervention strategies are most effective?

How important is “openness” to the Learning Analytics field? Use hash tag #NERLA13

OAAI RESEARCH METHODOLOGY

• Compared Marist model to Purdue University • Share some common academic characteristics

• Conducting real world pilots • Pilots run in Spring and Fall 2012

• Introductory-level F2F courses

• ~35 courses at community colleges

• ~30 courses at HBCUs

• Conducting student surveys and focus groups

INITIAL FINDINGS - PORTABILITY

• Marist found similar predictive elements and correlations as Purdue

• Student aptitude is strongest predictor

• Cumulative GPA

• LMS elements are secondary predictors

INITIAL FINDINGS - PORTABILITY

Table 3: Prediction Analysis from Spring Pilots

College% of Semester

Completed# of

students Accuracy Recall Specificity Precision

25% 504 66.96% 70.76% 64.64% 55.00%50% 504 71.52% 78.22% 67.56% 59.41%75% 504 77.75% 72.53% 80.94% 69.84%

25% 502 59.13% 69.23% 56.31% 30.73%50% 601 70.92% 66.14% 72.51% 44.44%75% 649 74.77% 74.42% 74.88% 47.76%

25% 195 70.50% 86.27% 61.36% 56.41%50% 195 79.86% 72.55% 84.09% 72.55%75% 195 79.14% 72.55% 82.95% 71.15%

Savannah

Cerritos

Redwoods

Model accuracy remained in the 60-80% range when Marist-derived model was deployed at community colleges and HBCUs.

INITIAL FINDINGS – INTERVENTIONS

• Found statistically significant differences between treatment and control groups • ~7% increase in course grades

• More students “mastered content” (C or better)

• Similar trend among low income students

• Students were more likely to withdraw • Similar to Purdue findings, not necessarily negative

INSTRUCTOR FEEDBACK

"Not only did this project directly assist my students by guiding students to resources to help them succeed, but as an instructor, it changed my pedagogy; I became more vigilant about reaching out to individual students and providing them with outlets to master necessary skills. P.S. I have to say that this semester, I received the highest volume of unsolicited positive feedback from students, who reported that they felt I provided them exceptional individual attention!

FUTURE RESEARCH INTERESTS

• Factors that impact on intervention effectiveness • Intervention Immunity – Students who do not respond

to first intervention tend to never respond

• Student Engagement – How can we increase the level of engagement between students and help resources?

• Can predictive models be customized for specific delivery methods and programs/subjects?

• Can Learning Analytics identify “at risk” students who would otherwise not be identified?

What do you think should be the major focus of future LA research funding? Use hash tag #NERLA13

• Purdue University’s Course Signals • College-wide learning analytics approach

• University of Michigan’s E2Coach • Course-specific learning analytics approach

• UMBC’s “Check My Activity” Tool • Student-centered learning analytics approach

• SNAPP - Social Networks Adapting Pedagogical Practice • Leveraging “big data” visualization

22

TOUR OF PEAKS ON THE LA LANDSCAPE

• Built predictive model using data from… • LMS – Events (login, content, discuss.) & gradebook • SIS – Aptitude (SAT/ACT, GPA) & demographic data

• Leverage model to create Early-alert system • Identify students at risk to not complete the course • Deploy intervention to increase chances of success

• Systems automates intervention process • Students get “traffic light” alert in LMS • Messages are posted to student that

suggest corrective action (practice tests)

23

PURDUE UNIVERSITY’S COURSE SIGNALS

• Impact on course grades and retention • Students in courses using Course Signals…

• scored up to 26% more A or B grades • up to 12% fewer C's; up to 17% fewer D's and F‘s

• Ellucian product that integrates w/Blackboard

24

PURDUE UNIVERSITY’S COURSE SIGNALS

• Focused specifically on introductory Physics

25

UNIVERSITY OF MICHIGAN’S E2COACH

“…to say to each what we would say if we could sit down with them for a personal chat.”

26

EXAMPLE MTS MESSAGE

• UMBC found that students earning D/F’s use Bb 39% less then higher grade achievers • Not suggesting cause and effect • Goal is to model higher achiever

behavior • Provides data directly to student

• Compare LMS use to class averages • Can also compare averages usage

data to grade outcomes • Feedback has been positive

27

UMBC’S CHECK MY ACTIVITY TOOL

• SNAPP (Social Networks Adapting Pedagogical Practice) – UBC/Wollongong • Visualize networks

of interaction resulting from discussion forum posts and replies

28

SOCIAL NETWORK ANALYSIS & VISUALIZATION

THE BIG QUESTIONS…

• “The obligation of knowing” – John Campbell • If we have the data and tools to improve student

success, are we obligated to use them? • Consider This > If a student has a 13% chance of passing a

course, should they be dropped? 3%?

• Who owns the data, the student? Institution? • Should students be allowed to “opt out”?

• Consider This > Is it fair to the other students if by opting out the predictive model’s power drops?

• What do we reveal to students? Instructors? • Consider This > If we tell a student in week three they have a

9% chance of passing, what will they do? • Will instructors begin to “profile” students?

30

LEARNING ANALYTICS ETHICS

What are some of the big ethical questions surrounding LA? Use hash tag #NERLA13

Big Questions about Big Data Could Learning Analytics end up driving the

wrong learning outcomes? What do you see as the biggest ethical

issues surrounding Learning Analytics? How important is “openness” in LA? Open

standards? Open licensing? If you were Bill and Melinda Gates what

would your funding priorities be around LA? Where is LA in the “hype cycle”? Where will

it end up?

LEARNING ANALYTICS AND KNOWLEDGE CONFERENCE (LAK) THE INTERNATIONAL JOURNAL OF THE SOCIETY FOR LEARNING ANALYTICS RESEARCH SOLAR FLARES – REGIONAL CONFERENCES SOLAR STORMS – DISTRIBUTED RESEARCH LAB MOOCS ON LEARNING ANALYTICS http://www.solaresearch.org/

33

More Questions?

Josh Baron Senior Academic Technology Officer

Marist College Josh.Baron@Marist.edu

@JoshBaron

OASE DESIGN FRAMEWORK

• Guiding design principals that allow for localization

• Will be releasing under a CC license

• Follows online course design concepts

• Learner – Content Interactions

• Learner – Facilitator Interactions

• Learner – Mentor Interactions

• Leverages Open Educational Resources (OER)

DESIGN FRAME #1 LEARNER-CONTENT INTERACTIONS

• Self-Assessment Instruments • Assist students in identify areas of weakness related to

subject matter and learning skill • OER Content for Remediation

• Focus on core subjects (math, writing) • Organized to prevent information overload

• “Top rated math resources”

• OER Content for Improving Learning Skills • Focus on skills and strategies for learner success

• Time management, test taking strategies, etc.

DESIGN FRAME #2 LEARNER - FACILITATOR INTERACTIONS

• Academic Learning Specialist role would • Connecting learners to people and services

• Promoting services and special events

• Moderates discussions on pertinent topics • Example: “Your first semester at college”

• Guest motivational speakers • Occasional webinars with upperclassman, alumni, etc.

• Allows learners to hear from those who “made it”

DESIGN FRAME #3 LEARNER - MENTOR INTERACTIONS

• Online interactions facilitated by student “mentor” • Facilitates weekly “student perspective” discussions

• Example: “Your first semester of college – the real story”

• Online “student lounge” for informal interactions • Let others know about study groups, etc.

• Help build a sense of community

• Blogs for students to reflect on experiences • Could be public, private or private to a group