+ All Categories
Home > Documents > TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some...

TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some...

Date post: 12-Sep-2019
Category:
Upload: others
View: 3 times
Download: 0 times
Share this document with a friend
41
TEL009 - Student Information from VLEs Project Report August 2014 Wilma Alexander, Learning Services - Project Manager Myles Blaney, TELS - Project Team Stephannie Hay, TELS - Project Team Contents 1. Summary ............................................................................................................ 2 Key Interim Findings .................................................................................................. 2 Futures ..................................................................................................................... 3 2. Literature review ............................................................................................... 5 Institutional and cultural context ................................................................................. 6 3. Existing information........................................................................................... 7 Existing Tools for Learn .............................................................................................. 7 Built-in Learn Tools .................................................................................................... 7 Case Studies ............................................................................................................10 Moodle: Existing tools ...............................................................................................11 Activity in Moodle community.....................................................................................15 Other tools and developments of interest ....................................................................15 Google Analytics ....................................................................................................15 Internal Initiatives ....................................................................................................16 4. Information gathering and consultation events .............................................. 17 MoodleMoot 2014 .....................................................................................................17 eLearning@ed seminar ..............................................................................................19 Moodle Consultant visit .............................................................................................19 Project events ..........................................................................................................20 Student Information gathering ...................................................................................20 User Stories discussion ..............................................................................................21 Stories about marks and grades .................................................................................23 Stories about activities ..............................................................................................24 4. Next steps / Next projects ............................................................................... 25 Project work .............................................................................................................25 Other related activities ..............................................................................................26 Appendix A: Categorised user stories .................................................................. 27 Appendix B: Project Contributors ........................................................................ 40
Transcript
Page 1: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 - Student Information from VLEs

Project Report August 2014

Wilma Alexander, Learning Services - Project Manager Myles Blaney, TELS - Project Team Stephannie Hay, TELS - Project Team

Contents 1. Summary ............................................................................................................ 2

Key Interim Findings .................................................................................................. 2

Futures ..................................................................................................................... 3

2. Literature review ............................................................................................... 5

Institutional and cultural context ................................................................................. 6

3. Existing information ........................................................................................... 7

Existing Tools for Learn .............................................................................................. 7

Built-in Learn Tools .................................................................................................... 7

Case Studies ............................................................................................................ 10

Moodle: Existing tools ............................................................................................... 11

Activity in Moodle community..................................................................................... 15

Other tools and developments of interest .................................................................... 15

Google Analytics .................................................................................................... 15

Internal Initiatives .................................................................................................... 16

4. Information gathering and consultation events .............................................. 17

MoodleMoot 2014 ..................................................................................................... 17

eLearning@ed seminar .............................................................................................. 19

Moodle Consultant visit ............................................................................................. 19

Project events .......................................................................................................... 20

Student Information gathering ................................................................................... 20

User Stories discussion .............................................................................................. 21

Stories about marks and grades ................................................................................. 23

Stories about activities .............................................................................................. 24

4. Next steps / Next projects ............................................................................... 25

Project work ............................................................................................................. 25

Other related activities .............................................................................................. 26

Appendix A: Categorised user stories .................................................................. 27

Appendix B: Project Contributors ........................................................................ 40

Page 2: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014 2

1. Summary This project aimed to examine options for exposing data from the central Virtual Learning Environments (VLEs) Learn and Moodle, to students, in order to make better use of existing data sources to provide students and staff with meaningful information on progress and interaction with content, in relation to their peers and to appropriate external measures. See the Projects website https://www.projects.ed.ac.uk/project /tel009 for detailed information about the project planning, timelines, milestones and meeting notes. The project aimed to draw on existing Learning Analytics (LA) work directed at supporting student learning, existing knowledge of the VLE databases, and on staff and student aspirations in utilising VLE activity data. As a result we expect to develop:

a) non-functional prototypes or models for building blocks or equivalent for Learn and Moodle VLEs which will automatically update and present comparative data on activities and evaluations;

b) suggestions for presentation format(s) which allow students and staff to understand and discuss the relationships between individual and cohort performance;

c) a mapping of priorities and road map for future delivery of student activity information (i.e. for a Phase 2 Project);

d) an analysis of effective presentation formats which will inform future plans for learning (and learner) analytics work;

e) recommendations as to how the data presentation may support Personal Tutor discussions.

Activity to date

Since its inception in January 2014, the project has carried out the following activities, explored in more detail in the report below.

Review of relevant literature Review of existing / comparable initiatives Promotion of project and call for expressions of interest

Review of existing / potential tools for the two VLEs Requirements gathering from staff and from students, allied with discussions of how

this project complements other University activity.

Key Interim Findings Through this work we have established:

Little comparable activity exists in other UK HEIs or elsewhere in English-speaking areas.

Existing tools and add-ons for the two VLEs do not provide adequate student-facing functionality.

Staff and student interest in such analytic tools is high. Where staff are already presenting activity data to students, this is produced

from predominantly manual analysis.

Some existing analytics activity and tools may be harnessed as starting points or models for a continuation of this project.

There are issues around the management, interpretation, control and contextualisation of analytics data which reach beyond the technical and presentation developments which are the focus of this and the follow-on project.

Page 3: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014 3

To address these we will consult widely, follow models being developed for HE in particular, and ensure appropriate project governance for the follow-on work.

Priority (from both staff and students) for type of data available lies with straightforward averages and aggregations of activity data, at course and programme level and by individual student, rather than “big data” overviews of departments and schools.

Key overviews appear to be individual student against cohort (for course aggregate data and individual items), and patterns of individual student activity over time.

A probable separation of development options for each VLE in 2014/15 because of different availability of tools and the different user group profiles.

Priority (from both groups) lies with presenting information in a simple way at a single point of contact (i.e. one overview page).

There is also a strong interest in being able to customise the views of data, to ensure relevant items only are shown for each course and to be able to drill down into detail.

There is a research interest in much broader questions of learner analytics which lies outside the scope of this project, but great potential for synergy exists between these strands within the University.

Futures This project and especially this final report will inform the shape of a number of work areas in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options, leaving detailed specification of technical developments to the second phase of the project in 2014-15. This allows us to ensure that if solutions for one VLE differ from those of the other, as seems likely, development is not delayed by the need to seek common technical ground. As far as possible, we will seek parity of user experience and functionality across the VLEs, over the course of the Phase 2 activities. Phase 2 activities will include

Project exploring student analytics views from Moodle courses - piloting existing and new plugins, user (staff and student) evaluations from the pilot, further development of plugins where possible.

Project exploring student analytics for Learn courses - prototype, evaluate and build building blocks or other appropriate data views. The project methodology will be iterative, allowing us to build and test analytics tools in a short and focused development cycle.

Continuing student and staff consultation on project developments, priorities and concerns, including transparency, data handling, and privacy.

Support for and engagement with wider University community on all aspects of learning analytics research and development.

Planning for the Phase 2 projects and development of key contacts has already begun. The User stories collected as part of this project and summarised in this report will form the basis of discussions and modelling for the follow-on work. All of the stories collected are presented in general categories as Appendix A. Thanks are due to many staff and students across the University who have participated in

Page 4: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014 4

events, shared their views and expertise, and helped direct our enquiries. Much of the consultation (especially with students) was anonymous, but where possible staff and student contributors and stakeholders are listed in Appendix B.

Page 5: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014 5

2. Literature review Learning Analytics has become an area of interest for post-compulsory education very quickly over the last few years, and there has been rapid growth in the number and range of publications addressing aspects of this topic. Many of these draw on the more mature models of Analytics from the commercial field, where the advantages of collecting and exploiting “big data” can be directly related to specific positive outcomes such as more sales, customer retention and satisfaction. While these are of considerable general interest, they are not strictly relevant to this project. Of more interest are the publications aimed directly at, and coming from, Higher Education, and these broadly fall into two categories:

a) developing theoretical models and technical standards for managing institution-level data, including but not exclusively student activity, to inform strategic development and planning. This is of interest but implies a much broader and more significant scale of organisational and technical development than this project.

b) Descriptions of research-led activities and pilots looking specifically at the relationship between activity analysis and student performance measures. These are of interest mainly to discover what sort of information is being used, and how it is presented - not to say that the research is not intrinsically interesting, but that the evidence base for the effectiveness of analytics-led interventions is not the core concern of this project. These publications have also been a significant source of information about which institutions and tools are of interest, as discussed in Section 3 below.

Key publications which have informed the development of our thinking in this project include the CETIS series of briefing papers on many aspects of Learning Analytics1 , the proceedings of the 2013 Learning Analytics Summer Institute2, and the first issue of the Society for Learning Analytics Research Journal SOLAR3. A number of project web sites have also provided information and case studies of interest, including the University of Roehampton fulCRM project4 , the Open University RISE5 and Learning Analytics Strategy development work6. These publications have helped set the context for the work of the project, and also helped ensure that the project boundaries are set appropriately. Research on the key indicators of student success is at an early stage, and so far these are not presenting novel or startling insights: in general the studies to date confirm that early and consistent engagement with coursework and readings is a strong indicator of success, but the use of analytics measures as a proxy for predicting student t progress and success remains contentious. Much of the work being done in the UK, USA and elsewhere addresses a deficit model of student learning, in attempting to identify students “at risk” of failure or dropping out. The intention of this student information project is not to manage retention problems, but to provide students and staff with access to information about their online activity which may be used to inform and support individual and course planning. The literature research also led us to believe that no other UK institution is currently approaching analytics development in this way - there are some very small-scale pilots, and some institution-level systems being developed to include data from sources other than the

Page 6: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014 6

VLEs, but as far as we are aware none attempting to provide analytics data from the VLEs to students, and on such a wide scale across the institution. This is also indicated by the work done on existing tools and initiatives, as outlined in Section 3 below.

Institutional and cultural context Some of the literature and case studies focused on non-technical aspects of analytics development which have helped inform the development of some general principles for this project: requirements which are considered mandatory. This project focuses on Micro-level analytics: that is, information about individual courses and students - this is what is readily available within the VLEs. It is of course important to bear in mind possible future developments on a wider scale, and to ensure that this project contributes to longer-term developments appropriately. Therefore we have embedded some basic principles into the work: these have been informed by emerging discussions on data handling, transparency, ethics and guidelines published by other organisations such as the Open University. A brief statement of these baseline principles would be:

Students should be informed about what data is gathered and held. Identifiable information about individual student activity and achievement should

only be available to authorised staff and the individual student concerned.

Wherever possible, information should be made available in an appropriate format and with appropriate contextual information and support from staff.

The follow-on projects will include a strand which will contribute to any wider institutional policy developments relating to use of learning analytics data.

Of particular help in this regard have been a number of discussion articles on the interpretation and visualisation of data and their relationship with an ethical and pedagogic framework for application of learning analytics. All acknowledge that data is not neutral, and there are potential hazards in allowing the development of data-driven models of student engagement and achievement. An effective summary of the importance of such a framework has been developed by the Open University as part of its developing ethical framework for Learning Analytics, based on these principles:

Learning Analytics is a moral practice which should align with core organisational principles

The purpose and the boundaries regarding the use of learning analytics should be well defined and visible

Students should be engaged as active agents in the implementation of learning analytics

The OU should aim to be transparent regarding data collection and provide students with the opportunity to update their own data and consent agreements at regular intervals.

Modelling and interventions based on analysis of data should be free from bias and aligned with appropriate theoretical and pedagogical frameworks wherever possible.

Students are not wholly defined by their visible data or our interpretation of that data

Adoption of learning analytics within the OU required broad acceptance of the values and benefits (organisational culture) and the development of appropriate skills.

Page 7: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014 7

The OU has responsibility to all stakeholders to use and extract meaning from student data for the benefit of students where feasible.

Source: 'Learning Analytics for large scale distance and online courses'. Webinar 25 June 2014

Presenters: Belinda Tynan, Pro Vice-Chancellor (Learning and Teaching) and Kevin Mayles, Open

University, UK

3. Existing information Information about external activity associated with both VLEs, Learn and Moodle, was gathered via literature and web searches, and requests for information sent to the user community. Existing tools and functionality for the VLEs, whether built-in or from third party suppliers, were also evaluated. Because of the different communities and tools available, reporting on this is split below. Evaluation of these tools included considerations of cost, best fit for our project aims, and estimated overhead of modifying tools to meet our needs.

Existing Tools for Learn Current availability of third party tools is limited and those tools that are available do not meet the projects requirements, either because they are not student-facing, not sufficiently personalisable, or require an additional license and costs. Eesysoft (developed by EasyAnalytics) is a performance analysis tool currently used by Cardiff & East Anglia Universities. It monitors user activity within the Blackboard environment and produces tool usage reports which show technology adoption. The reports can be configured but are only available to staff. Analytics for Learn (developed by Blackboard) is a packaged analytics suite that uses a data warehouse to create reports which can be exposed to staff and students. There are currently no UK institutions that use this product, licensing is expensive and would be in addition to our current costs, and no clarity that it will meet our project requirements. BIRT (Business Intelligence Reporting tool) is an open source reporting system that integrates with web applications to produce reports. It needs to be configured to produce reports. Report data cannot be exposed to students. Other institutions have developed custom-built tools to gather and expose information which meets their specific needs - see case studies below

Built-in Learn Tools Learn does incudes some functionality for gathering or exposing data, however this is limited and mostly not visible to students. Course Reports Course reports can be used to view information on course usage and activity. They vary from summary reports to more in-depth user analysis. The reports that can be run in Learn are:

All User Activity Inside Content Areas Course Activity Overview Course Coverage Report Course Performance Student Overview for single course

Page 8: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014 8

Overall summary for single course Overall summary of user activity User activity in forums\groups

Course reports provide a lot of useful information (e.g. logons, average time in course, activity breakdown, discussion board interaction per user) however this information has to be generated by configuring and running a report per user and per course, each time it is required, and cannot readily be made available to students. The reporting function also has consistency issues: staff have been warned that course reports are unreliable after numerous complaints regarding misinformation. This issue has been raised with Blackboard, who advised the release of SP16 is scheduled to address these issues. At the time of the report publication SP16 has just been installed and full report testing was not possible. Grade Centre Mode/Median Column All Assignment columns within the grade centre can be set to expose the Mode\Median score of the assignment to users. These features can be turned on or off by the instructor. This is one of the few facilities which provides data for students, but user feedback suggests that this facility is not widely used, perhaps because not enough staff are aware of the option. Retention Centre The retention centre allows instructors to use preconfigured rules or create their own rules to identify “at risk” students due to poor grades, lack of activity etc.. Staff can check the retention centre and easily identify by colour coding, students that may require intervention. Specific tasks can be tracked and triggers for action flags configured by the course instructor

Page 9: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014 9

Course Activity: checks student activity within a course based on click access. If a users is below a set level an alert will be triggered. Grade: can be set to a defined grade or average grade. This will identify any users that do not meet the specified criteria Course Access: set to alert if a user doesn’t access a course within a specified time frame. Missed Deadlines: can be set on a test\survey\assignment to monitor either all or specified deadlines. Staff can also select individual students to be monitored. All of this functionality is staff-facing and not available to the student. Performance Dashboard The performance dashboard provides instructors with a real-time overview on student activity within a course. It can be accessed via Control Panel > Evaluation > Performance Dashboard and displays information in a table. Visible columns can be selected.

The following columns can be set up:

Last Course Access date and time Days since last course access Discussion board posts Number of posts Review Status (allows instructor to track Mark Reviewed activity) Adaptive Release (allows the instructor to view what course content the student can

see)

Early warning system (displays the number of warnings/ total number of warnings that result in a user warning – it will link to the Retention Centre)

View Grades

Page 10: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014 10

Case Studies UK based institutions have been referenced in numerous white papers on learning analytics for HE, but we have found that while general interest is high and there are a number of interesting pilot projects, there are few institutions considering or implementing VLE-based work institution-wide. For example, enquiries to the Scotland-wide Blackboard Learn User group found only one institution which had expressed an interest and was starting to investigate learning analytics, and no institutions were using or had built a tool. Three of the main UK based cases studies referenced in numerous reports are outlined below: Institution: University of Roehampton Project Name: fullCRM Student performance tool to support student progression and retention JISC synopsis: “CRM project used a variety of activity data to support student progression and retention, especially addressing causes of non-completion. An ‘Early Warning System’ approach to collate key activity data and flag up students who were at risk of failing to progress was trialled in the Department of Psychology and is now being implemented more widely. The indicators ranged from poor lecture attendance to receiving a fail grade or a warning for plagiarism” Kay, D & Harmelen,M.V. (2012) Analytics for the Whole Institution: Balancing Strategy and Tactics. Roehampton University wanted to tackle the issue of retention specifically within a school with a high % student drop-out rate. The school had initially used a paper based system that was problematic to administer, so they created a retention-focused analytical tool that collated data and presented an Early Warning System to staff only. The project initially aimed to use data from the VLE (Moodle) however technical problems meant that VLE data was not included. The pilot was based within a single school which experienced a 14% increase amongst first year students progressing into the next year. The success of the pilot has led to a broader user of the tool throughout the institution. Institution: Huddersfield Project Name: Library Impact Data Project (LIDP) Analysis of the correlation between library usage and student attainment. JISC synopsis: “The project used the student's final grade, course title and variables relating to library usage, including books borrowed library e-resources access and entry to the library. At the end of the project the hypothesis was successfully supported for student attainment in relation to use of e-resources and books borrowed for all eight institutional partners” Kay, D & Harmelen,M.V. (2012) Analytics for the Whole Institution: Balancing Strategy and Tactics. The LIDP used various variables and resources to gather and collate the data however its primary data was library footfall and library resource usage to reflect student achievement

Page 11: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014 11

via library analytics. No VLE data was used and students were not exposed to the information. Institution: Huddersfield Project Name: EBEAM (Evaluating the Benefits of Electronic Assessment Management) JISC Synopsis: “The EBEAM project will evaluate the impact of e-assessment and feedback on student satisfaction, retention, progression and attainment as well as on institutional efficiency. It will share recommendations for achieving high quality assessment processes, staff development, student support, sustainability, scalability and curriculum development” EBEAM: Evaluating the Benefits of Electronic Assessment Managementhttp://www.jisc.ac.uk/whatwedo/programmes/elearning/assessmentandfeedback/ebeam.aspx EBEAM is using rubrics and quick comments with Turnitin to collate feedback on Turnitin assignment submissions and identify student assignment patterns with overall grade (e.g. a student with a weak introduction might score an average 40% however a strong introduction average score is 80%). This data is presented to students in assignment workshops. In summary, many institutions are looking at data generated within their VLEs, but analysis varies according to environment, data, expertise and tools. The TEL009 project remit is specific to data already within the VLEs and available to students as well as staff. We have been unable to find any other institution which has developed a similar tool. The VLE environment is also a key factor, as tools available for use with Learn are limited. However Moodle has a broad and open developer base which has already started to produce plugins.

Moodle: Existing tools Moodle currently has more options providing some degree of analytics than Learn does. Due to its open source nature we are generally free to put these onto our DEV environment for further investigation. We identified nine options to look at in closer detail alongside the Learning Technology Services (LTS) analytics, which is discussed later. Intelliboard.net This was the only option we looked at which is a paid-for product. Intelliboard.net has a separate web tool, which displays reports driven by your Moodle data, in a dashboard format. This option was not examined in any great depth as there are cost implications and there are other similar open source options available. However, it does provide some nice examples of graphical presentation of data, which we have noted. An example of this presentation is shown below.

Page 12: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014 12

(Source: Moodle. Image by Intelliboard) Analytics and Recommendations This plugin could not be made to work with the University’s version of Moodle and simply threw errors during the configuration – as the plugin does not appear to be actively maintained it was decided not to pursue this one further. GradeTracker and Personal Learning Plan (PLP) These two plugins were developed by Bedford College. The GradeTracker plugin allows staff to mark assignments against a predefined set of criteria related to the course and/or qualification that the student is working towards. It is designed to allow a member of staff to easily see the data for a single student over many courses and provides options for a class view, unit view and Gradebook/Activity views. It also uses the completed units to predict a final grade for the student. It has a straightforward interface. Crucially for this project, it has views for the student as well as the staff members, allowing a student to track their own progress. However, these views are predefined as part of the GradeTracker plugin and would require to be changed in the code. The GradeTracker was initially created to help give oversight to BTEC and City + Guilds qualifications although it does have the option to create ‘bespoke’ qualifications which is what the University’s would be termed. GradeTracker doesn’t fully meet the needs of the project: although it seems like a useful tool to monitor progress it would be time-consuming and complex to set up. The Personal Learning Plan plugin is designed to give a high-level overview of user activity and is available to students and teachers. It looks at a wide-range of information including attendance, tutorial target setting and monitoring, and timetable display, and it integrates with GradeTracker. Although this does not quite do what we require on this project, it does expose data to students and with the Reporting and Alerts plugin that Bedford are planning to release to the wider community, teachers can set up automated email response rules to certain indicators e.g. send X email when attendance falls below XX%. Neither PLP nor GradeTracker meet the requirements of this project and have a lot of additional elements which are not necessary. So, while we will not be installing these plugins, the email exchanges with Roy Currie at Bedford College have been very positive and helpful, especially in following how the

Page 13: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014 13

developments at Bedford have been received. The development team at Bedford ensured lots of consultation with staff and students and the plugins have been positively received by both groups. They have also witnessed changes in student behaviour with some students comparing their GradeTracker grids and trying to outdo their peers. For staff, it appears that their plugins have solved a real administrative issue by replacing the previous ad hoc spreadsheet-based tracking The uptake has been bigger than initially expected with lots of teams seeing the benefits. These projects seem to have parallels with our Learning Analytics project so hearing how the roll-out was managed and reports of a positive reception are useful. Overview Statistics This plugin delivers 4 predefined reports. It relates more to management information than learning analytics although it does present some nice ideas for presentation styles. The plugin shows four charts and is looking at information across the Moodle site rather than drilling into course and user interaction in the manner we would like. Adhoc Database Queries and Configurable Reports These plugins allow reports to be written and run within Moodle. The Configurable Reports plugin allows users who have the correct permissions the ability to create a number of reports without using Structured Query Language (SQL). The report creator can then decide where it is to be displayed in Moodle and which users can view it. It also allows users to create charts and generate exports. The Adhoc Database Queries plugin makes use of SQL meaning that reports would most likely have to written for staff. It also only allows the display of information in tables although there are a nice couple of features that allow reports to be scheduled and emailed to users. Both of these plugins would require work to ensure any reports were suitably anonymised and that permissions were set at the correct levels. These plugins lend themselves more to management information than the micro level analytics this project is concerned with, and although our Moodle database is at present relatively small we should be wary of having too much reporting done on the live database in case of impact upon users. The last two plugins that we investigated are more directly associated with Learning Analytics as applied to this project Engagement Analytics Plugin The first of these is a neat little plugin that displays a block to teachers on their Moodle course with colour coding according to how a student is doing compared to the weightings set on three elements within Moodle. For each course staff can configure the weightings they want for each element: Assessments, Forums & Login activity. This gives staff a degree of flexibility but only these three elements are available The plugin also allows control of the roles which are covered by the data, allowing us to target the student role. If some wished to opt out of data gathering, we could define more granular roles. This plugin offers a single display type: only the traffic light system which can be added to the teacher’s course page. There is also a table format report. Although elements can be weighted, the algorithm for working out the red, amber, green system is pre-defined within

Page 14: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014 14

the plugin.

Example of Engagement Analytics plugin

The plugin’s simplicity is an advantage, as the traffic light system is easy to recognise and interpret, and staff can choose the % of each element that feeds into the thresholds to generate the traffic light output. The negatives are that it isn't available to students yet, and deals only with information on forum, assessment and login activity. It is a candidate to be piloted on courses on the MSc for Digital Education, which will give us opportunities for further analysis and discussion. Forum Graph plugin This plugin looks only at forum interaction for individuals, and creates a visual picture of these interactions. An example is displayed below.

Currently, the information on the graph cannot be anonymised and the quality and scale of the interactions are not evaluated. However, it does provide a neat visual reference and will be piloted on the courses on the MSc for Digital Education to analyse further. This plugin is potentially of great interest to the project as it is highly relevant to the requirements. The presentation style, if shown to be effective, would be of interest for any Learn-based forum analysis report. LTS adapted plugin The Learning Technology Services team who were previously part of the College of Medicine and Veterinary Medicine and are now part of the central Information Services built a Learning Analytics application that is currently in use on some of the courses within Medicine

Page 15: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014 15

which use their in-house VLEs. It has been adapted for use within Moodle (thanks to Matt Hammond etc.). A snippet of JavaScript on each page of Moodle sends data out to the Learning Analytics application, which users can then log onto via the web. Users can then choose how they wish to view the information: by site, by course or by individual user. The chart types displayed are the same whether for course or user. They comprise of a heat map of activity, a pie chart showing the type of page that users accessed, a pie chart of device types and Activity (Inter-Quartile Range) Chart. The information is broken down by week and there is a summary of information across the top of the charts showing information including Unique Users, Max Users and Peak times of usage. The LTS analytics scripts have been tested on the Moodle DEV site and will be put through the normal channels of delivery and added onto Moodle Live during the summer. The tool will also be part of the piloting with some courses on the MSc Digital Education for further analysis and exploration. We will continue to work closely with LTS on this application.

Activity in Moodle community There is currently a limited amount of activity related to learning analytics or indeed analytics as a whole within the Moodle Community, as evidenced by the relatively small number of plugins available. However, there is a growing interest within the community. At this year’s Moodle Research Conference a working group on Student Engagement Analytics was planned but unfortunately the conference was cancelled. We are following up on this to see if there is interest in starting a virtual working group in its place. This has generated some useful conversations around learning analytics and a Moodle Forum has been started relating specifically to Reporting and Analytics. We are keeping up to date with these discussions. As part of this project we asked the wider JISC Moodle UK email list about their experiences of learning analytics and while we received only one response from University of London Computing Centre about an actual project, others responded with interest to hear more about our project. We plan to keep in touch with these institutions throughout our learning analytics projects.

Other tools and developments of interest

Google Analytics The Google Analytics service has been installed on Learn for a number of months and has recently been put onto Moodle as well. This allows us a view of how our users are interacting with the VLEs at a general level, with information about logins, devices used, pages visited and duration of visit. It also allows us a high level overview of user behaviour and technology trends. Google Analytics does not provide the individual student information required for this project, due to its anonymous nature. However it can provide useful insight into how people are interacting with courses, and combined with learning analytics tools and other management information tools we can build up a more powerful picture of use. We intend to explore further options for Google Analytics with interested members of staff, to discuss which reports and information will be most useful. The scope of information potentially available in Moodle, amounts to almost an overload of information, where previously there had been relatively little. The Google Analytics options for both VLEs will be taken forward separately to the learning

Page 16: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014 16

analytics project but the team will ensure the developments inform each other to help staff receive a fuller picture of interaction on the VLEs.

Internal Initiatives When this project started there were already a number of relevant initiatives and strategic planning objectives of relevance to this area. Personal Tutors developments The University has placed a high priority on improving student support and engagement, and the personal tutors system is a key part of the developing plan. Improvements to the information available to personal tutors and to support their interactions with students are under discussion at Institutional, college and school level. For on-campus students in particular, VLE activity will only ever form a small part of the picture which would include library data and interactions with personal tutors as well as assessment data from central student systems. Institutional initiatives Within the University a number of systems and projects will have the potential to contribute data to this wider institutional analytics picture. Project staff engaged with these have been contacted in the course of this investigation and there will be a continuing need to keep in touch across these projects and initiatives. These include:

the Programme and Course Information Management project (PCIM) Student timetabling project,

the Student Experience initiative, EUSA work on Peer support, procurement and installation of a new Library management system, and procurement for a new electronic voting system for classroom teaching.

The University is also supporting research into learning analytics in a number of ways, including the creation of a new Chair in Learning Analytics as an initiative across the Schools of Education and Informatics. Within the College of Medicine and Veterinary Medicine, Dr Paula Smith has been working closely with colleagues in the Learning Technology Services to develop tools, which will track aspects of student activity online. This builds on work started some years ago to provide effective interventions for “at risk” students on selected online distance learning programmes, and allows the data to be gathered and presented automatically. At the time of this project, work on providing student-facing views of some of this data is continuing. Because the technology used for this can be applied to other data sets, this project is exploring its use with Moodle, as described in the section on Moodle above. With support from IS and her College, Paula has been seconded to the Learning Services team for 0.2 FTE for a year, to carry out further research on the impact of analytics and interventions in this context. An outline of this work is available at https://www.wiki.ed.ac.uk/display/eLPP/Learning+Analytics+brown-bag+Feb+2014 Also within the College of Medicine and Veterinary Medicine, Dr David Hope in the Teaching Organisation has been looking at psychometrics in support of student learning, and has recently been awarded a Principal’s Teaching Award Scheme grant to pilot analytics modelling work with additional schools from the other colleges. CourseMarks Within the School of Physics and Astronomy, academic and technical staff have been

Page 17: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014 17

working to develop a way of providing an overview of student marks translated into a “traffic light” system for Personal Tutors. This draws on the Learn Grade Centre which the school already uses to pull together assessment grades from weekly activities throughout the year. The data is drawn from the Grade Centre and summarised as a class overview. Initially this view was available only to staff but work has started on making it visible to students and exploring their views on its helpfulness. A brief outline of this work is available at https://www.wiki.ed.ac.uk/display/eLPP/Learning+Analytics+brown-bag+Feb+2014 In addition, we have been keeping close watch on the data gathering and analysis work being carried out as part of the University’s MOOCs initiatives. Although the issues of student engagement and retention are very different in the MOOC context, and the data available is different and restricted by the platform (Coursera and FutureLearn), we believe there will be useful lessons from the data analysis and visualisations which prove most useful in that context.

4. Information gathering and consultation events Project team members attended external events where possible, so brief summaries of these are reported here.

MoodleMoot 2014 At MoodleMoot 2014 the key themes were mobile and usability, but some of the presentations touched on tools and ideas relevant to this project. Moodle checklists and progress bar Checklists can be set up to ensure an item is viewed or a test is passed, before they are marked checked. A student may also manage their own checklists. Students see only their own progress while teachers see all of the students’ progress A progress bar summary on the front page allows you to drill down to see individual items. These plugins have already been requested by one programme team, and so will be added to Moodle after the upgrade. SH has personal experience of using the checklist plugin during an SQL course in Moodle in 2013, and found it really helpful, especially when dipping in and out of the course. Core Moodle Functions Becky Barrington from South Devon College talked about how to use some core Moodle functions such as Gradebook, activity completion and course completion to enhance the tracking of student grades. Teachers are able to see all the grades for their course and set target columns within the Gradebook and show a calculated ‘grade to date’ or ‘predicted grade’. There is also the possibility of setting weightings within the Gradebook. However these options involve a degree of customisation and set up which might not be possible for all users or might require additional support. Using Activity Completion allows either Moodle or a student to mark an activity as complete, it works particularly well with quizzes and forums with the activity being marked as completed when a particular content item is viewed/file submitted/item discussed or replied to. Teachers can see all students and all activities giving them an overall picture of the group. Becky also mentioned Course Completion and Checklists, with Course Completion being an option that both staff and student can see (the student is only able to see themselves) and Checklists being a very visual tool to help students see their progress. This is detailed elsewhere in this report. This could be an effective way of exploiting existing data but does require significant set-up effort and does not quite meet the project needs for presenting class average etc. to students.

Page 18: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014 18

Engagement Tracker - MMU Emily Webb based at Manchester Metropolitan University spoke about Student Engagement. MMU are working in conjunction with ULCC who mentioned in a previous email exchange that they were undertaking a learning analytics project. Engagement Tracker is external to the VLE and only visible to staff currently, although they would like to roll out to students in the future. The information is extracted from Moodle logs and they have an Engagement tracker block within Moodle itself. The block tells them how many students are sitting below the engagement threshold and allows staff to access the reports. Staff are able to vary the threshold given the course design and how students will interact with it. The Engagement Tracker is designed to give staff a visual indicator of students who are not engaging. It allows staff to choose what measurements to set and it is up to staff to add the block. It sits at Programme and Unit level and can be drilled down into, the programme level sees all the units. They have chosen to stick with the red/green colour scheme for their reports. MMU have also developed a dashboard to give a summarised view of engagement, week-by-week and from here they can drill down further into students’ records. The dashboard is also designed to allow them to continuously monitor and improve, it can be used as a course overview to perform almost a health check on the courses and shows the satisfaction rating of a course allowing course organisers to take action. MMU are considering how best to use this data to highlight best practice. We have contacted Emily Webb to request further information about this project. Link to Emily Webb’s slides: http://www.slideshare.net/MoodleMootIreland/tracking-student-engagement-in-moodle-emily-webb Google Analytics and Moodle - City University City University London presented an interesting poster on Google Analytics use with Moodle, which combined Google Analytics and analysis of Moodle logs. See photos of the poster below. City now have 3 years worth of data and although not strictly relevant to this project we feel it would be worth following up with them and maintaining contact.

Page 19: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014 19

It was clear at an early stage of the project that the topic of learning analytics was of widespread interest across the University, with both academic staff and support staff. So in addition to specific project requirements gathering, the team contributed to some more wide-ranging events.

eLearning@ed seminar In partnership with the eLearning@ed Forum, we organised an event in February 2014, to showcase some of the activity in this field and to introduce the project. In addition to introducing this project, speakers were Paula Smith (CMVM) describing her research on predictive analytics; Karen Howie (CHSS) outlining proposed research to explore student retention for online programmes, and Keith Brunton (SEE) outlining the CourseMArks work in the School of Physics. See https://www.wiki.ed.ac.uk/display/eLPP/Learning+Analytics+brown-bag+Feb+2014 for further information, notes and presentations. This event attracted 23 staff from across the University, and stimulated some helpful discussion and direction, as well as putting us in touch with a number of staff we might not otherwise have reached.

Moodle Consultant visit In April 2014 we took the opportunity of a visit from Gavin Hendrick (Moodle consultant) to organise a meeting, and invited members of the University’s Moodle Community, the TEL009 project team and others interested in learning analytics. The event was informal and allowed general conversation around analytics and on various aspects of Moodle which could be used to provide relevant data. Key discussion points and recommendations are given here. A more detailed note of the meeting and recommendations is on the discussion analytics wiki, here is a short summary. Why Analytics? Analytics may help to predict student results, performance, and flag disengagement early enough to allow effective intervention. Some studies have indicated that simple measures can be predictors of students “at risk” - e.g. failure to log in during early weeks, failure to upload a photo when asked. There is a need to ensure students and staff are kept informed of what data is held, manipulated and available to others. Institutional context is important, and for the University of Edinburgh this may mean using analytics to support students and therefore improve student satisfaction.

Page 20: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014 20

Staff engagement is key to the successful introduction of learning analytics. Considerations Discussion also ranged across some of the more contentious measures, where there is no general agreement on what the data might mean. So for example it may be simple to measure forum participation but less clear on what participation might mean. Surely quality is more important than quantity? Staff and student trust in analytics measures and in the appropriate interpretation of these measures is crucial to success. Analytics can only ever be part of an overview of student activity, especially where much activity may take place offline or at least outside the University’s formal systems - an example of Engineering students setting up their own Facebook Groups was cited. Any database work must be considered against core performance - general advice is never to live interrogate the database. Course design can help make sense of analytics data, and staff should consider the extent to which they help students understand what is expected of them - what we mean by “engagement”. Course Design & Minimum Standards There was a discussion around course design, which is not routinely analysed at Edinburgh. As Moodle courses are entirely ODL at present, it may be possible to draw some inferences from Analytics on those courses which would not apply to all the courses using Learn, which are predominantly on-campus. Gavin Hendricks considers information architecture for courses, and a standard course design template, extremely important ways of ensuring good course design. The group did not consider this a feasible way forward for the University of Edinburgh, although there was a suggestion that minimum standards for online course presence could usefully be pursued. This may fit in well with University strategic planning for increased online course availability and implications for learning techn9ologists and teaching staff working together on design of online and blended courses.

Project events Following on from this and the preparatory work reviewing the literature as described above, a series of consultation events with staff and students were planned. After brief outline of the project’s aims and scope, participants were asked to create User Stories. This is a well-established requirements gathering technique which we felt especially helpful in this context. It frees participants from any need to consider specific technologies but requires them to think about why a particular feature might be of use. Participants were then asked to sort and rate the different stories - this resulted in some useful and occasionally lively discussion. The User stories technique was especially helpful for the staff events, and detailed results from the user stories analysis are described below.

Student Information gathering In spite of widespread publicity for student-facing events, including support and promotion by EUSA, we were unable to recruit any students for the planned User Stories events, even the online Collaborate session directed mainly at ODL students. Instead EUSA kindly hosted a lunchtime event for all class reps and EUSA staff and sabbaticals, which attracted over 30 students from across the university and including both undergraduates and postgraduates. This event was more an open discussion, and only a short period at the end was devoted to collecting user stories. These stories have been

Page 21: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014 21

included in the analysis below, but the student views in general are summarised here from notes and a transcript of the event. This worked well as it helped to tease out a wide range of ideas and questions through the students discussing learning analytics amongst themselves. The main themes from the student session were:

Most of those present liked the idea of seeing their grades and activity in relation to the rest of your class, for example a comparison between an individual grade and the class average. It should be noted here that some courses already do this for their students, as the tools are already available. .

There was strong support for the use of charts and images rather than tables of data.

There was also support for making the information available all the time but not “in your face”, giving the control to the individual about when and when not to engage with the information.

They wished to have a level of personalisation available so that the information presented is not overwhelming. So, for example, they wanted to be able to control how notifications are received and what would trigger them. So, for example, the idea of emails being generated if they have not logged into the system for a particular length of time had a mixed reception and would not be acceptable if it was not configurable .

Personal tutor access to and use of learning analytics information would be key to effectiveness. Most attendees felt that tutoring and teaching staff on specific courses might not be able to act upon the information at an individual level, especially for courses with large cohorts.

It was felt that while emails to notify students could be helpful there would need to be a careful balance struck between contacting when needed and “spamming”. See the comments on personalisation above.

The majority felt that whatever is developed they should be able to access this information in the same way as they receive their grades.

They also felt it was very important to have this information available via mobile devices

Interestingly, there was not a lot of discussion around the idea that this sort of data is intrusive, or concern about security. This may in part be down to the type of student attending this event, and may also be due to their expectation that the university gathers some of this data anyway. We did confirm, after questions, that that individual students would see only their own data and that other information such as averages and tool usage would be anonymised. One suggestion which was generally welcome was the notion of providing information about the use of course content and tools. The example given was to track use of lecture recordings and use this as evidence to support the argument for more such recording.

User Stories discussion A total of 92 “User Stories” were collected from 18 staff over two sessions and from 32 students in one session. Seven were directly from students, 16 were staff presenting exemplar student users. The remaining stories represented the viewpoints of a rich variety of staff types: teacher,

Page 22: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014 22

academic development adviser, instructor, personal tutor, student support officer, learning technologist, lecturer, programme leader, librarian, course secretary. Existing information from the VLE user groups and from other related in-house surveys & focus groups was used along with staff feedback to frame additional conversations during the staff and student consultation events. Stories were classified with respect to the project scope and aims (total is more than 100%):

Mandatory 4

In scope 38

Already available / doable with current technology (at least in part) 35

Not in scope but of interest for future 21

Not in scope / already doable (at least in part) 6

Stories were also classified by the type of analysis requested - (e.g. of grades, logins) - and the type of access / view requested. (e.g. individual against cohort or graph versus table). These are the groupings that give us most information about where we might direct initial efforts for the follow-on projects and consider adaptations of existing tools and views, where appropriate. The summary presented here is an extract from these based on our classification and interpretation of these stories. The full set is presented as an appendix to this report, and we intend to revisit these in the course of 2014/15 follow-on work, refining and adding to the grouping as we gather further input from users. User stories - 4 are already mandatory for project

Source As a… I want to… so that…

Staff Study development adviser

know if students know that data is recorded at an individual level

I am aware of the policy on this.

Staff academic development advisor

be assured that students can track the same activity/events that I can see

equality and fair play are maintained and a "big brother" feeling is not an issue

Staff student know exactly what other students and staff can see

I know who is reading (a) my work and (b) the comments on my work

Staff Teacher tell students exactly what I can / will look at

they know I'm not spying on them

Some mention of ensuring students know what data we hold in VLEs, and who has access to it, also surfaced in the student focus group conversations. Addressing these issues:

Project transparency requires that we ensure students are consulted at all stages. We are not collecting new / additional data - project centres around student access

to data, and additional staff access where this is not currently available.

Staff support and development work should include supporting information on

Page 23: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014 23

ensuring students are aware of information held in VLEs.

Ensure that data is anonymised except for staff and individual students on course. It is interesting to note a comment from Colleagues in MMU on similar project - their students assumed that the institution was already collecting and using a lot of data about them. Staff showed more concern for contextualising the data for students, and ensuring appropriate interpretation of output measures. Perhaps surprisingly, students felt strongly that they wanted to control when they looked at this information, and did not want to have it 'pushed' at them. There are existing in-house systems which already present some limited overviews to students - e.g. Physics 'traffic light' system drawing on performance data from weekly online tests. These have only been available for student viewing since Semester 2 this academic year, and numbers of student views is low, but it is too early to draw any conclusions from this.

Stories about marks and grades Both staff and students want to see “how my mark compares with the class average and

what the highest and lowest marks are”

For some, presenting this information in an accessible way is important - there were

requests for an overview of all coursework marks in one place, and graphs of all results

rather than just mean / median.

N.B. Learn can already be set (by staff) to show students the average and median for any

selected grade centre column. Students can see an overview of assignments and grades in

their “My Grades” view, including upcoming assignments and tests.

Staff want to see results for groups versus whole cohort, and see overview of all marks for

an individual student / tutee in one place.

Page 24: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014 24

This suggests that even where there is already some relevant functionality built-in (as

above), some work is needed on the management, location, and format of the data

presentation.

Stories about activities 53 of the stories were about activity in the VLE other than grades, such as logins and discussion participation, of which 30 were considered within scope, and a further 21 of some interest if not strictly within scope. Staff and students wanted some sort of overview / activity summary, and to “be able to drill into specific behaviour e.g. login time, length of login, tools used, posts made” Many of the stories involved the ability to have data presented interactively, to drill down, explore, and see detailed patterns. This applies both to individual students (seeing an annual overview as well as cohort context) and to class cohort, wanting to see the pattern of interaction with content and tools to identify gaps, discrepancies or areas to pay more attention to, e.g. for revision. Some stories from staff explained the desire to see activity information as enabling them to identify students at risk or disengagement and/or failure. This suggests a need also to allow staff to select, from the data available, the subset of relevance to a particular course. In discussion staff agreed that they would need to contextualise some of this information before it would be of use to staff or students. Some stories also noted that examination of patterns of access to content and activities such as tests etc. might help them analyse the course content and design and consider which parts might need to be reviewed to better support students. Some stories wanted to use the overview of activities to build models of student use, to help analyse study / work patterns with students and for course design / review purposes. Some of the requested metrics seem reasonable as staff tools, but do not fall within the project scope - e.g. staff wanting to know how often eReserve items are accessed so that the support for the service can be justified, and students requesting that views of lecture videos are tracked so that they can better argue for an enhanced service. Having examined these stories and attempted to classify and conflate similar stories wherever possible, we have reached the following list of options to explore further:

Present data on: Each assessed event in the VLE (assignments, graded forum posts, participation grades, etc.)

Graded totals across the course where appropriate

Single page overview for each student against cohort. Ability to “drill down” into more detailed view of individual assignments or activities. Management of all graded activities - when and where data is released, who can

control e.g. aggregations and weightings. NB Learn and Moodle may have different options here.

Data cumulated over time - e.g. patterns of login times and duration, views of tracked content.

Views of non-graded activities as above Types of view - e.g. graphs, heat maps, individual patterns against cohort patterns,

etc.

Page 25: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014 25

Value of different visualisations - e.g. graphs, heat maps, connection network. Explore views and options already available (in part) in the VLEs, to discuss to what

extent they meet the needs of users. E.g. Learn performance dashboard information for staff, Moodle forum participation summary.

There are existing in-house systems which already present some limited overviews to students - e.g. The Physics 'traffic light' system drawing on performance data from weekly online tests. These have only been available for student viewing since Semester 2 2013/14, and numbers of student views is low, but it is too early to draw any conclusions from this.

4. Next steps / Next projects Planning for work to be undertaken in AY 2014/15 has already started. This can be split into core project work being undertaken by the TEL/TELS teams within the Learning Teaching and Web Division of IS, and more wide-ranging activity within the University which we will contribute to. Phase 2 activities will include

Continuing student and staff consultation on project developments, priorities and concerns, including transparency, data handling, and privacy.

Support for and engagement with wider University community on all aspects of learning analytics research and development.

For a detailed description of the 2014/15 project work see the projects website at https://www.projects.ed.ac.uk/planning/project/278

Project work

Project exploring student analytics views from Moodle courses. Pilot with courses on MSc Digital Education in the School of Education planned, using existing Moodle plugins and additional tools developed by Learning Technology Services. Staff and students will participate to evaluate the plugins and information presented, for further development where possible. Research questions for this pilot are currently being formulated.

Project exploring student analytics for Learn courses. We will use iterative methodology to prototype, evaluate and build building blocks to provide appropriate views on existing student activity data.

As well as the project management team, we will aim to create a group of key user representatives who can be consulted regularly as the project development cycle proceeds. This should enable us to evaluate prototypes in a rapid cycle to ensure that the project deliverables satisfy key user requrements.

We are also exploring modelling tools such as User Personas to assist in project development and communications.

Communications and Support - developing and implementing a detailed communications plan for both of the projects above, and consulting with schools, colleges and the student body on the support required to maximise benefit from these projects.

Page 26: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014 26

We will explore the risks and ethical dimensions of the uses of student data in this project, leading to a report for LTC to consider and take forward.

Other related activities

Close liaison with other areas of the University with an interest in aspects of learning analytics: including but not restricted to

o EUSA - to help ensure student input, and that any student concerns are addressed throughout

o SACS - to ensure policy and practice remain in step with over-arching management of student data. Links with the PCIM project to develop a new and enhanced programme and course information system will be especially important. Links with initiatives to support personal tutors with data and ensure TELS developments can be appropriately integrated.

o Specific researchers working in this field - e.g. David Hope working on a PTAS funded research project, Paula Smith continuing to research predictive analytics, and various courses and schools introducing tools and student views of activity during 2014/15.

Support and contribute to a developing learning analytics community across the

University, both continuing existing connections and supporting new ones. This will help to ensure that planned work is informed by, and supportive of, research enquiries in this area. This will include support for events, promotion of the Learning Analytics Discussion wiki, and the ongoing consultation process for project development.

Dissemination and publication - the project team will seek opportunities to disseminate project work and findings through presentations and publications where possible.

References 1. CETIS Learning Analytics Series Volume 1. See papers at http://publications.cetis.ac.uk/c/analytics Accessed July 2014 2. Proceedings of the 2013 Learning Analytics Summer Institute: Programme and presentations at http://solaresearch.org/conferences/lasi/lasi2013/ Accessed July 2014. 3. SOLAR: Journal of Learning Analytics. Available from http://epress.lib.uts.edu.au/journals/index.php/JLA/index 4. University of Roehampton fulCRM project details http://www.roehampton.ac.uk/fulcrm/ 5. Open University Analytics RISE Project http://www.open.ac.uk/blogs/RISE/ 6. Open University Learning Analytics work http://www.open.ac.uk/iet/main/research-innovation/learning-analytics

Page 27: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014

Appendix A: Categorised user stories All stories are grouped here under their different functional categories. For this reason some stories appear more than once. Averages / grades. (10)

Source As a… I want to… so that…

Staff Student

see an overview of my own coursework marks for all my courses, all in the same place, and in context with the rest of the class I can get an informed view of my own progress

Staff Student know where I stand compared to others in the class as well as how I am progressing

I am in control of my learning and can take ownership of my success

Staff teacher see groups vs cohorts I can see scores against the greater cohorts (group v. cohort)

Staff personal tutor

see an overview of each of my tutees' coursework marks all in the same place and in the context with the performance of the rest of the class

I can keep an eye on my tutees' progress in a contextualised way i.e. I will worry less about a poor assignment performance if the whole class did badly too (maybe it was a really hard task).

Staff course organiser have a graph of student results rather than just mean/median mark

students have a visual representation of their place within a class - easier to get a sense of place. This was possible in WebCT and staff and students have requested this be provided if/when possible.

Staff student know exactly what other students and staff can see I know who is reading (a) my work and (b) the comments on my work

staff academic dev advisor

know whether there is a correlation between completing the formative/non-assessed activities and their final mark

I can access how to (or whether to) improve the learning activities also whether to emphasise the completion

staff student dev advisor download basic data as an excel file so I can present in the way I want I can communicate with colleagues about it

student first year biological student see grade distribution

see more standard organisation style of folders. Biology students in the school of chemistry, biology, biomedical each of which has drastically different methods. Would make it easier

student student see how my mark compares to the class average and I can track my progress and see how my performance

Page 28: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014

what the highest and lowest are and marks are anonymised

compares to the best and worst marks so I can get an idea of where I stand

Activity not grades (43)

Source As a… I want to… so that…

Staff Student receive an activity summary at the end of the year compared to average usage

I can see if I've neglected areas during the year, show areas for improvement

Staff Student be able to drill into specific behaviour e.g. login time, length of login, tools used, posts made I can pin-point specific areas of concern

Staff Student see my own activity profile holistically I can see the bigger picture of my behaviour and inform my own decisions

Staff Student know how often other people in my class access reading materials/ereserve

I can keep up with them, know that I'm keeping up with the rest of the class

Staff Student see data in an interactive format I can explore the data and engage with it

Staff Student be able to see how my behaviour compares with my peers I can learn from my activity and the activity of others

Staff Instructor learn how much time a student has spent in a forum/thread I can tell if a student is accessing content (i.e. is a lurker)

Staff teacher/examiner explore pattern of student "engagement"

I can see whether it would be appropriate/useful/valuable to add as elements of assessment

staff teacher / personal tutor have measures of activity of the students I can be alerted to who are "falling by the wayside"

staff teacher see detailed patterns of activity in the VLE I can judge whether some activities are more typical of "deep learning" or the "strong student"

staff personal tutor see all of my students behaviour and that of their peers I can advise my tutee appropriately

staff academic development adviser

know how long students "appear" to be engaging with an activity / on a page etc.

I can assess whether the time I have alloted for a unit is correct

staff student support officer see which students are failing to engage with the course content / assignments I can intervene with support

staff instructor know the total time a student has spent online I can flag if a student has problems attending staff lecturer have a deeper insight into discussion board activity I can view the connections between students & staff in

Page 29: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014

the boards - overview of network communication. Students can see who their strong connections are, or if there are students they haven't had the opportunity to connect with yet.

Staff study development advisor exclude members of staff from the data it reflects student use accurately

staff instructor see which course modules students have viewed, when and for how long

I can know if students are engaging with the content/course and flag students who have bad attendance which could be trouble logging in

Staff instructor know the last time a student has logged into the course I can flag potential attendance issues

Staff

a teacher at masters level on an education programme show student "online behaviour" to the student group we can talk about study patterns and styles

Staff learning technologist be able to see charts of students activity in real time and compare to others

I can test the water and see how practice in some schools/courses may compare with others.

Staff instructor see how many comments a student has made in the forums I can gauge engagement levels

Staff lecturer be able to access reports like Talis Aspire - "popular" materials (click & clicks per users)

we can clearly see what materials the students access most/least and compare with course feedback and staff/students meetings. Ensure most effective materials used and gaps in knowledge/info highlighted.

staff programme leader

know if there is a correlation between programme retention and a) location b) academic background c) online activity I can support students better

staff teacher know which students are feeling isolated I can contact them at an appropriate time staff librarian known how often e-reserve items are accessed we know we are justified in supporting the service

staff course organiser see which items have been accessed by the class (and to what extent) I can judge how much content is being used

staff course organiser\librarian

know how many times students in a course have viewed a reading list

I can correlate success\results with times reading list viewed e.g. student with better marks have accessed the list more often\less often

staff student support officer see when students have signed up for a group in learn I can tell which students are disorganised

Page 30: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014

(i.e. the exact time)

staff Instructor see an overall view of the course in access levels of different modules, forums, engagement, activity

at a glance I can get an overall picture of the courses performance

staff study development advisor

get below content areas in learn to see which items students look at I can see what students are going to

staff study development advisor

see which resources (pdf, docs, slideshows etc.) are downloaded/viewed most often (what is accessed how many users have accessed it) I know what students find more useful

staff academic dev advisor know which if the tools I am using to create the learning events are most popular

I can focus on using\dev the most popular tool even better

staff instructor see how many students clock on links (internal\external web links)

I can tell if students ae looking at the resource, following the course structure I have set out

staff course designer have a record of where people visit or do visit I can judge where design elements may not be working properly

staff academic dev advisor know when my students are most likely to login to moodle

1) I can anticipate when they need online support 2) when I can expect most students to be available for synchronous live lectures

staff teaching secretary be able to give markers their own level of access

I can view Turnitin details whilst work is being marked annoymously. This would allow us to check whether or not specific students have submitted their work

staff personal tutor access to materials my tutees see - observer I can support my students

staff librarian how many clicks out to library resources come from the VLE

I can know how much traffic comes from reading links on the VLE

staff student login and see whether or not I am alone I don’t feel so alone (simply by knowing others and online with me)

staff academic dev advisor login to moodle and see instantly who is live an online 1) I can cess overall activity 2) I am not in danger of disrupting activity in case I need to edit\fix etc.

staff student know who is an our site or has access to it I wont get any surprises

student first year biological student see podcast access (to argue high usage > value)

student final year student have a timetable\chart of dates when submissions are due plus possibly weighting of the assignment)

I can keep track of when essays are due and access a neat overview. This way I can plan ahead and organise

Page 31: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014

better. View of data (46)

Source As a… I want to… so that…

Staff Student see data in a scaleable and coherent manner, no-matter which system the data originates in I don't have to go hunting through systems

Staff Student see data in an interactive format I can explore the data and engage with it

Staff Student have learning analytics to be online 24x7 and accessible via my mobile/tablet I can engage with the data where and when I choose

Staff teacher/examiner explore pattern of student "engagement"

I can see whether it would be appropriate/useful/valuable to add as elements of assessment

Staff teacher see groups vs cohorts I can see scores against the greater cohorts (group v. cohort)

staff teacher / personal tutor have measures of activity of the students I can be alerted to who are "falling by the wayside"

staff teacher see detailed patterns of activity in the VLE I can judge whether some activities are more typical of "deep learning" or the "strong student"

staff personal tutor see all of my students behaviour and that of their peers I can advise my tutee appropriately

staff learning technologist be able to navigate a learning analytics module simply I can promote ease of use / uptake

staff programme director have an overview of student activity, including assessment progress, across courses

students, particularly vets, can print a report of their activity for CPD logs (professional requirement) I can note where students may require additional support and direct accordingly

staff learning technologist be able to see students that may be at risk through simple warning flags

I can take remedial action, or advise where this may be required

staff instructor know the total time a student has spent online I can flag if a student has problems attending

staff lecturer have a deeper insight into discussion board activity

I can view the connections between students & staff in the boards - overview of network communication. Students can see who their strong connections are, or if there are students they haven't had the opportunity to

Page 32: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014

connect with yet.

Staff study development advisor exclude members of staff from the data it reflects student use accurately

Staff personal tutor

see an overview of each of my tutees' coursework marks all in the same place and in the context with the performance of the rest of the class

I can keep an eye on my tutees' progress in a contextualised way i.e. I will worry less about a poor assignment performance if the whole class did badly too (maybe it was a really hard task).

staff instructor see which course modules students have viewed, when and for how long

I can know if students are engaging with the content/course and flag students who have bad attendance which could be trouble logging in

Staff instructor know the last time a student has logged into the course I can flag potential attendance issues

Staff course organiser have a graph of student results rather than just mean/median mark

students have a visual representation of their place within a class - easier to get a sense of place. This was possible in WebCT and staff and students have requested this be provided if/when possible.

Staff

a teacher at masters level on an education programme show student "online behaviour" to the student group we can talk about study patterns and styles

Staff academic development advisor

be in control of the information/analytics possible with the software/interfaces available

I can act quickly and make evidence-based decisions on curriculum design

Staff learning technologist be able to see charts of students activity in real time and compare to others

I can test the water and see how practice in some schools/courses may compare with others.

Staff instructor see how many comments a student has made in the forums I can gauge engagement levels

Staff lecturer be able to access reports like Talis Aspire - "popular" materials (click & clicks per users)

we can clearly see what materials the students access most/least and compare with course feedback and staff/students meetings. Ensure most effective materials used and gaps in knowledge/info highlighted.

Staff student know exactly what other students and staff can see I know who is reading (a) my work and (b) the comments on my work

Staff teacher tell students exactly what I can/will look at they know I'm not spying on them staff academic dev advisor know whether there is a correlation between I can access how to (or whether to) improve the learning

Page 33: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014

completing the formative/non-assessed activities and their final mark

activities also whether to emphasise the completion

staff academic dev advisor be in a position to "follow" individual students on their learning activity path

1) I can tailor\personalise learning objects/events 2)catch the invisible student before its too late

staff teacher know which students are feeling isolated I can contact them at an appropriate time

staff tutor set up a way to auto mark and score multiple choice questions from a random bank\library of questions

summative (and formative) assessment can be made quicker and results passed to teaching office as a spreadsheet/xml file

staff librarian known how often e-reserve items are accessed we know we are justified in supporting the service

staff postgraduate teaching assistant

be able to see the average course work marks for my group of students compared to class

I can calibrate how harsh/lenient my marking is compared to the other tutors

staff teacher add only my choice of assessment to the grade sheet students aren’t given results for something that was for fun

staff Instructor see an overall view of the course in access levels of different modules, forums, engagement, activity

at a glance I can get an overall picture of the courses performance

staff study development advisor

get below content areas in learn to see which items students look at I can see what students are going to

staff study development advisor

see which resources (pdf, docs, slideshows etc.) are downloaded/viewed most often (what is accessed how many users have accessed it) I know what students find more useful

staff academic dev advisor know which if the tools I am using to create the learning events are most popular

I can focus on using\dev the most popular tool even better

staff instructor see how many students clock on links (internal\external web links)

I can tell if students ae looking at the resource, following the course structure I have set out

staff course designer have a record of where people visit or do visit I can judge where design elements may not be working properly

staff personal tutor access to materials my tutees see - observer I can support my students

staff librarian how many clicks out to library resources come from the VLE

I can know how much traffic comes from reading links on the VLE

staff student login and see whether or not I am alone I don’t feel so alone (simply by knowing others and online with me)

staff student know who is an our site or has access to it I wont get any surprises

Page 34: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014

student first year biological student see grade distribution

see more standard organisation style of folders. Biology students in the school of chemistry, biology, biomedical each of which has drastically different methods. Would make it easier

student first year biological student see podcast access (to argue high usage > value)

student final year student have a timetable\chart of dates when submissions are due plus possibly weighting of the assignment)

I can keep track of when essays are due and access a neat overview. This way I can plan ahead and organise better.

student student

see how my mark compares to the class average and what the highest and lowest are and marks are anonymised

I can track my progress and see how my performance compares to the best and worst marks so I can get an idea of where I stand

Other (38)

Source As a… I want to… so that…

Staff learning technologist be able to evidence the benefit of learning analytics via case studies/e.gs we can focus further dev effort + update

staff academic dev advisor

know whether there is a correlation between completing the formative/non-assessed activities and their final mark

I can access how to (or whether to) improve the learning activities also whether to emphasise the completion

staff academic dev advisor be in a position to "follow" individual students on their learning activity path

1) I can tailor\personalise learning objects/events 2)catch the invisible student before its too late

staff programme leader

know if there is a correlation between programme retention and a) location b) academic background c) online activity I can support students better

staff teacher know which students are feeling isolated I can contact them at an appropriate time

staff tutor set up a way to auto mark and score multiple choice questions from a random bank\library of questions

summative (and formative) assessment can be made quicker and results passed to teaching office as a spreadsheet/xml file

staff student dev advisor download basic data as an excel file so I can present in the way I want I can communicate with colleagues about it

staff student see how many clicker questins Im getting right in each I can get a sense of my performance

Page 35: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014

lecture

staff postgraduate teaching assistant

be able to see the average course work marks for my group of students compared to class

I can calibrate how harsh/lenient my marking is compared to the other tutors

staff student support officer see when students have signed up for a group in learn (i.e. the exact time) I can tell which students are disorganised

staff course organiser create a space students can add their own content

staff personal tutor establish an easy mechanism I can contact students individually or in groups that I decide (not the system)

staff student know when the system isn’t working properly I know that I haven’t broken it

staff course secretary sync the contents of the grade book with offline marks data

I can easily enter the marks online before uploading to learn

staff teaching secretary see fewer emails coming from learn by default when I send an email from learn it doesn’t get ignored and so that students are inclined to complain

staff academic dev advisor know when my students are most likely to login to moodle

1) I can anticipate when they need online support 2) when I can expect most students to be available for synchronous live lectures

staff program director customise a default moodle template (if available) the ODL program can be set up to be quicker

staff administrator record marks in learn & transfer them to SMART or EUCLID

human error of having to enter marks multiple times is out of the equation

staff tutor set up an editable online glossary of computational chemistry terms the students may add to it and undergo peer review

staff teaching secretary be able to see the emails that I have sent previously in learn

I can send subsequent emails more efficiently and keep a record of who the recipients were at the time

staff teaching secretary be able to give markers their own level of access

I can view Turnitin details whilst work is being marked anonymously. This would allow us to check whether or not specific students have submitted their work

staff teaching secretary

be able to view a list of essays for a given tutor\marker along with the marks in Turnitin as the "by groups" link does not allow for this currently

moderators can get an idea of how people are marking or so that samples can be taken from Turnitin more easily

staff course organiser

download learn test results in a really simple format (e.g. one spreadsheet row per student, answer option A,B,C etc. rather than as text, tabs

so that I can do my own further processing and analysis without having to laboriously reformat the data

Page 36: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014

staff teaching secretary be able to flag groups as tutorial groups specifically along with time and location information

the information is available for integration into future timetables or even EUCLID

staff personal tutor access to materials my tutees see - observer I can support my students

staff teaching secretary be able to alter the rounding options for columns in the grade centre

grade centre can calculate the whole course marks with greater accuracy

Staff academic be able to see case studies of use in other schools/programmes I can modify my practise

staff teacher of education allow students to explore VLE social networks (SNAPP) we can talk about the social foundations of learning

staff librarian how many clicks out to library resources come from the VLE

I can know how much traffic comes from reading links on the VLE

staff student login and see whether or not I am alone I don’t feel so alone (simply by knowing others and online with me)

staff teacher be able to see where my students may be paired/busy each other to best effect to target students ability to support each other

staff academic dev advisor known whether the type of device used makes a difference to the learning experience I can improve online induction activities

staff academic dev advisor login to moodle and see instantly who is live an online 1) I can cess overall activity 2) I am not in danger of disrupting activity in case I need to edit\fix etc

staff student know who is an our site or has access to it I wont get any surprises

student first year biological student see podcast access (to argue high usage > value)

student postgraduate student find out more about the grade system more systematic and holistic

student postgraduate student log into email system on mobile learn

student postgraduate student personalised my page so that 2 see my priority

Page 37: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014

Already do-able - part or whole (31)

Source As a… I want to… so that…

staff tutor set up a way to auto mark and score multiple choice questions from a random bank\library of questions

summative (and formative) assessment can be made quicker and results passed to teaching office as a spreadsheet/xml file

staff student dev. advisor download basic data as an excel file so I can present in the way I want I can communicate with colleagues about it

staff instructor get deadline alerts I can see which student have been struggling to meet deadline for assessments

staff student see how many clicker questions Im getting right in each lecture I can get a sense of my performance

staff librarian known how often e-reserve items are accessed we know we are justified in supporting the service

staff course organiser see which items have been accessed by the class (and to what extent) I can judge how much content is being used

staff course organiser\librarian

know how many times students in a course have viewed a reading list

I can correlate success\results with times reading list viewed e.g. student with better marks have accessed the list more often\less often

staff postgraduate teaching assistant

be able to see the average course work marks for my group of students compared to class

I can calibrate how harsh/lenient my marking is compared to the other tutors

staff student support officer see when students have signed up for a group in learn (i.e. the exact time) I can tell which students are disorganised

staff teacher add only my choice of assessment to the grade sheet students aren’t given results for something that was for fun

staff course organiser create a space students can add their own content

staff personal tutor establish an easy mechanism I can contact students individually or in groups that I decide (not the system)

staff student know when the system isn’t working properly I know that I haven’t broken it

staff course secretary sync the contents of the grade book with offline marks data

I can easily enter the marks online before uploading to learn

staff study development advisor

get below content areas in learn to see which items students look at I can see what students are going to

Page 38: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014

staff study development advisor

see which resources (pdf, docs, slideshows etc.) are downloaded/viewed most often (what is accessed how many users have accessed it) I know what students find more useful

staff academic dev. advisor know which if the tools I am using to create the learning events are most popular

I can focus on using\dev. the most popular tool even better

staff instructor see how many students clock on links (internal\external web links)

I can tell if students ae looking at the resource, following the course structure I have set out

staff course designer have a record of where people visit or do visit I can judge where design elements may not be working properly

staff teaching secretary see fewer emails coming from learn by default when I send an email from learn it doesn’t get ignored and so that students are inclined to complain

staff academic dev advisor know when my students are most likely to login to moodle

1) I can anticipate when they need online support 2) when I can expect most students to be available for synchronous live lectures

staff student receive an email reminding me of a course work deadline/event on the ODL course I may get the most from the course

staff program director customise a default moodle template (if available) the ODL program can be set up to be quicker

staff administrator record marks in learn & transfer them to SMART or EUCLID

human error of having to enter marks multiple times is out of the equation

staff teaching secretary be able to see the emails that I have sent previously in learn

I can send subsequent emails more efficiently and keep a record of who the recipients were at the time

staff teaching secretary be able to give markers their own level of access

I can view Turnitin details whilst work is being marked annoymously. This would allow us to check whether or not specific students have submitted their work

staff teaching secretary

be able to view a list of essays for a given tutor\marker along with the marks in Turnitin as the "by groups" link does not allow for this currently

moderators can get an idea of how people are marking or so that samples can be taken from Turnitin more easily

staff course organiser

download learn test results in a really simple format (e.g. one spreadsheet row per student, answer option A,B,C etc. rather than as text, tabs

so that I can do my own further processing and analysis without having to laboriously reformat the data

staff teaching secretary be able to flag groups as tutorial groups specifically along with time and location information

the information is available for integration into future timetables or even EUCLID

staff personal tutor access to materials my tutees see - observer I can support my students

Page 39: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014

staff teaching secretary be able to alter the rounding options for columns in the grade centre

grade centre can calculate the whole course marks with greater accuracy

Page 40: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014

Appendix B: Project Contributors Staff and Students who have contributed to events, consultation and surveys.

Name Dept

Ian Pirie Asst Principal Learning & Development

Tanya Lubicz-Nawrocka EUSA

Alex Munyard EUSA 2013-14

Hugh Murdoch EUSA 2013-14

Dash Sekhar EUSA 2014-15

Guillaume Evrard HSS - Office of Lifelong Learning

Louise Connelly IAD

Kay Williams IAD

Karsten Moerman IAD

Jeff Haywood IS - CIO

Angela Laurins IS - Library Learning Services

Matt Hammond IS - LTS (MVM)

Michael Begg IS - LTS (MVM)

Geir Granum IS Apps

Richard Good IS Apps

Fiona Littleton IS Learning Services

Mark Wetton IS Learning Services

Wilma Alexander IS Learning Services

Mark Wetton IS Learning Services

Amy Woodgate IS Special Projects (MOOCs)

Anne-Marie Scott IS TELS

Myles Blaney IS TELS

Stephannie Hay IS TELS

Paula Smith MVM – IS Secondee

Christina Mainka MVM - Wellcome Trust Clinical Research

David Hope MVM Teaching Org

Barry Nielson SACS

Nicola Kett SACS

Lisa Dawson SACS - Student systems

Chris Giles SACS - Student systems

Karen Howie School - HCA

Simon Cann School - PPLS

Toni Noble School - PPLS

David Rogers School of Chemistry

Christine Sinclair School of Education

Hamish MacLeod School of Education

Erin Jackson School of Law

Page 41: TEL009 - Student Information from VLEs · in 2014/15, including but not exclusively some application development work for the VLEs. The focus has been on outputs and user-facing options,

TEL009 Student Information from VLEs - Project Report August 2014

George Kinnear School of Mathematics

Sharon Boyd School of Vet Sciences

Ross Galloway School Physics and Astronomy


Recommended