Learning Analytics Examples

Learning Analytics (LA) is a broad term that spans a broad range of activities: from instructors testing effectiveness of learning approaches, to instructors and advisors determining efficacy of particular learning interventions, to researchers asking basic questions of learning data to gain insights into individual performance or learning strategies, to institutional approaches used for program planning or reporting.

Purposes of using LA vary greatly, and stakeholder groups are diverse in their roles and interests. The following examples illustrate some existing use cases.



OnTask: A tool that makes use of data points that teachers identify as important, to help them track students’ progress, and give more frequent, better, personalized feedback
OnTask integrates data from any online learning tool (Learning Management System, Classroom Response System, Discussion Board, Quizzes, Attendance, etc.) and lets you, the instructor, define progress indicators. You can view a dashboard of student progress metrics, individual and aggregated. Even more powerfully, OnTask allows you to easily send custom email or text feedback to subsets of students who aren’t yet meeting performance indicators, giving them a personalized reminder of what they need to do to catch up.
For more information, see: 

Back to top


Threadz: A tool that develops network visualizations to allow you to ‘see’ who is communicating with whom in your course
 
It’s hard to monitor class discussion boards and give fair grades for participation. The Threadz tool gives instructors manipulable visualizations of what’s happening in the forums, and reliable metrics indicating degree of engagement. Week by week, you can easily discover who is engaged and who is not, whose responses are going unanswered, and whether you are achieving your learner engagement goals for your course.
For more information, see: 
https://threadz.ewu.edu/

Simple reports that draw on historic data from past offerings and current registration in your course, to give you a better understanding of who your students are: demographics, past courses completed, likely future majors choicesetc.
 
What might you change about your course content or your teaching if you knew which courses your students had already completed, and with what success? Or if you knew how many were likely to opt for a major in your discipline? Knowing your audience is key to starting off a course at the right level with the right tone. A ‘Know thy students’ project would deliver, before the start of the term, an aggregated report of students’ prior courses, grades, and engagement levels. Already partly underway in Arts, this project builds on a published use case by Motz and colleagues. Studies suggest good potential to help instructors tailor teaching, curriculum and expectations based on greater understanding of their likely class areas of specialization, past course history, likely study path, demographics, etc.

Discover how students really move through courses in your programs

Your department has some lore about student pathways, but how do different ‘kinds’ of student really progress through your degree programs? Do their enrolments bear any relation to the carefully sequenced curriculum that your program has designed (or not designed)? A pathways tool would allow departments to visually explore patterns of enrollment, and answer questions like “What proportion of students who take prerequisite course X  then take course Y, next term, next year, or after a longer gap?” or “Is there a big difference in the proportion of students in different sections of the same introductory course who go on to major in the program?”

Back to top


 An instructor dashboard that displays learner engagement with material and activities and ongoing student performance.  Can be used in courses with multiple sections or large enrollments through an interactive, multilevel heat map.

Instructors in large online or blended courses need tools that can help them monitor student online activity and performance, so they can provide support and intervention as needed. Ginda and colleagues from Indiana University have proposed an activity dashboard design that significantly improves upon the native Canvas dashboard.
It allows
 instructors to monitor student activity and performance in courses with multiple sections and large enrollment through an interactive, multilevel heat map. The top level shows an aggregated view with weekly activity (based on participation in discussions, page views, and quiz attempts) and submission grades for all course sections. Selecting a specific cell drills down the engagement or grade date for individual students in lower-level heat map. A preliminary user study showed that instructors found the dashboard an improvement upon existing embedded dashboard in Canvas

Measure the impact of student engagement with course material on their course grades or other indicators of learning
 
Learning Analytics can be used to evaluate the impact of the implementation of any element of the course that can be measured.
For example, the Collaborative Learning Annotation System (CLAS), built by Arts at UBC, 
allows students to actively engage with the content by leaving comments and reflections at specific moments of the video.  Does the course material and student engagement with it really contribute to learning and performance on course assessments? Pardo et al. have analyzed log data from CLAS to investigate the relationship between mid-term score and use of CLAS in a flipped classroom first year course. The analysis showed a significant positive relationship between annotating videos and midterm results, while additional findings offered suggestions for improving course design and teaching.
 

Visual and statistical tools that investigate past student performance data in selected courses to gain insight into curricular design and offer recommendations
Learning Analytics can offer departments insight into their programs, and guide and inform curricular (re)design. Ochoa et al. suggest a set of analyses that applies statistical and visual techniques to historic student performance data to provide insights into course difficulty, individual course impact on the overall academic performance of learners, curriculum coherence, dropout paths, and the impact of course load on student performance. The techniques were applied and validated for a case study in a Computer Science program using available historical academic performance data. The results informed adjustments to the curriculum. This kind of data-driven curriculum redesign is now being piloted in Arts.
Mendez, Gonzalo, et al. “Curricular design analysis: a data-driven perspective.” Journal of Learning Analytics 1.3 (2014): 84-119. 

An activity dashboard that gives students immediate feedback on their level of engagement with pre-class preparation activities


The
 flipped classroom is a new buzzword in education, although asking students to prepare materials before class is not new, especially in Arts disciplines!
However, 
learning technologies now make material available to students in various new ways. So instructors face the challenge of incentivizing students to complete preparatory work before class. In order to be able to give reinforce students’ preparation work, Pardo et al. (University of Sydney) have developed an activity dashboard that provides students immediate feedback on their preparation activities. The underlying system compiles and presents data on an individual student’s work on preparatory materials: log data from the learning management system, from the video platform, and a course website with domain­-specific learning activities.
Khan, Imran & Abelardo Pardo. “Data2U: scalable real time student feedback in active learning environments.” Proceedings of the sixth international conference on learning analytics & knowledge. ACM, 2016.