Learning analytics: Your roadmap to personalising the student experience

As with many other sectors, the transition to online has opened up a wealth of new, accessible information in higher education. Learning analytics combines data analysis of student interaction with online (as well as offline) education, aiming to provide a more personalised learning experience.

The potential for analytics is huge, and universities are beginning to harness the abundance of available data. But many faculties still face the burden of understanding how to develop an analytics strategy, as well as collating and analysing accurate data to make informed decisions that are aligned with student needs.

Ahead of Blended Learning 2015, we examined the various considerations and critical success factors that can enable faculties to gather information from different sources and create a single ‘source of truth’. Extracting actionable insights from student learning patterns will open up new possibilities to improve engagement and guide students to successful outcomes.

The case for learning analytics

The ability to investigate student study behavior is vital towards informing a number of blended learning attributes, including:

  • Teaching practice;
  • Curriculum design;
  • Learner engagement;
  • And course selection pathway.

These attributes reflect an important distinction between two perspectives: at a course level and a departmental level, from which both students and faculties stand to benefit. Many initiatives can result in the form of fact-finding projects (both predictive and reactive); dashboard analysis and reporting (of both students and staff); and the wider institutional approach.

But developing the analytics strategy very much relies on institutional maturity – what level of sophistication or readiness a university is at for learning analytics. When this level is determined, faculties need to identify:

  • The type of data accessible;
  • The tools available to assort and analyse the data;
  • Reporting, which could be ad-hoc or more formatted;
  • And consultation and engagement with stakeholders.

Some faculties can be tempted to track metrics just for the sake of it, not really considering the way in which information can be extracted from specific data sources. The key is to be able to draw out actionable insights to develop the blended learning strategy, such as using patterns of learning to predict individual student performance and implementing potential interventions if necessary.

Analytics in action – University of Sydney

At University of Sydney, there’s a learning analytics strategy (for flipped classrooms) spearheaded by Professor Abelardo Pardo, Senior Lecturer in the School of Electrical Engineering, who is responsible for the deployment of the unit and its design.
The design includes many active learning resources, such as videos, multiple choice questions in the notes, and sequences of problems for class preparation.

“Once a year,” he says, “there are several innovation projects funded and some of them use analytics to create an effective, blended learning environment. The university also supports us with additional tasks needed to deploy and run these data driven initiatives.”

At a more abstract level, pedagogical strategies are being adopted to increase student participation. Videos followed by multiple choice questions, for example, is a formative assessment resource that doesn’t count towards the students’ actual grades. Instead, it’s a way for them to ensure they are travelling along the right trajectory for learning outcomes.

The resources are designed to ensure student interactions support learning. Professor Pardo notes that different patterns of video interaction have been identified and are checked – in real time – if they are having an effect in the context of flipped learning.

“As a result, we can reach out to individual students and suggest different ways of maximising the usage of these videos. The results we have are still preliminary, but we think these techniques can provide greater support to the students. Our educators are currently supported by eLearning that provides access to several analytics reports,” he explains.

The support provided by eLearning and the reporting capability reflect more of the wider design strategy – for example, the type of resources; the type of interaction expected; inclusion of formative assessments; or the number of times students are allowed when solving exercises. These variables have a lot of potential to improve the learning design and therefore enhance the overall learning experience.

“We’re creating an environment in which we assume that students have a fair amount of interaction with the material, the instructors and among themselves, because that will increase their overall academic performance,” Professor Pardo notes.
However, creating this environment presents three distinct challenges. The first relates to enabling academics to become proficient with the necessary tools – that could range from creating or annotating videos, to detecting and processing events within them.

While the second issue lies in implementing additional tools that allow educators to create simulations and take a look at the data generated by them. And the third challenge is how to combine this data to produce predictive models, identify actionable insights and implement interventions.

“Linking pedagogical strategies with the collected data is not a trivial task,” Professor Pardo observes. “We need to think about what kind of activities would make more sense to include, ensure that proper data is being collected, and deploy the correct analytic procedures.”

Some universities have addressed the variety of available data and reporting by implementing integrated data warehouse applications – or enterprise warehousing. Only a few institutions have adopted this solution, but they are now able to provide multiple stakeholders with different levels of reports and intelligence.

At University of Sydney, several seminars and discussions on the use of analytics and associated issues have been held, as well as collaboration with other institutions – either through research projects or consulting.

“The most common data source is the learning management system (LMS). However, the LMS data is not enough and it needs to be integrated with other data sources such as student information systems. It’s by combining several sources that we can we obtain better insights and align them with the learning outcomes,” Professor Pardo says.

The conventional sources that are considered outside of the LMS are:

Initial enrolment information;
Student feedback surveys;
Number of courses taken by individual students;
And average scores.

The objective is to increase the number of data sources for analysis and identify important indicators, as well as scale personalised feedback using data to identify clusters of students with similar profiles.

By grouping together these profiles, Professor Pardo and his team will be able to communicate specific messages and feedback based on their preferred learning outcomes. This capability will require more detailed access and algorithms to create profile categories.

Learnig analytic

Analytics dashboard, courtesy of University of Sydney

Conclusion

Implementing data analytics is a group activity, not the responsibility of one individual. And it all starts with creating a data inventory, because most universities will generally have data that is already being collected. Dashboard reporting offers a range of opportunities for faculty to craft tailored learning experiences for students with their outcomes in mind, especially in the context of flipped classrooms.

“Think in advance of a solution that provides functionality for different stakeholders,” Professor Pardo says. “Not only to upper management, but curriculum design, course coordinators and instructors.”

Aligning the analytics strategy with the pedagogical objectives will enable students and staff to benefit from personalised feedback.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s