Case Study: Dr Hazel Farrell of Waterford Institute of Technology gives an overview of how she uses low stakes, weekly quizzes to inform her teaching practice and identify students that may be experiencing difficulty
Case Study: Dr Cormac Quigley and Dr Etain Kiely of Galway-Mayo Institute of Technology show how they are using everyday tools to enable at-scale, personalised feedback to large student cohorts
This study was conceived as part of a broader effort on the part of the Institute to understand with greater clarity the profile of the student body and specifically to identify factors that may negatively impact on student retention and progression. The institution also sought to build a more robust evidence base on which to plan proactive initiatives designed to address retention and progression issues at institutional, department, course and module level. In particular, they recognised the need for a more evidence-based understanding of the retention and progression challenges at module and programme level in order to inform more effective and targeted deployment of resources by faculty and by central services at the earliest possible stage in the programme lifecycle.
Recognising the impact of attrition on students, UL tracked the rates for first years over a seven-year period (2005-2013) and identified an increase in attrition averaging 13%. In addition to the costs and missed opportunities incurred by students that leave college early, this pattern also represented a significant loss of revenue to the University. In order to fully understand the issues impacting on first year UL students,
the University sought to identify best practices to support first year engagement both inside and outside the classroom, to leverage the value of learner data as a resource, to identify students at risk of withdrawing prematurely and to develop an effective and meaningful Student Engagement Policy.
LYIT continuously strives to enhance the experience of its students. A key goal in this process is to improve student retention, in which in-class attendance has been found to play a significant role. Attendance was traditionally recorded on a paper-based system, with data being manually compiled and digitised. This proved to be both cumbersome and time consuming. Recognising that retention reporting would be best served by a user-friendly approach that would eliminate the need for paper-based documents, allow the creation
of records for lecturers and the transfer of data to administration staff for timely reporting, senior managers within LYIT sought the development of a bespoke digital attendance monitoring system.
IT Blanchardstown has long recognised the value of data for enabling an evidence-based approach to decision-making. Given an increasing interest in maximising the power of that data, but cognisant of the importance of using it in an ethical and sustainable way, the need was established for an institute-wide policy and strategy that would set boundaries and guidelines for operationalising learning analytics initiatives. It was a necessary first step in ensuring appropriate use of student data for learning analytics, and to collaboratively define
a roadmap for next steps in implementing data enabled student success initiatives.
Coordinating module assessments across a semesterised programme is a recognised challenge of semesterised programmes. Within this context, DCU found that lecturers working independently to design and manage their module assignments could lead to a lack of coordination across programmes, potentially leading to an increase in student anxiety, particularly at key times. Responding to this challenge, and recognising the value of assessment data as a means of understanding students’ progress through their programme, DCU set out to develop a means of capturing, analysing and reporting on assessment data in order to effectively manage students’ workloads and to maximise the value of learner data as a resource for identifying students at risk of non-progression.
The purpose of this resource is to give an overview of some of the themes that feature in each of a set of policy exemplars from international sources. Each of the topics is addressed in greater detail in the ORLA resource Institutional Guide to Developing Enabling LA Policies
The fundamental role of Learning Analytics is to answer questions. These questions should provide actionable insights that institutions, teaching staff and students use to drive effective change. Identifying the initial question to be addressed is a key first step as it will dictate every other aspect of the institutional strategy.
This resource is intended to help institutions to review some questions they may wish to begin with as well as some of the data sources that may be used to provide answers. It is divided into questions designed to enable institutional insights, lecturer-facing insights and student-facing insights.
This resource should not be considered an exhaustive list, but rather a high-level guide to some of the popular applications of Learning Analytics. It should also be noted that the data recipes below are not the only means of answering the questions listed, they are merely suggested methods.
Institutions are also reminded that any use of personal data must be fully compliant with the conditions of the GDPR.
This institutional approach…
…is based in University College Dublin.
…was developed in-house.
…allows for the tracking of student interactions with services, with a view to improving efficiencies and enhancing student experiences.
…helps to inform institutional decision making regarding resource deployment.