5.1 Learning analytics and SoTL
Learning analytics and SoTL are closely intertwined. The learning analytics–SoTL partnership enables evidence-based analysis of learning and teaching trends to identify areas for interventions and innovations to improve student learning (Hubball et al., 2013; Muljana and Placencia, 2018).
Learning analytics can help guide the design of a SoTL inquiry and also facilitate evaluation of a SoTL inquiry. Central to SoTL is the ability to provide evidence for the outcomes of the innovations and interventions that are implemented as a result of SoTL research. Learning analytics and SoTL, therefore, supplement and complement one another (Hian and Chong, 2014).
Let’s consider an example: if learning analytics show that a majority of students are not completing an online formative (not counted towards final assessment) quiz on a module, a SoTL inquiry may investigate the reasons. Let’s say that the SoTL research reveals that the formative nature of the quiz doesn’t encourage students to complete the quiz. To encourage student engagement, a threshold of 40% is set for the quiz in order to pass the module. Learning analytics can then provide further data to evaluate the difference the threshold has made in student engagement with the quiz.
Many educational institutions are adopting a data-informed culture for continuous improvement and for evidence-based teaching practices guided by the ethics and data-governance structures for use of learning analytics (Rehrey et al., 2020). Other factors are also driving these changes, such as increased accountability in education and the use of technology in education (Bronnimann et al., 2018). There are generally institution-wide data analytics teams that help educators to identify the types of information that is relevant for SoTL, as well as help to locate the data (e.g. dashboards, reports) and the tools and training to analyse the data and how to draw interpretations.
If you are a module or course leader, depending upon your institution’s processes and learning analytics strategy, you may receive data about student performance on a regular basis. This data may highlight aspects for further investigation and need for a SoTL inquiry, such as why students are not performing well on a particular part of the module; or you may receive usage patterns of the wiki indicating that some students are not participating in the groupwork task.
These are examples of some of the interventions in the OU’s Faculty of STEM in response to learning analytics. The modules are online, which enables a fine-grained approach to data analysis for investigating how students interact with online content.
- The low submission rate on the first assignment in a first-level Science module was improved by reducing the size and scope of the assignment based on the submission data and feedback from the tutors. The submission date was also changed from week 7 to week 6. The module team is planning similar changes to the second assignment to improve the submission rate for that too. The module team and learning design team will monitor submission rates for the first and second assignments and investigate any other factors that could have had an impact on submission rates, beyond the changes introduced.
- A declining trend in VLE usage was detected between weeks 9 and 12 on a second-level Biology module. While the module team was investigating the reasons, a bridging video was introduced to enhance the interest for the content in these weeks: ‘... [to] prepare and reassure students, and to dispel some of the anxiety around anatomical terminology related to the human nervous system’. There was, however, no change in the VLE engagement pattern, even after introducing the bridging video. The team is now considering a change to the timing of assessment and introducing a checkpoint between weeks 8 and 12 to boost engagement with the VLE.
- On an introductory Physical Sciences module, students were not engaging with Python (a programming language) content. To improve student engagement with Python content, the module team took several actions including updating some text in the weeks in which Python is taught to clarify things, especially how to study Python (taking notes) and more specific guidance on the activities; and producing an additional video to demonstrate how to build up larger Python programs. There is now an increase in student engagement on the Python discussion forums of the module. The module team is monitoring the effect of the changes made and also considering other changes based on the learning analytics and their evaluations.
In a new third-level Natural Sciences module, the analytics showed 75% of students were employed, and 50% of the total cohort were in full-time employment. There was concern about the impact of high study intensity for students studying multiple modules.
The high workload of specific students was flagged to their tutors along with the assessment dates of the three most likely concurrently studied modules so that tutors could be careful when allowing Tutor Marked Assignment (TMA) extensions that could lead to clashes. Using the VLE data, the module team was also able to identify students who had not visited the site within the ten days prior to the last TMA submission date, and flag this to their tutors. While there was no benchmark previous presentation of the module to compare the actions above with, the proactive approach to talking with students and tutors was enabled by the learning analytics.
These examples illustrate that learning analytics can guide the formulation of SoTL inquiries and, when monitored on SoTL research projects, learning analytics serve as useful data alongside data collected by applying other research methods such as student interviews and questionnaires.