2. εκπαίδευση στα δεδομένα

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 18

Technology, Knowledge and Learning (2023) 28:981–998

https://doi.org/10.1007/s10758-023-09642-0

ORIGINAL RESEARCH

Mediating Teacher Professional Learning with a Learning


Analytics Dashboard and Training Intervention

Manisha Khulbe1 · Kairit Tammets1

Accepted: 11 February 2023 / Published online: 23 February 2023


© The Author(s), under exclusive licence to Springer Nature B.V. 2023

Abstract
Insights derived from classroom data can help teachers improve their practice and stu-
dents’ learning. However, a number of obstacles stand in the way of widespread adoption
of data use. Teachers are often sceptical about the usefulness of data. Even when willing
to work with data, they often do not have the relevant skills. Tools for analysis of learning
data can, theoretically, aid teachers in data use, but often fall short of their potential as
they are commonly designed without reference to educational theory and rarely consider
end-user’s needs. Keeping these challenges in mind, we designed a professional develop-
ment program that aimed at, among other things, improving teachers’ beliefs regarding
data and their data literacy skills. After the training, we found that teachers had more posi-
tive attitudes regarding data. However, some data literacy skills proved quite difficult to
learn. We present and analyse our intervention here and forward a proposal for improving
the effectiveness of data use interventions by leveraging theory-based Learning Analytics
(LA) dashboards as mediating tools that scaffold teachers’ acquisition of new knowledge
and skills during and beyond the intervention.

Keywords Teachers’ data use · Data use intervention · Learning analytics · Learning
analytics dashboard · Teacher-facing dashboard · Engagement

1 Introduction

The use of educational technology is increasingly common in the K-12 context and has
made possible the collection of a variety of data which can be used for improving teachers’
practice and students’ learning. Around the world, teachers are increasingly expected to
rely on data in their decision-making (Prestigiacomo et al., 2020). However, teachers need
to acquire new pedagogical, technological, and data-related knowledge and skills to effec-

Manisha Khulbe
manisha@tlu.ee
Kairit Tammets
kairit.tammets@tlu.ee
1
Tallinn University, Tallinn, Estonia

13
982 M. Khulbe, K. Tammets

tively use data (Mandinach & Gummer, 2016). For this, teacher training and preparation is
necessary, but both pre-service instruction and in-service professional development have
been slow in preparing educators to work with data so far (Bocala & Boudett, 2015; Hender-
son & Corry, 2020), and literature regarding data use interventions for in-service teachers
indicates that improvements are needed in their design and implementation (Hoogland et
al., 2016; Ebbeler et al., 2017). Learning Analytics (LA) tools can aid teachers in data use,
but these tools typically fall short in several ways, such as a lack of grounding in theory
(Gašević et al., 2015; Wise & Shaf-fer, 2015), and not accounting for teachers’ needs during
the design process (Buckingham Shum et al., 2019).
To address the above-mentioned challenges in the context of in-service teachers, we
designed a professional development program that aimed at, among other things, enhanc-
ing teachers’ pedagogical and digital literacy skills, improving their attitudes towards data,
and gathering insights about teachers’ experiences during the training to improve the data
intervention. Based on our findings, we propose data interventions can be more effective
if teachers have access to theory-based data analysis tools designed specifically to support
teachers in improving their pedagogical content knowledge and data use knowledge. So far,
the potential of LA dashboards in mediating teachers’ learning has not been explored, but
the idea that well-designed tools could foster teacher learning, especially during data use
interventions but also in other contexts, is supported by the trialogical learning approach
formulated by Paavola et al. (2012). We present our method for ensuring LA dashboards
conform to theory and thus provide useful data analysis and scaffolds to users. In designing
and evaluating the intervention, we were guided by the following research questions:

RQ1 To what extent did the training program support the development of teachers’ under-
standing of the importance of educational data in gauging student engagement during math-
ematics lessons?

RQ2 To what extent did teachers’ data literacy change over the course of the intervention?

RQ3 Which design principles should be considered when designing data literacy interven-
tions for teachers?

2 Background and Related work

2.1 Teachers’ Data Literacy and Beliefs About Data

In the context of K-12 education, data can be broadly defined as “information that is col-
lected and organized to represent some aspect of schools” (Lai & Schildkamp, 2013). This
includes both quantitative and qualitative data and is not limited to only assessment data
but emcompasses attendance, behavioural reports, teacher feedback, student exit tickets,
parent-teacher interactions, etc. Currently, judicious and systematic data use is increasingly
expected from teachers: the idea of grounding pedagogical actions in evidence rather than
instinct is inherently appealing to researchers and policy makers, and a number of studies
also show that data use can result in improved student performance (Carlson et al., 2011;

13
Mediating Teacher Professional Learning with a Learning Analytics… 983

McNaughton et al., 2012). To effectively integrate data into their everyday practice, teachers
must acquire data literacy skills.
Many definitions and frameworks exist for elaborating upon the concept of data lit-
eracy, but one widely accepted definition describes it as a teacher’s “ability to transform
information into actionable instructional knowledge and practices by collecting, analyz-
ing, and interpreting all types of data (assessment, school climate, behavioral, snapshot,
longitudinal, moment-to-moment, etc.) to help determine instructional steps. It combines
an understanding of data with standards, disciplinary knowledge and practices, curricular
knowledge, pedagogical content knowledge, and an understanding of how children learn”
(Mandinach & Gummer, 2016).
While data literacy is an important consideration, teachers’ beliefs about data are also
worth exploring. Mandinach and Gummer (2016) do not include “dispositions, habits of
mind, or factors that influence data use” in their data literacy framework but acknowledge
their importance. Pre-existing beliefs about data and learning can affect teachers’ willing-
ness to engage with data. For example, Jimerson (2014) explored teachers’ and school lead-
ers’ mental models for data use and found that while a majority of school leaders were
enthusiastic about data use, teachers viewed it more negatively as they associated it with
punishment and accountability rather than attempts at improving their practice as teachers.
This resulted in reluctance to use data. Actual change in practice requires a change in mind-
set and school culture, with numerous studies indicating that it is important for teachers to
first understand that data use can benefit their practice (Hamilton et al., 2009; Miller, 2009;
Jimerson, 2014) suggests that teacher beliefs about data use are related to their mental mod-
els, which are quite rigid though not fixed. Formal training and personal and vicarious expe-
riences can encourage reconstruction of mental models and improve beliefs regarding data.

2.2 Data Literacy Interventions for In-Service Teachers

To support teachers in integrating data use into their practice, a variety of interventions have
been proposed. For example, the model proposed by Hansen and Wasson (2016) - Teacher
Inquiry into Student Learning - aims to guide teachers as they undertake inquiry into stu-
dent learning by defining research questions, creating and implementing relevant learning
scenarios, and then reflecting upon and modifying them with the help of LA data from
tools which they participated in designing. Employing data teams in teacher professional
development has been also proposed to support teachers to work collaboratively on data to
enhance the students’ learning (Schildkamp et al., 2019). The Lesson Study approach has
been also implemented widely as an intervention to support teachers’ learning and profes-
sional development and has shown positive effects on teachers’ practice (Vermunt et al.,
2019).
The results of these data use interventions are encouraging, but also indicate that experi-
mentation can lead to better results. First, teachers’ data literacy skills do not always
improve satisfactorily after training. For example, Kippers et al. (2018) found that though
their intervention participants’ data literacy improved, their average score of 11.5 out of a
maximum of 25 on the data literacy test indicated improvement was possible. This could be
an indication of the inherent complexity of the knowledge and skills teachers are required to
acquire for effective data use. Prolonged and more intensive support could lead to improved
outcomes. Second, researchers have observed that teachers prefer training programs that

13
984 M. Khulbe, K. Tammets

focus specifically on their classroom context and needs to programs that address data use
in a more general manner (Hoogland et al., 2016). This challenge is related to intervention
design, and may be solved by providing differentiated, targeted training that involves active
participation from teachers and application and testing of new knowledge and skills in the
authentic context of their own classroom. Third, it has been reported that some teachers,
despite participating in data use interventions, do not think that evidence-based reflection
on their own work can improve their practice, and instead prefer to rely on their “years of
experience” and are likely to attribute class performance to the students’ inherent ability
(Schildkamp & Kuiper, 2010; Wayman et al., 2012). As mentioned earlier, such beliefs are
rigid but can be influenced by personal and vicarious positive experiences with data.
A final challenge is not related to intervention design, but inherent in working with data:
teachers report recognising the usefulness of data while participating in training events, but
also acknowledge that their packed schedules will not permit time for exploration of data
after the intervention (Ebbeler et al., 2017). Theoretically, this problem can be addressed by
making data analysis tools available to teachers. In practice, however, such tools currently
have their own shortcomings which reduce their usefulness for teachers. We describe these
shortcomings and possible solutions in the next section.

2.3 LA Dashboards and Teachers’ Data Use Practices

Students interact with digital tools frequently in classrooms today, and educational research-
ers are interested in using the data thus generated to better understand student learning. Lit-
erature describing innovative LA dashboards has been burgeoning in recent years, and basic
analytics features are also built into widely used Learning Management Systems (LMSs).
LA researchers have been exploring the ways in which data can help teachers. Wise and
Jung (2019) describe four main directions of research in instructor-facing analytics. The
first strand relates to teacher inquiry and reflection upon instructional practice. The second
strand involves use of data to improve learning design. The third category is concerned with
providing teachers with real-time data for monitoring and orchestration purposes. Finally,
macro-level analytics are used for improvements at the institutional level. Unfortunately,
easy data availability and reporting in the classroom has not had a transformative effect on
teachers’ data use practices outside of researcher-managed interventions, with a number of
factors contributing towards this lacklustre scenario.
First, it has been reported that even though teachers have access to an ever-increasing
amount of student data and data analysis tools, this data is not always relevant to the ques-
tions they are asking (Morgan & Killion, 2018). This situation often arises because teachers
are rarely consulted regarding their classroom data needs during the creation of LA tools
(Avramides et al., 2014). In response to this problem, the practice of co-creation has been
emphasised, which involves users in the design process (Dollinger & Lodge, 2018).
Second, concerns have been expressed that the link between LA and educational theory
has been rather weak, and the resulting atheoretical, contextless collection, analysis and
reporting of any kind of student data cannot enhance educational practice and research
(Gašević et al., 2015; Wise & Shaffer, 2015). This gap between learning theory and aspects
of LA is well-documented. Jivet et al. (2017) explored whether and how the development
of student-facing LA dashboards has been grounded in educational theory, and after study-
ing 95 relevant papers reported that only 26 of these referred to the theoretical concepts

13
Mediating Teacher Professional Learning with a Learning Analytics… 985

employed during the design of the dashboard. Similarly, Matcha et al. (2020), who focused
on recent papers about LA dashboards designed to enhance self-regulation of learning found
that 20 out of 29 papers made no mention of established theory. The authors of these reviews
argue that LA research has largely focused on leveraging any readily available data, and
recommend that instead, researchers should focus on developing LA systems in which the
collection, analysis, and reporting of data are governed by sound educational theory.
Third, in a review of 55 papers about LA dashboards it was found that most relied solely
on log data (Schwendimann et al., 2017) even though relying on a single source of data in
the complex environment of a K-12 classroom does not provide an accurate image of stu-
dent activities and progress. Multimodal Learning Analytics (MMLA) is a response to this
problem and provides insights into learning processes that happen across multiple contexts
(physical and digital). Next, researchers have differentiated between exploratory visualisa-
tions and explanatory visualisations and emphasised that teachers should not be expected
to undertake exploratory analysis, the cumbersome task of analysing large volumes of data
in order to understand their classrooms. Instead, it has been suggested that researchers and
designers need to support teachers in using data by developing dashboards that offer clear
actionable insights and prompts derived from the data (Sergis & Sampson, 2017).

2.4 Engagement

For the present study, we chose to focus on student engagement as the pedagogical-psycho-
logical construct that teachers should be aware of in their practice as it has repeatedly been
shown to be strongly linked to and a reliable predictor of achievement. While the importance
of engagement is universally acknowledged, researchers differ when it comes to defining
the construct and no one definition is widely agreed upon. According to the model rooted
in self-determination theory that we have used in our work, engagement has four compo-
nents: behavioural, emotional, cognitive, and agentic engagement (Reeve & Tseng, 2011).
Behavioural engagement involves students’ participation in learning and extra-curricular
activities. Indicators of behavioural engagement include time spent on learning activities,
homework completion, attendance, interaction with teachers and other students, etc. Cogni-
tive engagement comprises interest in learning, and self-regulation to master knowledge
and skills. Indicators include aiming for deep learning as opposed to superficial knowledge,
use of personalised and sophisticated learning strategies, planning and monitoring learning
activities, etc. Emotional engagement addresses the affective states of learners and whether
they are conducive to learning. Agentic engagement refers to how students alter learning
activities to better suit their needs and interests, and the indicator is active and constructive
rather than passive participation in learning activities.
The LA research community also considers engagement an important construct (Lu et
al., 2017) and LA dashboards could provide to teachers insights about students’ engagement
and calls to pedagogical action (Koh & Tan, 2017). However, most of the research in this
field is focused on MOOCs and higher education, and little attention has been accorded
to studying the use of LA dashboards to support teachers’ own understanding of student
engagement.

13
986 M. Khulbe, K. Tammets

3 Methodology

3.1 Research Design and the Context of the Study

This study is a part of a larger, ongoing project involving the design of Digital Learning
Resources (DLRs) for the K-12 classroom and development and evaluation of classroom
practices that employ them effectively. We utilise Design-based research (DBR, see Wang
& Hannafin, 2005) as a methodological framework, with each phase informing the one that
comes after. The first phase involved DLR creation and piloting, during which we found
that though the DLRs could support the use of more student-centred practices in the class-
room, teachers continued teaching in traditional, teacher-led ways. Therefore, the following
phases focus, among other things, on familiarising teachers with constructivist instructional
practices involving the DLRS and evaluating their effect on student outcomes. The second
and third phase of the project are the subjects of this study.
The setting of the second phase was a professional training program for mathematics
teachers - the Teacher Innovation Laboratory – which was designed while keeping in mind
the aforementioned challenges faced by data use intervention participants. The program
had four objectives: first, to guide teachers in co-creating learning designs utilising avail-
able DLRs and other educational tools (mainly Desmos) to engage students in mathematics
lessons through student-centred instructional practices (this aspect of the intervention is not
addressed in detail here), second, to support the development of teachers’ willingness to use
data in their everyday practice with the help of process-oriented data collection tools, third,
to support teachers’ data literacy skills, and fourth, to provide insights regarding challenges
teachers faced when trying to acquire these data literacy skills.
The third phase involved drawing upon relevant literature and data use insights from the
second phase to enhance the effectiveness of future iterations of the intervention.

3.2 Participants

The participants were 21 highschool mathematics teachers who responded to an open call
for participation in a training program for learning how to make mathematics lessons more
student-centred and engaging. The teachers were aware that data use to evaluate the effec-
tiveness of their new way of teaching would be part of the training, but the call for par-
ticipation did not describe the training as focused primarily on data use. 20 out of the 21
participating teachers were female. Five teachers had less than 10 years of teaching experi-
ence, three had been teaching for between 11 and 20 years, six had between 21 and 30 years
of teaching experience, and the remaining seven had been in the profession for more than
30 years.

3.3 Procedure

The program ran for eight months and involved close collaboration among the 21 partici-
pating teachers and four university researchers and didactics. The participants met monthly
to co-create student-centred classroom practices for mathematics lessons.

13
Mediating Teacher Professional Learning with a Learning Analytics… 987

Fig. 1 Data use Intervention structure

Fig. 2 Teachers had access to students’ self-reported data about engagement

During the training days, participants were taught strategies for improving student
engagement in mathematics lessons and the signs of engagement and disengagement in
students. Next, lesson plans and tasks were co-created to foster classroom engagement. In
between these training days, teachers piloted the new teaching practices to engage students
in mathematics lessons, and at least twice a month, after one ordinary and one intervention
lesson, they used the web-application LAPills to collect self-reports from the students about
their classroom engagement and disengagement. LAPills was developed to help teachers
easily create questionnaires using YAML and administer them to students. A simple visu-
alisation is automatically generated using the data collected and teachers have immediate
access to these visualisations (Fig. 2). The questionnaire administered to the students for
this purpose was adapted from multiple sources (Skinner et al., 2008; Wolters, 2004) and
required responses on the 5-point Likert scale.
To encourage appreciation for and positive attitudes towards data, the importance of
engagement was discussed, and it was stressed that unintentional observations in the class-
room are not enough to gauge engagement. We also hoped that access to authentic classroom

13
988 M. Khulbe, K. Tammets

data would foster teacher interest in data use. Guided by the researchers, teachers explored
different phases of data use and gained knowledge about what data should be collected,
when, and how. During the monthly meetings, data from randomly chosen classrooms were
analysed collectively in an attempt to aid teachers’ data interpretation skills. Pedagogical
responses to problems indicated by data were also discussed.
Every month, after a period of making sense of the collected data, the teachers answered
a questionnaire which required them to reflect upon their data use practice and experience
- how they tried to improve student engagement, the information they gleaned from the
visualisations about students’ engagement, and what they would do differently next time.
The teachers’ responses helped us identify how the intervention could be improved: we
decided to design a theory-driven, scaffolding LA dashboard to aid teachers in acquiring
data literacy skills during the intervention and after.

3.4 Data Collection and Analysis

All participants of the study, teachers and students, signified consent to take part in the
study. They were made fully aware of their rights regarding their data and the purpose for
which it could be used.

Teacher Reports Regarding Beliefs About Data Teachers’ beliefs about data contribute
greatly towards their willingness to use student data to reflect upon and improve their pro-
fessional practice. To study teachers’ beliefs about data use, we used a single-group pre-post
research design (Field, 2013), and prior to and at the end of the intervention we collected
teachers’ responses to the teacher data use survey (Wayman et al., 2016). For this study,
we focused on the scale that pertained to teachers’ beliefs regarding data’s effectiveness
for pedagogy. We received responses to both pre- and post-test from 12 teachers out of
21. To analyse whether participants’ beliefs changed during the intervention, a t-test was
employed. To strengthen and support this quantitative analysis that drew upon a rather small
sample size, we also analysed teachers’ written reflections about data use to gauge their
beliefs about data. 17 teachers’ reflections were evaluated using inductive coding, with both
authors individually and then collaboratively categorising teachers’ responses.

Teachers’ Data Literacy Skills Two sources of information were used to evaluate different
aspects of teachers’ data literacy: self-reports about data use skills and reflections on data
use.

● Data selection and collection skills were evaluated based on teachers’ self-reports.
This method was selected because though the teachers received training about sources
and types of data and data collection tools, they were not required to practice related
skills independently during the intervention. Before and after the intervention, teachers
reported, on a scale of 1 to 5, their confidence in their ability to choose relevant data and
collect the same in the context of the Innolab intervention. Responses were received to
both pre- and post-test from 17 participants, and these were evaluated for differences
using a paired t-test.
● To evaluate data interpretation skills and capacity to take appropriate pedagogical action
in response we analysed teachers’ responses to open-ended questions about how they

13
Mediating Teacher Professional Learning with a Learning Analytics… 989

employed engagement data collected via LAPills. Each month, the teachers answered
questions which guided them in reflecting upon the data they collected with LAPills
about students’ engagement. Teachers were required to interpret the data and diagnose
problems with engagement and also think of pedagogical responses suited to their stu-
dents’ needs. We received 53 responses to this questionnaire (administered via Google
Forms). Of the 21 teachers who participated in the intervention, 17 completed the reflec-
tion form at least once over the span of eight months. Inductive coding was performed
with the teachers’ responses. Both authors first individually read and coded the teachers’
responses several times before assigning them to different categories, and then com-
pared their codes and categories in order to reach consensus. This analysis provided
insights into teachers’ data literacy and the features teachers would benefit from in a
classroom monitoring tool that aids inquiry into student engagement.

4 Results and Discussion

4.1 Teacher Reports Regarding Beliefs About Data

The teachers, with researchers’ aid, co-created classroom practices to enhance students’
engagement in mathematics class, piloted the practices, and collected students’ self-reports
about their engagement. Not only were the teachers encouraged to interpret the results indi-
vidually, but the results were also discussed together at the training sessions. This cycle
was repeated 6–7 times over the course of the training, and it was expected this clear link
between new pedagogical practices and using data to evaluate them in their own classrooms
would help teachers develop an appreciation for the usefulness of data.
On exploring teacher responses to the items from the teacher data use survey, we found
teachers’ beliefs regarding data’s effectiveness for pedagogy improved significantly from
the pre-test (M = 3.75) to the post-test (M = 4.02), t(11)=-2.59, P = .02. The specific items
teachers responded to were as follows:

● Data help teachers know what concepts students are learning.


● Data help teachers plan instruction.
● Data offer information about students that was not already known.

As only 12 participants’ responses were available for the quantitative analysis described
above, we also analysed teachers’ monthly responses to open-ended questions about data
use to check whether qualitative results aligned with our quantitative findings. More specifi-
cally, we focused on teacher responses to the question:

● “To what extent did you familiarise yourself with the LAPills results and what did you
learn from them?”

On exploring data from 17 teachers we found that though one teacher clearly did not deem
data as a useful source of useful information about their students (“The results were not sur-
prising for me at all. I already know the 11th grade is the weakest and least motivated in our

13
990 M. Khulbe, K. Tammets

school”), most teachers wrote about how the data helped them notice several aspects of their
students’ engagement and the effects of their own practice (“[Student data showed me that]
both intervention and normal lessons have their own pros and cons.”; “I reviewed the results
of LaPills and compared them to the results of a regular class… It was noticeable that they
had a more interesting and nicer time [in the intervention lesson]… although I thought it
was not as well thought out as a regular lesson”; “It was pleasant to learn that most students
feel at ease in my classroom”). The teachers realised that the data was providing some useful
insights that would otherwise have been missed by them.
The quantitative and qualitative results taken together seem to indicate that our train-
ing program exposing teachers to data use shows some potential to improve their attitudes
towards using data in instructional practice.

4.2 Teachers’ Data Literacy Skills

To answer RQ2 (To what extent did teachers’ data literacy change over the course of the
intervention?), we attempted to gain a better understanding of whether and how teachers’
data literacy skills developed during the intervention. Different methods were used for
assessing four important data literacy skills, and our findings are presented and discussed
in this section.

Teachers’ Data Selection and Collection Skills Teachers responded to two questions related
to these skills:

● I know very well which data to collect to measure the effects of new Innolab pedagogi-
cal practices on my students’ learning.
● I know very well how to collect data to measure the effects of new Innolab pedagogical
practices on my students’ learning.

Teachers reported improvements in their data selection skills, with statistically significant
changes from the pre-test (M = 2.35) to the post-test (M = 3.12), t(16)=-3.05, P = .007. Sta-
tistically significant improvements were also found for data collection skills when compar-
ing pre-test (M = 2.29) and post-test (M = 3.35), t(16)=-4.24, P < .001. These important skills
form the foundation of inquiry using data and involve application of pedagogical, data use,
and technological knowledge. The Innolab training helped teachers gain confidence in the
use of these skills.
To evaluate teachers’ data interpretation skills and capacity to take pedagogical action
in response to data, we analysed their reflections upon data use. We placed emphasis on the
kind of data teachers focused on and the obstacles they encountered during the different
stages of data use. The teachers wrote descriptive answers to two questions related to differ-
ent aspects of data literacy. More specifically, the questions related to:

● teachers’ use of the engagement questionnaire and interpretation of data (“To what
extent did you familiarise yourself with the LAPills results and what did you learn from
them?”).
● the pedagogical action teachers would take in response to the data they had collected
(“Based on the data, what would you do differently next time?”).

13
Mediating Teacher Professional Learning with a Learning Analytics… 991

A few categories emerged during the qualitative analysis of the data and are described next.

Teachers’ Data Interpretation Skills Most teachers used the available data to accurately iden-
tify gaps in student engagement. Their responses showed that students’ cognitive engage-
ment (specifically, use of appropriate learning strategies) was the focus for most teachers,
with many stating that it was worrying that students did not try to express mathematical
concepts in their own words (“Rephrasing new things with their own words still needs
more working on” and “It caught my eye that… they don’t try to rephrase new material for
themselves.”) It was also concerning for the teachers that students were unable to connect
what was being learned to their prior knowledge. Problems with behavioural engagement
were also observed by the teachers (“It is worrying that there is a lack of interest, and that
their thoughts wander off”).

It was interesting that the teachers did not focus equally on all the available engagement
subtypes. Instead, most teachers placed the greatest emphasis on students’ cognitive engage-
ment. This outcome can be explained by the fact that cognitive engagement relates to stu-
dents’ motivation and inner learning process and strategies which are not easily outwardly
apparent, unlike behavioural engagement.

Taking Pedagogical Action in Response to Data Next, we turn to teachers’ responses about
the pedagogical action they would take to remedy the problems indicated by data. While
more teachers observed problems with cognitive engagement, only two teachers mentioned
specific strategies that could be used to foster a deeper learning experience. One of them
focused on having students rephrase concepts in their own words (“I think, in future exer-
cises, students should be required to rephrase things more frequently. Currently, when things
are discussed at the end of the class, some of the students don’t rephrase terms and don’t
perceive the importance of doing so. It should be regarded as individual work that is manda-
tory.”; “rephrasing new things with their own words still needs more working on.”), and the
other planned to encourage conceptual understanding through an end of lesson discussion
summarising lesson contents (“Next time I will try to have a quiz/discussion involving all
students … at the end of the lesson”). Other teachers’ responses about future pedagogical
actions were vague (e.g., “I should think lessons through even more, but that is very time-
consuming”). Two teachers stated that they were unsure of how to react to the information
about student engagement. These responses are in line with previous research (Datnow &
Hubbard, 2015; Marsh, 2012; Schildkamp et al., 2014) which indicated that taking appro-
priate instructional action is the data literacy skill that is the most difficult for teachers to
acquire.

Obstacles Faced by Teachers Finally, teachers also wrote about a number of obstacles to
engaging with the data. Two of the teachers did not familiarise themselves with the data
presented by LAPills at all on some occasions and wrote that performing other essential
tasks took up all their time. Two others tried to work with the data, but then discovered that
they were unable to interpret it in its given form. Of these, one thought the manner in which
the data was presented was complicated and suggested that the information would be more
useful for her if presented through diagrams. The other stated that she would have to create
diagrams of her own to better understand the data.

13
992 M. Khulbe, K. Tammets

It is clear that while most of the teachers were able to interpret students’ self-reported
data to pinpoint where problems with engagement lay, data interpretation was difficult for
some even after training with university researchers, and nearly all struggled when it came
to deciding upon a course of action to take in response to the data. It is also important to
note that teachers work under significant time constraints that keep them from engaging
with data.

5 Supporting Learning During Data Use Interventions

The current data intervention included some elements that were previously identified in
literature as desirable:

● Seamless integration of training and practice: The teachers iteratively practised in their
own classrooms the new methods that had been introduced during training sessions and
tried to improve their lesson plans. Challenges faced during such implementation were
then analysed collaboratively.
● Combination of pedagogical, content, and technological knowledge: The intervention
specifically targeted only mathematics teachers. This made possible focus on domain-
specific content knowledge and pedagogical knowledge and skills. It also allowed better
collaboration because of the shared context of the teachers’ practice.
● Reflective data-informed practice: An important aspect of the intervention was the
use of authentic data collected by the teachers themselves from their classrooms. We
hypothesised the use of such data would help bolster teachers’ appreciation of the use-
fulness of data in understanding and improving their own practice.

However, though there was increased understanding of the practical value of data among
participants at the end of the intervention, the capacity to combine new pedagogical knowl-
edge and information gleaned from data to plan future instruction was not satisfactorily
exhibited. To solve this challenge, data use interventions should emphasise support for
acquiring and employing relevant pedagogical and data use knowledge. We suggest provid-
ing this support by making available to teachers theory-driven LA tools that are designed
to scaffold teachers’ learning by explicitly linking together relevant pedagogical constructs
(here, engagement), pedagogical knowledge acquired during the training, and classroom
data (Fig. 3). To elaborate, the pedagogical constructs that are the focus of the professional
development programme should be identified and relevant theory used to design scaffolding
LA tools. While trainers and colleagues support participants during training sessions, the
LA tools aid acquisition of pedagogical and data use knowledge and skills during individual
classroom practice. Further, discussions that happen during training sessions can lead to
insights that inform improvements in LA tool design and features and even relevant theory.
Teachers’ learning may be seen from a sociocultural perspective as the process by which
teachers adapt and internalise new knowledge and activities which were initially medi-
ated by experts, peers, and/or relevant artifacts (Johnson & Golombek, 2003). So far, little
research has focused on the potential of LA dashboards for aiding this process. Support
for experiments in this direction comes from the trialogical learning approach, which sug-
gests that when designed thoughtfully, technology can play an important role in supporting

13
Mediating Teacher Professional Learning with a Learning Analytics… 993

Fig. 3 Professional Development mediated by an LA dashboard

“reflective mediation” by making knowledge explicit and setting the stage for reflection to
take place (Paavola et al., 2012). We believe that LA dashboards that are in alignment with
training objectives can enhance the effectiveness of professional learning programs by sup-
porting teachers’ acquisition of new content and pedagogical knowledge and facilitating and
modelling inquiry into the impact of new practices on student outcomes.

6 Designing an LA Dashboard to Mediate and Scaffold Teachers’


Learning

Based on the experiences of teachers who participated in the intervention and also draw-
ing upon literature regarding LA dashboards, we attempted to design an LA dashboard that
can aid teachers’ learning during professional development programmes and beyond. We
translated our design considerations/method into the guidelines presented below. The focus
here is engagement, but these guidelines should be equally applicable to other constructs.

● Theory-based LA tools can scaffold teachers’ learning: Where possible, LA tools


should support teachers in acquiring new pedagogical and data use skills. Such support
can be implemented through strategic scaffolds which provide guidance on effective
ways to approach a task or solve a problem (Hannafin et al., 1999).We provide such
support using a “scaffolding layer”: notifications and explanations about pedagogi-
cally important information and prompts recommending useful pedagogical actions to
aid teachers’ developing pedagogical knowledge and data use skills. These scaffolds
should be grounded in relevant theory, which can ensure they are particularly useful
during training events, when teachers are introduced to and expected to work with new,

13
994 M. Khulbe, K. Tammets

complex information. Over time, teachers might not need to rely on the scaffolds that
suggest pedagogical actions and should be able to use the LA tools with these scaffolds
disabled. Prompts emphasising important information should probably be useful for
teachers indefinitely as they save time.
● Grounding LA tools in relevant theory: To ensure such scaffolding is possible and
also useful for teachers, the starting point in the design and development of LA tools
and dashboards should be theory. In the present study, this theory is student engagement
literature. For guidance in selecting and analysing variables, literature about indicators
of engagement should be referred to. Literature reporting interventions that have suc-
cessfully supported student engagement can be used to generate scaffolding prompts
suggesting relevant pedagogical actions in response to different classroom scenarios
indicated by the data.
● Multiple sources of data: Data should be collected from multiple sources to provide
teachers with a more complete understanding of student engagement. Engagement is
a complex construct which is challenging to interpret based on a single data source.
In our study, teachers were trained to work systematically with students’ self-reports,
but during the training sessions students’ interaction with the DLRs was also analysed
and combining self-reports and log data provided more meaningful insights about the
students’ engagement. In the future, our LA data will be sourced from DLR logs (time
spent on task, homework completion rate, accessing certain files and pages, etc.), DLR
meta-data, school attendance records, etc.

Table 1 helps illustrate these ideas through some concrete examples.

7 Conclusion

In this study we described our attempts at addressing data use obstacles in the K-12 context.
We claim that situated learning mediated by thoughtfully designed educational technology
and LA tools is vital for teachers to acquire the pedagogical knowledge and data use skills
necessary for efficient practice in the technology-enhanced classroom.
We designed an intervention to support teachers in creating and reflecting upon instruc-
tional methods to improve student engagement in a technology-rich mathematics classroom.
Teachers were aided by researchers in understanding and implementing the new pedagogi-
cal methods and also had access to some student engagement data to evaluate the impact of
these practices. We found that at the end of the intervention, most teachers reported more
positive attitudes towards data use. Relying on not only self-reports but also qualitative
analysis of the teachers’ reflection upon data, we found that the participants also exhibited
the ability to interpret data and draw meaningful information from it. We realised also that
not just the data intervention but also involvement in the co-creation of classroom practices
must have contributed to the development of teachers’ understanding of the LA data. Indeed,
participation in the co-creation of activities during which LA data is generated can enhance
the meaningfulness of LA innovations for users (Dollinger & Lodge, 2018).
A few teachers found data interpretation difficult, and a majority of the teachers could not
satisfactorily complete the essential task of choosing suitable pedagogical action in light of
the discoveries made with the help of student data. Our attention was also drawn to teach-

13
Mediating Teacher Professional Learning with a Learning Analytics… 995

Table 1 Examples of theory-based data analysis and scaffold generation


Engagement Data source Indicator Examples of feedback
sub-construct
Behavioural Student Self-reported behavioural “Student x has not been completing as-
engagement self-reports, engagement score is satis- signments and reports low behavioural
DLR logs, factory (> 3.5). engagement. You could try behavioural
attendance Students access and interact contracting – jointly outlining and plan-
records with DLRs (assigned ning required behavioural changes and
content is accessed, tasks are monitoring progress.”
submitted, average time on “In this class you used a lot of indi-
tasks equal to the time indi- vidual tasks and it seems that some of the
cated by the creator, etc.). students were less engaged behaviourally
Student attendance is at least - you may want to design some collabora-
90%. tive tasks for the next time.”
Cognitive Student self- Self-reported cognitive “More than half the students report that
engagement reports, DLR engagement score is satis- they do not try to connect new content
logs and factory (> 3.5). to prior knowledge. It may be helpful to
meta-data, Students achieve a high begin the topic with tasks that connect
meta-data score when interacting new and prior knowledge.”
from teach- with digital tasks labelled “Requiring students to answer more
ers’ lesson as “higher-order thinking” open-ended questions so that they formu-
plans (> 80% of the results). late answers in their own words can aid
Students access optional student understanding of the topic and
study material (at least teach them an effective learning strategy.”
once). “Only 24% of the students achieved a
Students use hints and high score on tasks that require higher-
prompts (at least once). order thinking – it may be helpful to
design tasks in different levels to foster
students to argue, explain, compare.”

ers’ statements about lack of time to engage with data. This indicated the need for more
efficient design of our data intervention and encouraged us to create a more meaningful LA
dashboard for supporting teachers in understanding and enhancing the impact of technol-
ogy-enhanced classroom activities on students’ engagement. We formulated guidelines for
designing a service grounded in sound educational theory and utilising data from multiple
sources with the aim of not only informing the teachers about how engaged the students
are, but also guiding them in making decisions to bolster student engagement by providing
timely recommendations for action.
Though learning technologies are frequently used during teacher training and data use
interventions, the currency study explores the novel idea of supporting teachers’ learning
with these tools that they are encouraged to use in their everyday practice. Further, this
paper contributes to the field of LA research by providing guidelines for coupling pedagogi-
cal concepts/theory and students’ data and specific examples of the same in the context of
classroom engagement.
Our findings and recommendations have important implications for technology-enhanced
learning research and educational practice. Often, educational theory and novel educational
technology and its affordances do not make a significant impact on teachers’ everyday prac-
tice. Coupling professional learning of new pedagogical knowledge and skills with technol-
ogy as a mediator and learning scaffold is a holistic approach that can help bridge this gap
and promote the adoption of new instructional practices and technology in the classroom.
At the same time, this approach encourages theory-based co-creation of classroom practices

13
996 M. Khulbe, K. Tammets

and provides insights about classroom realities that should be taken into account to further
improve educational technology.
In the upcoming phase of our research, we will further develop our application, adding
more scaffolds to support teachers in data interpretation, decision-making and choosing
appropriate instructional strategies based on theory and data. In a longitudinal study, we
plan to pilot this new version of our LA dashboard in classrooms to, first, understand how
it supports teachers in understanding and enhancing student engagement with the use of
educational technology, and second, conduct a follow-up exploration regarding learning
manifest as long-term changes in teachers’ practice.

Acknowledgements This study was supported by Personal Research Grant (PRG) project PRG1634, Euro-
pean Union’s Horizon 2020 research and innovation programme under grant agreement No 856954 and
European Union’s Horizon 2020 research and innovation programme under grant agreement No 101004676.

Data Availability Not applicable.

References
Avramides, K., Hunter, J., Oliver, M., & Luckin, R. (2014). A method for teacher inquiry in cross-curricular
projects: Lessons from a case study. British Journal of Educational Technology, 46(2), 249–264. https://
doi.org/10.1111/bjet.12233.
Bocala, C., & Boudett, K. P. (2015). Teaching educators habits of mind for using data wisely. Teachers Col-
lege Record, 117(4), 1–20. https://doi.org/10.1177/016146811511700409.
Buckingham Shum, S., Ferguson, R., & Martinez-Maldonado, R. (2019). Human-centred learning analytics.
Journal of Learning Analytics, 6(2), 1–9. https://doi.org/10.18608/jla.2019.62.1.
Carlson, D., Borman, G. D., & Robinson, M. (2011). A Multistate District-Level Cluster Randomized Trial of
the Impact of Data-Driven Reform on Reading and Mathematics Achievement.Educational Evaluation
and Policy Analysis, 33(3),378–398. https://doi.org/10.3102/0162373711412765
Datnow, A., & Hubbard, L. (2015). Teachers’ use of assessment data to inform instruction: Lessons
from the past and prospects for the future.Teachers College Record, 117(4),1–26. https://doi.
org/10.1177/016146811511700408
Dollinger, M., & Lodge, J. M. (2018). Co-Creation Strategies for Learning Analytics. Proceedings of the
8th International Conference on Learning Analytics and Knowledge (LAK ’18), 97–101. https://doi.
org/10.1145/3170358.3170372
Ebbeler, J., Poortman, C. L., Schildkamp, K., & Pieters, J. M. (2017). The effects of a data use intervention
on educators’ satisfaction and data literacy.Educational Assessment, Evaluation and Accountability,
29,83–105. https://doi.org/10.1007/s11092-016-9251-z
Field, A. (2013). Discovering statistics using IBM SPSS statistics. Sage Publications.
Gašević, D., Dawson, S., & Siemens, G. (2015). Let’s not forget: learning analytics are about learning. Tech-
Trends, 59(1), 64–71. https://doi.org/10.1007/s11528-014-0822-x.
Hamilton, L., Halverson, R., Jackson, S. S., Mandinach, E., Supovitz, J. A., & Wayman, J. C. (2009). Using
student achievement data to support instructional decision making.National Center for Education Eval-
uation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education.
Hannafin, M., Land, S., & Oliver, K. (1999). Open learning environments: foundations, methods, and mod-
els. In C. M. Reigeluth (Ed.), Instructional-design theories and models: a new paradigm of instructional
theory (II vol., pp. 115–140). Mahwah, NJ: Lawrence Erlbaum Associates.
Hansen, C. J., & Wasson, B. (2016). Teacher Inquiry into Student Learning: - the TISL Heart Model and
Method for use in Teachers’ Professional Development. Nordic Journal of Digital Literacy, 10, 24–49.
https://doi.org/10.18261/issn.1891-943x-2016-01-02.
Henderson, J., & Corry, M. (2020). Data literacy training and use for educational professionals. Jour-
nal of Research in Innovative Teaching & Learning, 14(2), 232–244. https://doi.org/10.1108/
JRIT-11-2019-0074.
Hoogland, I., Schildkamp, K., van der Kleij, F., Heitink, M., Kippers, W., Veldkamp, B., & Dijkstra, A. M.
(2016). Prerequisites for data-based decision making in the classroom: Research evidence and practical
illustrations. Teaching and Teacher Education, 60, 377–386. https://doi.org/10.1016/j.tate.2016.07.012.

13
Mediating Teacher Professional Learning with a Learning Analytics… 997

Jimerson, J. B. (2014). Thinking about data: exploring the development of mental models for “data
use” among teachers and school leaders. Studies in Educational Evaluation, 42, 5–14. https://doi.
org/10.1016/j.stueduc.2013.10.010.
Jivet, I., Scheffel, M., Drachsler, H., & Specht, M. (2017). Awareness is not enough: Pitfalls of learning
analytics dashboards in the educational practice. In Proceedings of the 12th European Conference on
Technology Enhanced Learning (pp. 82–96). Springer.
Johnson, K. E., & Golombek, P. R. (2003). Seeing. Teacher Learning TESOL Quarterly, 37(4), 729–737.
https://doi.org/10.2307/3588221.
Kippers, W. B., Poortman, C. L., Schildkamp, K., & Visscher, A. J. (2018). Data literacy: What do educators
learn and struggle with during a data use intervention?Studies in Educational Evaluation, 56,21–31.
https://doi.org/10.1016/j.stueduc.2017.11.001
Koh, E., & Tan, J. P. (2017). Teacher-actionable insights in student engagement: A learning analytics tax-
onomy. Proceedings of the 25th International Conference on Computers in Education, 319–325.
Lai, M. K., & Schildkamp, K. (2013). Data-based decision making: an overview. In K. Schildkamp, M. K.
Lai, & L. Earl (Eds.), Data-based decision making in education: Challenges and opportunities (pp.
9–21). Dordrecht: Springer.
Lu, O. H. T., Huang, J. C. H., Huang, A. Y. Q., & Yang, S. J. H. (2017). Applying learning analytics for
improving students engagement and learning outcomes in an MOOCs enabled collaborative program-
ming course. Interactive Learning Environments, 25(2), 220–234. https://doi.org/10.1080/10494820.2
016.1278391.
Mandinach, E. B., & Gummer, E. S. (2016). What does it mean for teachers to be data literate: laying out
the skills, knowledge, and dispositions. Teaching and Teacher Education, 60, 366–376. https://doi.
org/10.1016/j.tate.2016.07.011.
Marsh, J. (2012). Interventions promoting educators’ use of data: research insights and gaps. Teachers Col-
lege Record, 114(11), 1–47. https://doi.org/10.1177/016146811211401106.
Matcha, W., Uzir, N. A., Gašević, D., & Pardo, A. (2020). A systematic review of empirical studies on learn-
ing analytics dashboards: a self-regulated learning perspective. IEEE Transactions on Learning Tech-
nologies, 13(2), 226–245. https://doi.org/10.1109/TLT.2019.2916802.
McNaughton, S., Lai, M. K., & Hsiao, S. (2012). Testing the effectiveness of an intervention model based on
data use: a replication series across clusters of schools. School Effectiveness and School Improvement,
23(2), 203–228. https://doi.org/10.1080/09243453.2011.652126.
Miller, M. (2009). Achieving a wealth of riches: delivering on the promise of data to transform teaching and
learning (policy brief). Alliance for Excellent Education.
Morgan, N., & Killion, J. (2018). Beyond barriers: encouraging teacher use of feedback resources. A report
from the teacher Feedback Resources Project. Learning Forward.
Paavola, S., Engeström, R., & Hakkarainen, K. (2012). The trialogical approach as a new form of mediation.
In A. Moen, A. I. Mørch, & S. Paavola (Eds.), Collaborative Knowledge Creation (pp. 1–14). SensePub-
lishers. https://doi.org/10.1007/978-94-6209-004-0_1
Prestigiacomo, R., Hunter, J., Knight, S., Martinez-Maldonado, R., & Lockyer, L. (2020). Data in practice:
a participatory approach to understanding pre-service teachers’ perspectives. Australasian Journal of
Educational Technology, 36(6), 107–119. https://doi.org/10.14742/ajet.6388.
Reeve, J., & Tseng, C. M. (2011). Agency as a fourth aspect of students’ engagement during learn-
ing activities. Contemporary Educational Psychology, 36(4), 257–267. https://doi.org/10.1016/j.
cedpsych.2011.05.002.
Schildkamp, K., Karbautzki, L., & Vanhoof, J. (2014). Exploring data use practices around Europe: Iden-
tifying enablers and barriers. Studies in Educational Evaluation, 42, 15–24. https://doi.org/10.1016/j.
stueduc.2013.10.007
Schildkamp, K., & Kuiper, W. (2010). Data-informed curriculum reform: which data, what purposes,
and promoting and hindering factors. Teaching and Teacher Education, 26(3), 482–496. https://doi.
org/10.1016/j.tate.2009.06.007.
Schildkamp, K., Smit, M., & Blossing, U. (2019). Professional Development in the Use of Data: From Data
to Knowledge in Data Teams.Scandinavian Journal of Educational Research, 63(3),393–411. https://
doi.org/10.1080/00313831.2017.1376350
Schwendimann, B. A., Rodríguez-Triana, M. J., Vozniuk, A., Prieto, L. P., Boroujeni, M. S., Holzer, A.,
Gillet, D., & Dillenbourg, P. (2017). Perceiving learning at a glance: a systematic literature review of
learning Dashboard Research. IEEE Transactions on Learning Technologies, 10(1), 30–41. https://doi.
org/10.1109/TLT.2016.2599522.
Sergis, S., & Sampson, D. (2017). Teaching and learning analytics to support Teacher Inquiry: a systematic
literature review. In A. Peña-Ayala (Ed.), Learning analytics: fundaments, applications, and trends (pp.
25–63). Springer.

13
998 M. Khulbe, K. Tammets

Skinner, E., Furrer, C., Marchand, G., & Kindermann, T. (2008). Engagement and disaffection in the class-
room: part of a larger motivational dynamic? Journal of Educational Psychology, 100(4), 765–781.
https://doi.org/10.1037/a0012840.
Vermunt, J. D., Vrikki, M., van Halem, N., Warwick, P., & Mercer, N. (2019). The impact of lesson study pro-
fessional development on the quality of teacher learning. Teaching and Teacher Education, 81, 61–73.
https://doi.org/10.1016/j.tate.2019.02.009.
Wang, F., & Hannafin, M. J. (2005). Design-based research and technology-enhanced learning environments.
Educational Technology Research and Development, 53(4), 5–23. https://doi.org/10.1007/BF02504682.
Wayman, J. C., Cho, V., Jimerson, J. B., & Spikes, D. D. (2012). District-wide Effects on Data Use in
the Classroom. Education Policy Analysis Archives, 20(25), 1–28. https://doi.org/10.14507/epaa.
v20n25.2012.
Wayman, J. C., Cho, V., Mandinach, E., Supovitz, J., & Wilkerson, S. (2016). Guide to using the Teacher
Data Use Survey.U.S. Department of Education, Institute of Education Sciences, National Center for
Education Evaluation and Regional Assistance, Regional Educational Laboratory Appalachia.
Wise, A. F., & Jung, Y. (2019). Teaching with analytics: towards a situated model of instructional decision-
making. Journal of Learning Analytics, 6(2), 53–69. https://doi.org/10.18608/jla.2019.62.4.
Wise, A. F., & Shaffer, D. W. (2015). Why theory matters more than ever in the age of big data. Journal of
Learning Analytics, 2(2), 5–13. https://doi.org/10.18608/jla.2015.22.2.
Wolters, C. A. (2004). Advancing achievement goal theory: using goal structures and goal orientations to
predict students’ motivation, cognition, and achievement. Journal of Educational Psychology, 96(2),
236–250. https://doi.org/10.1037/0022-0663.96.2.236.

Publisher’s Note Springer Nature remains neutral with regard to jurisdictional claims in published maps and
institutional affiliations.

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a
publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manu-
script version of this article is solely governed by the terms of such publishing agreement and applicable law.

13

You might also like