Professional Documents
Culture Documents
Preview-9781351113021 A35067124
Preview-9781351113021 A35067124
Learning Analytics in the Classroom presents a coherent framework for the effective translation of learning
analytics research for educational practice to its practical application in different education domains. High-
lighting the real potential of learning analytics as a way to better understand and enhance student learning,
and with each chapter including specific discussion about what the research means in the classroom, this
book provides educators and researchers alike with the tools and frameworks to effectively make sense of
and use data and analytics in their everyday practice.
This volume is split into five sections, all of which relate to the key themes in understanding learning
analytics through the lens of the classroom:
Bridging the gap between research, theory and practice, Learning Analytics in the Classroom is both a
practical tool and an instructive guide for educators, and a valuable addition to researchers' bookshelves. A
team of world-leading researchers and expert editors have compiled a state-of-the-art compendium on
this fascinating subject and this will be a critical resource for the evolution of this field into the future.
Jason M. Lodge is Associate Professor of Educational Psychology in the School of Education and
Institute for Teaching and Learning Innovation at the University of Queensland, Australia. He is affili-
ated with the National Australian Research Council funded Science of Learning Research Centre.
Jason’s research focusses on the cognitive and emotional aspects of learning, particularly in digital
learning environments.
Jared Cooney Horvath is a research fellow at St Vincent’s Hospital in Melbourne and the co-founder of the
Science of Learning Group – a team dedicated to bringing the latest in educationally relevant brain and
behavioural research to students and educators at all levels. Currently he teaches at the University of
Melbourne, prior to this he spent a number of years working as a teacher and curriculum developer for several
institutions around Los Angeles, Seattle and Boston.
Linda Corrin is Senior Lecturer in Higher Education in the Williams Centre for Learning Advancement in the
Faculty of Business and Economics, University of Melbourne. Linda’s research interests include students’
engagement with technology, learning analytics, feedback and learning design. Currently, she is working on
several large research projects exploring how learning analytics can be used to provide meaningful and timely
feedback to academics and students. Linda is co-founder of the Victorian and Tasmanian Learning Analytics
Network.
This page intentionally left blank
LEARNING ANALYTICS IN
THE CLASSROOM
Translating Learning Analytics Research
for Teachers
Typeset in Bembo
by Taylor & Francis Books
This book is dedicated to all the scholars, teachers, administrators,
technicians, designers and others who devote their lives to
improving student learning. As education is becoming increasingly
politicised, your commitment and enthusiasm to students and their
learning is perhaps more important now than it has ever been.
This page intentionally left blank
CONTENTS
List of figures x
List of tables xii
Acknowledgements xiii
List of contributors xiv
PART I
Theoretical perspectives 9
2 Learning analytics and teaching: A conceptual framework for translation
and application 11
Gregory M. Donoghue, Jared Cooney Horvath and Jason M. Lodge
3 The perspective realism brings to learning analytics in the classroom 22
Kathryn Bartimote, Abelardo Pardo and Peter Reimann
PART II
Understanding learning through analytics 43
4 Supporting self-regulated learning with learning analytics 45
Jason M. Lodge, Ernesto Panadero, Jaclyn Broadbent and Paula G. de Barba
5 Identifying epistemic emotions from activity analytics in interactive
digital learning environments 56
Amaël Arguel, Mariya Pachman and Lori Lockyer
viii Contents
PART III
Learning design and analytics 69
6 Gathering, visualising and interpreting learning design analytics to
inform classroom practice and curriculum design: A student-centred
approach from the Open University 71
Tom Olney, Bart Rienties and Lisette Toetenel
7 Co-designing learning analytics tools with learners 93
Carlos G. Prieto-Alvarez, Roberto Martinez-Maldonado and
Theresa Dirndorfer Anderson
8 Connecting expert knowledge in the design of classroom learning
experiences 111
Kate Thompson, Sakinah S. J. Alhadad, Simon Buckingham Shum,
Sarah Howard, Simon Knight, Roberto Martinez-Maldonado and
Abelardo Pardo
PART IV
Analytics in the classroom 129
9 An analytics-based framework to support teaching and learning in a
flipped classroom 131
Jelena Jovanović, Dragan Gašević, Abelardo Pardo, Negin Mirriahi and
Shane Dawson
10 Opening the black box: Developing methods to see learning in
contemporary learning environments 152
Sarah Howard, Kate Thompson and Abelardo Pardo
11 Text analytic tools to illuminate student learning 165
Jenny McDonald, Adon Christian Michael Moskal, Cathy Gunn and
Claire Donald
12 Data-informed nudges for student engagement and success 185
Marion Blumenstein, Danny Y. T. Liu, Deborah Richards, Steve Leichtweis
and Jason M. Stephens
13 Supporting the use of student-facing learning analytics in the classroom 208
Linda Corrin
14 Using measures of pedagogical quality to provide feedback and improve
practice 221
Dan Cloney and Hilary Hollingsworth
Contents ix
PART V
Implementating analytics 245
15 Promoting learning analytics for tertiary teachers: A New Zealand case
study 247
Cathy Gunn and Jenny McDonald
16 Blurring the boundaries: Developing leadership in learning analytics 267
Deborah West, Henk Huijser and David Heath
The Australian Research Council provided funding for the editors to collate this volume as
part of a Special Research Initiative for the ARC-SRI Science of Learning Research Centre
(project number SRI20300015).
LIST OF CONTRIBUTORS
Jaclyn Broadbent is an Associate Head of School (Teaching & Learning) in the School of
Psychology at Deakin University. Since completing her PhD in Psychology in December
2011, Jaclyn runs the School of Psychology’s largest unit HBS110: Health Behaviour,
annually teaching more than 2100 Faculty of Health students the skills and principles of
behaviour modification. Despite Jaclyn’s high teaching load, her research output relative to
opportunity is excellent and shows a clear upward trajectory with 29 published/accepted
papers in the last four years (2013–2017).
Simon Buckingham Shum is Professor of Learning Informatics at the University of Tech-
nology Sydney, and Director of the Connected Intelligence Centre, which is charged with
building the capacity of UTS to benefit from analytics in teaching and learning, research and
business operations. He has been active in shaping the field of learning analytics, and is a co-
founder and former vice-president of the Society for Learning Analytics Research.
Dan Cloney’s expertise is in early education, cognitive development, academic achieve-
ment, and inequality. His current work focuses on the potential for high-quality early
childhood care and education (ECCE) programs to narrow SES-related development gaps.
Dan is Project Director of the Overcoming Disadvantage in Early Childhood Project, a study of
the longitudinal impact of ECEC programs of children’s language and literacy outcomes.
Dan is also a Technical Consultant for UNICEF on the Modelling of Universal Pre-Primary
Education Project in Indonesia.
Shane Dawson is the Director of the Teaching Innovation Unit and Professor of Learning
Analytics at the University of South Australia. Shane's research focuses on the use of social
network analysis and learner ICT interaction data to inform and benchmark teaching and
learning quality. Shane is a founding executive member of the Society for Learning Analytics
Research and past program and conference chair of the International Learning Analytics and
Knowledge conference. He is a co-developer of numerous open source software including the
Online Video Annotations for Learning (OVAL) and SNAPP, a social network visualisation
tool designed for teaching staff to better understand, identify and evaluate student learning,
engagement, academic performance and creative capacity.
Paula G. de Barba is a research fellow at the Melbourne Centre for Studies in Higher
Education. Her role focuses on research in the areas of educational technology, educational
psychology and learning analytics. More specifically, Paula is interested on the cognitive and
emotional influences on student learning, such as achievement motivation and self-regulated
learning, in digital learning environments.
Claire Donald is a lecturer and learning designer in the Centre for Learning and Research
in Higher Education (CLeaR) at the University of Auckland. Her research interests are in
science and engineering education, learning design, teacher beliefs, and learning analytics.
She teaches and provides professional development on course design, academic practice and
blended learning, and collaborates with university teachers in a range of disciplines on course
development and evaluation projects.
Gregory M. Donoghue is a learning science researcher and PhD candidate at the Science
of Learning Research Centre, Melbourne Graduate School of Education. As a former
police detective and child protection investigator, Gregory has decades of experience deal-
ing with victimised and traumatised children, and now centres his academic and research on
xvi List of contributors
how the learning sciences, including educational neuroscience, can enhance student learning
and wellbeing.
Dragan Gašević is Professor of Learning Analytics in the Faculty of Education and Adjunct
Professor in the Faculty of Information Technology at Monash University. He is a co-
founder and the immediate past president (2015– 2017) of the Society for Learning
Analytics Research (SoLAR). His research is focused on conceptual, methodological, and
adoption aspects of learning analytics.
Cathy Gunn is an Associate Professor of Learning Technology at CLeaR (Centre for
Learning and Research in Higher Education) at the University of Auckland. CLeaR provides
leadership and promotes innovation in teaching and learning across eight faculties in New
Zealand’s largest research university. Cathy’s leadership roles at the Centre include Head of
eLearning, Deputy Director, Acting Director and Principal Researcher. She is an active
researcher and has produced more than 130 scholarly publications over 25 years in the higher
education sector. Cathy is an active contributor to international professional societies and
networks, a former President and life member of Ascilite, Australasia’s learning technology
professional network.
David Heath has a background in social work and digital design. He is a senior digital media
designer in the Office of the Deputy Vice-Chancellor Education at RMIT University. Whilst
at RMIT, David has worked on a number of major projects, with a current focus on the
implementation of a new LMS. Prior to joining RMIT, David worked at Charles Darwin
University, where he taught in social work. He has also worked as a project officer and then
project manager on multiple Category 1 funded Learning and Teaching projects.
Hilary Hollingsworth has 30 years’ experience working in a wide range of national and
international educational contexts including schools, universities, research organisations, gov-
ernment education departments, and private education service organisations. Her expertise is in
teaching and learning, teacher education and professional development, classroom observation
frameworks and the use of video, teacher feedback, teaching quality, school improvement,
assessing student learning, and communicating student progress. Hilary has experience teaching
in early primary years and facilitates an early childhood researchers’ network.
Sarah Howard is a senior lecturer in Information and Communication Technologies in
Education, at the University of Wollongong. She is a member of the SMART Infrastructure
Facility and Early Start Research Institute. Her research focuses on cultural and individual fac-
tors of teachers’ digital technology use, educational change and learning design.
Henk Huijser has been a curriculum designer in the Learning and Teaching Unit at
Queensland University of Technology since September 2017. He has wide-ranging experience
in academic development in different higher education contexts, including in Australia, the
Middle East and China. Henk is the co-author of Problem-based learning into the future (published
by Springer in 2017) and he publishes extensively in a range of higher education related fields.
Jelena Jovanović is an Associate Professor at the Department of Software Engineering,
University of Belgrade, Serbia. She teaches undergraduate and postgraduate courses in pro-
gramming, applied artificial intelligence, and data analytics. Her research focuses on com-
bining data science methods with knowledge databases to better understand learning
List of contributors xvii
processes. She is also interested in how insights obtained from analytics can be effectively
communicated to teachers and learners, as meaningful feedback.
Simon Knight is a lecturer at the Faculty of Transdisciplinary Innovation, University of
Technology Sydney, and a co-editor of the Journal of Learning Analytics. His research and
teaching focuses on the ways that people find, use and evaluate evidence.
Steve Leichtweis is the Head of the eLearning Group in the Centre for Learning and
Research in Higher Education (CLeaR) at the University of Auckland. His background
includes 15 years working in IT and a decade as an academic researcher in cardiovascular
physiology. His interests involve the use of digital technology to enhance learning and
teaching, learning analytics and learning environments.
Danny Y. T. Liu is a senior lecturer in the Educational Innovation team at the University of
Sydney. Danny is a molecular biologist by training, programmer by night, researcher and aca-
demic developer by day, and educator at heart. A multiple national teaching award winner, he
works at the confluence of learning analytics, student engagement, educational technology, and
professional development and leadership to enhance the student experience.
Lori Lockyer is Dean of the Graduate Research School at the University of Technology
Sydney and Honorary Professor at Macquarie University. Lori is an expert in educational
technology. Her research focuses on teaching and learning with technology in school, uni-
versity, and professional learning environments. Lori is interested in understanding the learning
processes and outcomes for learners engaged in technology-supported tasks – particularly col-
laborative and problem-based tasks.
Roberto Martinez-Maldonado is a full-time researcher in the areas of educational data
science and human-computer interaction at the Connected Intelligence Centre (CIC), and
data visualisation lecturer at the University of Technology Sydney (UTS), Australia. His
ongoing research takes a holistic perspective that spans across multiple disciplines. In parti-
cular, he is interested in the role of data and technology to empower people in areas such as
learning, health literacy and collaboration.
Jenny McDonald is a research associate at the Centre for Learning and Research in Higher
Education at the University of Auckland and co-developer of Quantext, text analytics
software for teachers. She is an experienced educational technologist and academic devel-
oper and has been involved in numerous cross-sector educational research and develop-
ment projects. Broadly interested in all aspects of teaching and learning with technology,
her research focus is on natural language processing for formative feedback and learning
analytics.
Negin Mirriahi is an academic developer (senior lecturer) at the University of South Aus-
tralia. She has extensive experience in managing, designing, implementing, and evaluating
innovative educational technology in higher education and in designing fully online, blen-
ded, and open courses. Her research projects focus on learning analytics to inform pedago-
gical change, video analytics for learning, videos in the curriculum, technology adoption,
blended and online learning, and academic staff development. One of her focus areas is
developing confidence and capability amongst academic staff to use learning analytics to
inform their teaching and course and curriculum design.
xviii List of contributors
intelligence on the reuse of knowledge at the University of New South Wales she joined
academia in 1999. While she continues to work on solutions to assist decision-making and
knowledge acquisition, for the past decade, her focus has been on intelligent systems, agent
technologies and virtual worlds to support human learning and well-being.
Bart Rienties is Professor of Learning Analytics in the Institute of Educational Technology
at the Open University, UK. He is the programme director of Learning Analytics within IET
and the head of Data Wranglers, whereby he leads a group of learning analytics academics who
conduct evidence-based research and sense making of Big Data.
Jason M. Stephens is an Associate Professor in the School of Learning, Development and
Professional Practice at The University of Auckland. His primary research interests include
human motivation, moral development, cheating behaviour, and the promotion academic
integrity during adolescence. He is a co-author of two books (Educating Citizens and
Creating a Culture of Academic Integrity) as well as numerous journal articles and other
publications related to these and other interests.
Kate Thompson is a senior lecturer in Educational Technologies in the School of Educa-
tion and Professional Studies at Griffith University. She is the Head of the Creative Practice
Lab and Co-Leader of the Teacher Education Research Program. Her research interests
include systems approaches to understanding learning and teaching, computer supported
collaborative learning, and, in particular, learning to be interdisciplinary.
Lisette Toetenel is currently lead instructional designer at a private bank in Zurich with
responsibility for e-learning globally. She used to lead the Learning Design team, within the
Institute of Educational Technology, at The Open University. Lisette's published work is
focused on learning design and learning analytics, along with social networking and interac-
tion. Lisette has delivered presentations, key notes and master classes in commercial and
academic settings in the UK and abroad.
Deborah West is Pro Vice Chancellor (Learning and Teaching Innovation) at Flinders
University. With a disciplinary background in social work, her research has encompassed
both her discipline and learning and teaching. In recent years, her focus has been on the use
of technology to support learning and teaching and in particular learning analytics.
This page intentionally left blank
1
INTRODUCTION
Learning analytics in the classroom
Learning analytics burst into prominence as a field in the early 2010s. It has since grown
rapidly, with researchers around the world from a variety of disciplines engaging in the
learning analytics community through meetings, conferences and publications. Learning
analytics was initially defined as: ‘the measurement, collection, analysis and reporting of data
about learners and their contexts, for purposes of understanding and optimizing learning and
the environments in which it occurs’ (Long, Siemens, Conole & Gašević, 2011). The field
provides the basis for the power of data science methodologies to be brought to bear on
education. In this volume, we have brought together some of the leading scholars in learning
analytics to take one further step towards that goal. We collectively seek to understand how
learning analytics will have an impact in the classroom.
The field of learning analytics can no longer be thought of as new or emerging. With the
Learning Analytics and Knowledge Conference now well established and journals such as The
Journal of Learning Analytics also now several volumes in press, the field is maturing. Despite
the rapid growth of the interdisciplinary endeavour that is learning analytics, there remains
much work to be done. In this edited volume, we seek to highlight the work of some of the
key researchers in the field with a particular emphasis on what the work means in practice.
As with the previous edited collection, From the laboratory to the classroom: Translating science of
learning for teachers (Horvath, Lodge & Hattie, 2017), this volume is an attempt to make sense
of a complex field of research for teachers and educational researchers.
As the field of learning analytics has grown, there appear to have been several distinct bran-
ches, or sub-fields, emerge. The first, understandably, focuses predominantly on the data and
analytics side. This includes the aspects of the field traditionally associated with core aspects of
data science. The kinds of activities occurring in this branch include the development of means
of collecting and synthesising data, the construction of predictive models and the creation and
testing of algorithmic work. It is recognised that this is a high-level generalisation of the kind of
data science work that is occurring in the community, we are here trying to provide an overall
categorisation of the strands of activity rather than a comprehensive map.
The second broad area of activity involves the use of methodologies developed through
the first to better understand student learning (Lodge & Corrin, 2017). This sub-field is
2 Lodge, Horvath and Corrin
engaged in using new forms of data developed by and through the learning analytics com-
munity to uncover aspects of learning that were previously difficult to capture, particularly in
real time. There is a strong overlap between this branch and research occurring in the cog-
nate disciplines such as the learning sciences and educational psychology (Gašević, Dawson &
Siemens, 2015). The increased use of data such as log files and the power of techniques for
analysing and modelling data across various instruments is providing insight into student
learning that has been largely limited in the past to crude behavioural data, self-report
instruments and complex psychophysiological measures. The examples presented here suggest
great potential in the use of learning analytics to enrich the learning sciences and educational
psychology and is a worthy endeavour.
While the focus of much of the work presented in this volume is on classroom applica-
tion, it should not be overlooked the contribution learning analytics is making to founda-
tional research in multiple domains, particularly in computer science, human computer
interaction and the learning sciences. The fundamental research is critical for moving the
field forward but is often forgotten in the discussions around the utility of learning analytics
in practice.
The third major sub-branch is the use of learning analytics in practice, the focus of this
volume. This is the sub-branch of the field that dominated its early focus. Significant effort in
early learning analytics work concentrated on innovations that would support student reten-
tion both in face-to-face learning environments (e.g. Arnold & Pistilli, 2012; Aguilar, Lonn
& Teasley, 2014) and online (Kizilcec, Piech & Schneider, 2013). In more recent times, both
the use of learning analytics to make predictions about student progress in the wild and the
development of analytics-based interventions have been key areas of progress. As opposed to
the high-level modelling used in earlier attempts at predicting when students might need
assistance, later efforts are now focussed on more sophisticated approaches. These efforts are
increasingly incorporating what is being learned from foundational research, as described in
the first two sub-branches, and an acknowledgement of the key role that learning design
plays in helping teachers to interpret the analytics in ways that impact practice (Lockyer,
Heathcote & Dawson, 2013; Bakharia et al., 2016).
Finally, there are obvious policy and ethical implications that are raised by the
increased collection and use of data generated about and by students. While we did not
seek contributions directly assessing and/or discussing the ethical implications of learning
analytics in the classroom, these issues undoubtedly permeate all the ideas and practices
outlined in this volume. We consider it critical that this is recognised here as a core
consideration for all work occurring in the field of learning analytics, despite it not being
a core focus of the volume.
There is undoubtedly nascent and emerging potential for learning analytics to fundamen-
tally alter practice and policy in education in the near future, particularly as machine learning
and artificial intelligence increase in power and availability. As the theory and methods in
learning analytics research become increasingly sophisticated, it is not hard to see educational
researchers and teachers being left behind as the technical aspects of the field advance. Our
aim in putting this volume together was to try to bridge that gap. Learning analytics has
been, and must remain, a transdisciplinary endeavour if the data generated are to continue to
be used to meaningfully enhance student learning (Lodge, Alhadad, Lewis & Gašević, 2017).
We have brought together a collection of chapters from some of the leading researchers in
learning analytics to help close this gap.
Introduction 3
The second major part of the book focuses on understanding learning through analytics.
Much of the emphasis on learning analytics has been to understand student progression and
determine possible avenues for intervention. As is evident in Chapter 4 by Lodge, Panadero,
Broadbent and de Barba, there is great value in drawing on learning analytics data, such as
behavioural trace data, to better understand learning processes such as those involved in self-
regulated learning. Higher-level cognitive and metacognitive processes like those involved in
self-regulated learning need to be better understood in order to best use data to help facilitate
and enhance learning. Enhancing student self-regulated learning is a difficult undertaking
without the nuance that can be provided by a teacher, but some possibilities for using data to
do so are discussed in this chapter. Along similar lines, Arguel, Pachman and Lockyer discuss
the possibilities for understanding and responding to student emotion using data in Chapter 5.
Progress in the area of affective computing and an increased emphasis on emotions in education
mean that there is great potential for detecting and intervening when students exhibit various
emotions while they learn. Perhaps the most obvious example of this is when behavioural
traces suggest a student has reached an impasse in the learning process and is confused. Arguel
and colleagues cover how learning analytics can help better understand how student emotions
impact on learning in digital environments and how these environments can be created to
respond to these emotions.
We then turn to the relationship between learning design and learning analytics. The
relationship between these areas has been a strong area of focus in recent years (e.g. Mor,
Ferguson & Wasson, 2015). In Chapter 6, Olney, Rienties and Toetenel describe how
learning analytics and learning design have been used in combination in a large-scale imple-
mentation at the Open University in the UK. A key feature of the approach used in the
examples outlined in this chapter is that visualisations are provided to teachers who are then
able to interpret the data being generated by students through the lens of the learning design.
Olney and colleagues therefore provide a cutting-edge example of a large-scale imple-
mentation of learning analytics with learning design as the centrepiece. Prieto-Alvarez,
Martinez-Maldonado and Anderson extend the conversation about the use of design to
inform learning analytics interventions by discussing possibilities for co-design in Chapter 7.
In particular, they point out the power of co-designing learning analytics implementations
with students. This chapter therefore bridges the learning analytics community with other
trends such as the ‘students as partners’ initiative (e.g. Mercer-Mapstone et al., 2017). In
Chapter 8, Thompson and colleagues focus on the design of assessment in a graduate course
that strategically incorporates data. In particular, they emphasise the importance of an inter-
disciplinary approach, through which the implementation of a data-rich approach achieved
greater impact for students and for generating new insights into effective learning analytics
implementation.
The largest part of the book is devoted to chapters discussing specific examples of the
possible impact that learning analytics can and will have on education. The collection of
chapters in this part covers a wide range of contexts and implementation strategies. In the first
of these practice-focused chapters, Jovanovic and colleagues discuss the role learning analytics
can play in understanding and enhancing student learning strategies in flipped classes in
Chapter 9. Using a learning analytics framework and design research methodology, these
researchers have examined ways of enhancing self-regulated learning in flipped classes. This
chapter adds to the limited research to date on ways in which learning analytics might be
used to enhance learning in flipped classes but also provides approaches that will be useful
Introduction 5
beyond that setting. Along similar lines, Howard, Thompson and Pardo present research on
the use of learning analytics to support learning in a blended context in Chapter 10. These
researchers have used a minimally intrusive approach drawing on a diverse range of data from
physical and virtual environments to facilitate interactions in both settings. The consideration
of learning that occurs across digital and real-life classroom settings is vital, particularly given
most educational activities and tasks can now be described as blended due to the ubiquitous
nature of digital technologies in many aspects of learning.
Text and writing analytics have become an area of particular emphasis in the field of
learning analytics in the last few years. In Chapter 11, McDonald, Moskal, Gunn and Donald
examine the role language plays in learning. In particular, they outline how language can
illuminate the efficacy of student-teacher interactions. Their work demonstrates the power of
text-based analytics for helping to better understand these interactions and use the insights
derived from text-based data to enhance student learning.
Student engagement is often seen as a central concern in ensuring students successfully
transition into, through and out of university. In Chapter 12, Blumenstein and colleagues
present a set of case studies where analytics are used to inform interventions to ‘nudge’ stu-
dents towards academic success. Directly instructing students to engage in their studies can
often backfire, with students interpreting the intervention as patronising or punitive. The
framework provided by these researchers will prove useful in informing similar approaches
that draw on data to encourage, rather than demand, students to more meaningfully engage
with their studies.
In Chapter 13, Corrin explores the opportunities and challenges that arise from the
development of student-facing analytics tools. An increasing number of systems are emerging
with the potential to place the outcomes of learning analytics directly in the hands of stu-
dents. While this can offer many benefits in supporting students’ ability to monitor and adapt
their study strategies, there are also concerns about the impact on students who are less cap-
able of interpreting these data and determining appropriate actions. This chapter profiles
different classifications of student-facing learning analytics tools and offers insights into how
these can be designed, implemented and supported in the classroom.
Taking a slightly different perspective, Cloney and Hollingsworth discuss large-scale testing
of students and what the data generated from these tests might contribute to improving
practice in Chapter 14. Coming from a psychometric perspective, these researchers offer ways
of visualising large-scale datasets and how the lessons from the analysis of these data might
inform practice and data gathering at a more local level.
In the final part of the book, there are two chapters devoted to the process of imple-
menting analytics, taking a higher-level perspective on the challenges associated with doing
so. Learning analytics implementations do not happen in a vacuum and it is critical to con-
sider how best to equip teachers with the necessary skill and knowledge to work with and
make sense of data. In Chapter 15, Gunn and McDonald report on a nationally funded
project in New Zealand examining the conditions required for tertiary teachers to make use
of data generated by and about students. While the results were not encouraging in terms of
the preparedness of teachers to engage in this data-driven process, they present a set of case
studies to demonstrate what is possible when teachers work with others who have the
expertise to assist with learning analytics interventions.
Finally, West, Huijser and Heath discuss one of the most critical considerations in any
learning analytics intervention: leadership. In Chapter 16, these authors outline what is
6 Lodge, Horvath and Corrin
What the chapters of this book suggest about where learning analytics is
going
When taken as a whole, there are some key trends across the chapters in this volume which
might suggest some critical areas for the learning analytics community and those interested in
learning analytics beyond the research community to focus their efforts. These key areas are
outlined below.
References
Aguilar, S., Lonn, S. & Teasley, S. D. (2014). Perceptions and use of an early warning system during a
higher education transition program. Proceedings of the Fourth International Conference on Learning Analytics
and Knowledge (pp. 113–117). New York: ACM.
8 Lodge, Horvath and Corrin
Arnold, K. E. & Pistilli, M. D. (2012). Course signals at Purdue: using learning analytics to increase
student success. Proceedings of the 2nd International Conference on Learning Analytics & Knowledge (pp.
267–270). New York: ACM.
Bakharia, A., Corrin, L., de Barba, P., Kennedy, G., Gasevic, D., Mulder, R., Williams, D., Dawson, S.
& Lockyer, L. (2016). A conceptual framework linking learning design with learning analytics. In
T. Reiners, B. R. von Konsky, D. Gibson, V. Chang, L. Irving and K. Clarke (Eds.), Proceedings of
the 6th International Conference on Learning Analytics and Knowledge (pp. 409–413). New York: ACM.
Bennett, S., Agostinho, S. & Lockyer, L. (2017). The process of designing for learning: understanding
university teachers’ design work. Educational Technology Research and Development, 65(1), 125–145.
Corrin, L., Law, N., Ringtved, U. & Milligan, S. (2018) DesignLAK18: evaluating systems and tools
that link learning analytics and learning design (Workshop). In Companion Proceedings of the 8th Inter-
national Conference on Learning Analytics and Knowledge (pp. 423–427). New York: ACM.
Dollinger, M. & Lodge, J. M. (2018). Co-creation strategies for learning analytics. In Proceedings of the
International Conference on Learning Analytics and Knowledge, Sydney, Australia, March 2018 (LAK’18).
doi:10.1145/3170358.3170372
Gašević, D., Dawson, S. & Siemens, G. (2015). Let’s not forget: learning analytics are about learning.
TechTrends, 59(1), 64–71.
Goodyear, P. (2015). Teaching as design. HERDSA Review of Higher Education, 2, 27–50.
Horvath, J. C., Lodge, J. M. & Hattie, J. A. C. (2017). From the laboratory to the classroom: translating
science of learning for teachers. Abingdon, UK: Routledge.
Kizilcec, R. F., Piech, C. & Schneider, E. (2013). Deconstructing disengagement: analyzing learner
subpopulations in massive open online courses. In Proceedings of the Third International Conference on
Learning Analytics and Knowledge (pp. 170–179). New York: ACM.
Laurillard, D. (2013). Teaching as a design science: building pedagogical patterns for learning and technology.
Abingdon, UK: Routledge.
Lockyer, L., Heathcote, E. & Dawson, S. (2013). Informing pedagogical action: Aligning learning
analytics with learning design. American Behavioral Scientist, 57(10), 1439–1459.
Lodge, J. M. & Corrin, L. (2017). What data and analytics can and do say about effective learning.
Nature: npj Science of Learning, 2(1), 4–5. doi:10.1038/s41539-017-0006-5
Lodge, J. M., Alhadad, S. S. J., Lewis, M. J. & Gašević, D. (2017). Inferring learning from big data: the
importance of a transdisciplinary and multidimensional approach. Technology, Knowledge & Learning,
22(3), 385–400. doi:10.1007/s10758-017-9330-3
Long, P. D., Siemens, G., Conole, G. & Gašević, D. (Eds) (2011). Proceedings of the 1st International
Conference on Learning Analytics and Knowledge (LAK’11). New York: ACM.
Mercer-Mapstone, L., Dvorakova, S. L., Matthews, K., Abbot, S., Cheng, B., Felten, P. … & Swaim,
K. (2017). A systematic literature review of students as partners in higher education. International
Journal for Students as Partners, 1(1).
Mor, Y., Ferguson, R. & Wasson, B. (2015). Learning design, teacher inquiry into student learning and
learning analytics: a call for action. British Journal of Educational Technology, 46(2), 221–229.
Popenici, S. A. & Kerr, S. (2017). Exploring the impact of artificial intelligence on teaching and
learning in higher education. Research and Practice in Technology Enhanced Learning, 12(1), 22.
PART I
Theoretical perspectives
This page intentionally left blank
2
LEARNING ANALYTICS AND TEACHING
A conceptual framework for translation and application
Introduction
At its essence, human learning is a complex neurological phenomenon. Ultimately, learning is
instantiated by changes to the organisation of the human nervous system (Kandel, 2000;
Gazzaniga 2004), corollaries of which include more or less permanent changes in behaviour
(Mayer, 2003; Shuell, 1986). Education, on the other hand, is an even more complex socio-
cultural phenomenon that subsumes the biological, psychological, sociological and even
physical sciences. In other words, learning involves neurological processes, psychological
phenomena and sociocultural factors such as social interactions, cultural agents, relationships
and communication.
Learning analytics (LA) is an emerging field that purports to support, enhance, facilitate,
predict and measure human learning in educational settings. It achieves this largely by mea-
suring learning correlates (an array of phenomena that exist or emerge when learning occurs),
including, for example, reaction times, keyboard entries, mouse clicks, etc. More recently,
this has been expanded through the use of multimodal learning analytics that also incorpo-
rate, for example, pupil dilation, pulse rate and sweat responses (Ochoa, 2017). It is important
to note, however, that the correlates of learning are not the same thing as learning itself (see
Gašević, Dawson & Siemens, 2015). For this reason, it is important to consider how the
former can (and cannot) meaningfully be used to assess the latter. In addition, while collec-
tion of data from LA implementations often occurs in educational environments, making
meaning of these data still requires translation in order to be valid. As with scientific or
neuroscientific data, LA data also needs be understood within the context in which it was
produced in order to derive meaning both within and beyond that context. It is not sufficient
to rely solely on algorithms to abstract this meaning.
Inferring truth from one discipline to another – in this case, from LA to education – is
beset with problems of translation (see Lodge, Alhadad, Lewis & Gašević, 2017). These issues
constrain and often invalidate applications of knowledge from one discipline to another; a
topic that has previously been explored generally within science (Anderson, 2002; Horvath &
Donoghue, 2016) and specifically within educational neuroscience (Donoghue & Horvath,