Download as pdf or txt
Download as pdf or txt
You are on page 1of 32

LEARNING ANALYTICS IN THE CLASSROOM

Learning Analytics in the Classroom presents a coherent framework for the effective translation of learning
analytics research for educational practice to its practical application in different education domains. High-
lighting the real potential of learning analytics as a way to better understand and enhance student learning,
and with each chapter including specific discussion about what the research means in the classroom, this
book provides educators and researchers alike with the tools and frameworks to effectively make sense of
and use data and analytics in their everyday practice.
This volume is split into five sections, all of which relate to the key themes in understanding learning
analytics through the lens of the classroom:

 broad theoretical perspectives


 understanding learning through analytics
 the relationship between learning design and learning analytics
 analytics in the classroom and the impact it can and will have on education
 implementing analytics and the challenges involved.

Bridging the gap between research, theory and practice, Learning Analytics in the Classroom is both a
practical tool and an instructive guide for educators, and a valuable addition to researchers' bookshelves. A
team of world-leading researchers and expert editors have compiled a state-of-the-art compendium on
this fascinating subject and this will be a critical resource for the evolution of this field into the future.

Jason M. Lodge is Associate Professor of Educational Psychology in the School of Education and
Institute for Teaching and Learning Innovation at the University of Queensland, Australia. He is affili-
ated with the National Australian Research Council funded Science of Learning Research Centre.
Jason’s research focusses on the cognitive and emotional aspects of learning, particularly in digital
learning environments.

Jared Cooney Horvath is a research fellow at St Vincent’s Hospital in Melbourne and the co-founder of the
Science of Learning Group – a team dedicated to bringing the latest in educationally relevant brain and
behavioural research to students and educators at all levels. Currently he teaches at the University of
Melbourne, prior to this he spent a number of years working as a teacher and curriculum developer for several
institutions around Los Angeles, Seattle and Boston.

Linda Corrin is Senior Lecturer in Higher Education in the Williams Centre for Learning Advancement in the
Faculty of Business and Economics, University of Melbourne. Linda’s research interests include students’
engagement with technology, learning analytics, feedback and learning design. Currently, she is working on
several large research projects exploring how learning analytics can be used to provide meaningful and timely
feedback to academics and students. Linda is co-founder of the Victorian and Tasmanian Learning Analytics
Network.
This page intentionally left blank
LEARNING ANALYTICS IN
THE CLASSROOM
Translating Learning Analytics Research
for Teachers

Edited by Jason M. Lodge, Jared Cooney Horvath and


Linda Corrin
First published 2019
by Routledge
2 Park Square, Milton Park, Abingdon, Oxon OX14 4RN
and by Routledge
711 Third Avenue, New York, NY 10017
Routledge is an imprint of the Taylor & Francis Group, an informa business
© 2019 selection and editorial matter, Jason M. Lodge, Jared Cooney Horvath and Linda
Corrin; individual chapters, the contributors
The right of the editors to be identified as the author of the editorial material, and of the
authors for their individual chapters, has been asserted in accordance with sections 77 and 78
of the Copyright, Designs and Patents Act 1988.
All rights reserved. No part of this book may be reprinted or reproduced or utilised in any
form or by any electronic, mechanical, or other means, now known or hereafter invented,
including photocopying and recording, or in any information storage or retrieval system,
without permission in writing from the publishers.
Trademark notice: Product or corporate names may be trademarks or registered trademarks, and
are used only for identification and explanation without intent to infringe.
individual chapters, the contributors
British Library Cataloguing in Publication Data
A catalogue record for this book is available from the British Library
Library of Congress Cataloging in Publication Data
Names: Lodge, Jason M., editor. | Horvath, Jared Cooney, editor. | Corrin, Linda, editor.
Title: Learning analytics in the classroom : translating learning analytics research for teachers /
edited by Jason M. Lodge, Jared Cooney Horvath, Linda Corrin.
Description: Abingdon, Oxon ; New York,
NY : Routledge, 2019. | Includes bibliographical references.
Identifiers: LCCN 2018021778 (print) | LCCN 2018037812 (ebook) | ISBN
9781351113038 (ebook) | ISBN 9780815362111 (hbk) | ISBN 9780815362128 (pbk) |
ISBN 9781351113038 (ebk)Subjects: LCSH: Education–Research–Statistical methods. |
Education–Research–Data processing. | Educational statistics.
Classification: LCC LB1028.43 (ebook) | LCC LB1028.43 .L42 2019 (print) | DDC 370.72–
dc23LC record available at https://lccn.loc.gov/2018021778

ISBN: 978-0-8153-6211-1 (hbk)


ISBN: 978-0-8153-6212-8 (pbk)
ISBN: 978-1-351-11303-8 (ebk)

Typeset in Bembo
by Taylor & Francis Books
This book is dedicated to all the scholars, teachers, administrators,
technicians, designers and others who devote their lives to
improving student learning. As education is becoming increasingly
politicised, your commitment and enthusiasm to students and their
learning is perhaps more important now than it has ever been.
This page intentionally left blank
CONTENTS

List of figures x
List of tables xii
Acknowledgements xiii
List of contributors xiv

1 Introduction: Learning analytics in the classroom 1


Jason M. Lodge, Jared Cooney Horvath and Linda Corrin

PART I
Theoretical perspectives 9
2 Learning analytics and teaching: A conceptual framework for translation
and application 11
Gregory M. Donoghue, Jared Cooney Horvath and Jason M. Lodge
3 The perspective realism brings to learning analytics in the classroom 22
Kathryn Bartimote, Abelardo Pardo and Peter Reimann

PART II
Understanding learning through analytics 43
4 Supporting self-regulated learning with learning analytics 45
Jason M. Lodge, Ernesto Panadero, Jaclyn Broadbent and Paula G. de Barba
5 Identifying epistemic emotions from activity analytics in interactive
digital learning environments 56
Amaël Arguel, Mariya Pachman and Lori Lockyer
viii Contents

PART III
Learning design and analytics 69
6 Gathering, visualising and interpreting learning design analytics to
inform classroom practice and curriculum design: A student-centred
approach from the Open University 71
Tom Olney, Bart Rienties and Lisette Toetenel
7 Co-designing learning analytics tools with learners 93
Carlos G. Prieto-Alvarez, Roberto Martinez-Maldonado and
Theresa Dirndorfer Anderson
8 Connecting expert knowledge in the design of classroom learning
experiences 111
Kate Thompson, Sakinah S. J. Alhadad, Simon Buckingham Shum,
Sarah Howard, Simon Knight, Roberto Martinez-Maldonado and
Abelardo Pardo

PART IV
Analytics in the classroom 129
9 An analytics-based framework to support teaching and learning in a
flipped classroom 131
Jelena Jovanović, Dragan Gašević, Abelardo Pardo, Negin Mirriahi and
Shane Dawson
10 Opening the black box: Developing methods to see learning in
contemporary learning environments 152
Sarah Howard, Kate Thompson and Abelardo Pardo
11 Text analytic tools to illuminate student learning 165
Jenny McDonald, Adon Christian Michael Moskal, Cathy Gunn and
Claire Donald
12 Data-informed nudges for student engagement and success 185
Marion Blumenstein, Danny Y. T. Liu, Deborah Richards, Steve Leichtweis
and Jason M. Stephens
13 Supporting the use of student-facing learning analytics in the classroom 208
Linda Corrin
14 Using measures of pedagogical quality to provide feedback and improve
practice 221
Dan Cloney and Hilary Hollingsworth
Contents ix

PART V
Implementating analytics 245
15 Promoting learning analytics for tertiary teachers: A New Zealand case
study 247
Cathy Gunn and Jenny McDonald
16 Blurring the boundaries: Developing leadership in learning analytics 267
Deborah West, Henk Huijser and David Heath

Appendix A: Discussion questions 284


Index 288
FIGURES

3.1 Learning analytics process model for teachers 26


5.1 Emotion transitions as observed in D’Mello and Graesser (2012) 58
6.1 Sample activity type profile histogram 81
6.2 Visualisation of first four weeks of module before revision session 84
6.3 Visualisation of first four weeks of module after revision 86
6.4 Longitudinal visualisation of learning design and average students’
engagement in the VLE each week for CS2, from Nguyen, Rienties,
Toetenel, Ferguson, and Whitelock (2017) 87
6.5 Longitudinal visualisation of learning design and average students’
engagement in the VLE each week for CS3, from Nguyen, Rienties,
Toetenel, Ferguson, and Whitelock (2017) 88
7.1 Support and inclusion added to the Design Thinking process for co-
design (Ellingsen, 2016) 95
7.2 Interaction design process and roles 96
7.3 Co-creating Personas 99
7.4 Student journey for learning analytics made by a group of learners for
our sessions 101
7.5 Six of our students engaged in a focus group session 103
7.6 Output of the knowledge mapping activity generated by learners in
one of our sessions 104
7.7 A group of our learners co-creating a low fidelity prototype for our
mobile application related to personal feedback 106
7.8 Tools and techniques for a co-design process used to involve MDSI
learners at UTS to design learning analytics means to support the
development of their graduate attributes 108
8.1 Design sketch 118
8.2 Design cycle 122
List of figures xi

9.1 The proposed analytics-based framework for examining learning


strategies 135
9.2 An instance of the framework for detecting strategies and strategy
changes at the level of a course week 138
9.3 An instance of the framework for detecting strategies through learning
session analysis 142
9.4 Distribution of learning strategies per course week for each identified
student strategy profile 145
11.1 Three valid analyses of the sentence ‘The old English language teacher
grinned’ 168
11.2 Sorting responses by length – non-responses, ‘don’t know’ and similar
tend to float to the top 177
11.3 Example word cloud generated at https://worditout.com/ 178
11.4 Side-by-side comparison of word cloud from https://worditout.com
and keyword frequency from http://textalyser.net 179
11.5 Keyword in context (KWIC) analysis of the word ‘blood’ from the
sample text, showing the words appearing on either side 180
11.6 Word tree generated at https://www.jasondavies.com/wordtree
showing the word ‘blood’ in context 181
12.1 Overview of basic SRES functions 190
14.1 Wright map of multidimensional partial credit model of CLASS
Instructional Support 230
15.1 Learning analytics stakeholders and roles 251
15.2 Teacher awareness, ease of access and interpretation of online system
data 252
15.3 Awareness and use of learning analytics for specific purposes 253
16.1 The Let’s Talk Learning Analytics Framework 277
TABLES

2.1 Layers of complexity in human learning 13


2.2 Abstracted Layered Framework 15
2.3 Forms of valid translation 17
3.1 A taxonomy of educational analytics 31
6.1 OU Learner Experience Wheel words. Source: Open University UK
(2016) 77
6.2 Sample activity mapped using the Activity Type Classification
Taxonomy 80
8.1 Design of the core course 116
8.2 Information about design ideas 121
10.1 Data source, researcher, focus 157
12.1 Proxies for engagement and timely intervention 194
12.2 Synthesised framework for nudges to enhance engagement based on
data 201
14.1 Description of the indicators (dimensions) and factors (domains) of the
Classroom Assessment Scoring System 226
14.2 Orthodox description of the theoretical continuum of Instructional
Support 227
14.3 Illustration of the continuum of behaviours making up Instructional
Support using evidence from IRT analysis 232
14.4 Professional Learning Steps and Teacher Actions in the Structured
Stimulation of Teacher Reflection (SSTR) Process and Instructional
Support Continuum Example 236
15.1 Case studies summary 255
15.2 Example scenarios derived from case studies for students at risk (a) and
for revealing misconceptions (b) 264
ACKNOWLEDGEMENTS

The Australian Research Council provided funding for the editors to collate this volume as
part of a Special Research Initiative for the ARC-SRI Science of Learning Research Centre
(project number SRI20300015).
LIST OF CONTRIBUTORS

Sakinah S. J. Alhadad is a lecturer in Learning Innovation at Learning Futures at Griffith


University. She is a psychological scientist with specialist expertise in learning analytics,
research methods and statistics, and academic development. Sakinah’s research interests sit at
the research-practice nexus, with the broad goal of enhancing the practice of teaching and
learning. Her ultimate aim is to enable positive change for learners and educators through
transformative higher education practices
Theresa Dirndorfer Anderson is Course Director of the Master of Data Science and
Innovation in the Connected Intelligence Centre of the University of Technology Sydney.
As an information ethicist, she is particularly interested in the interaction between creative
and analytic thinking and doing and in examining ways information systems and institutional
policies might better support both creative and analytic activities.
Amaël Arguel is a psychological scientist specialised in learning from digital technologies
and the Internet. Based in Sydney, at Macquarie University, he is a research fellow of the
Science of Learning Research Centre (a Special Research Initiative of the Australian
Research Council), where he is involved in a research stream on understanding learners’
confusion in digital environments.
Kathryn Bartimote is Head, Quality and Analytics within the Deputy Vice-Chancellor
(Education) Portfolio, and an honorary associate in the School of Education and Social Work
at the University of Sydney. In her leadership role, she oversees surveys of the student
experience and graduate outcomes, curriculum quality evaluation, and enabling learning
analytics across the institution. Her current research interests include motivation, self-regulation
and metacognition, epistemic beliefs, and research methodology.
Marion Blumenstein is a senior lecturer in the Centre for Learning and Research in Higher
Education (CLeaR) at the University of Auckland. Marion is a biologist by training with a PhD in
cancer research (Dr rer nat). In 2009, after working over a decade in biomedical research, she
switched to higher education research. Her special interest is learning analytics for the optimisation
of student learning bringing together her passion for teaching and research.
List of contributors xv

Jaclyn Broadbent is an Associate Head of School (Teaching & Learning) in the School of
Psychology at Deakin University. Since completing her PhD in Psychology in December
2011, Jaclyn runs the School of Psychology’s largest unit HBS110: Health Behaviour,
annually teaching more than 2100 Faculty of Health students the skills and principles of
behaviour modification. Despite Jaclyn’s high teaching load, her research output relative to
opportunity is excellent and shows a clear upward trajectory with 29 published/accepted
papers in the last four years (2013–2017).
Simon Buckingham Shum is Professor of Learning Informatics at the University of Tech-
nology Sydney, and Director of the Connected Intelligence Centre, which is charged with
building the capacity of UTS to benefit from analytics in teaching and learning, research and
business operations. He has been active in shaping the field of learning analytics, and is a co-
founder and former vice-president of the Society for Learning Analytics Research.
Dan Cloney’s expertise is in early education, cognitive development, academic achieve-
ment, and inequality. His current work focuses on the potential for high-quality early
childhood care and education (ECCE) programs to narrow SES-related development gaps.
Dan is Project Director of the Overcoming Disadvantage in Early Childhood Project, a study of
the longitudinal impact of ECEC programs of children’s language and literacy outcomes.
Dan is also a Technical Consultant for UNICEF on the Modelling of Universal Pre-Primary
Education Project in Indonesia.
Shane Dawson is the Director of the Teaching Innovation Unit and Professor of Learning
Analytics at the University of South Australia. Shane's research focuses on the use of social
network analysis and learner ICT interaction data to inform and benchmark teaching and
learning quality. Shane is a founding executive member of the Society for Learning Analytics
Research and past program and conference chair of the International Learning Analytics and
Knowledge conference. He is a co-developer of numerous open source software including the
Online Video Annotations for Learning (OVAL) and SNAPP, a social network visualisation
tool designed for teaching staff to better understand, identify and evaluate student learning,
engagement, academic performance and creative capacity.
Paula G. de Barba is a research fellow at the Melbourne Centre for Studies in Higher
Education. Her role focuses on research in the areas of educational technology, educational
psychology and learning analytics. More specifically, Paula is interested on the cognitive and
emotional influences on student learning, such as achievement motivation and self-regulated
learning, in digital learning environments.
Claire Donald is a lecturer and learning designer in the Centre for Learning and Research
in Higher Education (CLeaR) at the University of Auckland. Her research interests are in
science and engineering education, learning design, teacher beliefs, and learning analytics.
She teaches and provides professional development on course design, academic practice and
blended learning, and collaborates with university teachers in a range of disciplines on course
development and evaluation projects.
Gregory M. Donoghue is a learning science researcher and PhD candidate at the Science
of Learning Research Centre, Melbourne Graduate School of Education. As a former
police detective and child protection investigator, Gregory has decades of experience deal-
ing with victimised and traumatised children, and now centres his academic and research on
xvi List of contributors

how the learning sciences, including educational neuroscience, can enhance student learning
and wellbeing.
Dragan Gašević is Professor of Learning Analytics in the Faculty of Education and Adjunct
Professor in the Faculty of Information Technology at Monash University. He is a co-
founder and the immediate past president (2015– 2017) of the Society for Learning
Analytics Research (SoLAR). His research is focused on conceptual, methodological, and
adoption aspects of learning analytics.
Cathy Gunn is an Associate Professor of Learning Technology at CLeaR (Centre for
Learning and Research in Higher Education) at the University of Auckland. CLeaR provides
leadership and promotes innovation in teaching and learning across eight faculties in New
Zealand’s largest research university. Cathy’s leadership roles at the Centre include Head of
eLearning, Deputy Director, Acting Director and Principal Researcher. She is an active
researcher and has produced more than 130 scholarly publications over 25 years in the higher
education sector. Cathy is an active contributor to international professional societies and
networks, a former President and life member of Ascilite, Australasia’s learning technology
professional network.
David Heath has a background in social work and digital design. He is a senior digital media
designer in the Office of the Deputy Vice-Chancellor Education at RMIT University. Whilst
at RMIT, David has worked on a number of major projects, with a current focus on the
implementation of a new LMS. Prior to joining RMIT, David worked at Charles Darwin
University, where he taught in social work. He has also worked as a project officer and then
project manager on multiple Category 1 funded Learning and Teaching projects.
Hilary Hollingsworth has 30 years’ experience working in a wide range of national and
international educational contexts including schools, universities, research organisations, gov-
ernment education departments, and private education service organisations. Her expertise is in
teaching and learning, teacher education and professional development, classroom observation
frameworks and the use of video, teacher feedback, teaching quality, school improvement,
assessing student learning, and communicating student progress. Hilary has experience teaching
in early primary years and facilitates an early childhood researchers’ network.
Sarah Howard is a senior lecturer in Information and Communication Technologies in
Education, at the University of Wollongong. She is a member of the SMART Infrastructure
Facility and Early Start Research Institute. Her research focuses on cultural and individual fac-
tors of teachers’ digital technology use, educational change and learning design.
Henk Huijser has been a curriculum designer in the Learning and Teaching Unit at
Queensland University of Technology since September 2017. He has wide-ranging experience
in academic development in different higher education contexts, including in Australia, the
Middle East and China. Henk is the co-author of Problem-based learning into the future (published
by Springer in 2017) and he publishes extensively in a range of higher education related fields.
Jelena Jovanović is an Associate Professor at the Department of Software Engineering,
University of Belgrade, Serbia. She teaches undergraduate and postgraduate courses in pro-
gramming, applied artificial intelligence, and data analytics. Her research focuses on com-
bining data science methods with knowledge databases to better understand learning
List of contributors xvii

processes. She is also interested in how insights obtained from analytics can be effectively
communicated to teachers and learners, as meaningful feedback.
Simon Knight is a lecturer at the Faculty of Transdisciplinary Innovation, University of
Technology Sydney, and a co-editor of the Journal of Learning Analytics. His research and
teaching focuses on the ways that people find, use and evaluate evidence.
Steve Leichtweis is the Head of the eLearning Group in the Centre for Learning and
Research in Higher Education (CLeaR) at the University of Auckland. His background
includes 15 years working in IT and a decade as an academic researcher in cardiovascular
physiology. His interests involve the use of digital technology to enhance learning and
teaching, learning analytics and learning environments.
Danny Y. T. Liu is a senior lecturer in the Educational Innovation team at the University of
Sydney. Danny is a molecular biologist by training, programmer by night, researcher and aca-
demic developer by day, and educator at heart. A multiple national teaching award winner, he
works at the confluence of learning analytics, student engagement, educational technology, and
professional development and leadership to enhance the student experience.
Lori Lockyer is Dean of the Graduate Research School at the University of Technology
Sydney and Honorary Professor at Macquarie University. Lori is an expert in educational
technology. Her research focuses on teaching and learning with technology in school, uni-
versity, and professional learning environments. Lori is interested in understanding the learning
processes and outcomes for learners engaged in technology-supported tasks – particularly col-
laborative and problem-based tasks.
Roberto Martinez-Maldonado is a full-time researcher in the areas of educational data
science and human-computer interaction at the Connected Intelligence Centre (CIC), and
data visualisation lecturer at the University of Technology Sydney (UTS), Australia. His
ongoing research takes a holistic perspective that spans across multiple disciplines. In parti-
cular, he is interested in the role of data and technology to empower people in areas such as
learning, health literacy and collaboration.
Jenny McDonald is a research associate at the Centre for Learning and Research in Higher
Education at the University of Auckland and co-developer of Quantext, text analytics
software for teachers. She is an experienced educational technologist and academic devel-
oper and has been involved in numerous cross-sector educational research and develop-
ment projects. Broadly interested in all aspects of teaching and learning with technology,
her research focus is on natural language processing for formative feedback and learning
analytics.
Negin Mirriahi is an academic developer (senior lecturer) at the University of South Aus-
tralia. She has extensive experience in managing, designing, implementing, and evaluating
innovative educational technology in higher education and in designing fully online, blen-
ded, and open courses. Her research projects focus on learning analytics to inform pedago-
gical change, video analytics for learning, videos in the curriculum, technology adoption,
blended and online learning, and academic staff development. One of her focus areas is
developing confidence and capability amongst academic staff to use learning analytics to
inform their teaching and course and curriculum design.
xviii List of contributors

Adon Christian Michael Moskal is a lecturer and project coordinator in Information


Technology at Otago Polytechnic, and co-developer of Quantext, text analytics software for
teachers. From 2011–2016, Adon worked as an educational technologist at the University of
Otago, developing software and researching educational technology. His research interests
include student evaluation, academic development and learning analytics.
Tom Olney is currently Senior Manager: Learning & Teaching in the Faculty of Science,
Technology, Engineering and Mathematics (STEM) at the Open University UK, with
responsibility for learning design and learning analytics. Prior to joining the Open University
in 2014 he was a classroom teacher with experience in several large secondary schools in
Victoria, Australia. His research interests include the translation of online and distance LD
principles into face-to-face, commercial and Chinese student environments.
Mariya Pachman is an educational researcher specialised in learning and new technologies. Her
background is in instructional design, multimedia learning, and Cognitive Load Theory. Mariya is
currently based at FSU where she is exploring the affordances of a Mixed-Reality-Integrated
Learning Environment (MILE) in enhancing teaching competencies in STEM disciplines.
Ernesto Panadero is a researcher at the Developmental & Educational Psychology Depart-
ment, at Universidad Autónoma de Madrid (funded by the Ramón y Cajal research program
-2014 call-). He is also one of the coordinators of the EARLI SIG-1 group Assessment and
Evaluation; and member of the Continental Europe team at the Assessment for Learning
International Network (AfLIN). Ernesto’s research interests are twofold; first the understanding
and promotion of self-regulated learning (at the individual level), and socially shared regulated
learning (at the collaborative learning level); second, the relationship between self-regulated
learning and the use of formative assessment (mainly self-assessment but also peer assessment).
Abelardo Pardo is Professor and Dean Academic at the Division of Information Technol-
ogy, Engineering and the Environment at the University of South Australia. His research
interests include the design and deployment of technology to increase the understanding and
improve digital learning experiences. More specifically, his work examines the areas of
learning analytics, personalized active learning, and technology for student support.
Carlos G. Prieto-Alvarez is a PhD student in the Connected Intelligence Centre (CIC) of the
University of Technology Sydney. His research is focused on co-design, human-computer
interaction and design for learning analytics. He is always interested in research collaboration for
participatory methods, data driven design and technology enhance learning.
Peter Reimann is a Professor in the Sydney School of Education and Social Work, and a
co-director of the Centre for Research on Learning & Innovation (CRLI). His primary
research areas have been cognitive learning research with a focus on educational computing,
multimedia-based and knowledge-based learning environments, e-learning, and the devel-
opment of evaluation and assessment methods for the effectiveness of computer-based tech-
nologies. Current research activities comprise among other issues the analysis of individual
and group problem solving/learning processes and possible support by means of ICT, and
analysis of the use of mobile IT in informal learning settings (outdoors, in museums, etc.).
Deborah Richards is a Professor in the Department of Computing at Macquarie Uni-
versity. Following 20 years in the IT industry during which she completed a PhD in artificial
List of contributors xix

intelligence on the reuse of knowledge at the University of New South Wales she joined
academia in 1999. While she continues to work on solutions to assist decision-making and
knowledge acquisition, for the past decade, her focus has been on intelligent systems, agent
technologies and virtual worlds to support human learning and well-being.
Bart Rienties is Professor of Learning Analytics in the Institute of Educational Technology
at the Open University, UK. He is the programme director of Learning Analytics within IET
and the head of Data Wranglers, whereby he leads a group of learning analytics academics who
conduct evidence-based research and sense making of Big Data.
Jason M. Stephens is an Associate Professor in the School of Learning, Development and
Professional Practice at The University of Auckland. His primary research interests include
human motivation, moral development, cheating behaviour, and the promotion academic
integrity during adolescence. He is a co-author of two books (Educating Citizens and
Creating a Culture of Academic Integrity) as well as numerous journal articles and other
publications related to these and other interests.
Kate Thompson is a senior lecturer in Educational Technologies in the School of Educa-
tion and Professional Studies at Griffith University. She is the Head of the Creative Practice
Lab and Co-Leader of the Teacher Education Research Program. Her research interests
include systems approaches to understanding learning and teaching, computer supported
collaborative learning, and, in particular, learning to be interdisciplinary.
Lisette Toetenel is currently lead instructional designer at a private bank in Zurich with
responsibility for e-learning globally. She used to lead the Learning Design team, within the
Institute of Educational Technology, at The Open University. Lisette's published work is
focused on learning design and learning analytics, along with social networking and interac-
tion. Lisette has delivered presentations, key notes and master classes in commercial and
academic settings in the UK and abroad.
Deborah West is Pro Vice Chancellor (Learning and Teaching Innovation) at Flinders
University. With a disciplinary background in social work, her research has encompassed
both her discipline and learning and teaching. In recent years, her focus has been on the use
of technology to support learning and teaching and in particular learning analytics.
This page intentionally left blank
1
INTRODUCTION
Learning analytics in the classroom

Jason M. Lodge, Jared Cooney Horvath and Linda Corrin

Learning analytics burst into prominence as a field in the early 2010s. It has since grown
rapidly, with researchers around the world from a variety of disciplines engaging in the
learning analytics community through meetings, conferences and publications. Learning
analytics was initially defined as: ‘the measurement, collection, analysis and reporting of data
about learners and their contexts, for purposes of understanding and optimizing learning and
the environments in which it occurs’ (Long, Siemens, Conole & Gašević, 2011). The field
provides the basis for the power of data science methodologies to be brought to bear on
education. In this volume, we have brought together some of the leading scholars in learning
analytics to take one further step towards that goal. We collectively seek to understand how
learning analytics will have an impact in the classroom.
The field of learning analytics can no longer be thought of as new or emerging. With the
Learning Analytics and Knowledge Conference now well established and journals such as The
Journal of Learning Analytics also now several volumes in press, the field is maturing. Despite
the rapid growth of the interdisciplinary endeavour that is learning analytics, there remains
much work to be done. In this edited volume, we seek to highlight the work of some of the
key researchers in the field with a particular emphasis on what the work means in practice.
As with the previous edited collection, From the laboratory to the classroom: Translating science of
learning for teachers (Horvath, Lodge & Hattie, 2017), this volume is an attempt to make sense
of a complex field of research for teachers and educational researchers.
As the field of learning analytics has grown, there appear to have been several distinct bran-
ches, or sub-fields, emerge. The first, understandably, focuses predominantly on the data and
analytics side. This includes the aspects of the field traditionally associated with core aspects of
data science. The kinds of activities occurring in this branch include the development of means
of collecting and synthesising data, the construction of predictive models and the creation and
testing of algorithmic work. It is recognised that this is a high-level generalisation of the kind of
data science work that is occurring in the community, we are here trying to provide an overall
categorisation of the strands of activity rather than a comprehensive map.
The second broad area of activity involves the use of methodologies developed through
the first to better understand student learning (Lodge & Corrin, 2017). This sub-field is
2 Lodge, Horvath and Corrin

engaged in using new forms of data developed by and through the learning analytics com-
munity to uncover aspects of learning that were previously difficult to capture, particularly in
real time. There is a strong overlap between this branch and research occurring in the cog-
nate disciplines such as the learning sciences and educational psychology (Gašević, Dawson &
Siemens, 2015). The increased use of data such as log files and the power of techniques for
analysing and modelling data across various instruments is providing insight into student
learning that has been largely limited in the past to crude behavioural data, self-report
instruments and complex psychophysiological measures. The examples presented here suggest
great potential in the use of learning analytics to enrich the learning sciences and educational
psychology and is a worthy endeavour.
While the focus of much of the work presented in this volume is on classroom applica-
tion, it should not be overlooked the contribution learning analytics is making to founda-
tional research in multiple domains, particularly in computer science, human computer
interaction and the learning sciences. The fundamental research is critical for moving the
field forward but is often forgotten in the discussions around the utility of learning analytics
in practice.
The third major sub-branch is the use of learning analytics in practice, the focus of this
volume. This is the sub-branch of the field that dominated its early focus. Significant effort in
early learning analytics work concentrated on innovations that would support student reten-
tion both in face-to-face learning environments (e.g. Arnold & Pistilli, 2012; Aguilar, Lonn
& Teasley, 2014) and online (Kizilcec, Piech & Schneider, 2013). In more recent times, both
the use of learning analytics to make predictions about student progress in the wild and the
development of analytics-based interventions have been key areas of progress. As opposed to
the high-level modelling used in earlier attempts at predicting when students might need
assistance, later efforts are now focussed on more sophisticated approaches. These efforts are
increasingly incorporating what is being learned from foundational research, as described in
the first two sub-branches, and an acknowledgement of the key role that learning design
plays in helping teachers to interpret the analytics in ways that impact practice (Lockyer,
Heathcote & Dawson, 2013; Bakharia et al., 2016).
Finally, there are obvious policy and ethical implications that are raised by the
increased collection and use of data generated about and by students. While we did not
seek contributions directly assessing and/or discussing the ethical implications of learning
analytics in the classroom, these issues undoubtedly permeate all the ideas and practices
outlined in this volume. We consider it critical that this is recognised here as a core
consideration for all work occurring in the field of learning analytics, despite it not being
a core focus of the volume.
There is undoubtedly nascent and emerging potential for learning analytics to fundamen-
tally alter practice and policy in education in the near future, particularly as machine learning
and artificial intelligence increase in power and availability. As the theory and methods in
learning analytics research become increasingly sophisticated, it is not hard to see educational
researchers and teachers being left behind as the technical aspects of the field advance. Our
aim in putting this volume together was to try to bridge that gap. Learning analytics has
been, and must remain, a transdisciplinary endeavour if the data generated are to continue to
be used to meaningfully enhance student learning (Lodge, Alhadad, Lewis & Gašević, 2017).
We have brought together a collection of chapters from some of the leading researchers in
learning analytics to help close this gap.
Introduction 3

Aim of this book


Enhancing student learning is a ‘wicked’ problem. There are many factors that contribute to
the efficacy of learning environments and activities. Knowing which combination of these
factors will most effectively improve the learning of students will continue to be a pressing
challenge as we progress further into the 21st century.
The opportunities for leveraging the growing body of data available in practice are
becoming increasingly apparent. The collection of chapters in this book highlight the real
and concrete potential of learning analytics to help both better understand and enhance stu-
dent learning. What will be critical in this evolution of the field is that educational
researchers and teachers are included in the journey. It is patently clear that articulating the
pedagogical purpose of activities and curriculum are critical for understanding how data
might be used to enhance learning in practice. Without the meaning given to the data by
those on the ground where the data are being collected, it is very difficult to determine what
these data mean.
This collection of chapters is therefore an attempt to help bridge the gap between research,
theory and practice. As such, each chapter includes specific discussion about what the
research means in the classroom. In these parts, authors of all chapters have provided guidance
about how the research applies in the wild. The book will therefore be of benefit to both
the learning analytics community and to those new to thinking about what data and analytics
mean in a classroom context. Our hope is that the great chapters included in this volume
from key researchers will help to foster a deeper and sustainable conversation between
learning analytics researchers and teaching practitioners. This conversation will be critical for
the evolution of the field into the future.

Outline of the book


This volume is split into parts related to key themes in understanding learning analytics
through the lens of the classroom. In the first part, the broad theoretical perspectives are
considered. In Chapter 2, Donoghue, Horvath and Lodge consider the applicability of
translation frameworks used in the science of learning for the field of learning analytics. One
of the key issues in the use of data of any kind to infer student learning is that there is always
an inferential gap (Lodge et al., 2017). There is no way to directly measure the processing
that occurs in the mind. The kinds of metaphors and methodologies used throughout the
diverse range of disciplines interested in learning all rely on some form of inference, which in
turn are based on a set of underlying philosophical assumptions. Donoghue et al. unpack
what the different levels of analysis mean for interpreting data about learning collected across
the full spectrum of methodologies from neurons to neighbourhoods. The discussion in this
chapter provides a foundation for more holistic thinking about meaning making and data,
which will become increasingly critical as more data from more diverse sources, such as
psychophysiological data, are added into the mix. In Chapter 3, Bartimote, Pardo and
Reimann delve further into the theoretical aspects of learning analytics. In their chapter, they
discuss learning analytics from a realist perspective. Building on a case study in the context of
a statistics course, the distinction between learning analytics and learner analytics is drawn.
The chapter makes an important contribution to understanding how it is possible to link
theory and practice in learning analytics.
4 Lodge, Horvath and Corrin

The second major part of the book focuses on understanding learning through analytics.
Much of the emphasis on learning analytics has been to understand student progression and
determine possible avenues for intervention. As is evident in Chapter 4 by Lodge, Panadero,
Broadbent and de Barba, there is great value in drawing on learning analytics data, such as
behavioural trace data, to better understand learning processes such as those involved in self-
regulated learning. Higher-level cognitive and metacognitive processes like those involved in
self-regulated learning need to be better understood in order to best use data to help facilitate
and enhance learning. Enhancing student self-regulated learning is a difficult undertaking
without the nuance that can be provided by a teacher, but some possibilities for using data to
do so are discussed in this chapter. Along similar lines, Arguel, Pachman and Lockyer discuss
the possibilities for understanding and responding to student emotion using data in Chapter 5.
Progress in the area of affective computing and an increased emphasis on emotions in education
mean that there is great potential for detecting and intervening when students exhibit various
emotions while they learn. Perhaps the most obvious example of this is when behavioural
traces suggest a student has reached an impasse in the learning process and is confused. Arguel
and colleagues cover how learning analytics can help better understand how student emotions
impact on learning in digital environments and how these environments can be created to
respond to these emotions.
We then turn to the relationship between learning design and learning analytics. The
relationship between these areas has been a strong area of focus in recent years (e.g. Mor,
Ferguson & Wasson, 2015). In Chapter 6, Olney, Rienties and Toetenel describe how
learning analytics and learning design have been used in combination in a large-scale imple-
mentation at the Open University in the UK. A key feature of the approach used in the
examples outlined in this chapter is that visualisations are provided to teachers who are then
able to interpret the data being generated by students through the lens of the learning design.
Olney and colleagues therefore provide a cutting-edge example of a large-scale imple-
mentation of learning analytics with learning design as the centrepiece. Prieto-Alvarez,
Martinez-Maldonado and Anderson extend the conversation about the use of design to
inform learning analytics interventions by discussing possibilities for co-design in Chapter 7.
In particular, they point out the power of co-designing learning analytics implementations
with students. This chapter therefore bridges the learning analytics community with other
trends such as the ‘students as partners’ initiative (e.g. Mercer-Mapstone et al., 2017). In
Chapter 8, Thompson and colleagues focus on the design of assessment in a graduate course
that strategically incorporates data. In particular, they emphasise the importance of an inter-
disciplinary approach, through which the implementation of a data-rich approach achieved
greater impact for students and for generating new insights into effective learning analytics
implementation.
The largest part of the book is devoted to chapters discussing specific examples of the
possible impact that learning analytics can and will have on education. The collection of
chapters in this part covers a wide range of contexts and implementation strategies. In the first
of these practice-focused chapters, Jovanovic and colleagues discuss the role learning analytics
can play in understanding and enhancing student learning strategies in flipped classes in
Chapter 9. Using a learning analytics framework and design research methodology, these
researchers have examined ways of enhancing self-regulated learning in flipped classes. This
chapter adds to the limited research to date on ways in which learning analytics might be
used to enhance learning in flipped classes but also provides approaches that will be useful
Introduction 5

beyond that setting. Along similar lines, Howard, Thompson and Pardo present research on
the use of learning analytics to support learning in a blended context in Chapter 10. These
researchers have used a minimally intrusive approach drawing on a diverse range of data from
physical and virtual environments to facilitate interactions in both settings. The consideration
of learning that occurs across digital and real-life classroom settings is vital, particularly given
most educational activities and tasks can now be described as blended due to the ubiquitous
nature of digital technologies in many aspects of learning.
Text and writing analytics have become an area of particular emphasis in the field of
learning analytics in the last few years. In Chapter 11, McDonald, Moskal, Gunn and Donald
examine the role language plays in learning. In particular, they outline how language can
illuminate the efficacy of student-teacher interactions. Their work demonstrates the power of
text-based analytics for helping to better understand these interactions and use the insights
derived from text-based data to enhance student learning.
Student engagement is often seen as a central concern in ensuring students successfully
transition into, through and out of university. In Chapter 12, Blumenstein and colleagues
present a set of case studies where analytics are used to inform interventions to ‘nudge’ stu-
dents towards academic success. Directly instructing students to engage in their studies can
often backfire, with students interpreting the intervention as patronising or punitive. The
framework provided by these researchers will prove useful in informing similar approaches
that draw on data to encourage, rather than demand, students to more meaningfully engage
with their studies.
In Chapter 13, Corrin explores the opportunities and challenges that arise from the
development of student-facing analytics tools. An increasing number of systems are emerging
with the potential to place the outcomes of learning analytics directly in the hands of stu-
dents. While this can offer many benefits in supporting students’ ability to monitor and adapt
their study strategies, there are also concerns about the impact on students who are less cap-
able of interpreting these data and determining appropriate actions. This chapter profiles
different classifications of student-facing learning analytics tools and offers insights into how
these can be designed, implemented and supported in the classroom.
Taking a slightly different perspective, Cloney and Hollingsworth discuss large-scale testing
of students and what the data generated from these tests might contribute to improving
practice in Chapter 14. Coming from a psychometric perspective, these researchers offer ways
of visualising large-scale datasets and how the lessons from the analysis of these data might
inform practice and data gathering at a more local level.
In the final part of the book, there are two chapters devoted to the process of imple-
menting analytics, taking a higher-level perspective on the challenges associated with doing
so. Learning analytics implementations do not happen in a vacuum and it is critical to con-
sider how best to equip teachers with the necessary skill and knowledge to work with and
make sense of data. In Chapter 15, Gunn and McDonald report on a nationally funded
project in New Zealand examining the conditions required for tertiary teachers to make use
of data generated by and about students. While the results were not encouraging in terms of
the preparedness of teachers to engage in this data-driven process, they present a set of case
studies to demonstrate what is possible when teachers work with others who have the
expertise to assist with learning analytics interventions.
Finally, West, Huijser and Heath discuss one of the most critical considerations in any
learning analytics intervention: leadership. In Chapter 16, these authors outline what is
6 Lodge, Horvath and Corrin

required in a university context for broad learning analytics implementations to be suc-


cessful in institutions. Drawing on research conducted in a nationally funded research
project in Australia, West and colleagues argue that a consistent, whole of institution
approach is critical for large-scale implementation, but that this needs to be coupled with
a distributed approach to leadership and an explicit valuing of the diverse expertise
required for effective implementation on the ground. This chapter is a great way
to round out the previous chapters in the volume as it complements the case studies
discussed throughout the book with a higher-level leadership perspective on learning
analytics in practice.

What the chapters of this book suggest about where learning analytics is
going
When taken as a whole, there are some key trends across the chapters in this volume which
might suggest some critical areas for the learning analytics community and those interested in
learning analytics beyond the research community to focus their efforts. These key areas are
outlined below.

The role of the teacher


Some discussions about the impact of artificial intelligence might give the impression that
many professions, including teaching are likely to be replaced by machines (e.g. Popenici &
Kerr, 2017). The increased use of both adaptive and intelligent tutoring systems might also
suggest that the role of the teacher will become redundant in the near future. The collection
of chapters in this volume would suggest instead that the teacher is not only critical but
central to the implementation of data driven and machine-mediated education, at least into
the foreseeable future. Teachers need to contribute to the design and development of learn-
ing analytics systems so that they align with learning and teaching practices in the classroom,
especially in relation to the design of personalised learning paths or learning interventions.
The power of data to enhance education seems best deployed in concert with teachers rather
than in an attempt to supplant them.

The role of learning design


Aligned with the important association between teachers and learning analytics is the critical
role of design in linking the two together. Design for learning has increasingly been emphasised
in educational theory and practice in recent years (e.g. Goodyear, 2015; Laurillard, 2013).
Learning design provides a systematic approach for converting pedagogical intentions into
actionable approaches and artefacts for facilitating learning (Bennett, Agostinho & Lockyer,
2017). When considering how this might be helpful for the implementation of learning ana-
lytics, it is evident that learning design allows for a systematic means of deciding what data is
useful and provides a framework for making meaning from the data. This insight is key in
informing the design of learning analytics tools for teachers to use to support their practice in
the classroom, and a greater emphasis on ways to represent and link learning design to analytics
in meaningful ways can be seen in emerging analytics tools for teachers (Corrin, Law, Ringtved
& Milligan, 2018).
Introduction 7

The role of students


Initially data gathering was seen as a means of collecting data about students. Over time
however, it is becoming clear that learning analytics needs to start collecting data for and
with students (see Dollinger & Lodge, 2018). Students have a critical role to play in helping
to make sense of the data that is collected in educational settings. Involving students in the
learning analytics endeavour means that students can help to provide depth to data in ways
that would not be possible without understanding their experiences. From the student per-
spective, their involvement also provides insight into how data are collected and what it is
used for.

The criticality of transdisciplinarity


Another issue that is apparent across all the chapters in the volume is that learning analytics is
not the domain of a single discipline or methodology. In order for data and analytics to have
any real impact on education, it must involve multiple disciplines and methodologies. The
field needs not just to be interdisciplinary, but transdisciplinary. In this regard, effective
learning analytics interventions require people with different kinds of expertise to not only
contribute in pieces to the development and implementation of learning analytics but to
approach research and practice holistically. The liminal spaces between disciplinary ways of
knowing are therefore as important as the epistemic contributions that come from individual
disciplines. Learning analytics are implemented in complex school or university environ-
ments, a more systemic way of understanding how data can be leveraged to enhance student
learning and success necessitates diverse ideas about how to collect, analyse, make meaning of
and act upon data.

Delivering on the promise of learning analytics


Perhaps more than anything else in this volume, what is most evident is that learning analy-
tics has immense promise that is yet to be fully realised. As technologies become more
advanced, it is clear from the chapters in this volume that new opportunities will arise for
supporting various aspects of student learning and of teachers’ work in the classroom. By this
we mean that there are almost limitless opportunities for helping improve the way that
instruction is delivered. These will include sophisticated intelligent tutoring and adaptive
learning systems that will afford deep personalisation and adaptability to cater for individual
student needs. When deployed in concert with teachers, who can provide nuanced, human
interventions to individual students, there is untold potential in the approaches and ideas
outlined in the chapters in this book. We hope that readers feel the same excitement about
these possibilities as we did as we were working with all the great scholars who contributed
to this volume.

References
Aguilar, S., Lonn, S. & Teasley, S. D. (2014). Perceptions and use of an early warning system during a
higher education transition program. Proceedings of the Fourth International Conference on Learning Analytics
and Knowledge (pp. 113–117). New York: ACM.
8 Lodge, Horvath and Corrin

Arnold, K. E. & Pistilli, M. D. (2012). Course signals at Purdue: using learning analytics to increase
student success. Proceedings of the 2nd International Conference on Learning Analytics & Knowledge (pp.
267–270). New York: ACM.
Bakharia, A., Corrin, L., de Barba, P., Kennedy, G., Gasevic, D., Mulder, R., Williams, D., Dawson, S.
& Lockyer, L. (2016). A conceptual framework linking learning design with learning analytics. In
T. Reiners, B. R. von Konsky, D. Gibson, V. Chang, L. Irving and K. Clarke (Eds.), Proceedings of
the 6th International Conference on Learning Analytics and Knowledge (pp. 409–413). New York: ACM.
Bennett, S., Agostinho, S. & Lockyer, L. (2017). The process of designing for learning: understanding
university teachers’ design work. Educational Technology Research and Development, 65(1), 125–145.
Corrin, L., Law, N., Ringtved, U. & Milligan, S. (2018) DesignLAK18: evaluating systems and tools
that link learning analytics and learning design (Workshop). In Companion Proceedings of the 8th Inter-
national Conference on Learning Analytics and Knowledge (pp. 423–427). New York: ACM.
Dollinger, M. & Lodge, J. M. (2018). Co-creation strategies for learning analytics. In Proceedings of the
International Conference on Learning Analytics and Knowledge, Sydney, Australia, March 2018 (LAK’18).
doi:10.1145/3170358.3170372
Gašević, D., Dawson, S. & Siemens, G. (2015). Let’s not forget: learning analytics are about learning.
TechTrends, 59(1), 64–71.
Goodyear, P. (2015). Teaching as design. HERDSA Review of Higher Education, 2, 27–50.
Horvath, J. C., Lodge, J. M. & Hattie, J. A. C. (2017). From the laboratory to the classroom: translating
science of learning for teachers. Abingdon, UK: Routledge.
Kizilcec, R. F., Piech, C. & Schneider, E. (2013). Deconstructing disengagement: analyzing learner
subpopulations in massive open online courses. In Proceedings of the Third International Conference on
Learning Analytics and Knowledge (pp. 170–179). New York: ACM.
Laurillard, D. (2013). Teaching as a design science: building pedagogical patterns for learning and technology.
Abingdon, UK: Routledge.
Lockyer, L., Heathcote, E. & Dawson, S. (2013). Informing pedagogical action: Aligning learning
analytics with learning design. American Behavioral Scientist, 57(10), 1439–1459.
Lodge, J. M. & Corrin, L. (2017). What data and analytics can and do say about effective learning.
Nature: npj Science of Learning, 2(1), 4–5. doi:10.1038/s41539-017-0006-5
Lodge, J. M., Alhadad, S. S. J., Lewis, M. J. & Gašević, D. (2017). Inferring learning from big data: the
importance of a transdisciplinary and multidimensional approach. Technology, Knowledge & Learning,
22(3), 385–400. doi:10.1007/s10758-017-9330-3
Long, P. D., Siemens, G., Conole, G. & Gašević, D. (Eds) (2011). Proceedings of the 1st International
Conference on Learning Analytics and Knowledge (LAK’11). New York: ACM.
Mercer-Mapstone, L., Dvorakova, S. L., Matthews, K., Abbot, S., Cheng, B., Felten, P. … & Swaim,
K. (2017). A systematic literature review of students as partners in higher education. International
Journal for Students as Partners, 1(1).
Mor, Y., Ferguson, R. & Wasson, B. (2015). Learning design, teacher inquiry into student learning and
learning analytics: a call for action. British Journal of Educational Technology, 46(2), 221–229.
Popenici, S. A. & Kerr, S. (2017). Exploring the impact of artificial intelligence on teaching and
learning in higher education. Research and Practice in Technology Enhanced Learning, 12(1), 22.
PART I

Theoretical perspectives
This page intentionally left blank
2
LEARNING ANALYTICS AND TEACHING
A conceptual framework for translation and application

Gregory M. Donoghue, Jared Cooney Horvath and Jason M. Lodge

Introduction
At its essence, human learning is a complex neurological phenomenon. Ultimately, learning is
instantiated by changes to the organisation of the human nervous system (Kandel, 2000;
Gazzaniga 2004), corollaries of which include more or less permanent changes in behaviour
(Mayer, 2003; Shuell, 1986). Education, on the other hand, is an even more complex socio-
cultural phenomenon that subsumes the biological, psychological, sociological and even
physical sciences. In other words, learning involves neurological processes, psychological
phenomena and sociocultural factors such as social interactions, cultural agents, relationships
and communication.
Learning analytics (LA) is an emerging field that purports to support, enhance, facilitate,
predict and measure human learning in educational settings. It achieves this largely by mea-
suring learning correlates (an array of phenomena that exist or emerge when learning occurs),
including, for example, reaction times, keyboard entries, mouse clicks, etc. More recently,
this has been expanded through the use of multimodal learning analytics that also incorpo-
rate, for example, pupil dilation, pulse rate and sweat responses (Ochoa, 2017). It is important
to note, however, that the correlates of learning are not the same thing as learning itself (see
Gašević, Dawson & Siemens, 2015). For this reason, it is important to consider how the
former can (and cannot) meaningfully be used to assess the latter. In addition, while collec-
tion of data from LA implementations often occurs in educational environments, making
meaning of these data still requires translation in order to be valid. As with scientific or
neuroscientific data, LA data also needs be understood within the context in which it was
produced in order to derive meaning both within and beyond that context. It is not sufficient
to rely solely on algorithms to abstract this meaning.
Inferring truth from one discipline to another – in this case, from LA to education – is
beset with problems of translation (see Lodge, Alhadad, Lewis & Gašević, 2017). These issues
constrain and often invalidate applications of knowledge from one discipline to another; a
topic that has previously been explored generally within science (Anderson, 2002; Horvath &
Donoghue, 2016) and specifically within educational neuroscience (Donoghue & Horvath,

You might also like