Data-Driven Instruction in The Classroom: January 2015

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 5

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/283909949

Data-Driven Instruction in the Classroom

Article · January 2015

CITATIONS READS

2 694

1 author:

Arnon Hershkovitz
Tel Aviv University
59 PUBLICATIONS   455 CITATIONS   

SEE PROFILE

All content following this page was uploaded by Arnon Hershkovitz on 16 November 2015.

The user has requested enhancement of the downloaded file.


Tech., Inst., Cognition and Learning, Vol. 10, pp. 81–84 © 2015 Old City Publishing, Inc.
Reprints available directly from the publisher Published by license under the OCP Science imprint,
Photocopying permitted by license only a member of the Old City Publishing Group

Data-driven Instruction in the Classroom

Arnon Hershkovitz

School of Education, Tel Aviv University, Tel Aviv, Israel

In the previous issue of this special issue, we have focused on various ap-
proaches to improving instructional processes. Mainly, we have focused on in-
struction and learning that is mediated through computer systems, that is,
computer-based learning environments. It seems that most of the research in the
fields of Educational Data Mining (EDM) and Learning Analytics (LA) has in-
deed been focused on analyzing what is happening within virtual learning envi-
ronments. This is of no surprise, considering the history of these communities.
Baker and Siemens (2014) describe the origins: “Much of the early work in EDM
was conducted within intelligent tutoring systems […] and much of the work in
LA began in Web-based e-Learning and social learning environments.” Although
Baker and Siemens mention these beginnings in order to demonstrate the ever-
widening range of data sources being discussed in the EDM/LA communities, all
but one of their examples are computer-based learning environments (learning
resources, science simulations, teacher newsgroups); their last example is school
district grade data systems.
Still lacking in the mainstream data-driven approaches to studying educational
settings is the very basic, most popular educational setting – that is, the class-
room. Using data-driven methods to study the traditional classroom was sug-
gested about the time EDM community first emerged (cf. Romero & Ventura,
2007), and some work has since been done in this direction (e.g., Blanchard,
D’Mello, Olney, & Nystrand, 2015; Hershkovitz, Merceron & Shamaly, 2015;
Martinez-Maldonado, Yacef & Kay, 2013 – just to name a few). Data concerning
learning and teaching processes in traditional classrooms can be diverse and
include, among others, teacher-student and student-student discourse and

*Corresponding author: arnon.hershkovitz@gmail.com

81
82 A. Hershkovitz

interactions, students’ emotional states, and–much more traditionally—student


assignments and grades. These can be collected via observations, audio/video
recording, or even log files (like in Martinez-Maldonado, Yacef, & Kay, 2013).
Capturing data that describes learning in the classroom is the focus of the current
issue. The articles in this issue present a large variety of data sources, data collec-
tion tools and data analysis techniques.
Chen, Clarke and Resnick (2015) address the difficulties many teachers
encounter in trying to gain a better understanding of their classroom discourse.
Classroom discussion is key to learning, and there is a large body of evidence that
fruitful discussions may help both students and teachers. Until very recently,
however, studying classroom discourse was a tedious task that relied on data that
was not fully authentic (e.g., notes taken during observations, or data based on the
teacher’s memory). In this paper, the authors report on the development of a
classroom discourse analyzer (CDA) that assists in data collection, coding, visu-
alization and tracking and comparison. The authors demonstrate how CDA is
being implemented with a data set of almost 4,000 teacher and student
interactions(ed?), based on 30 discussion sessions (video transcriptions) in a
4th grade science class. The authors use machine coding software, and then ana-
lyze the data, demonstrating the power of their approach, both analytically and
visually. This tool can definitely be of help in teacher development programs.
Daily, James, Roy and Darnell (2015) reports on a reflective tool that may
assist in tracking students’ engagement. “Engagement” – another key aspect of
learning – is really a sloppy term; today, educational scholars distinguish between
different types of engagement, namely behavioral, emotional and cognitive (cf.
Fredricks, Blumenfeld, & Paris, 2004). Studies in recent years have conceptual-
ized and operationalized engagement in numerous ways—from self-reports,
through observations, to log file-based measurements—each with their own
advantages and weaknesses (cf. Azevado, 2015). in In this paper, the authors use
physiological arousal (as measured by a wrist-worn sensors) as a proxy for
engagement; as physiological arousal relates to emotion, attention and memory
– it might indeed serve as an approximation of a somewhat combined measure of
“engagement” (until a better measure is established). Using these data, the authors
present EngageMe, software that uses bright visualizations that instantly help
teachers to reflect on, and to act upon, their students’ engagement profiles. As
such, and as the authors emphasize, this tool does not intend to replace human
interaction or judgement, however it can definitely supplement them, supporting
teachers with a quick decision-making aid.
Shapiro and Wardrip (2015) address another critical aspect of teachers’ work:
Understanding their students’ works. Specifically, the authors present a tool they
Data-driven Instruction in the Classroom 83

have developed, Markup, to analyze students’ annotated texts. Although in the


current case study, the students have used this tool at home (students who were
not able to use it on their home computer were given time to use computers at
school with the software). This tool can definitely be used in classrooms, as read-
ing sessions are an integral part of many lessons. The practice of annotating text
has been found to be very important in developing students’ critical thinking (e.g.,
Liu, 1996). Hence, it is not surprising that text annotation is presented as an inte-
gral part of reading instruction programs (e.g., Jones, Chang, Heritage, Tobiason
& Herman, 2015). Through a set of visualizations and reports, teachers can use
these data to promote their students’ interaction with text, and even to change
their teaching. However, regarding to the latter, the authors suggest that such
changes are also dependent on the teacher’s own educational agenda.
Finally, Levi-Gamlieli, Cohen and Nachmias (2015) take an interesting
approach to student-teacher interactions; they examine students—at the univer-
sity level— who approach their instructors in an overly intensive manner. That is,
this paper is focused on students who frequently send emails and text messages
to their instructor and often visit the instructor office hours. Comparing this
behavior with the students’ interaction with the course website, the authors found
similarities in behavior in both channels. Although intense communication with
the instructor is not necessarily a harmful behavior, the authors find that extremely
intense communication may predict lower achievements.
In all, this issue presents a wide range of educational settings, applications and
methodologies that relate to the basic, traditional practices of teaching and learn-
ing. The articles in this issue demonstrate how we can enrich our understanding
of these “simple” issues using data-driven research. Together with the previous
issue, we believe that this double special issue sheds light on the critical role of
data for instructional processes. We hope you will enjoy this issue.

References

Azevado, R. (2015). Defining and measuring engagement and learning in science: Conceptual, theo-
retical, methodological, and analytical issues. Educational Psychologist, 50(1), 84–94.
Baker, R. & Siemens, G. (2014). Educational data mining and learning analytics. In Sawyer, K. (Ed.),
Cambridge Handbook of the Learning Sciences: 2nd Edition (pp. 253–274). New York, NY:
Cambridge Academic Press.
Blanchard, N., D’Mello, S., Olney, A.M., & Nystrand, M. (2015). Automatic classification of question
& answer discourse segments from teacher’s speech in classrooms. In Proceedings of the 8th
International Conference on Educational Data Mining, 169–175.
Chen, G., Clarke, S.N., and Resnick, L.B. (2015). Classroom discourse analyzer (CDA): A discourse
analytic tool for teachers. Teaching, Instruction, Cognition and Learning, 10(2).
84 A. Hershkovitz

Daily, S.B., James, M.T., Roy, T., & Darnell, S.S. (2015). EngageMe: Designing a visualization tool
utilizing physiological feedback to support instruction. Teaching, Instruction, Cognition and
Learning, 10(2).
Fredricks, J. A., Blumenfeld, P. C., & Paris, A. H. (2004). School engagement: Potential of the con-
cept, state of the evidence. Review of Educational Research, 74(1), 59–109.
Hershkovitz, A., Merceron, A., & Shamaly, A. (2015). In Proceedings of the 8th International Confer-
ence on Educational Data Mining, 254–255.
Jones, B., Chang, S., Heritage, M., Tobiason, G., & Herman, J. (2015). Supporting students in close
reading. Los Angeles, CA: National Center for Research on Evaluation, Standards, and Student
Testing.
Levi-Gamlieli, H., Cohen, A., & Nachmias, R. (2015). Detection of overly intensive student interac-
tions using weblog of course website. Teaching, Instruction, Cognition and Learning, 10(2).
Liu, K. (1996). Annotation as an index to critical writing. Urban Education, 41, 192–207.
Martinez-Maldonando, R., Yacef, K., & Kay, J. (2013). Data mining in the classroom: Discovering
groups’ strategies at a multi-tabletop environment. In Proceedings of the 6th International Con-
ference on Educational Data Mining, 33–40.
Romero, C. & Ventura, S. (2007). Educational data mining: A survey from 1995 to 2005. Expert
Systems with Application, 33(1), 135–146.
Shapiro, B. & Wardrip, P.S. (2015). Keepin’ it real: Understanding analytics in classroom practice.
Teaching, Instruction, Cognition and Learning, 10(2).

View publication stats

You might also like