Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 8

Invited paper presented at “Electronic Kazan 2011” on 19 April 2011, Kazan, Tatarstan, Russian Federation.

Measuring Engagement: Learning Analytics in Online Learning

Griff Richards
Thompson Rivers University
Kamloops, BC, Canada
griff@sfu.ca

Abstract: The growing number of on-line interactions between learners and on-line learning
systems leaves a trail of data that can be analyzed at a number of levels of granularity and for
several purposes. Learner engagement in the program of studies, with the course or with a
specific learning activity is one variable thought to highly correlate with learner success.
Engagement may be with the learning content, or with other learners and faculty in the
socially constructed environment. This paper looks at three specific areas where learner
engagement can be measured, and discussed the possibilities for using such information in
advising learners and faculty on ways to strengthen course outcomes.
Keywords: Learning analytics, Learner engagement, student success, on-line learning

There is an idea in education that the greater a learner is engaged with their
learning and with their peers, the better their learning will be. As early as 1980
Richard Snow, having exhausted aptitude treatment interaction (ATI) studies,
remarked that the only thing that seemed to matter was the degree of interaction
the learner had with the content and with other learners. Interaction seemed to
increase internalization of the content and resulted in deeper learning and better
test scores. Of course, engagement is not the only factor for promoting and
predicting learning outcomes. Ability (aptitude), prior knowledge and prior success
in learning (achievement) are well regarded as major factors. However engagement
seems to be the current focus, partly because of the extraordinary work of George
Kuh (2004) in developing and deploying instruments to measure student
engagement in colleges and high schools, and demonstrating positive correlations of
student engagement with student retention and academic success.

If engagement has predictive powers in traditional face-to-face instruction, it should


also transfer to on-line learning. Unfortunately, there are no clear prescriptions for
creating student engagement in either online or face-to-face instruction. One can
design engaging learning activities such as collaborative learning tasks, or problem
based learning, or an instructor can facilitate an online course in an engaging
manner but engagement is an active process, and it is the motivated learner who
decides to engage.

Measuring engagement is a key step towards improving it. But to be truly effective
with engagement data, learning analytic systems also need to trigger appropriate
responses for both learners, and the instructional systems. The goal of this paper is
to explore the concept of engagement in on-line learning, and to look at the growing
approach of learning analytics in monitoring, measuring and responding to learner
engagement.
Richards

What is Engagement?
Engagement is most often associated with personal involvement and commitment.
To be completely engaged in a book, a computer game or a conversation is involves
intense attention and often the exclusion of other stimuli – a state often referred to
as “flow”. There are a number of definitions of engagement in the context of
learning. Chapman (2003) talks about active participation and cognitive investment.
Engagement can involve individual attention or it can reflect participation in a
group. Kuh, Kinzie, Buckley, et al. (2006) related learning engagement to student
effort in both learning activities and other non-academic campus activities. They
identified five factors of engagement: active and collaborative learning, student-
faculty interaction, supportive campus environment, enriching educational
experiences, and the level of academic challenge. Ross (2009) noted certain
conditions and activities are needed for students to be engaged in learning. These
include structures such as classes, cultures and expectations of behaviour,
relationships between learners or between learners and their instructors,
motivation of the individuals, and factors outside the learning system such as family
responsibilities.

In the rapidly growing era of social media Churchill (2010) noted, "Engagement is
more than the actions of a single actor; it is about social groups and reciprocal
action and responsibility..." Interaction with faculty and classmates can help
establish an identity. Participation in a campus club or sports group can create a
supportive social network. Possibly the most direct action to improve academic
engagement is to make learning activities more engaging by borrowing strategies
from the Cooperative Learning movement of the 1990's. Positive interdependence
in group work and small group sizes improve opportunities for peer interaction, and
active participation in learning activities (Johnson, Johnson & Smith, 1998).
However, to be successful, cooperative learning structures often require that
learners learn how to participate in a cooperative setting and instructors learn how
to teach in a facilitative style. Perhaps we now must also learn to be engaging.

How can we measure engagement?


There are currently two approaches to measuring learning engagement. The macro
perspective looks at learning engagement for the entire campus. Kuh (2004) created
the National Survey of Student Engagement (NSSE)– a survey instrument that has
been administered each year to hundreds of thousands of students across North
American campuses. The survey asks questions about the frequency of academic,
and social activities over the course of a term and enables institutions to compare
their levels of student engagement from year to year, and with schools of similar
size. Ideally, the results are best used to motivate leaders and faculty to adopt
proactive engagement practices in class and in campus life that reduce student
isolation, and increase a sense of belonging and participation. The belief is that the
increased engagement increases personal motivation and helps students establish a
supportive network of peers. The survey can also identify individual learners with
low engagement who may be at risk for dropping out or failure. Kuh reports online

2
Measuring Engagement

learners can be just as engaged as on-campus learners although since most online
learners are older, they are more goal driven and less interested in non-academic
campus activities. The yearly survey method means that the results are not timely
enough to flag and intervene with individual learners who are at risk for failure.

The analytics approach to measuring engagement builds on techniques similar to


web site tracking tools and data-mining that are used to monitor and improve the
usability of web sites. In contrast to a survey approach, analytics looks at all the
data of all the students. Analytics have been used in computer-based training since
the 1970s for adaptive testing and more recently for research on personalization of
lesson presentations (Graf & Kinshuk, 2008). The shift to web-based e-learning has
made it possible to collect a wider range of data from large numbers of learners. For
example, Lyryx Systems administers an on-line assignment marking system that
supports major textbooks including a calculus text used by over 100,000 students
each year (Claude Laflamme, Personal communication of 29 January 2009). With
such large student numbers there is a statistical basis for confidently flagging
problem questions and identifying students lagging in progress or achievement.
There is also a sustainable business model to support the backend computer
systems and analytics development. Such marking systems can issue daily reports to
instructors and administrators to make them aware of individuals in need of extra
tutoring or counseling.

Learning management systems (LMS) capture a lot of data that can be of value in
mapping student engagement and predicting those at risk. Most LMS systems
contain an asynchronous forum – a text discussion where participants engage in
online seminars about assigned topics. Such extended chat sessions leave traces that
can be used to analyze the transactions. Each posting is part of a response pattern of
who talks to who, who provides high-value messages responded to by many, and
who provides low-value messages. By mining this data, the tacit social structure of
the class can be quickly visualized using open source tools such as Snapp (Bakharia,
2010). Snapp portrays learners as nodes, linked by lines representing the number
of interactions between them. Figure 1 shows a sample of data from a brief on-line
discussion. Highly active participants appear in the centre of the diagram festooned
with multiple connections, while late-comers and less active learners appear as
sparsely connected dots on the outside of the cloud. The relative size of the dot
increases with the number of connections. The same technique could be applied to
analyzing an instructor’s message traffic to see which learners are engaging with the
teacher. The plot makes it easy to see who is busy, but there is no indication of the
intellectual value of the postings. Once again, we might say that those with more
postings are more engaged, but we do not know the depth of that engagement.

3
Richards

Figure 1. SNAPP map of an asynchronous discussion


in a Moodle course.

It is important to recall that engagement is a theoretical concept and it can not be


measured directly. When online learners interact in an electronic environment they
leave a data trail of when and where they have been, what documents they have
accessed who they talked to, and how well they are doing on web quizzes. Learning
analytics is a growing field that analyzes this transactional data either looking for
specific information for a single learner, or for more general patterns of interaction
from which one might measure progress, infer engagement and possibly predict
outcomes. The same sort of data mining techniques are used in web analytics to
measure and improve the effectiveness of web pages in attracting and persuading
users to move from attention to valuing, to acting, and becoming members of a site’s
community, buying its product or becoming contributors to its web content. This is
the same hierarchy of the affective domain that is familiar to educators (Krathwohl,
Bloom & Masia, 1956).

Some measures of engagement can be misleading. For example, Beer, Clark and
Jones (2010) plotted the number of learning management system (LMS) page visits
(“hits”) by academic achievement for several thousand students (see Figure 2). They
noted a solid correlation between the number of LMS hits and the student grade. Did
this mean that students who clicked more web pages were more engaged? Perhaps,
but the explanation might simply be that they had to click on the web pages to take
their tests and submit their assignments, and students who completed their work
were more likely to succeed. Beer et al. also noted that students using the
Blackboard LMS had much higher hits than students using Moodle LMS. Does that

4
Measuring Engagement

mean Blackboard is a more engaging LMS? No, it reflects that Moodle has a more
efficient access architecture. Learning Analytics is not simply about counting hits or
mapping discussions, it is about intelligent and thoughtful interpretation of data in
the context of human activity.

Figure 2. Plot of LMS hits by final grade


from Beer, Clark and Jones (2010)

What response should engagement analytics trigger?


As human beings we constantly process data as we walk down the street and we
alter course when we encounter a hazard. Our reaction is mostly learned – we have
learned to identify a speeding car, and we have learned that the appropriate
response is to jump out of the way. Most Learning Analytics is still learning to
recognize information in the data stream. We have not yet learned to interpret all
the data we see, and thus we are just beginning to develop a set of appropriate
interventions. Early analytics focuses on information such as “learner X is at risk”
that can be sent to faculty or staff so that they can do a better and more timely
intervention in remedial teaching or personal counseling. However the same
information could be sent directly to the learner. Figure 3 portrays a hypothetical
display or “dashboard” that shows the learner how fast they are moving through the
course compared to others, and how well they are achieving the course outcomes.
The dashboard shows the personal display for a learner who is progressing ahead of
the class, but who is just keeping up in terms of achievement.

We see by this example how a student logging into the learning management system
could be presented with a display that shows their progress and achievement in
comparison to their classmates, or possibly to all students who have ever taken the
course. In the case of a self-paced or independent study course, the comparison
could be made to the historical progress trends of all previously enrolled students.
Of course, not all learners might want to know this information they might prefer to

5
Richards

remain blissfully in the dark rather than face undue stress and competition that
plague high need-achievement personalities, and learners at risk for failure.
However, early detection of risk should be the first line of defense in providing
remedial action. The ethical issue should not be “Should we inform learners of their
standing?” but “What form of remedial action should we counsel them to take?” Is it
possible to have a similar dashboard for engagement – perhaps one that shows the
lack of interaction with other students in group work and the consequences of the
lack of social capital on future income?

Figure 3. Hypothetical learner dashboard showing progress and


grades in comparison to historical performance.
(Normal curve added to aid explanation)

Course administrators could also make use of learning analytics to monitor the
effectiveness of course learning activities, the perceived usefulness of examinations
and assignments, or library references are being used. Richards and Devries (2011)
described the use of learning analytics combined with student comments to provide
feedback on the usefulness of prescribed learning activities. Such a system could
correlate student comments with achievement scores for each module of
instruction. In contrast to the usual course evaluation form that is administered at
the end of a course, the analytics provides tracking and feedback during the course
from the first learning activity onward. The goal is to develop sufficiently engaging
activities that make a measurable gain in learning by learners. Analytics from
tracking and performance data could also be mixed with questionnaire responses
from learners to identify learning activities or course readings that are difficult to
learn.

What can’t analytics do?


There are a few weaknesses to the analytic approach. The first is the old garbage in
garbage out phenomenon. Data from tracking systems is not inherently intelligent.
Hit counts and access patterns do not really explain anything. The intelligence is in
the interpretation of the data by a skilled analyst. Ideally, data mining enables the
visualization of interesting data that in turn sparks the investigation of apparent

6
Measuring Engagement

patterns, the generation and testing of a hypothesis, and then the subsequent
application of a useful metric to raise warning flags, to help pace progress, to
suggest remediation or similar referral to a human agent. Such interventions also
need to be evaluated to ensure that they bring about the desired effect. Sending
learners at risk off to an inappropriate tutoring service could exacerbate their
problems instead of getting them on track.

Analytics can not of themselves cope with institutional politics or labour relations
issues. For example a learning analytic system might just as easily identify
disengaged faculty members, and unions might fight to block such unwarranted
intrusions on the sanctity of the classroom. Indeed, discriminating good from bad
teachers might end up flooding the classrooms of the good teachers (and increasing
their marking load) while reducing the registrations (and marking load) in the
classes of the poor teachers. Thus analytics must be used judiciously and carefully to
avoid undesired effects.

Another thing analytics can not do by themselves is improve instruction. While they
can point to areas in need of improvement and they can identify engaging practices,
the numbers can not make suggestion for improvements. This requires a human
intervention – usually in the form of a focus group or by soliciting suggestions from
the learners themselves. In many cases learners are well placed to suggest effective
instructional practices or to identify useful learning resources. Such learner-centric
models are at the core of self-directed learning communities.

Ethical and Privacy Issues


Every data collection scheme is now subject to the Freedom of Information and
Protection of Privacy laws. Those using learning analytics need to protect the
information of their learners, and they also need to consider which information
might become subject to a request for disclosure under freedom of information.
Dashboard schemes should be modified when small numbers would make it easy to
identify individuals. Those planning to collect and use interaction data need to
ensure that appropriate measures are taken to safeguard the identity of users –
particularly for data that might be kept for historical purposes or exported from
LMS for larger scale analyses. The best policy is to ensure those interacting with
such systems are appropriately forewarned that their data may be used for analytic
purposes and what policies are in place around the usage of data collected.

Conclusions
This paper has skimmed the surface of the emerging field of learning analytics. As
we start measuring “engagement” or “achievement” we need to be aware of the
assumptions behind the measures. We need to know what we are measuring and
the inference between the “thing being measured” and the “concept” we want to
observe. Learning analytics is becoming a very popular way of tracking interactions,
the field is new and we have much to learn from those who have been conducting
data-mining and other analytics in sectors such as health and finance. A new world

7
Richards

of applications is about to emerge. In the meantime measuring something is


probably better than measuring nothing.

References

Bakharia, A., Heathcote, E. & Dawson, S. (2009). Social Networks Adopting Pedagogical Practice:
SNAPP. Proceedings ASCILITE. Auckland.
http://www.ascilite.org.au/conferences/auckland09/procs/bakharia-poster.pdf

Beer, C., Clark, K., Jones, D. (2010). Indicators of engagement. Proceedings ASCILITE 2010. Sydney.
http://www.ascilite.org.au/conferences/sydney10/Ascilite%20conference%20proceedings
%202010/Beer-full.pdf

Churchill, E. (2010). Enticing Engagement. Interactions. 17(3), May-June, 82-87.


http://portal.acm.org/citation.cfm?doid=1744161.1744180

Graf, S. & Kinshuk. (2008). Adaptivity and Personalization in Ubiquitous Learning Systems. In A.
Holzinger (Ed.) Proceedings of USAB 2008, LNCS 5298 331:338.

Johnson, D.W., Johnson, R. T., & Smith, K. A. (1998). Active learning: Cooperation in the college
classroom. Edina, MN: Interaction Books.

Kuh, G. (2004). The national survey of student engagement: Conceptual framework and overview of
psychometric properties.
http://nsse.iub.edu/2004_annual_report/pdf/2004_conceptual_framework.pdf

Kuh, G., Kinzie, J., Buckley, J., Bridges, B., & Hayek, J. (2006). What matters to student success: A review
of the literature. Executive Summary. Commissioned Report.
http://nces.ed.gov/npec/pdf/Kuh_Team_ExecSumm.pdf

Krathwohl, D., Bloom, B., & Masia, B. (1956). Taxonomy of educational objectives. Handbook II:
Affective domain. New York: David McKay.

Richards, G. & Devries, I. (2011). Revisiting formative evaluation: Dynamic monitoring for the
improvement of learning activity design and delivery. Proceedings Learning Analytics and
Knowledge (LAK11). Banff.

Richardson, J. C., Newby, T. (2006). The role of students’ cognitive engagement in online learning.
American Journal of Distance Education, 20(1) 23:37.

Ross, C. (2009). Engagement for learning: What matters to students, what motivates them and how
can institutional support services foster student engagement? ANZSSA Biennial Conference,
Brisbane. http://www.adcet.edu.au/Anzssa/View.aspx?id=7034

You might also like