1 s2.0 S8755461520300050 Main

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 23

Available online at www.sciencedirect.

com

ScienceDirect
Computers and Composition 55 (2020) 102544

The Current State of Analytics: Implications for Learning


Management System (LMS) Use in Writing Pedagogy

Ann Hill Duin a,∗ , Jason Tham b


a University of Minnesota
bTexas Tech University
Available online 29 January 2020

Abstract
Amid the burgeoning interest in and use of academic and learning analytics through learning management systems (LMS), the
implications of big data and their uses should be central to computers and writing scholarship. In this case study we describe the
UMN Canvas LMS experience in such as way so that writing instructors might become more familiar with levels of access to
academic and learning analytics, more acquainted with the analytical capabilities in LMSs, and more mindful of implications of
learning analytics stemming from LMS use in writing pedagogy. We provide a historical account on the development and infusion
of LMS in writing pedagogy and demonstrate how these systems are affecting the way computers and composition scholars consider
writing instruction and assessment. We then respond critically to the collection of data drawn from the authors’ use of these systems
in on-campus and online teaching. We conclude with implications for writing pedagogy along with a matrix for addressing ethical
concerns.
© 2020 Elsevier Inc. All rights reserved.

Keywords: Learning management systems; Academic and learning analytics; Writing pedagogy; Student privacy; Access

Introduction

Learning management technology makes it possible for colleges and universities to collect, store, and mine data
for business intelligence, descriptive, and predictive analytics. With technology, academic institutions have massive
storehouses of data as part of their research, teaching, and engagement efforts; the analytical capabilities made possible
with technology are part of the seemingly objective, structural properties of our institutions. However, across these
massive endeavors, how are administrators treating these analytics? Whose data is it? What knowledge and access do
instructors and students have of these data? Are scholars asking the right questions about privacy and surveillance? And
most important for this article, given the current state of analytics, what are the implications for learning management
system (LMS) use in writing pedagogy? These questions should be central to the work of computers and writing
scholars.

∗ Corresponding author.
E-mail addresses: ahduin@umn.edu (A.H. Duin), jason.tham@ttu.edu (J. Tham).

https://doi.org/10.1016/j.compcom.2020.102544
8755-4615/© 2020 Elsevier Inc. All rights reserved.
2 A.H. Duin and J. Tham / Computers and Composition 55 (2020) 102544

Each term writing instructors use learning (or course) management systems for uploading readings and assign-
ments, reviewing student participation and discussion, and providing comment and assessment of student work (or any
combination of these purposes). Since the 1990’s, composition instructors have used a number of these LMS systems.
In this article we respond critically to the collection of data drawn from such use of these systems in on-campus and
online teaching and from the potential surveillance of such use by university administrators. We analyze the current
state of learning analytics as a means to position writing instructors as agents over LMS use and student success.
We begin with definition and illustration of big data, academic and learning analytics; we then focus on LMS use in
writing instruction, providing a case study of the University of Minnesota’s transition to and use of the Canvas LMS.
We conclude with emphasis on critical uses and implications of LMS in writing pedagogy along with a matrix for
addressing ethical concerns in the design, application, and documentation of learning analytics in writing pedagogy.

Big data, academic and learning analytics

Discussion of big data, business intelligence, and academic and learning analytics is often bundled together. Big data
consists of “extremely large data sets that may be analyzed computationally to reveal patterns, trends, and associations,
especially relating to human behavior and interactions” (Google Dictionary, 2019). Big data in academia most often
includes business intelligence work to identify ways for an institution’s human resources, finance, and student services
to become more efficient, effective, and accountable.
Academic analytics is the application of these business intelligence tools and strategies as a means to guide decision-
making processes related to teaching and learning. Over a decade ago, John Campbell, Peter DeBlois, and Diana
Oblinger (2007) introduced the application of academic analytics to teaching and learning and instructor use of
learning management systems, emphasizing that “with the increased concern for accountability, academic analytics
has the potential to create actionable intelligence to improve teaching, learning, and student success. Traditionally
academic systems—such as course management systems, student response systems, and similar tools—have generated
a wide array of data that may relate to student effort and success” (p. 44). From academic analytics, predictive models
are developed through statistical analysis as a means to locate at-risk students and then provide interventions to increase
student success.
An EDUCAUSE Center for Applied Research (ECAR) study (Goldstein and Katz, 2005) reinforced the fact that most
colleges and universities rely on academic analytics in that “a robust academic analytics environment is often associated
with leaders who are committed to evidence-based decision making” (p.7). This expanding need for information and
decision-making capacity has resulted in increasingly sophisticated technologies and techniques that analyze data and
develop predictive models and assessment frameworks. Among these are the following:

• the Predictive Analytics Reporting (PAR) Network (PAR framework. Starfish Solutions, 2019) that “applies descrip-
tive, inferential, and predictive analytical data-mining techniques to a single, federated dataset to better gauge risks
and implement interventions that remove barriers to student success;”
• Purdue’s Course Signals project “developed to allow instructors the opportunity to employ the power of learner
analytics to provide real-time feedback to a student. Course Signals relies not only on grades to predict students’
performance, but also demographic characteristics, past academic history, and students’ effort as measured by
interaction with [the LMS]. . . The outcome is delivered to the students via a personalized email from the faculty
member to each student, as well as a specific color on a stoplight––traffic signal––to indicate how each student is
doing;” (Arnold & Pistilli, 2012); and
• the Civitas Learning platform that “makes the most of the world’s learning data [to] deliver solutions and services
to help educators measurably improve student success outcomes” (2019).

With increased focus on learning (versus teaching), the language of academic analytics has evolved to reflect student-
centered data collection. As distinct from academic analytics, numerous scholars have defined learning analytics as “the
measurement, collection, analysis, and reporting of data about learners and their contexts, for purposes of understanding
and optimizing learning and the environments in which it occurs” (as reported by Kenneth David Strang, 2016, p. 276).
As an example of this evolution, the New Media Consortium convenes panels of experts each year for the purpose
of determining emerging technology trends across higher education. The 2016 Horizon Report focused on learner
profiling, defining learning analytics as “an educational application of web analytics aimed at learner profiling, a
A.H. Duin and J. Tham / Computers and Composition 55 (2020) 102544 3

process of gathering and analyzing details of individual student interactions in online learning activities” (Johnson
et al., 2016, p. 38); and the most recent 2019 Horizon Report (Alexander et al., 2019) emphasized that “analytics
technologies and capabilities will be an essential component of institutional thriving in the years ahead. Beyond static,
descriptive analyses of student learning, grades, and behaviors, analytics capabilities comprise dynamic, connected,
predictive, and personalized systems and data” (p. 23).
Academic and learning analytics continue to advance through open analytics initiatives such as the Society for Learn-
ing Analytics Research (SOLAR) focus on open learning analytics (Siemens et al., 2014), global learning consortia,
and private funding support such as that from the Gates Foundation (2019). As a result, there is a strong sense across
higher education that academic and learning analytics will pay significant dividends if administrators and instructors
but mine the data available to create “actionable intelligence” with the goal of increasing efficiency, effectiveness, and
student success. For example,

• Lori Lockyer, Elizabeth Heathcote, and Shane Dawson (2013) provided a contextual framework to help instructors
interpret the many “dashboards” stemming from learning analytics;
• Alicia Friend Wise (2014) provided a preliminary model of pedagogical learning analytics intervention design that
includes integration, agency, reference frame, and dialogue;
• Mohammed M. Olama, Gautam Thakur, Allen W. McNair and Sreenivas R. Sukumar (2014) in their collection
of academic analytics from use of the Moodle LMS from 2009-2013, identified data features useful for predicting
student learning outcomes; these include scores on assignments and exams and activities in discussion forums;
• Florence Martin and Abdou Ndoye (2016) examined multiple learning analytics techniques including quantitative
and social network analysis (e.g., descriptive statistics and interaction analysis), qualitative analysis (e.g., discourse
and conversation analysis, content and document analysis, and concept mapping); and
• Ji Won You (2016) identified significant indicators for predicting course achievement: students’ regular study,
late submissions of assignments, number of sessions (frequency of course logins), and proof of reading course
information.

However, Strang (2016), in his extensive review of LMS studies, found little statistically significant relation-
ship between student online LMS activity and academic performance. His subsequent study of student activity logs,
engagement analytics, event (message) monitoring, graphs, security statistics, usage statistics, and third-party tools
(e.g., Google analytics) also indicated no significant relationship between learning performance and LMS activity. As
a result, instructors find themselves swirling in the “no significance” space while simultaneously being required to
adopt LMS platforms with increasingly robust built-in analytics.
Anna Wilson, Cate Watson, Terrie Lynn Thompson, Valerie Drew, & Sarah Doyle (2017), in their exceptional
analysis of the challenges and limitations of learning analytics, emphasized concerns surrounding “unproblematized
Big Data discourse” (p. 991). They highlighted four issues: “the inconclusiveness of empirical studies; somewhat
simplistic conceptions of learning analytics data and methods as part of some generic species, Big Data; choices about
data, algorithms and interpretation; and issues around disciplinary and finer-grained differences in pedagogical and
learning approaches” (p. 993). They further reiterated that learning analytics implementation at an institution-wide
level stems from “its technical nature and from a need to justify sufficient investment in their development” (p. 991).
Citing a broad Australian study, they noted that instructors are largely unaware of the initiatives underway at their own
institutions and rarely discuss learning analytics. Wilson and colleagues intended “to raise questions about whether
using digital traces as proxies for learning can be expected to provide accurate, reliable metrics for student performance”
(p. 992).
In short, the technical ease of tracking and storing data along with the popularity of big data intelligence and
academic and learning analytics is growing faster than writing programs can respond in terms of identifying and
understanding their exact roles in pedagogy. As computers and writing scholars, we must expand our understand-
ing surrounding the uses and implications of academic and learning analytics collected and mined as part of LMS
use in writing pedagogy. So we ask: What does the use of LMS technology tell us about faculty roles and student
participation?
4 A.H. Duin and J. Tham / Computers and Composition 55 (2020) 102544

LMS use in writing instruction

Scholarship within our field consistently tackles questions surrounding computers and writing. Early computer-and-
writing perspectives on digital literacy focused on the complications of teaching and learning with technology (Selfe &
Hilligoss, 1994; Sullivan & Dautermann, 1996; Taylor & Ward, 1998). Scholars like Craig Hansen (1996) and Cynthia
Haynes (1998) took a critical approach to understanding how new computer technologies require new pedagogical
theories and praxis in writing and literacy instruction. Gail Hawisher and Cindy Selfe (1991) called for particular
attention to the kinds of literacies that emerge as a result of the increasing proliferation of networked technologies in
the classroom. While these studies of browser interfaces, keyboard usage, and even email writing may sound outdated
at the time of this writing, they remain exemplary cases of innovative research that motivates us to assess the current
state of learning analytics as a means to once again position writing instructors as agents over technology use and
student success.
Of guidance in this work is Kevin DePew and Heather Lettner-Rust’s (2009) analysis of distance learning interfaces,
in which they emphasized that learning management systems “have traditionally been designed from the top down,
supporting banking models of learning or, in writing instruction, current-traditional rhetoric pedagogies” through
which “students mostly interact with the instructor” as the interface designs “empower the instructor to gaze upon
the students and assess them––often not as a corporeal body but as a corpus of texts” (p. 174). Referencing Michel
Foucault’s (1979) detail on how an institution’s architectural infrastructure is designed to support certain ideological
positions and “mediate very specific communication and contact between various bodies” (p. 177), DePew and Lettner-
Rust emphasized “gaze” and how interface designs give instructors sole power to survey and assess, with sole authority
over the features deployed. However, while they called for administrators, programs, and instructors “to interrogate
the affordances that online interfaces offer for developing better pedagogical methods” (p. 186), we find no mention
made of the broader “gaze” regarding data being collected and mined from these systems for its predictive analytical
qualities.
In his reflection of big data analytics use in assessing writing program, Marc Scott (2017) highlighted that poor
faculty buy-in, privacy concerns, and technological and human limitations are the key critiques about the design and
implementation of analytics in writing programs (p. 63). Scott is convinced that big data analytics will continue to
be “quite persuasive to higher-level administrators” but “faculty and those responsible for curricula and pedagogy
(like WPAs, writing center administrators, and writing across the curriculum directors) should also be included in
conversations about study design and methodology to ensure relevant and useful questions are asked” (p. 64). While
Scott has found the use of analytics to be impactful to writing programs in general, we do not find specific discussions
on the collection and analysis of learning analytics in the context of a course and how they might be used to assist
students.
Also of note is Lauren Salisbury’s (2018) recent study of first-year writing instructor use of LMSs in which she
identified a “great disparity between instructors’ practices in face-to-face and [LMS] spaces with many instructors
failing to see their use of [LMSs] as part of their pedagogical practice” (p. 1), noting that “information technology and
rhetoric and writing scholars alike argue not enough is yet known about [LMSs] to make a clear determination about
the balance between possible benefit and harm” (pp. 3-4). Similar to the institutional focus on efficiency, Salisbury
found that instructors most often used LMS tools to improve efficiency and “make teaching and learning easier for
themselves and their students” (p. 11). She also found that instructors felt limited by LMS capabilities and that LMSs
do not contribute significantly to successful teaching and learning, recommending additional instructor preparation for
use of LMSs and for teaching online: “A vigilant focus on research, practice, and preparation is necessary to ensure
writing instruction remains effective for all learners” (p. 15). However, no mention is made of the data being collected
and mined from these systems for its predictive qualities.
Last, focusing on instructor response to student writing online, Angela Laflen and Michelle Smith (2017) investigated
how students interact with an LMS interface to access instructor feedback on their writing. In a blind study of 334
students in 16 courses, they compared two response methodologies: making grades visible apart from feedback, and
requiring students to open feedback files to access their grades. Somewhat unsurprising, they found that making grades
visible apart from feedback significantly reduced the rate at which students opened instructor feedback files. Absent
again is mention of data being collected and mined for later use.
A.H. Duin and J. Tham / Computers and Composition 55 (2020) 102544 5

Fig. 1. LMS Market Share for US and Canadian Higher Ed Institutions (January 2019 Edition). Source: https://mfeldstein.com/state-higher-
ed-lms-market-us-canada-end-2018/.

A case study of the Canvas learning management system

From the above studies, we see a pressing need to focus this study of LMSs on the understanding and usage of
learning analytics: Where do the data reside? Who has access to these data? And how are they analyzed and used in
decision making at the course level and beyond? As a first phase of this work and to explicate the difficulty in addressing
these questions, we conducted a case study of the University of Minnesota’s (UMN) transition to and use of the Canvas
LMS with emphasis on critical uses and implications of learning analytics stemming from its use in writing pedagogy.
We draw from our experiences with LMS systems and also current use of learning analytics to provide insights on their
affordances and constraints. We reflect on our experiences collectively as well as individually. As part of writing this
article, we compared notes and shared observations in weekly meetings.
Throughout this study, we also examined how the analytics collected and mined via the Canvas LMS reflect our
field’s hallmark pedagogical principles in writing instruction––active learning, socio-cognitive development, process
based composing, digital and multiliteracies, and student empowerment. In alignment with Wilson et al. (2017), we
emphasize that the digital traces left by students and ourselves are “far from trivial, and that any analytics that relies
on these as proxies for learning tends towards a behaviorist evaluation of learning processes” (p. 991). Furthermore,
we commend Jenni (Swenson, 2015; Beck, 2016) framework for understanding ethical concerns surrounding big data
and learning analytics in higher education, sharing a matrix for addressing ethical concerns in the design, application,
and documentation of learning analytics in writing pedagogy.
We are well positioned for such study: Over the past 25 years, Duin has used many of the LMSs included in Fig. 1
in her teaching and online course development; her experience includes 15+ years in central administration, including
service as associate chief information officer in which she was tasked with oversight of LMS platforms along with initial
work with the Unizin consortium (described later), and she has collaborated with Linda Baer (2019), a national expert
on analytics, in ongoing study of academic analytics (Baer & Duin, 2014, 2018). Tham has taught online via three
different LMSs (D2L Brightspace, Moodle, Canvas) in the last seven years. His ongoing study of distance education
methods and course design has been presented in various publications and national conferences (Duin & Tham, 2018;
Tham, 2016).
Drawing on our collective experience, we examine the University of Minnesota’s determination to transition to the
Canvas LMS, exploring its analytical capability with focus on instructor and student access to this capability. This
6 A.H. Duin and J. Tham / Computers and Composition 55 (2020) 102544

case study is illustrative as our purpose is to describe the UMN Canvas LMS experience in such as way so that writing
instructors might become more familiar with levels of access to academic and learning analytics, more acquainted with
the analytical capabilities in LMS systems, and more mindful of implications of learning analytics stemming from
LMS use in writing pedagogy.

The UMN transition to the Canvas LMS with focus on analytics

As writing instructors know well, over the past two decades, colleges and universities have used a variety of LMS
platforms in support of teaching and learning. Currently, as shown in Fig. 1, Canvas is the fastest growing LMS for
the U.S. and Canada. In July 2016, e-Literate (Hill, 2016) reported that “Canvas is now the primary LMS in more US
colleges and universities than Blackboard Learn.”
Over the past 10+ years, the University of Minnesota has used Moodle, an open-source LMS, to create and manage
online multimedia learning materials and activities for courses, training programs, and events that students or partic-
ipants can access online. As previous users of the Moodle LMS, we know that its use by instructors and students is
tracked through participation logs. However, over these years, there has never been any clear indication as to where
this data resides, who has access to it, or how it is analyzed and used by instructors or by others in decision-making
capacities. We largely assume that our university is working to protect the privacy of our intellectual property included
in the LMS as well as the data generated from its use. As Estee Beck (2016) stated in her discussion of the history
and practice of privacy and surveillance, “Blackboard and Canvas, two commonly used learning management systems
within many universities, use data analytics to track student engagement, including the amount of time logged into
their systems and clicks across modules. This data is available for faculty to use, but students do not get to view the
data collected” (n.p.).
In September 2014, the University of Minnesota joined the university-owned and directed consortium, Unizin
(https://unizin.org/). At the current 2019 Unizin site, the “About Us” page states that its member institutions “collaborate
and share resources to adapt to the ever-changing digital teaching and learning landscape. Unizin is a means for higher
education to shape its own future in ways that best serve learners.” Unizin is governed by its current 25 member
institutions whose 900,000+ learners make it one of the largest educational organizations in the U.S. As stated on
the site: “Improved access to data and learning materials strengthens the learning environment. Our data management
solutions strengthen Members’ data and analytics efforts.”(Unizin, 2019)
A key component of our university’s obligation as a Unizin member / technical investor was to pilot Canvas, Unizin’s
LMS, to determine if the university would transition from Moodle to Canvas. The assumptions for being part of Unizin
and using the Canvas LMS included increased capacity to explore and leverage analytics and the “significant value”
that Unizin participation would bring in the area of analytics. The principles and key values stated in a Charter to
University Learning Technology Advisors (ULTA, 2017-2018) for guiding this LMS pilot and decision-making process
were transparency, inclusiveness, nimbleness, multiple drivers of success, and sustainability. No mention of student
success and analytics is made.
In a report on the first year of pilot testing (2015-2016), the Canvas LMS was described to the university community
as follows:
Canvas, hosted through Instructure, is a relatively new online learning platform, or learning management system
(LMS) that has been favorably received and adopted by many of UMN’s peer institutions. Some beneficial features
include: cloud-based functionality, a clean, intuitive user interface with drag and drop usability, close integration with
social media, and a comprehensive grading tool (Donalee Attardo, Paul Baepler, Erik Epp, & Chris Scruton, 2016,
p.4).
The “Latest LMS News” (Latest LMS news, 2017) in 2017 signaled greater attention to teaching and learning:
Participation in the consortial effort of Unizin, coupled with an annual roadmap review of the LMS platform/system,
became a catalyst for developing a vision for the future of teaching and learning at the University of Minnesota. This
prompted a broader study of a set of integrated tools that can grow with the advancement of higher education. [This
effort] highlights a shift in focus from an LMS as the sole system for learning technology, to being only one part
of an integrated and interconnected environment of tools. This environment could provide a learner-centered, open,
and interoperable platform able to support new models of learning, such as competency-based education (CBE) and
personalized learning.
A.H. Duin and J. Tham / Computers and Composition 55 (2020) 102544 7

Fig. 2. UMN Learning Analytics slide from a presentation to the Board of Regents (9.16.16).

Focus on the integration of tools in support of teaching and learning was further signaled in an administrative
presentation to the university’s Board of Regents (Attardo & Kallsen, 9.16.16). From this presentation, the slide shown
here in Fig. 2 illustrates the position of Canvas as an analytical tool along with that of a local advising system (known
as APLUS), student academic planning system (Grad Planner), support (Campus Labs), and overall monitoring of
student degree progress. The mark-up boxes in Fig. 2 indicate areas of greatest interest to the university at this time:
course management systems, dashboards, and alert systems. The full slideset signals the pending Moodle to Canvas
transition as it includes multiple snapshots of Canvas analytics dashboards along with lists of the teams involved in
administrative work to support this transition.
The report concludes with the following “bigger questions” and emphasis on faculty participation:

• Ethics – What is the appropriate and ethical use of the data?


• Quality – What is needed to enhance data quality, including training and support resources?
• Research – What value can faculty researchers get from use of Unizin’s consortium de-identified data warehouse?
• Students – How do we use predictive analytics in interacting with students?
• Leadership – What is the University’s leadership structure for driving the future strategic directions in this area?

Interestingly, throughout our examination of these many reports and presentations, we found little focus on “action-
able intelligence,” no discussion of where the data would reside, and no visible efforts “to interrogate the affordances
that online interfaces offer for developing better pedagogical methods” (DePew & Lettner-Rust, 2009, p. 186). Instead,
discussion focused on justification of the LMS given the increased expense associated with continued use of Moodle,
again substantiating Wilson et al.’s (2017) finding that learning analytics implementation at an institution-wide level
stems from “its technical nature and from a need to justify sufficient investment in their development” (p. 991).

Access to and use of Canvas analytics

As part of our focus on access during review of all sites and reports associated with this LMS transition, we located a
Canvas Technical Report (Scruton & Epp, 2015-2016Scruton & Epp, 2015-2016) conducted that includes the following
details regarding student-, instructor-, and administrator/ researcher-facing analytics and reporting:

• Student-Facing Analytics and Reporting


8 A.H. Duin and J. Tham / Computers and Composition 55 (2020) 102544

• Students have access to a grades page that reports on all of their scores for the course. This page is also editable to
allow “what if” analysis of the effect of various assignments on the final course grade.
• Instructor-Facing Analytics and Reporting
• The instructor has access to a course-level summary analytics page that provides a histogram of participation and
page views, an overview of assignment submissions (missing, on-time, late), and a set of bar and whiskers plots
for grades. This summary analytics page also provides a table with per-student data for Page Views, Participations,
Submissions, On Time, Late, Missing, and Current Score in the course. . .
• Instructors can also access a student access report that shows Content, Times Viewed, Times Participated, and Last
Viewed on a per student basis. There is also a student interactions report that shows the last interaction with the
student, scores, and ungraded assignments.
• Administrator/Researcher-Facing Analytics and Reporting
• In addition to the above, Administrators can view all page views for a given user, which can be used for forensic
purposes (e.g. whether a student actually accessed a given assignment). There are also a set of administrative
reports (Course Storage, Grade Export, LTI Report, Last Enrollment Activity, Last User Access, Outcome Results,
Provisioning, Public Courses, Recently Deleted Courses, SIS Export, Student Competency, Student Submissions,
Students with no submissions, Unpublished Courses, Unused Courses, User Access Tokens, Zero Activity) that
permit certain configuration data to be exported. . .
• Separately the Canvas Data Portal provides access to detailed logs of activity in the system every 24 hours, with the
last five days worth of logs available for access. If this data is to be used for whole-term analytics, these logs would
need to be saved periodically. If this is to be used for actionable interventions by researchers/advisors/teachers based
on analysis, the time interval between batches would need to be shortened.

First, note the discrepancies across the above categories in terms of access to analytics. Students have access to
a Grades page and potential “what if” analyses in terms of the impact of future assignments on overall grades. In
addition to grade information, instructors have an overview of student participation that includes summaries of page
views, participation, and status of assignment submissions. In stark contrast is administrator access that, in addition to
the above, includes all student page views, enrollment activity, student competency based on submissions and overall
activity in courses, and complete “detailed logs of activity.”
As part of decades of interest in the study of emerging technologies, one of us (Duin) volunteered to be one of the
90 instructors involved in the pilot testing effort, using early versions of the Canvas LMS in multiple courses over two
years. The Canvas Community site (2019) repeatedly states the following: “Viewing analytics is a course permission.
If you cannot view analytics, your institution has restricted this feature,” so Duin worked to locate the unit that was in
charge of restricting or providing such access. After queries to collegiate, university, and program levels, she eventually
emailed all levels at once, eventually gaining access to view her course/student analytics. It never was clear as to who
made the decision to allow such access to analytics.
At the point of documenting this case study, we located the most recent Canvas site titled “What are analytics?
(2019) and its “three-pronged approach to creating substantive data for Canvas users”:

• Justification focuses on system reports and how the system is being used.
• Intervention looks to predict at-risk students and how to meet their needs.
• Learning focuses on learning outcomes, the effectiveness of the teaching style, and the division of time between
students achieving competence and those falling behind.

This Canvas site specifically differentiates between Account Analytics and Course Analytics. Account administrators
are tasked with making sure that students, teachers, observers, and/or designers are participating in the courses; seeing
an overview of the term and how users are interacting with overall courses; watching how grade distributions fluctuate
or remain steady; and viewing the “total number of courses, teachers, students assignments, submissions, discussion
topics and replies, files, and media recording in the account.” We have created Fig. 3 to provide a view of the Account
Analytics hierarchy that illustrates how administrators at the top level account (number 1) can view all analytics in the
top and sub-level accounts; and administrators at the sub-account level (number 2 and below) can view analytics for
their sub-accounts as well as all sub-accounts below them.
A.H. Duin and J. Tham / Computers and Composition 55 (2020) 102544 9

Fig. 3. View of Account and Course Analytics Hierarchy in the Canvas LMS.

Fig. 4. Canvas Snapshot Dashboard for Instructor Use.

At the bottom of Fig. 3 are courses. In contrast to Account Analytics, Course Analytics such as those included in
Figs. 4–6, 8 and 9 in this article “track and analyze what students, observers, and/or designers are doing within the
course” as a means to “predict how students react to course activities; see which students are at-risk and need help;
view how effective your teaching strategies are in allowing students to learn; and see a quick view of what your students
are achieving in your course.” Account administrators have access to all Course Analytics, and instructors can request
permission for access to Course Analytics. Again, the “What are Analytics?” document includes no mention of Student
access to analytics.
In comparison to previous use of the Moodle LMS, the Canvas system indeed offered Duin an overview of patterns of
participation and student interaction with assignments that prompted changes in her pedagogical practice. For example,
10 A.H. Duin and J. Tham / Computers and Composition 55 (2020) 102544

Fig. 5. Two individual student snapshots of Canvas course activity.

Fig. 6. Analytics for a student’s Activity by Date and Communication during the use of Canvas for a Spring 2019 course.
A.H. Duin and J. Tham / Computers and Composition 55 (2020) 102544 11

Fig. 8. Instructor vs. Student Course Page View Screenshots in Canvas.

Fig. 4 presents an overall “dashboard” view of student participation from one of her initial pilot uses of Canvas, an
online course on Writing with Digital Technologies (WRIT 4662 W). At the time, the “Activity by Date” indicated
a good deal of participation that, at the time of the Fig. 4 screenshot, was just beginning to lessen. As a result, she
viewed individual student activity, sending specific messages to prompt those students with less participation. The
“Submissions” graph indicated to Duin that a possible pattern of late submissions was occuring, prompting her to
revisit the instructions for the weekly online course forum assignments, and as a result, she revised these to more
clearly indicate expectations for both posting to the forum and responding to other student postings. The “Grades”
graph at this point in the course indicated that the majority of students were meeting expectations for the assignments
to date.
Use of Canvas analytics also allowed Duin to see snapshots of each individual student’s Activity by Date, Commu-
nication, Submissions, and Grades. For example, the two snapshots in Fig. 5 illustrate how Duin was able to identify a
student who needed more guidance and nudging (left column) in contrast to one who was doing well (right column).
For the student in the left column, seeing the variance in “Activity by Date” led Duin to reach out to the student
about this variance in activity and participation in the course. More important, the Communication graph indicated that
the student was not responding to these messages, and this lack of communication along with the lag in submitting
or missing assignments prompted Duin to reach out to the student’s advisor to determine how they might partner in
providing support and direction for the student. In this case, the student’s advisor was surprised that Duin had access
to such analytics, inquiring as to how also to request access to analytics in the Canvas LMS.
The Communication graph in these various dashboards continues to be of value in Duin’s later uses of Canvas.
For example, Fig. 6 includes graphs of a student’s Activity by Date and Communication in an online course offered
recently in Spring 2019. When Duin saw “Page Views Only” and a lack of participation by this student during Week
3 of the course, she reached out to inquire about lack of participation in that week’s online course forum. This
communicative exchange is documented in the lower Communication graph, which also shows a large amount of
continued communication between student and instructor. [The lack of participation during the third week of March
was due to Spring break.]
For students to have access to snapshots of their analytics at the University of Minnesota, instructors still must
request specific permission for such access. The current 2019 Canvas Guide, “How do I view my course analytics as
a student?” provides the following instructions for student access to analytics: 1) Open People (on the course site);
2) Locate and click your name; 3) Click the Analytics button; and 4) View your total grade percentage in the course.
However, these instructions also state: “Viewing Analytics is a course permission. If you cannot view Analytics, your
institution has restricted this feature.”
12 A.H. Duin and J. Tham / Computers and Composition 55 (2020) 102544

One of the most persuasive arguments for learning analytics is this notion of personalized learning and the use of dash-
boards such as those in Figs. 4–6 to provide an overview of competence development. An additional argument focuses
on how to differentiate instruction for students. While Duin offers students multiple submission options depending on
the assignment (e.g., websites, video, audio, document files), she currently does not include specialized assignments for
individual students nor exempts students from some of the assignments even though the means for such personalization
of learning exists in Canvas. In blog posts on the external Canvas community site, K-12 instructors discuss their ability to
individualize assignments; exempt students from subsequent assignments once a competency is met; extend due dates;
and offer students the option for multiple attempts at quizzes or their choice between assignments (Hainline, 2016).
Under the higher education section on the Canvas community site (https://www.canvaslms.com/higher-education/),
adaptability, customization, and pedagogical flexibility are noted as hallmarks of this LMS. Therefore, we expect that
many of the local tools in place at our institution (e.g., those listed in Fig. 2) eventually will be integrated with Canvas.
The Canvas LMS now in full use at UMN includes “An introduction to Canvas course analytics (2019) where the
analytics tutorial continues to focus most on grade comparisons, prompting instructors to consider these questions for
exploration:

• Which assignment did students struggle with the most (lowest average grade)?
• Which exam had the lowest/highest average grade?
• Compare the grades of the two sections in this course. Which section is performing better/worse?
• How can I identify the student with the best/worst performance in the course?
• How can I compare the grades of two students (and what would that tell me about them)?
• What is the grade distribution for a particular assignment or exam?

As Wilson et al. (2017) noted, this focus on grades highlights “a potential tension between personalized, individual-
ized learning and the collaborative, socially emergent view of knowledge underpinning both well-established learning
theories such as social constructivism and its offshoots in communities of practice. . . and in more recent ‘learning
2.0’ theories such as networked learning. . .and connectivism” (p. 993). So, as LMS analytical capabilities continue to
expand, the collective voices, views, and scholarship of those in computers and composition is needed as a means to
focus on our pedagogical goals of active learning, socio-cognitive development, process-based compositing, student
empowerment, and digital literacy.
Given this need for expanded analytical capability, during both Summers 2017 and 2018, Duin conducted sets of
interviews with the University’s Director of Undergraduate Analytics, the Academic Technology Systems Analyst in
charge of analytics, and the administrator in charge of the University’s Learning Analytics Community of Practice:

• The Director of Undergraduate Analytics is responsible for gathering, analyzing, and disseminating information to
support evidence-based decision-making, strategic enrollment management, and student success. He shared that the
University’s focus is on a system-wide Student Success Analytics project, with the overall goal of increasing student
retention by using analytics to locate and pinpoint issues in student course taking patterns and movement between
programs. Thus, these interviews indicated continued focus at the Account level of analytics.
• The Academic Technology Systems Analyst in charge of analytics shared that the focus related to analytics is also
largely at the program or Account level and above and that any focus on analytics in classes will be on building sets
of tools that give faculty access to course information. He shared that “Unizin concluded that until we get reliable
data in the aggregate, they won’t be successful. So Unizin moved to build an underlying platform to manage and
aggregate data in real time from tools that use common standards. Therefore, the focus this past year has been on
building out a comprehensive data platform as a way to normalize data across systems to then draw insights from
the full student use. So, any focus on learning analysis (course level) is in a holding pattern.”
• And the administrator in charge of the University’s Learning Analytics Community of Practice stated most succinctly
that “Learning analytics is limited to the affordances in Canvas.”

With additional searching on the external Canvas community site, Duin discovered the page titled “Priority: New
Canvas Analytics” (Priority: New Canvas analytics, 2019) along with indication that insight from student interaction
with course material may be in progress or possibly slated for future Unizin development (see Fig. 7).
A.H. Duin and J. Tham / Computers and Composition 55 (2020) 102544 13

Fig. 7. New Canvas Analytics (Accessed 5 February 2019).

However, given that “learning analytics [at UMN] is limited to the affordances in Canvas,” we next share co-author
Tham’s specific work.

Canvas use and student LMS activity

While Duin’s experience highlights access to LMS analytics, co-author Tham is interested in how Canvas allows
instructors to customize the presentation of the course which in turn might affect students’ activity on the LMS. As an
early adopter of Canvas during the UMN transition, as well as an online adjunct instructor for another institution that
has just started using Canvas at the time of this writing, Tham has the advantage of comparing how Canvas is set up
at two institutions, what’s common and what’s different, and what this means to the instructor and students. Below he
presents some observations.
Canvas affords instructors with control in terms of student view of course elements. Below, Fig. 8 shows a comparison
of two Canvas courses, one at UMN (maroon colored panels, labeled A & B) and one at Tham’s adjunct institution
(gray colored panels, C & D). Panels A and C show the instructor view of the central menu on Canvas, while panels
B and D show the student view. Notice panels A and B take a more minimalist approach to presenting the LMS menu
compared to C and D. We can see the menu items that are available to the instructor but hidden from the student view:
grades, pages, quizzes, announcements, people, etc. In comparison, the Canvas for the course represented by panels C
and D provides more menu items to the student viewer. The deliberate (un)showing of available features on the Canvas
LMS gives the course instructor control in how they would like students to interact with the LMS, including how much
autonomy students afford.
One could argue that the presentation of the LMS shown through the student view in panel B limits how much
students can access their ongoing learning performance, including their grades and information about other classmates
(as both the grades and people tabs are hidden from student view). Whereas, panel D shows that students are given
access to many course-related components on the LMS, but could easily be overwhelmed by unessential items like
conferences, chat, class notebook, Barnes & Noble, etc.
The visibility of LMS features reflects the power relations between the instructor and students—indicating where
authority and subjectivity reside. It is important to highlight what is hidden from students on an LMS to reveal
such dynamic. Another Canvas analytics feature that students do not have access to is the individual performance
“quick view” for instructors. The items included in these Canvas Quick View panels (see Fig. 9) are standard across
institutional Canvas setup: (from top to bottom of the Snapshot panel) student profile, course information, a button
to the student’s current grades and full academic analytics, the student’s grade in weighted percentage, numbers of
missing and late assignment submissions, the student’s most recent grades (up to last 10 graded items), and rated
performance (“Activity Compared to Class”) in terms of participation (attendance, forum discussions) and LMS page
views (1 = low, 2 = moderate, 3 = high).
While these snapshots of student performance are quantitatively organized and provide a summary to a student’s
involvement in the course (both online and onsite), these data convey only the “what” of student learning behaviors,
but not the “why” of these behaviors. For instance, in Fig. 8, the instructor could see that two students from the same
14 A.H. Duin and J. Tham / Computers and Composition 55 (2020) 102544

Fig. 9. Individual Student Learning Analytics Quick View Panels.

course (A & B) have moderate to high “participation” in the course, albeit not specific to online or offline participation.
The Quick View for Student A shows that this student has low “page views” but still achieved higher grade percentage
(94.19%) than Student B (87.74%) even though Student B’s “page views” was rated high by Canvas. Similarly, in
another course, Student C has a higher grade percentage (97.97%) than Student D (89.29%) even though Student D
has high ratings for both “participation” and “page views.” Student C only has low to moderate activity.
With Tham’s Canvas experience, we arrive at more questions: What does the availability of these data mean for
writing instructors? How should instructors be interpreting and using them? In Tham’s courses, he has consulted the
Quick View feature when asked to report students’ midterm performances. For example, the student athlete advising
centers both at UMN and Tham’s adjunct institution require instructors to provide midterm alerts and sometimes
additional comments that reflect the athlete’s academic performance as part of their requirement to stay on a sports
team. These Quick View analytics give Tham an overview of a student’s assignment submission behavior (whether
they are done on time or late) and the student’s current grade based on completed assignments. Certainly, these are
information an instructor could also retrieve from the LMS gradebook; the Quick View panels let the instructor access
the aggregated results more quickly. This feature, as justified by the “three-pronged approach” (“What are Analytics”)
we reviewed earlier in this article, aims to provide early alerts to both the instructor and student so they can work
to avoid failures in meeting the course learning outcomes. The goal is to intervene before it is too late. However, as
mentioned previously, these learning analytics do not truly represent a student’s authentic engagement with the course.
They paint a certain portrait based on the student’s work that is made visible through assignments, but do not reveal any
cognitive, social, or other literacy developments that are often invisible in learning analytics. Are the particular students
self-motivated? Why are certain students reluctant to engage with online discussions and collaborative exercises? What
about their cultural personality? What is the socio-economic status of these students? What is their technological literacy
level? The learning analytics here do not provide answers to these concerns.

Implications of learning analytics used in LMSs for writing studies

Wilson et al. (2017) emphasized that “Learning analytics are only likely to effectively and reliably enhance learning
outcomes if they are designed to measure and track signals that are genuine indicators of or proxies for learning. To do
this, they need to be grounded in robust and clearly articulated theories of learning” (p. 996). Engagement in learning
is linked to student success, so minimally, the student snapshots shared in Fig. 5 provide clear indication of the need
A.H. Duin and J. Tham / Computers and Composition 55 (2020) 102544 15

Fig. 10. An example of the use of the Canvas grading rubric.

to visit with the student whose work is summarized on the left side of this figure. Although we cannot yet view the
specific routes that students travel through our course resources, we can get an overview of the number of pages visited,
the time spent on them, and participation in discussions and assignments.
We also gain insight through access to student comments as part of their peer reviews. Fig. 10 illustrates the use
of a Canvas grading rubric for development of a static website assignment as part of a course we both have taught,
Writing with Digital Technologies. This includes one student’s comments and assignment of points as part of a peer
review process. The analytics available to instructors again include an overview of the number of students who have
completed such a peer review component of an assignment.
When set up as a group activity, students can create a shared project and/or view the works of their assigned
group members. An LMS can be a medium for the exchange of ideas and collaborative learning. Instructors may gain
insight into student participation through students’ individual contributions and through group dynamics (e.g., group
communication and material sharing). However, it is not always made known to students that their group interactions
are immediately accessible by the instructor (in the case of group projects), included in reports accessible by those
with “Account” status, and possibly aggregated at the level of consortial work as part of Unizin. In short, we ask, what
should computers and composition scholars consider regarding surveillance, privacy, and security?
16 A.H. Duin and J. Tham / Computers and Composition 55 (2020) 102544

Surveillance, privacy, and security

As instructors using these learning management systems, we know that we have analytical access “that exceeds
our expectations or those of students” (Hawisher & Selfe, 1991, p. 63). Estee Beck (2016) chronicled the history and
practice of surveillance and privacy in our writing classrooms, noting how Joseph Janangelo (Janangelo, 1991) warned
instructors as early as 1991 about abuses of power and control in the CMS (course management system) writing space.
Beck noted the work of later scholars who have warned us about the surveillance of plagiarism-detection systems
(Zwagerman, 2008) and digital archives of student papers (Purdy, 2009), tracking and data-mining practices (Beck,
2016), surveillance of big data connected to writing portfolios (Beck, 2016), and the lengthy privacy policies seldom
read (Vie, 2014). Beck (2015) also provided suggestions for helping instructors and students to develop awareness
surrounding privacy and surveillance as well as practices and techniques to support the development of digital literacy,
and we highlight five of her guiding questions here:

1 How will my teaching of tracking technologies and online behavioral advertising support university, departmental,
and my learning outcomes of the course?
2 What tools will I use to show students about tracking technologies?
3 How am I providing safe spaces for students when they go online to various websites that use tracking technologies?
4 What impact will occur when students learn about their invisible digital identities? Will students feel comfortable
sharing the details of their “hidden” life with others? How might I provide alternatives for people who may not feel
comfortable sharing what has been branded about them?
5 Will my department support my discussions about invisible digital identity in my classroom?

Both students and instructors benefit from Beck’s suggestions to summarize and analyze privacy policy statements, to
examine data usage statements, and in this case, to examine together with students their Canvas Snapshot dashboards.
As Brian Blake (2015) noted, instructors should pay attention to data accuracy, ethical use, and safe collaboration
within these systems. When compiled and analyzed, these LMS data can reveal personal information about students
and instructors, e.g., when and how we work and what we value as important. At this juncture, we do not fully understand
the types and volume of information gathered about us or implications of its use in decision making throughout the
University.
There is no opting out of data collection. Perhaps most paramount, as part of Unizin, our University’s agreement
includes the compilation and sharing of aggregate data across all courses and institutions as a means to better understand
learning and improve student success. In short, while the analytics dashboard that Canvas renders may well offer a
better means for students to attend to personal progress, and instructors may use this analytics tool to better monitor
student attention to resources and assignments, we do not fully understand the implications of its use by the Unizin
consortium.
Numerous studies emphasize issues of segregation, privacy, and security (e.g., Pardo & Siemens, 2014; Tene &
Polonetsky, 2013) as well as codes of practice for both academic and learning analytics (Sclater, 2014), and matrices
of strategies and choices for understanding ethical concerns in the design, application, and documentation of learning
analytics are being developed. As one example, Jenni (Swenson, 2015) surveyed the ethical concerns of learning
analytics, finding that “the inability of students to provide input into the learning analytics process was the concern
most often revealed, followed by a lack of context for interpreting the data by both institutional users and students,
and the potential inaccuracies in the predictive model caused by inaccurate or incomplete data.” Secondary concerns
included “an undefined institutional responsibility to act on data, which could put the institution at risk for legal action,
as well as the possibility for discrimination to occur during the learning analytics process.” And concerns identified less
frequently included “the potential for students to become objectified (student viewed as data), the lack of an opt-out
option for students, the potential for de-anonymizing the student as at-risk, and the failure to develop and communicate
principles and policies college-wide.” The final concerns identified included “inadequate user training (for both students
and institutional users), the potential for differential access, and a lack of a vision or mission statement, or code of
ethics, created and communicated by the institution” (p. iii).
Kyle Jones, John Thomson, and Kimberly Arnold (2014), leaders of academic analytics initiatives in the U.S., further
stressed that “technological progress and social enthusiasm for data analytics continues to outpace these concerns.”
In their specific focus on student data ownership, they contended that “Higher education must mind the gap: these
A.H. Duin and J. Tham / Computers and Composition 55 (2020) 102544 17

Table 1
The seven surveillance strips mapped to the LMS.
Marx’s seven surveillance strips Seven strips mapped to the LMS

1. Tool selection University selection of the LMS


2. Subject selection Student selection of the course using the LMS
3. Data selection Instructor development of assignments, modules; Student completion of the assignments and modules
4. Data processing/analysis Instructor analysis and evaluation of the assignments
5. Data interpretation Student and instructor interpretation of the data
6. Data uses/action Instructor use of the data in overall evaluation of student work
7. Data fate Largely unknown, assumed to “reside” in the LMS

problems are political, legal, and social landmines for institutions that continue to push forward without addressing
them.”
Gary Marx (2016), in Windows into the Soul: Surveillance and Society in an Age of High Technology, stressed
the importance of understanding “the softening of surveillance,” meaning that it “becomes less visible and directly
coercive, often being engineered into an activity” (p.114). He noted that “the new soft surveillance has several aspects:
minimal visibility and invasiveness as well as passive, often automated data collection. . .The trend is toward techniques
that do not require consent or even awareness” (p.117). In the case of an LMS, students are not given the option to
consent to their use; the LMS is viewed similar to an assigned textbook. However, automated data collection is pervasive
throughout its design and use. Indeed, failure to “comply” with the use of an LMS would likely result in failure of a
class.
Most of Marx’s examples relate to data gathered from mobile devices, web sites, credit card use, vehicle registrations,
employment history, travel records, etc. In these cases, “Surveillance agents often use tools that don’t require active
subject cooperation. They justify their data collection by claiming that subjects ‘volunteer’ their data by walking
or driving on public streets; entering a shopping mall; failing to hide their faces, wear gloves, and encrypt their
communication; or choosing to use a phone, computer, or credit card” (2016, p.127). Instructors regularly gather data
from the LMS site, using it to inform ongoing teaching and grading practices. Instructors justify this data collection as
an integral part of pedagogy.
Marx further emphasizes that surveillance includes a series of actions that encompass seven surveillance strips.
In Table 1, we include each surveillance strip along with its possible companion element in the LMS space. In this
case, the University selects the LMS; the student selects the subject or course; the instructor develops the assignments
and collections, resulting in selected data; and the instructor analyzes these assignments (data), using them in overall
evaluation. What remains largely unknown is the fate of these data.
Seen together, these seven strips provide a full surveillance story of the LMS. They illustrate “the emergent character
of ‘surveillance’ as a multifaceted concept not only one with various dimensions but also one that is more fully seen
when followed over time” (p.138). Marx’s seven surveillance strips provide us with one lens for considering the current
state of analytics in writing pedagogy.

Pedagogical implications

Students compose, conduct peer reviews, and submit final assignments within LMSs. While these “texts” are not
public, they certainly are not private, nor are they protected. To date, few if any guiding principles and policies readily
exist for implementing academic and learning analytics.
Francisco de Arriba-Perez, Manuel Caeiro-Rodríguez & Juan M. Santos-Gago (2016) wrote about the importance
of self-quantification in which the values and indicators are “shown to the student (and teacher) on a specific dashboard.
Daily, weekly and monthly summaries together with comparisons with other students will be provided. This can be
used as a tool to promote self-reflection and self-regulation. . . It will enable the user to be more aware of his own
features and patterns” (p.14). Likewise, the Canvas Snapshot component provides a dashboard that an instructor can
now choose to be “open” for students to view both their individual summaries as well as how they compare to other
students in the course. While we have not yet located a similar dashboard system for instructor comparisons, it surely
is being planned or exists somewhere.
18 A.H. Duin and J. Tham / Computers and Composition 55 (2020) 102544

Table 2
Summary of hallmark pedagogical goals in writing studies, LMS tools/features, and their implications for writing pedagogy.
Hallmark goals LMS tools/features Implications for pedagogy

Active learning Analytics (student interface), chat, Enhance instructor ethos by appealing to
discussion forums, collaboration tools, transparency in course design
project groups, learning outcomes, instructor Let students assume autonomy in practical
conferences problem-solving tasks
Socio-cognitive development Discussion forums, collaboration tools, Increase learning engagement through social
project groups, learning outcomes, interactions
assignments Focus on collective achievement vs.
individual accomplishments
Process based composing Project groups, discussion forums, Allow scaffolding in learning and composing
assignments, modules, collaboration tools, Treat collaboration as a natural part of the
assignments composing process
Student empowerment Analytics (student interface), project groups, Enact agency through students’ access to
discussion forums, collaboration tools, their own progress/accomplishments
conferences, chat, assignments Favor student-organized efforts and solutions
Digital literacy and multiliteracies (Multi)media curation, collaboration tools, Emphasize different modes and means for
analytics (student interface), assignments composing and presentation
Increase data literacy

Nevertheless, learning and academic analytics indeed affect instructor pedagogy. Current LMS tools and
features––such as those made available through Canvas––impact the ways instructors teach and deliver writing instruc-
tions (see Table 2). As writing instructors continue to design active learning environments for students to learn and
practice writing, LMSs allow students to view their real-time progress, interact with other learners, and assess their own
work against the learning outcomes designated by their instructors or the department. LMSs afford greater teacherly
ethos by showing students instructor involvement and transparency in evaluating student work. Instructors may also be
able to let their students assume more autonomy in solving practical problems/tasks due to the accessible collaboration
tools and interactive features available on some LMSs.
Features like discussion forums, project groups, and cloud-based collaborative authoring tools on LMSs promote
socio-cognitive development in students and foster process-based composing. These features create social learning
spaces that allow students to locate shared topics of investigation, co-define and ideate solutions, and produce artifacts
representing their collaborative work. With appropriate facilitation, these features increase engagement in learning,
providing greater focus on collective achievement instead of individual accomplishments. In both of our pedagogical
experiences, we have assigned interdependent learning exercises where students collaborate to achieve the goals of an
assignment. For instance, we have recently completed a collaborative project with the University of Ontario Institute
of Technology’s cultural analytics database, Fabric of Digital Life (https://fabricofdigitallife.com/), where students
curated specific collections of technological artifacts and built knowledge archives for public consumption. In this
instance, students utilized LMS forums to discuss their desired directions, managed workflow through project groups
on Canvas, and used collaborative authoring tools like Google Docs to compose collective introductions to these
archives (see an example here: https://z.umn.edu/fabric29). Working together to learn as a cohort helps students see
collaboration as a natural part of the composing process. Instructors may leverage these tools to allow more proper
scaffolding in learning and composing.
With the available, albeit student-facing, learning analytics and access to the instructor and other students in the
learning community, students may access their own progress through quantified means, including their course engage-
ment level, quality of writing, and degree of contribution. This would empower students to enact agency for their own
learning as well as their peers’. For instructors, the LMS features may promote student-organized efforts such as unique
discussion topics and new collaborations. Certainly, different institutions have different privileges and challenges. We
do not suggest that students across institutions would enjoy the same access to their learning analytics. This is where
Writing Program Administrators (WPAs) and instructors should advocate on behalf of their students to higher admin-
istration so that students are allowed to see their own analytics. And when students do access these analytics, they
should be trained to interpret and use these data as a launchpad to further their academic endeavors.
A.H. Duin and J. Tham / Computers and Composition 55 (2020) 102544 19

Last, LMS tools grant students the opportunity to develop their digital literacy and what Selber (2004) called “mul-
tiliteracies.” The available (multi)media curation/submission and viewing feature, students may experience different
modalities in writing. Instructors can take this opportunity to highlight the affordances and constraints of these modal-
ities, encouraging students to consider the rhetorical purposes and goals of their chosen modes. Because students
have access to their learning analytics, this would also be an opportunity to develop data literacy in students. Writing
instructors may teach students to consider the usefulness of data and how to measure and present their work through
analytical means.
As shown in Table 2, LMS features and tools––when used effectively and ethically––can support and promote
the hallmark goals in writing pedagogy. Student-facing analytics can help facilitate active learning while increasing
instructor ethos through transparency. Visualized data about student engagement in social learning spaces like discussion
forums and collaborative projects can promote socio-cognitive development in students, particularly when instructors
use these to encourage collective learning achievements. Contextually quantified student behaviors can help instructors
understand students’ writing processes and use scaffolding methods to enhance writerly growth. Accessible analytics
and social learning tools in LMS can also enact student agency and empower students to self-manage and monitor their
academic progress. By continuous use and examination of multimedia features and data tools on the LMS, students
may also increase their digital and data literacies.

Looking ahead: Heuristics for working with academic and learning analytics

An institution’s use of an LMS is likely to remain for the foreseeable future. Administrators will continue to
capitalize on business intelligence, learning and academic analytics to devise institutional plans and strategic directions.
To channel Cindy Selfe (1999), writing instructors and scholars should therefore pay attention to the increasing use
and development of big data in the academy. As analytics move from being descriptive to predictive in academia,
we predict an “assistive” future where student performance results in personalized recommendations for particular
goals or academic achievements. To arrive at such a future, writing instructors must collaborate with instructional
designers and analytics specialists to learn and use academic data to their optimal potential. Instructors should not
treat academic technologists as merely tech support personnel but rather as partners, or even design experts, who
will help put together effective courses. Working directly with analytics specialists, writing instructors can locate and
analyze data made available through the LMS.
It would be even more desirable if writing instructors attain basic data and analytics literacy to study and act
upon their students’ performances. This would allow instructors to use identifiable data to create personalized solutions
for individual students. Instructors should cultivate a sense of curiosity––not be afraid of the immense data presented
by the LMS but instead seek to leverage the data as evidence of student progress.
As writing researchers have advocated, data are to be interpreted and used sensibly. WPAs and instructors must
learn to use available data to better understand student engagement and not penalize those who are disadvantaged.
Account administrators should use data to triangulate evaluation of student work, in addition to instructor direct
observation and involvement in student development. Instructors should use Course analytics to inform feedback to
students, to help students envision how they might participate in a course. It goes without saying that instructors must
never treat analytics simply as reasons for manipulation––or to create change for change sake––but rather as examples
of evidence based pedagogical practices.
Last, the field as a whole needs to create a sustainable, ethical framework for programmatic development
using analytics. As we have demonstrated through mapping of the “surveillance strips” on LMS (in Table 1), more
work needs to be done to ensure ethical administration, use, and study of analytics. Writing studies scholars should
form collectives or workgroups at national and institutional levels to provide oversight into legal and ethical analytics
practices so that instructors have a reliable set of tools and guidelines for implementing analytics in their courses and
programs. Wolfgang Greller and Hendrick Drachsler (2012) provided an exemplary approach to starting this exercise.
They proposed a framework (see Fig. 11) that focuses on six critical dimensions where “each of the six fields of
attention is required to have at least one instantiation present in a fully formulated learning analytics design” (p. 44).
The framework considers the goals of different stakeholders, the instruments used to achieve different objects and
produce data, as well as the internal and external limitations in the process.
Computers and composition needs scholars to lead similar projects in order to create rhetorically grounded and
relevant frameworks to deploy in writing programs. To begin this movement, we offer an initial matrix (see Table 3)
20 A.H. Duin and J. Tham / Computers and Composition 55 (2020) 102544

Fig. 11. Greller & Drachsler’s (2012) critical dimensions of learning analytics.

for addressing ethical concerns in the design, application, and documentation of learning analytics in writing pedagogy
using Swenson’s (2015) framework for understanding ethical concerns surrounding big data and learning analytics
in higher education. We consider this matrix an invitation for writing scholars and instructors to look into their LMS
and learning analytics practices. We hope that this framework will serve as a heuristic for further examination and
development.
In designing learning analytics study and implementation in writing programs, administrators and instructors must
pay attention to any unequal power dynamic and discriminatory aspects of data collection and analysis. For exam-
ple, in designing LMS course statistics for example, a lack of contextual understanding of learning data (e.g., forum
engagement, student attendance, etc.) by faculty and staff may lead to incorrect predictions of student performance. In
response, WPAs and university leadership should provide professional development training to institutional users about
the purpose, methods, and rhetorical aspects of analytics design. When applying learning analytics in programmatic
decisions, administrators and departmental leadership must ensure that student voices are included in the implemen-
tation process. For instance, in using the “Student View” feature to monitor students’ learning progress and make
predictions, instructors should give students an adequate platform to question or correct how their data are used in the
predictive model, and to offer suggestions for more ethical use. When creating documentation of learning analytics,
administrators and faculty are responsible for developing sound policies and protocols that ensure shared understanding
of mission, a code of ethics to guide data management, and maintain transparency while protecting student privacy
throughout the process. A lack of an ethical operational procedure can lead to legal complications as well as diminish-
ing trust from students and faculty alike. Administrators are responsible for training instructors in documenting and
sharing learning analytics, creating written and accessible documentations for data use, and encouraging students to
participate in the discussions on learning analytics use and implementations.

Conclusion

The learning management systems that writing instructors use are considered simply as tools of the trade for teaching
and learning. LMSs can provide significant service to individuals and institutions alike given their connectivity to
enterprise systems for managing wellness, registration and ongoing tracking of students. In using these devices and
systems, writing instructors and students must increase their understanding of the types and volume of information
gathered as well as how it is used in decision making throughout their institutions. Through this illustrative case
A.H. Duin and J. Tham / Computers and Composition 55 (2020) 102544 21

Table 3
An initial framework for addressing ethical concerns in the design, application, and documentation of learning analytics in writing pedagogy.
Adapted from (Swenson, 2015) and Swenson & Duin, 2018a,Swenson & Duin, 2018b.
Analytics practices Ethical concerns Professional and pedagogical
(some examples) responses
Design: To ensure that users LMS course statistics Institutional users (faculty Conduct professional
understand the rhetorical and staff) are unaware of development sessions to inform
aspects of visualizations in data used in predictive faculty and staff about the
terms of unequal social power, model and therefore do not purpose, methods, and rhetorical
lack of context to interpret data, have “context” to interpret aspects of analytics design.
and discriminatory aspects of data.
learning analytics (requires LMS user profiles Students risk becoming Inform students of the
goodwill and sensitivity). objectified. applications and goals of learning
Potential profiling of analytics, and how their profiles
students based on their are used/viewed by instructors
images, labels, categories, and administration.
etc. Train instructors to read student
profiles rhetorically.
Application: Ensure that Student analytics Potential for revealing Hire analytics advisors to provide
processes are in place to snapshot panels, student status (by student or recommendations and
acknowledge student voice, to gradebook, progress bar institution) beyond “need to demonstrations of best practices
provide adequate services, and know” personnel. while handling student data.
to conduct adequate training in Page views, forum Inaccurate or imprecise data Train instructors to use learning
order to implement learning engagement data used in predictive model. analytics optimally.
analytics accurately with the Student view (LMS Institution does not give Conduct open discussion in class
motivation of increasing interface for students) students an opportunity to (and elsewhere) to allow student
student success (requires question or correct data input and suggestions for use of
practical skills and motivation). used in predictive model. analytics in courses.
Documentation: Agreement Missing documentation Lack of institutional best Conduct professional
involves developing sound training for instructors practices for documenting development sessions to train
policies and procedures for and sharing analytics. instructors to document and share
learning analytics processes, learning analytics.
and establishing a mission, Missing student forum Student is unaware of data Provide written and accessible
vision, and code of ethics to for input and collected and ways they are documentation of data collection
serve as an infrastructure for communication used to inform curriculum and use in curriculum design.
conducting learning analytics design. Create forums to collect student
campus-wide and, thereby, input on the use of learning
allowing students to engage analytics.
with transparency while Automatic enrollment Students not given an Provide options for students to
protecting student privacy opportunity to opt-in or opt-in or opt-out of course
(requires practical wisdom and opt-out. analytics.
character).

study, we demonstrate how these systems are affecting writing instruction and assessment. We provide insights on
how learning analytics contribute to and inform our pedagogical approaches. Using Swenson’s (2015) model, we
recommend critical uses and implications of LMS in writing pedagogy via a matrix for addressing ethical concerns in
the design, application, and documentation of learning analytics in writing pedagogy. It is time to understand the types
and volume of information being gathered and become knowledgeable and active in determining implications of its
use in decision making at course, program, institution, national, and international levels.
Ann Hill Duin is a Professor of Writing Studies at the University of Minnesota––Twin Cities where her research and teaching focus on digital literacy,
analytics, and collaboration. Besides Computers and Composition, her most recent scholarship appears in Communication Design Quarterly, IEEE
Transactions on Professional Communication, and the International Journal of Sociotechnology Knowledge Development. She is co-founder of the
Emerging Technologies Research Collaboratory, an environment where participants investigate the impact of emerging technologies on personal
lives, professional futures, and pedagogical innovation.

Jason Chew Kit Tham is an Assistant Professor of Technical Communication and Rhetoric at Texas Tech University. He studies emerging approaches
such as design thinking and multimodal literacies in technical communication pedagogy. Besides Computers and Composition, Jason’s scholarship
has appeared in Technical Communication, Technical Communication Quarterly, and Journal of Technical Writing and Communication.
22 A.H. Duin and J. Tham / Computers and Composition 55 (2020) 102544

References

Alexander, B., Ashford-Rowe, K., Barajas-Murphy, N., Dobbin, G., Knott, J., McCormack, M., et al. (2019). EDUCAUSE
Horizon Report: 2019 Higher Education Edition Retrieved from https://library.educause.edu/-/media/files/library/2019/
4/2019horizonreport.pdf?la=en&hash=C8E8D444AF372E705FA1BF9D4FF0DD4CC6F0FDD1 Accessed 25 April 2019.
An introduction to Canvas course analytics. (2019). University of Minnesota Retrieved from. https://canvas.umn.edu/courses/108765
Arnold, K. E., & Pistilli, M. D. (2012). Course signals at Purdue: Using learning analytics to increase student success. ACM International Conference
Proceeding Series, http://dx.doi.org/10.1145/2330601.2330666
Attardo, D., Baepler, P., Epp, E., & Scruton, C. (2016). UMN Canvas pilot – 2015-2016 report Retrieved from.
https://it.umn.edu/sites/it.umn.edu/files/canvas pilot report spring 2016 1.pdf
Baer, L. L. (2019). The rise of analytics in higher education. In L. L. Baer, & C. Carmean (Eds.), An analytics handbook: Moving from evidence to
impact (pp. 3–9). Society for College and University Planning.
Baer, L. L., & Duin, A. H. (2018). Connecting the dots: Assessment, accountability, analytics, and accreditation. Nashville, TN: Society for College
and University Planning. July 2018.
Baer, L., & Duin, A. H. (2014). Retain your students! The analytics, policies and politics of reinvention strategies. Planning for Higher Education
Journal, 42(3), 30–41.
Beck, E. (2015). The invisible digital identity: Assemblages in digital networks. Computers and Composition, 35, 125–140.
Beck, E. (2016). Writing educator responsibilities for discussing the history and practice of surveillance and privacy in writing classrooms. Kairos:
A Journal of Rhetoric, Technology, and Pedagogy, 20(2). Retrieved from. http://kairos.technorhetoric.net/20.2/topoi/beck-et-al/beck.html
Beck, E., et al. (2016). Writing in an age of surveillance, privacy, and net neutrality. Kairos: A Journal of Rhetoric, Technology, and Pedagogy, 20(2)
http://kairos.technorhetoric.net/20.2/topoi/beck-et-al/mckee.html
Blake, B. (2015). Worry about wearables. IEEE Internet Computing, 19(5), 4–5.
Campbell, J., DeBlois, P., & Oblinger, D. (2007). Academic analytics: A new tool for a new era. Educause Review, 42(4), 40–57. Retrieved from.
https://net.educause.edu/ir/library/pdf/ERM0742.pdf
Canvas community site. How do I view course analytics? Retrieved from https://community.canvaslms.com/docs/DOC-10299-415266790 Accessed
5 February 2019.
Charter to University Learning Technology Advisors. (2017-2018). University of Minnesota Learning Technology Advisors. Retrieved from
http://ulta.umn.edu/.
(2019). Civitas Learning. Retrieved from https://www.civitaslearning.com/ Accessed 5 February 2019
de Arriba-Perez, F., Caeiro-Rodríguez, M., & Santos-Gago, J. M. (2016). Collection and processing of data from wrist wearable devices in
heterogeneous and multiple-user scenarios. Sensors, 16(9), 1538–1569.
DePew, K. E., & Lettner-Rust, H. (2009). Mediating power: Distance learning interfaces, classroom epistemology, and the gaze. Computers and
Composition, 26, 174–189.
Duin, A. H., & Tham, J. (2018). Cultivating code literacy: A case study of course redesign through advisory board engagement. Communication
Design Quarterly, 6(3), 44–58. Retrieved from. https://z.umn.edu/duin-tham-cdq18
Foucault, M. (1979). Discipline and punish: The birth of the prison. [A. Sheridan, Trans.]. New York, NY: Vintage Books.
(2019). Foundation invests in research and data systems to improve student achievement. Press release. Gates Foundation. Retrieved
from. https://www.gatesfoundation.org/Media-Center/Press-Releases/2009/01/Foundation-Invests-in-Research-and-Data-Systems-to-Improve-
Student-Achievement
Goldstein, P. J., & Katz, R. N. (2005).. Academic analytics: The uses of management information and technology in higher edu-
cation (Volume 8) Boulder, CO: Educause. Retrieved from. https://library.educause.edu/resources/2005/12/academic-analytics-the-uses-
of-management-information-and-technology-in-higher-education
Google Dictionary. (2019). Big data. Google Dictionary, https://www.google.com/search?q=google+dictionary&rlz=1C5CHFA enUS862US862
&oq=google+dictionary+&aqs=chrome.69i57j0l7.4218j0j7&sourceid=chrome&ie=UTF-8#dobs=big%20data
Greller, W., & Drachsler, H. (2012). Translating learning into numbers: A generic framework for learning analytics. Educational Technology &
Society, 15(3), 42–57.
Hainline, A. (2016). Differentiaing assignments (k-12) in Canvas: Helping all learners be successful. Canvas Community blog.. Accessed 25 April
2019. https://community.canvaslms.com/groups/k12/blog/2016/02/11/differentiating-assignments-k-12-in-canvas-helping-all-learners-be-
successful
Hansen, C. (1996). Networking technology in the classroom: Whose interests are we serving? In P. Sullivan, & J. Dautermann (Eds.), Electronic
literacies in the workplace: Technologies of writing (pp. 201–215). Urbana, IL: NCTE.
Hawisher, G., & Selfe, C. (1991). The rhetoric of technology and the electronic writing class. College Composition and Communication, 42(1),
55–65.
Haynes, C. (1998). prosthetic rhetorics@writing.loss.technology. In T. Taylor, & I. Ward (Eds.), Literacy theory in the age of the internet (pp.
79–92). New York, NY: Columbia University Press.
Hill, P. (2016). State of higher ed LMS market for US and Canada 2016 edition. e-Literate. Retrieved from. Spring.
http://mfeldstein.com/state-higher-ed-lms-market-spring-2016/
Janangelo, J. (1991). Technopower and technoppression: Some abuses of power and control in computer-assisted writing environments. Computers
and Composition, 9, 47–64.
Johnson, L., Adams Becker, S., Cummins, M., Estrada, V., Freeman, A., & Hall, C. (2016). NMC Horizon Report: 2016 Higher Education Edition.
Austin, Texas: The New Media Consortium. Retrieved from https://library.educause.edu/∼/media/files/library/2016/2/hr2016.pdf.
A.H. Duin and J. Tham / Computers and Composition 55 (2020) 102544 23

Jones, K. M. L., Thomson, J., & Arnold, K. (2014). Questions of data ownership on campus. In Educause Review.. Retrieved from.
https://er.educause.edu/articles/2014/8/questions-of-data-ownership-on-campus
Zwagerman, S. (2008). The scarlet P: Plagiarism, panopticism, and the rhetoric of academic integrity. College Composition and Communication,
59(4), 676–710.
(2017). Latest LMS news. Five things the university community needs to know.. Retrieved from. http://ulta.umn.edu/latest-lms-news-
five-things-university-community-needs-know
Lockyer, L., Heathcote, E., & Dawson, S. (2013). Informing pedagogical action: Aligning learning analytics with learning design. American
Behavioral Scientists, 57(10), 1439–1459.
Martin, F., & Ndoye, A. (2016). Using learning analytics to assess student learning in online courses. Journal of University Teaching and Learning
Practices, 13(3), 1–20.
Marx, G. T. (2016). Windows into the soul: Surveillance and society in an age of high technology. Chicago, IL: University of Chicago Press.
Olama, M. M., Thakur, G., McNair, A. W., & Sukumar, S. R. (2014). Predicting student success using analytics in course learning management
systems. In Proceedings of SPIE. pp. 1–9. Retrieved from. www.SPIEDigitalLibrary.org/conference-proceedings-of-spie
Pardo, A., & Siemens, G. (2014). Ethical and privacy principles for learning analytics. British Journal of Educational Technology, 45(3), 438–450.
(2019). PAR framework. Starfish Solutions.. Retrieved from http://www.parframework.org/about-par/our-approach/ Accessed 5 February 2019
Priority: New Canvas analytics. Canvas Community Site. Retrieved from https://community.canvaslms.com/community/ideas/analytics Accessed 5
February 2019.
Purdy, J. P. (2009). Anxiety and the archive: Understanding plagiarism detection services as digital archives. Computers and Composition, 26(2),
65–77.
Salisbury, L. E. (2018). Just a tool: Instructors’ attitudes and use of course management systems for online writing instruction. Computers and
Composition, 48, 1–17.
Sclater, N. (2014). Code of practice for learning analytics: A literature review of the ethical and legal issues Retrieved from. JISC.
http://analytics.jiscinvolve.org/wp/category/code-of-practice/
Scott, M. (2017). Big data and writing program retention assessment. In T. Ruecker, D. Shepherd, H. Estrem, & B. Brunk-Chavez (Eds.), Retention,
persistence, and writing programs (pp. 56–73). Louisville, CO: University Press of Colorado/Utah State University Press.
Scruton, C., & Epp, E. (2016). Canvas technical report. University of Minnesota. Retrieved from http://z.umn.edu/canvastr Accessed 5 February
2019
Selber, S. (2004). Multiliteracies for a digital age. Carbondale: Southern Illinois University Press.
Selfe, C. (1999). Technology and literacy: A story about the perils of not paying attention. College Composition and Communication, 50(3), 411–436.
Selfe, C., & Hilligoss, S. (Eds.). (1994). Literacy and computers: The complications of teaching and learning with technology. New York, NY:
Modern Language Association.
Siemens, G., Gasevic, D., Haythornthwaite, C., Dawson, S., Shum, S. B., Ferguson, R., et al. (2014). Open learning analytics: an integrated &
modularized platform. Proposal to design, implement and evaluate an open platform to integrate heterogeneous learning analytics techniques.
Society for Learning Analytics Research,. Retrieved from. http://classroom-aid.com/wp-content/uploads/2014/04/OpenLearningAnalytics.pdf
Strang, K. D. (2016). Do the critical success factors from learning analytics predict student outcomes? Journal of Educational Technology Systems,
44(3), 273–299.
Sullivan, P., & Dautermann, J. (1996). Electronic literacies in the workplace: Technologies of writing. Urbana, IL: NCTE.
Swenson, J. (2015). Understanding ethical concerns in the design, application, and documentation of learning analytics in post-secondary education.
Doctoral Dissertation. University of Minnesota. https://conservancy.umn.edu/handle/11299/175347.
Swenson, J., & Duin, A. H. (2018a). Understanding ethical concerns surrounding big data and learning analytics in higher education. In R. Papa, &
S. W. J. Armfield (Eds.), The Wiley handbook of educational policy (pp. 479–511). Noboken, NJ: Wiley and Sons, Inc.
Swenson, J., & Duin, A. H. (2018b). A matrix to address ethical concerns in the design, application, and documentation of learning analytics in
postsecondary education. In R. Papa, & S. W. J. Armfield (Eds.), The Wiley handbook of educational policy (pp. 551–573). Noboken, NJ: Wiley
and Sons, Inc.
Taylor, T., & Ward, I. (1998). Literacy theory in the age of the internet. New York, NY: Columbia University Press.
Tene, O., & Polonetsky, J. (2013). Big data for all: Privacy and user control in the age of analytics. Northwestern Journal of Technology and
Intellectual Property, 11(5), 239–273.
Tham, J. (2016). Pedagogical and technological ethos in online instruction: A rhetorical review of on-site and online learning statements. Journal
of Global Literacies, Technologies, and Emerging Pedagogies, 3(3), 499–515.
Unizin. (2019). Improving learner success: Affordability, access, retention, graduation. Retrieved from http://unizin.org/ Accessed 5 February 2019
(2019). What are analytics? Guides: Canvas. Retrieved from. https://community.canvaslms.com/docs/DOC-10742-67952724559
Wise, A. F. (2014). Designing pedagogical interventions to support student use of learning analytics. Proceedings of the 4th International Conference
on Learning Analytics and Knowledge (LAK), 203–211.
Vie, S. (2014). “You are how you play”: Privacy policies and data mining in social networking games. In J. deWinter, & R. Moeller (Eds.), Computer
Games and Technical Communication: Critical Methods and Applications at the Intersection (pp. 171–187). Burlington, VT: Ashgate.
Wilson, A., Watson, C., Thompson, T. L., Drew, V., & Doyle, S. (2017). Learning analytics: Challenges and limitations. Teaching in Higher
Education, 22(8), 991–1007.
You, J. W. (2016). Identifying significant indicators using LMS data to predict course achievement in online learning. In The Internet and Higher
Education. pp. 29. http://dx.doi.org/10.1016/j.iheduc.2015.11.003

You might also like