Professional Documents
Culture Documents
How Students Write and Instructors Read
How Students Write and Instructors Read
Guus de Krom
To cite this article: Guus de Krom (2023) Approaches to written assignments -how students
write, and instructors read, Assessment & Evaluation in Higher Education, 48:1, 67-76, DOI:
10.1080/02602938.2022.2040946
ABSTRACT KEYWORDS
The importance of language in the learning process of students has written assignments;
been studied extensively. However, little attention has been paid to the assessment; separation
role language plays in the assessment of student writings. Students and of form and content
instructors at a liberal arts and sciences college were interviewed to
reveal their approaches to written assignments. The aim was to learn
what writing approaches students use, particularly if they know their
knowledge falls short for the task, and conversely, to explore what
instructors do to try and make sense of student writings that are some-
times ambiguous with regards to correctness of content. A few detailed
examples are provided to illustrate how instructors may sometimes effec-
tively separate language form and disciplinary content when reading
ambiguous text fragments. The need to stress the importance of lan-
guage in teaching is discussed, leading to some practical suggestions
to improve written assignments.
Introduction
Although students may occasionally learn and demonstrate their acquired knowledge and skills
without much language being used - practicing with a pipette in a wet laboratory, for instance
- academic education in general heavily depends on language.
Academic language is characterized by formal conventions and a level of precision that is
much different from language used in daily life. It is, in a sense, a language somewhat foreign
to all students: native and non-native speakers of the language of instruction alike. Read (2015)
states that ‘Although the problem of academic language may be particularly visible or acute
for second-language speakers, in fact, we argue that academic language is intrinsically more
difficult than other language registers’ (114). Also, since many instructors consider written lan-
guage the preferred mode for students to consolidate and demonstrate knowledge (Hyland
2013), the ability to express oneself in writing is even more important in academic studies than
it is in ordinary life.
Given the importance of, especially written, language in academic teaching and assessment,
instructors often highlight it, particularly as students sometimes underestimate the role of
language. For example, students in a methods and statistics course often think of it as math-
ematics in disguise, and believe their performance depends on numerical and logical skills, rather
than the ability to exactly formulate ideas and conclusions. In such a course, instructors may
choose words carefully to help students realize that statistically significant outcomes are not
necessarily relevant, but students often do not realize that such subtle language differences are
crucial, and may consequently fail to demonstrate that in their writing. This is a bit of a chicken
and egg problem. In order to understand why and how language subtleties matter in a given
disciplinary discourse, a student must have a certain level of understanding of content. Conversely,
students may not completely understand concepts if they do not understand the language at
a deeper conceptual level (Richardson et al. 2016). Sutton (2012) pointed at a similar reciprocal
relation between language form and content when students are provided with feedback on
their performance, arguing that a basic content knowledge is required for students to decode
what the feedback means.
A recurrent issue specific to academic language is the use of familiar-sounding words that,
in the context of a particular discipline, have a technical meaning that can be subtly or some-
times completely different from their everyday usage. These words often relate to what Meyer
and Land (2003) coined threshold concepts: concepts that are considered core to a discipline,
and that - once understood - lead students to see things in a different way. Examples are
opportunity cost in economics, complex numbers in mathematics, and significant in statistics.
Kaplan, Fisher, and Rogness (2009) point out that students may make incorrect associations if
words with a specific technical meaning are also used in common language. Richardson, Dunn,
and Hutchins (2013) emphasize the importance of student’s understanding of such lexically
ambiguous words. Students often erroneously transpose the everyday meaning to academic
settings, and fail to see for example that a significant outcome in quantitative studies is not
necessarily an important one. They should be aware that a statistically significant result could
actually be irrelevant for the purpose of a study - which is the exact opposite of its meaning
in general English. Note that native speakers of English could be in a disadvantage in these
cases as compared to non-native speakers, since the ordinary meaning of such words may be
more firmly rooted in their conceptual frameworks, and harder to avoid using.
The importance of language and language skills for the student’s learning process has long
been recognized and has received considerable attention, resulting in an extensive body of
research literature on the role of language in the acquisition of knowledge of all kinds and at
all levels, including more specific issues having to do with learning in non-native language
settings. This study does not deal with the role of language in a student’s learning process and
the challenges that first or second language issues pose to the acquisition of knowledge. Instead,
the focus is on writings that students produce to demonstrate their acquired knowledge, par-
ticularly when that requires the use of discipline-specific jargon. We will look at these writings
from the perspective of the students as writers, as well as instructors as readers.
When writing an assignment students must translate their knowledge (content) into a text
(form) that can be interpreted and assessed by the instructor. I will not dive into the philo-
sophical discussion of whether knowledge can exit without language, but just assume that ideas
must be articulated in language.
Students may use different approaches when writing. They may try to be concise and to the
point, or decide to write much to demonstrate their knowledge. If they are aware their content
knowledge falls short, they may try to hide this, or choose to be open about it. One aim of
this study is to collect experiences from students to find out whether they indeed utilize such
diverse approaches, and if so, what makes them do so and why. Conversely, when reading text
written for an assignment, instructors must decide whether the student has correctly answered
a question, or properly phrased an argument. If students employ diverse approaches in their
writing, the question arises whether instructors in turn employ diverse approaches in reading
to deal with these, and if so, what their approaches are. Unlike in an oral examination, or during
a presentation, an instructor has no opportunity to ask for a reformulation on the spot - when
reading, one has to do with the text as provided. The academic literature on ‘reading strategies’
focuses largely on teaching students to read, but the approaches employed by instructors when
Assessment & Evaluation in Higher Education 69
they read for assessment have attracted little attention, even though instructors often face
challenges doing so.
Admittedly, if a student answers a question correctly, or successfully illustrates a line of
reasoning in a paragraph, an instructor can easily connect the text to the student’s ideas.
Information that is evidently wrong or just missing also poses no interpretation problems or
special approach to reading. However, if a student adds irrelevant information to otherwise
meaningful prose, or if the writing contains plainly contradictory elements, a reader must have
some way of dealing with that. An interesting scenario occurs when a student’s language is
ambiguous, allowing for a reading that can mean either X or Y, with X being correct, and Y
incorrect. Disentangling form and content becomes problematic then.
In this study, instructors were asked to reflect on their reading approaches and explore what
they do, and how. The emphasis in such conversations will be on the challenging cases in which
student writings are unclear, and a proper interpretation of content, and consequently assess-
ment, is hindered by language issues. To illustrate how textual ambiguities may or may not be
resolved when reading, a few examples taken from students’ answers to questions on a methods
and statistics test will be provided.
Academic language, academic writing in particular, is not a natural language to any student.
Although students with a language background other than the language of instruction may
arguably be in a less-favoured position for academic studies than native speakers, the approaches
to writing and reading that will be explored in this study may be relevant for native and
non-native speakers alike. Since such language issues are not limited to any particular area of
study, it would be interesting to obtain information from students with diverse academic inter-
ests, and correspondingly, instructors who teach in different disciplines.
The aim of the study is foremost descriptive and qualitative, rather than quantitative. It is
not about testing a certain theory, nor about gathering information that can be interpreted in
terms of certain percentages of students or instructors who act in a particular way. Instead, the
purpose is to explore some of the writing approaches and experiences of university students,
and, correspondingly, the reading and assessment strategies and experiences of instructors, and
derive some insights that could be helpful when designing written tasks and assessing textual
information.
Methods
Participants: Students and instructors
Students and instructors at University College Utrecht (hereafter UCU), a liberal arts and sciences
college at Utrecht University, the Netherlands, were interviewed by the author in the spring of
2021. Students were approached via the team of tutors. At UCU, all students have a designated
tutor as academic advisor. The author asked the tutors to suggest names of students who, in
their view, might be interested to participate in a study dealing with what was at that stage
vaguely described as ‘the role of language in writing’. The tutors all knew the author profes-
sionally, and were informed that the study was intended to lead to a publication. The tutors
named a total of fifteen students who had agreed that their names could be passed on, and
who were subsequently contacted by the author in person and informed in more detail about
the purpose of the study. The students were asked whether they were still willing to participate
and were informed that no remuneration was offered. All students volunteered to participate
in the study and agreed to be interviewed.
The resulting convenience sample consisted of a mix of male and female students, both
native and non-native speakers of English (the language of instruction at UCU), with different
major orientations (sciences, humanities, social sciences), and at different stages of their studies
(first, second or third year). The author considered this a sufficiently diverse group of students
70 G. DEKROM
for the purpose of the study. The author also personally contacted a number of instructors to
ask for their participation. The only inclusion criteria were that they were employed at the
college, or had taught there for at least a few years, and that they had extensive experience
with the assessment of different kinds of student texts. This yielded another convenience sample
of fifteen instructors, who also agreed to be interviewed. The sample included male and female
instructors, teaching across a wide range of academic disciplines rooted in the humanities,
sciences and social sciences. All informants (students and instructors) provided full consent in
writing for the use of their remarks in this study, on condition that their identities could not
be retraced. According to rules at Utrecht University, further ethical clearance or debriefing was
not needed.
Experiences
Writing for assignments: student perspectives and practices
As can be expected students often wrote for a task knowing they did not possess all required
knowledge. Some of these consciously incompetent (Broadwell 1969) students just left part of
a question unanswered or decided to not try to work out the line of argumentation in detail.
Others explicitly signaled their problems, and said they would add something like ‘I have no
idea what you mean here’, or just place a question mark for an answer. Those who did so said
they hoped demonstrating awareness of their limitations might be appreciated by the instructor.
A few others would rather not alert the instructor, but instead somehow tried to hide their
lack of knowledge. One strategy is to add information which is not incorrect, but not completely
to the point either. As one said: ‘I think it is better to add a bit too much that may not be relevant,
than to miss out on information’. Some thought instructors might appreciate that they make
much effort, even if it is not exactly what was asked for. ‘I sometimes just put down a lot, so that
my teacher can see that I at least tried to answer the question’. One compared this to firing a shot
of hail, where much may miss the target, but some will be spot on. Others said they believed
a demonstration of effort is actually required to get a high grade; being concise or to the point
would, they fear, be seen as laziness. Rather than adding less relevant information, students
sometimes deliberately used vague language, hoping the instructor was willing to interpret
somewhat evasive writing as sufficiently precise; an interesting strategy in that it actually requires
a rather high linguistic competence.
Assessment & Evaluation in Higher Education 71
Such shot of hail and mystification strategies may indeed pay off if the instructor is willing
to sieve out the relevant bits and ignore irrelevant material. It backfires if an instructor penalizes
students for not being concise or to the point. Students admitted they did not always know
how their instructor would respond, so when using such strategies, it felt ‘a bit tricky’, as one
said. When writing for an assignment, some students admitted having experienced difficulties
expressing themselves, pondering over the exact terminology to use, the best way to structure
a sentence, and so forth. They said this adds to the stress of the task, particularly under time
pressure. Interestingly, several students said time pressure encouraged them to write more, not
less. Fearing to miss out on some important content, they would rather hastily write text that
is maybe too elaborate and potentially repetitious than invest precious time deciding what is
and what is not to the point, to structure the answer, (p)rewrite sentences, and check for spelling
mistakes. At the college, many instructors tell students to try and ‘split the thinking from the
writing’ and students confirmed being aware of this, but for many, time pressure seemed in
the way of doing so.
Students said their most frustrating experiences were when they mastered the content,
thought they expressed themselves properly in writing, and then, seeing the instructor’s eval-
uation, discovered they apparently did not manage to convey their message as intended. This
is actually not only a problem for the student, but for instructors as well. If language compe-
tency is a limiting factor, an assignment may miss its main function since students will then
not be able to demonstrate their knowledge to the fullest. Interestingly, when students were
asked whether they thought they would perform better if they were allowed to write in their
native language, some said they would prefer writing in English, since this is the language of
instruction in which their conceptual framework had been built.
it near impossible to take a student’s personal background into consideration. All these factors
make it difficult for instructors to set tailored expectations for language use, and act on these,
should they wish.
In the interviews, several instructors mentioned that they recognized the two main approaches
used by students when their knowledge fell short for the task. Where students explicitly indi-
cated what they thought was lacking, instructors tended to appreciate that. More often, though,
instructors read students writings that are overly wordy, in what the instructors see as an
attempt to hide a lack of knowledge. If students added redundant, but otherwise correct,
information, some instructors said they tended to ignore this and focused on what is relevant.
As the students hoped for, they appreciated - or at least did not punish - students giving it a
try. The potential disadvantage, as indeed recognized by the instructors, is that such lenient
reading may reward and encourage students to write as much as possible. For this reason,
some instructors disliked this strategy: ‘It upsets me if I have to wade through irrelevant blabbering.
Being concise and to the point is a quality’. Another explicitly warned students that points would
be subtracted for irrelevant information, hoping this would force students to remain focused
and concise.
When a student was partially right on an answer or claim, but plainly contradicted themselves
somewhere else in the text, the instructors said they would consider that as wrong altogether.
They believed that a strategy to focus on what is right and just ignore what is wrong could
encourage students to write anything that comes to mind.
All but one instructor said they had come across writings that left them in doubt as to what
a student actually meant. If the writing was vague or ambiguous, these instructors had to
decide whether this was primarily a content or a language issue, a distinction that is obviously
particularly relevant for instructors who wish to focus on content, rather than form. Instructors
sometimes considered this a frustrating experience, especially when they believed that the
students actually possessed the knowledge, but just failed to put it in writing correctly. As one
said: ‘I sometimes cannot understand why students fail to write even the most obvious things cor-
rectly. We dealt with this several times in class, I am sure they know better than this’.
This second part of the answer effectively demonstrated the student meant that a lack of
random assignment was indeed the problem, allowing the instructor to give full credit.
Assessment & Evaluation in Higher Education 73
Another student answered: ‘The problem with this study is that the different groups do not
consist of the same participants’. The use of the word same suggests the student believed the
subjects should have been measured more than once, requiring a so-called repeated measures
design. However, the real issue was a potential group difference at the onset of the study on
characteristics that are relevant for the outcome variable - a selection problem that was exten-
sively discussed in class. Sameness of subjects was not required. Had the student written some-
thing like ‘the groups are not similar’, the answer would have been correct.
On the same test question, another student answered ‘The two groups are different’, and just
left the explanation part blank. In the story line of the question, the groups were indeed com-
posed of different people, so stating that the groups are different is trivial. Whether this student
had the intended selection issue in mind could not be determined on the basis of the answer,
which was not untrue, but uninformative.
For the examples given above, had this question been asked in an oral examination, an
instructor, sensing ambiguity in the answers, could have asked the student to reformulate their
answer, allowing them to decide whether this indeed evidenced a lack of knowledge, or an
unfortunate choice of words. However, with written assignments, instructors have to do with
the text as provided. As the examples demonstrate, one student indeed had the knowledge,
though the writing did not directly demonstrate that. For the others, it might have been the
case that they knew the answer, but there was no way to conclude that on the basis of the
response, or some other part of the test. Since instructors cannot always use additional infor-
mation to aid their interpretation (or may not want to do so), some students will perform worse
on written tasks as compared to some type of oral assessment.
In a study involving secondary school children who were tested on chemistry and physics,
pupils performed much better on oral than on written tests. They often understood core con-
cepts, but failed to interpret the written questions correctly. In the oral test, instructors could
rephrase questions, allowing pupils to demonstrate their knowledge (Schoultz, Säljö, and
Wyndhamn 2001). In another study, performance and attitudes of students were compared for
oral and written examination. The authors concluded that oral testing opens opportunities for
genuine discourse, allowing instructors to better gauge true and deep understanding of con-
cepts (Huxham, Campbell, and Westwood 2012).
However, oral testing is often just not feasible, or maybe not even allowed. Some instructors
organize individual feedback or inspection sessions and go over the text in detail on a one-on-
one basis to help students see where their thinking and/or writing fell short, occasionally also
allowing them to clarify retrospectively what they tried to argue. In case students could plausibly
argue they had simply not put their knowledge in writing correctly, instructors sometimes
proved willing to adjust their assessment a bit. As one said: ‘Listening to the student’s explanation,
I sometimes see what was apparently meant. In some cases, I had to admit my question was a bit
vague, and that my reading did not take other plausible interpretations into consideration. In some
cases, I have added a few points in retrospect’. More often, instructors provided written feedback,
not only because one-on-one sessions with students are highly time-demanding, but also
because written feedback is archived and can be re-examined, with less room for bargaining.
This in turn requires the instructor to be precise with language. Providing proper written feed-
back is an art in itself, and must be carefully phrased for students to appreciate (Mahfoodh
2017). Elton (2010, 156) says ‘It is surely more surprising that many students can cope with this
large variety of writing tasks than that a proportion cannot, particularly as many of the teachers’
comments which they find on their writings are almost Delphic in their obscurity’.
Feedback on student writing, by the way, need not come from instructors. It may actually
be more convincing if fellow students can point out what is and is not clear to them (Baker
2016). Providing feedback to peers may help students appreciate the judgement criteria better
and become less dependent on external feedback (Nicol, Thomson, and Breslin 2014).
74 G. DEKROM
when drafting test questions, but less so for essay prompts or project instructions, although
these may also be multi-interpretable. The time invested to ask colleagues to scrutinize instruc-
tions and questions and take their advice to heart may well pay off if this leads to texts that
are more in agreement with what was meant.
At UCU, we have a Writing and Skills Centre where students who struggle with their writing,
design of studies, and analyses of data may call for help. We intend to expand the role of the
centre beyond providing service to students, and also invite colleagues to seek collegial advice
on the wording of instructions, questions and whatever written materials are used. This kind
of collaboration between colleagues and teachers may help improve the quality of teaching
and writing across the curriculum, in line with recommendations made by other researchers (cf
Wilkinson 2018).
Wherever possible, instructors should discuss examples of correct and incorrect language
use, provide exemplary articles and templates, have students practice with different types of
writing, and provide feedback on that. Goldschmidt (2014) interviewed students on their devel-
opment as writers and scholars, and found that students wish to enter dialogues with faculty
members, as this helps to foster relationships that strengthen an understanding of what the
disciplines are about, and behave in accordance with this disciplinary identity.
Finally, one may consider replacing a written assignment with something really different to
reduce interpretation issues with student writings. For example, at UCU, an introductory method
and statistics course is taught in which students have to complete two small research projects.
In each, they collect and analyze data, and report on their findings. The end product of the
first project is a formal paper, following strict conventions. Next to content, the students’ ability
to write according to academic standards is explicitly weighed in on assessment. In contrast,
the end product for the second research project may be anything, except a formal paper. Over
the years, students have created animations, filmed mock interviews with fellow students as if
the study was a news item, drawn cartoons, and even presented their findings in a song. Note
that this second project is not a sloppy variant of the first: the instructors expect a similar
rigour with regards to content. In the experience of the instructors, the second project is equally
suited to assess the students’ understanding of content just as well as the first one. Students
in turn appreciate the freedom they have to present the information in a way they like, while
not having to rely as much on formal language conventions. Sometimes then, language prob-
lems can be dealt with by trying to avoid them.
Acknowledgements
I would like to thank Annemieke Meijer for her comments on a draft versions of this article.
Disclosure statement
No potential conflict of interest was reported by the authors.
ORCID
Guus de Krom http://orcid.org/0000-0001-8922-8631
References
Baker, K. M. 2016. “Peer Review as a Strategy for Improving Students’ Writing Process.” Active Learning
in Higher Education 17 (3): 179–192. doi:10.1177/1469787416654794.
Beaufort, A. 2012. “College Writing and beyond: Five Years Later.” Composition Forum 26: 1–13.
76 G. DEKROM
Broadwell, M. M. 1969. “Teaching for Learning (XVI).” The Gospel Guardian 20 (41): 1–3.
Elton, L. 2010. “Academic Writing and Tacit Knowledge.” Teaching in Higher Education 15 (2): 151–160.
doi:10.1080/13562511003619979.
Goldschmidt, M. 2014. “Teaching Writing in the Disciplines: Student Perspectives on Learning Genre.”
Teaching & Learning Inquiry the ISSOTL Journal 2 (2): 25–40. doi:10.20343/teachlearninqu.2.2.25.
Huxham, M., F. Campbell, and J. Westwood. 2012. “Oral versus Written Assessments: A Test of Student
Performance and Attitudes.” Assessment & Evaluation in Higher Education 37 (1): 125–136. doi:10.1
080/02602938.2010.515012.
Hyland, K. L. 2013. “Writing in the University: Education, Knowledge and Reputation.” Language
Teaching 46 (1): 53–70. doi:10.1017/S0261444811000036.
Kaplan, J. J., D. G. Fisher, and N. T. Rogness. 2009. “Lexical Ambiguity in Statistics: What Do Students
Know about the Words Association, Average, Confidence, Random and Spread?” Journal of Statistics
Education 17 (3). doi:10.1080/10691898.2009.11889535.
Mahfoodh, O. H. A. 2017. “I Feel Disappointed”: EFL University Students’ Emotional Responses towards
Teacher Written Feedback.” Assessing Writing 31: 53–72. doi:10.1016/j.asw.2016.07.001.
Melchers, G., and P. Shaw. 2013. World Englishes. New York: Routledge. ISBN: 978 1 444 135 37 4
Meyer, J., and R. Land. 2003. “Threshold Concepts and Troublesome Knowledge: Linkages to Ways of
Thinking and Practising within the Disciplines.” In Improving Student Learning -Ten Years On, 412–424.
Oxford: OCSLD.
Nicol, D., A. Thomson, and C. Breslin. 2014. “Rethinking Feedback Practices in Higher Education: A
Peer Review Perspective.” Assessment & Evaluation in Higher Education 39 (1): 102–122. doi:10.1080
/02602938.2013.795518.
Read, J. 2015. “Defining and Assessing Academic Language Proficiency.” In Assessing English Proficiency
for University Study, 110–136. London: Palgrave Macmillan. doi:10.1057/9781137315694_6.
Richardson, A. M., P. K. Dunn, M. D. Carey, and C. McDonald. 2016. “Ten Simple Rules for Learning
the Language of Statistics.” Proceedings of the 9th Australian Conference on Teaching Statistics,
32–37. Statistical Society of Australia Inc.
Richardson, A. M., P. K. Dunn, and R. Hutchins. 2013. “Identification and Definition of Lexically
Ambiguous Words in Statistics by Tutors and Students.” International Journal of Mathematical
Education in Science and Technology 44 (7): 1007–1019. doi:10.1080/0020739X.2013.830781.
Schoultz, J., R. Säljö, and J. Wyndhamn. 2001. “Conceptual Knowledge in Talk and Text: What Does It
Take to Understand a Science Question?” Instructional Science 29 (3): 213–236. https://www.jstor.
org/stable/41953550. doi:10.1023/A:1017586614763.
Sutton, P. 2012. “Conceptualizing Feedback Literacy: Knowing, Being, and Acting.” Innovations in
Education and Teaching International 49 (1): 31–40. doi:10.1080/14703297.2012.647781.
Wilkinson, R. 2018. “Content and Language Integration at Universities? Collaborative Reflections.”
International Journal of Bilingual Education and Bilingualism 21 (5): 607–615. doi:10.1080/13670050
.2018.1491948.