Professional Documents
Culture Documents
Testing Grammar and Vocabulary
Testing Grammar and Vocabulary
Testing grammar
172
Downloaded from https://www.cambridge.org/core. Western Sydney University Library, on 10 Jan 2019 at 22:58:09, subject to the Cambridge
Core terms of use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9780511732980.014
Testing grammar and vocabulary
Writing specifications
For achievement tests where teaching objectives or the syllabus list the
grammatical structures to be taught, specification of content should be
quite straightforward. When there is no such listing it becomes neces-
sary to infer from textbooks and other teaching materials what struc-
tures are being taught. Specifications for a placement test will normally
include all of the structures identified in this way, as well as, perhaps,
those structures the command of which is taken for granted in even
the lowest classes. For proficiency and diagnostic tests, the van Ek and
Trim publications referred to in the Further reading section, which are
173
Downloaded from https://www.cambridge.org/core. Western Sydney University Library, on 10 Jan 2019 at 22:58:09, subject to the Cambridge
Core terms of use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9780511732980.014
Testing for language teachers
Sampling
This will reflect an attempt to give the test content validity by selecting
widely from the structures specified. It should also take account of what
are regarded for one reason or another as the most important structures.
It should not deliberately concentrate on the structures that happen to
be easiest to test.
Writing items
Whatever techniques are chosen for testing grammar, it is important for
the text of the item to be written in grammatically correct and natural
language. It is surprising how often this is not the case. Two examples I
have to hand from items written by teachers are:
We can’t work with this class because there isn’t enough silence.
and
I want to see the film. The actors play well.
To avoid unnatural language of this kind, I would recommend using
corpus based examples. One readily available source for English is the
British National Corpus sampler on CD.
Four techniques are presented for testing grammar: gap filling, para-
phrase, completion, and multiple choice. Used with imagination, they
should meet just about all our needs. The first three require production
on the part of the candidates, while multiple choice, of course, calls only
for recognition. This difference may be a factor in choosing one tech-
nique rather than another.
Gap filling
Ideally, gap filling items should have just one correct response.
For example: What was most disturbing _______________ that
for the first time in his life Henry was on his own. [was]
Or: The council must do something to improve transport in the
city. ____________, they will lose the next election. [Otherwise]
(Sentence linking can be tested extensively using gap filling)
Or: He arrived late, _____________ was a surprise. [which]
174
Downloaded from https://www.cambridge.org/core. Western Sydney University Library, on 10 Jan 2019 at 22:58:09, subject to the Cambridge
Core terms of use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9780511732980.014
Testing grammar and vocabulary
175
Downloaded from https://www.cambridge.org/core. Western Sydney University Library, on 10 Jan 2019 at 22:58:09, subject to the Cambridge
Core terms of use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9780511732980.014
Testing for language teachers
Part 5
For questions 56–65, read the text below. Use the word given in capitals at the end of
each line to form a word that fits in the space in the same line. There is an example at
the beginning (0). Write your answers on the separate answer sheet.
Example: 0 ability
Paraphrase
Paraphrase items require the student to write a sentence equivalent in
meaning to one that is given. It is helpful to give part of the paraphrase
in order to restrict the students to the grammatical structure being
tested.
176
Downloaded from https://www.cambridge.org/core. Western Sydney University Library, on 10 Jan 2019 at 22:58:09, subject to the Cambridge
Core terms of use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9780511732980.014
Testing grammar and vocabulary
Thus:
1. Testing passive, past continuous form.
When we arrived, a policeman was questioning the bank clerk.
When we arrived, the bank clerk ..............................................
2. Testing present perfect with for.
It is six years since I last saw him.
I ......................................……….. six years.
Completion
This technique can be used to test a variety of structures. Note how the
context in a passage like the following, from the Cambridge First
Certificate in English (FCE) Testpack 1, allows the tester to elicit specific
structures, in this case interrogative forms1.
In the following conversation, the sentences numbered (1) to (6)
have been left incomplete. Complete them suitably. Read the
whole conversation before you begin to answer the question.
(Mr Cole wants a job in Mr Gilbert’s export business. He has
come for an interview.)
Mr Gilbert: Good morning, Mr Cole. Please come in and sit down. Now let me
see. (1) Which school ........................................?
Mr Cole: Whitestone College.
Mr Gilbert: (2) And when ..............................................................?
Mr Cole: In 1972, at the end of the summer term.
Mr Gilbert: (3) And since then what ..............................................?
Mr Cole: I worked in a bank for a year. Then I took my present job, selling
cars. But I would like a change now.
Mr Gilbert: (4) Well, what sort of a job ..............................................?
Mr Cole: I’d really like to work in your Export Department.
Mr Gilbert: That might be a little difficult. What are your qualifications?
(5) I mean what languages ..............................................
besides English?
Mr Cole: Well, only a little French.
Mr Gilbert: That would be a big disadvantage, Mr Cole. (6) Could you tell me
why ..............................................?
Mr Cole: Because I’d like to travel and to meet people from other countries.
Mr Gilbert: I don’t think I can help you at present, Mr Cole. Perhaps you ought
to try a travel agency.
177
Downloaded from https://www.cambridge.org/core. Western Sydney University Library, on 10 Jan 2019 at 22:58:09, subject to the Cambridge
Core terms of use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9780511732980.014
Testing for language teachers
Multiple choice
Reasons for being careful about using multiple choice were given in
Chapter 8. There are times, however, when gap filling will not test what
we want it to test (at least, in my experience). Here is an example where
we want to test epistemic could.
If we have the simple sentence:
They left at seven. They __________ be home by now.
There are obviously too many possibilities for the gap (must, should,
may, could, might, will).
We can add context, having someone reply: Yes, but we can’t count
on it, can we? This removes the possibility of must and will but leaves
the other possibilities.
At this point I would think that I could only test the epistemic use of
could satisfactorily by resorting to multiple choice.
A: They left at seven. They __________ be home by now.
B: Yes, but we can’t count on it, can we?
a. can b. could c. will d. must
I would also use multiple choice when testing discontinuous elements.
A: Poor man, he ………………………… at that for days now.
B: Why doesn’t he give up?
a. was working
b. has been working
c. is working
d. had worked
(Why doesn’t he give up? is added to eliminate the possibility of d being
correct, which might just be possible despite the presence of now.)
Also, all the above non-multiple-choice techniques can be given a
multiple choice structure, but the reader who attempts to write such
items can often expect to have problems in finding suitable distractors.
Moderation of items is of course essential. The checklist included in
Chapter 7 should be helpful in this.
178
Downloaded from https://www.cambridge.org/core. Western Sydney University Library, on 10 Jan 2019 at 22:58:09, subject to the Cambridge
Core terms of use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9780511732980.014
Testing grammar and vocabulary
Testing vocabulary
179
Downloaded from https://www.cambridge.org/core. Western Sydney University Library, on 10 Jan 2019 at 22:58:09, subject to the Cambridge
Core terms of use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9780511732980.014
Testing for language teachers
Writing specifications
How do we specify the vocabulary for an achievement test? If vocabu-
lary is being consciously taught, then presumably all the items thereby
presented to the students should be included in the specifications. To
these we can add all the new items that the students have met in other
activities (reading, listening, etc.). Words should be grouped according
to whether their recognition or their production is required. A subse-
quent step is to group the items in terms of their relative importance.
We have suggested that a vocabulary placement test will be in essence
a proficiency test. The usual way to specify the lexical items that may be
tested in a proficiency test is to make reference to one of the published
word lists that indicate the frequency with which the words have been
found to be used (see Further reading).
Sampling
Words can be grouped according to their frequency and usefulness.
From each of these groups, items can be taken at random, with more
being selected from the groups containing the more frequent and useful
words.
Writing items
Testing recognition ability
This is one testing problem for which multiple choice can be recom-
mended without too many reservations. For one thing, distractors are
usually readily available. For another, there seems unlikely to be any
serious harmful backwash effect, since guessing the meaning of vocabu-
lary items is something that we would probably wish to encourage.
However, the writing of successful items is not without its difficulties.
Items may involve a number of different operations on the part of the
candidates:
Recognise synonyms
Choose the alternative (a, b, c or d) which is closest in meaning
to the word on the left of the page.
gleam a. gather b. shine c. welcome d. clean
The writer of this item has probably chosen the first alternative because
of the word glean. The fourth may have been chosen because of the
similarity of its sound to that of gleam. Whether these distractors would
work as intended would only be discovered through trialling.
180
Downloaded from https://www.cambridge.org/core. Western Sydney University Library, on 10 Jan 2019 at 22:58:09, subject to the Cambridge
Core terms of use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9780511732980.014
Testing grammar and vocabulary
Note that all of the options are words that the candidates are
expected to know. If, for example, welcome were replaced by groyne,
most candidates, recognising that it is the meaning of the stem (gleam)
on which they are being tested, would dismiss groyne immediately.
On the other hand, the item could have a common word as the stem
with four less frequent words as options:
shine a. malm b. gleam c. loam d. snarl
The drawback to doing this is the problem of what distractors to use.
Clearly they should not be too common, otherwise they will not distract.
But even if they are not common, if the test taker knows them, they will
not distract. This suggests that the first method is preferable.
Note that in both items it is the word gleam that is being tested.
Recognise definitions
loathe means a. dislike intensely
b. become seriously ill
c. search carefully
d. look very angry
Note that all of the options are of about the same length. It is said that
test-takers who are uncertain of which option is correct will tend to
choose the one which is noticeably different from the others. If dislike
intensely is to be used as the definition, then the distractors should be
made to resemble it. In this case the writer has included some notion of
intensity in all of the options.
Again the difficult word could be one of the options, although the
concern expressed above about this technique applies here too.
One word that means to dislike intensely is a. growl
b. screech
c. sneer
d. loathe
Thrasher (Internet) believes that vocabulary is best tested in context
and, referring to the first edition of this book, suggests that a better way
to test knowledge of loathe would be:
Bill is someone I loathe.
a. like very much
b. dislike intensely
c. respect
d. fear
For the moment, I leave it to the reader to consider whether the provi-
sion of context makes an improvement.
181
Downloaded from https://www.cambridge.org/core. Western Sydney University Library, on 10 Jan 2019 at 22:58:09, subject to the Cambridge
Core terms of use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9780511732980.014
Testing for language teachers
Note that the context should not itself contain words that the candi-
dates are unlikely to know.
Having now presented an item testing vocabulary in context myself,
I return to Thrasher’s suggested improvement. It could be argued that,
since learners and language users in general normally meet vocabulary
in context, providing context in an item makes the task more authentic
and perhaps results in a more valid measure of the candidate’s ability.
The context may help activate a memory of the word, in the same way
as meeting it when reading in a non-test situation. It may also be said
that there could be some negative backwash when words are presented
in isolation. However, when we test vocabulary by means of multiple
choice, the range of possible distractors will be wider if words are
presented in isolation. In Thrasher’s item, I suspect that the difference in
length between the first two and the second two options would encour-
age candidates who don’t know the word to choose a or b, thereby
increasing the possibility of a correct response by guessing. I have to
admit that I know of no systematic research that has compared test
performance on vocabulary items with and without context.
A ....................................................................
182
Downloaded from https://www.cambridge.org/core. Western Sydney University Library, on 10 Jan 2019 at 22:58:09, subject to the Cambridge
Core terms of use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9780511732980.014
Testing grammar and vocabulary
B ………………………………………...........…
C ....................................................................
D ……………………………………….........….
E ……………………………………….........…..
F ……………………………………..........…….
B
A C
F
D E
183
Downloaded from https://www.cambridge.org/core. Western Sydney University Library, on 10 Jan 2019 at 22:58:09, subject to the Cambridge
Core terms of use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9780511732980.014
Testing for language teachers
Postscript
This chapter should end with a reminder that while grammar and vocab-
ulary contribute to communicative skills, they are rarely to be regarded
as ends in themselves. It is essential that tests should not accord them too
much importance, and so create a backwash effect that undermines the
achievement of the objectives of teaching and learning where these are
communicative in nature.
Reader activities
Construct items to test the following:
● Conditional: If .... had ...., .... would have .... .
● Comparison of equality.
● Relative pronoun whose.
● Past continuous: ... was -ing, when ... .
Which of the techniques suggested in the chapter suits each structure
best? Can you say why?
Can you see anything wrong with the following multiple choice items
taken from tests written by teachers (use the checklist given as Table
1 in Chapter 7)? If so, what? Try to improve them.
a) I said to my friend ‘ ________ be stupid.’
Isn’t Aren’t Didn’t Don’t be
b) What ________ you do, if your car broke down?
must did shall
184
Downloaded from https://www.cambridge.org/core. Western Sydney University Library, on 10 Jan 2019 at 22:58:09, subject to the Cambridge
Core terms of use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9780511732980.014
Testing grammar and vocabulary
Further reading
For a highly detailed taxonomy of notions and functions and their gram-
matical and lexical realisations, see van Ek and Trim (2001a, b and c). I
have also found Collins Cobuild (1992) useful in writing specifications. A
thorough study of vocabulary assessment (going beyond testing) is Read
(2000). It includes methods of assessing both size (breadth) and quality
(depth) of knowledge. Read and Chapelle (2001) proposes a framework
for vocabulary assessment. A new book of word frequencies is Leech et al
(2001). It gives information for spoken and written varieties of English.
West (1953) is a standard word list of high frequency words learners
should know. Collins COBUILD English Language Dictionary and the
Longman Dictionary of Contemporary English mark words according to
their frequency in the language.
185
Downloaded from https://www.cambridge.org/core. Western Sydney University Library, on 10 Jan 2019 at 22:58:09, subject to the Cambridge
Core terms of use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9780511732980.014