Assessment Guidelines March09

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 35

Good practice in

assessing students
University of Plymouth

The most up to date version of this publication is


available electronically at:
http://www.plymouth.ac.uk/pages/view.asp?page=8093

EDaLT
1/1/2009

1
Good practice in assessing students
„Students can escape bad teaching but they cannot escape
bad assessment‟ (Boud, 1995)

How to use this booklet

‘Good practice in assessing students’ has been recently updated to align with the
new Teaching and Learning Strategy and the Assessment Review action plan. This
booklet should be consulted in conjunction with ‘Designing your programme and
modules’ and the University‟s Assessment Policy (links provided at the end of
document).

The booklet is designed for (at least) two groups:

 It is for use by Learning and Teaching in Higher Education (LTHE) course


participants – it underpins workshops on assessment, marking and grading. To
support this process there are a number of activities and directed study tasks
incorporated in the text.

 It may also be used by experienced tutors who are revising their programmes and
modules or who are facing changes in the structure of courses that demand revised
approaches to assessment. The examples in this booklet may suggest useful
coping strategies that are readily adapted to different contexts.

The changing context for assessment


Assessment today presents many challenges:

 Student learning is increasingly driven by their assignments ... in fact they may
learn little beyond the context of assignments.
 Traditional assessment methods – the ubiquitous examination essay for example –
have given way to a plethora of methods which are designed to develop and test
knowledge, understanding and a wide range of skills. These are often unfamiliar to
students and require tutors to adapt their working practices.
 Widening participation means that students have a variety of pre-university
academic experiences and diverse learning styles. Tutors need to be aware of the
implications for assessment.
 Rising numbers of students make it difficult to provide the attention and feedback
that most tutors would like to give and yet, there are pressures to ensure that the
standards and reliability of assessment are maintained and raised.

Assessment is driven by external agendas – in particular the QAA (through its audit and
review processes, the development of a National Qualifications Framework, subject
benchmarks and a Code of Practice for assessment) has been a powerful influence.
There are also internal drivers. The University‟s assessment policy, review and action
plan, and the Teaching and Learning Strategy with its focus on learner-centred
approaches and formative assessment are obvious examples of drivers that influence
assessment practice.

2
Contents Page

Section One: Assessment in modules and programmes: an 4


overview

1.1 Assessment as one element of programme and module design 4

1.2 Constructive alignment - an underpinning philosophy 7

1.3 Assessment principles and purposes 8

1.4 External assessment regulations 9

1.5 Internal Assessment regulations 9

Section Two : Assessment in practice 10

2.1 Devising assessment tasks 10

2.1.1 Selecting formative or summative assessment 10

2.1.2 Matching learning outcomes and assessment tasks 11

2.1.3 Devising appropriate assessment criteria 12

2.1.4 Introducing a variety of assessment tasks 12

2.1.5 Innovative assessment 16

2.2 Preparing students for assessment 21

2.3 Equality, diversity disability and assessment 23

2.4 Marking and feedback 24

2.4.1 The marking load 24

2.4.2 Marking reliably 24

2.4.3 Feedback 24

2.5 Avoiding plagiarism and encouraging accurate referencing 29

2.6 Accrediting work-based and informal learning 29

References and Bibliography 30

Appendices 32

3
Section One - Assessment in modules and programmes: an overview
1.1 Assessment as one element of programme and module design
Choosing how to assess students and undertaking assessment fairly are two of the most
important roles of a university tutor. However, they do not stand on their own. They are
part of the wider process of programme and module design.

Stage One
Planning assessment begins when a programme team devises the programme aims and
learning outcomes and these are incorporated into a programme specification. The
specification should have taken account of external drivers including the subject
benchmarks and any professional accreditation requirements. The range of assessment
opportunities and the rationale for including these should be clearly stated in the
programme specification as an assessment strategy. (This should include the
purposes, principles and procedure for assessments). Amongst other issues, the
assessment strategy should take into account the following:

 Ensuring that a range of assessors are involved throughout the programme to


enhance the robustness of decisions;
 Engaging students in assessment tasks early on in the programme, in order to
enable timely feedback to support future learning;
 Spreading the assessment load across the terms to avoid conflicting deadlines.

In addition, your assessment strategy should reflect the requirements of DDA 2005 to
reduce discrimination barriers and provide equality of opportunity.
In planning an assignment you should refer back to the programme specification
to confirm that you are reflecting the assessment strategy in your plans. You
should not need to refer to the subject benchmark and other external documents –
this should already have been covered in the programme specification.

Stage Two
The programme specification informs the process of module design. Module descriptors
(Definitive Module Record - DMR) include information on the assessment elements
(coursework and examinations), the mode of assessment (essays, reports, displays etc)
and the assessment criteria (indicators of the way the assignments will be assessed).

In planning an assignment you should refer carefully to the DMR to confirm that
you are reflecting the modes of assessment and assessment criteria. You will find
that this information is repeated in the student handbook and therefore students
will already have expectations of the assessment for the module.

Stage Three
The individual tutor uses the information from stages one and two to design the
summative and formative tasks that will contribute to the assignment. At the same
time the assessment criteria should be refined so that they refer directly to the
assignment. Feedback opportunities should also be identified.

In planning the assignment build your ideas into a scheme of work and present
the students with a clear assignment brief explaining what is expected.

The diagrams on pages 5 and 6 develop this process in more detail

4
Stages and influences in the design of
programmes, learning and assessment

professional accreditation consult


devise
requirements national qualifications
Programme Specification
descriptor
consult
national subject identify Programme Aims
benchmarks consult
Levels Descriptors
(SEEC)
write (threshold level) Learning Outcomes
knowledge and understanding (subject specific);
cognitive/intellectual skills (generic);
key/transferable skills (generic),
practical skills (subject specific);
other eg professional/employment related
(matching learning outcomes to programme aims)

use outcomes template to map


Programme outcomes across modules

use information above to


complete headings in
Programme Specification

complete
Definitive Module Record
(DMR)
see DMR diagram and Review
and Approval Handbook
Section 5

write assessment questions

design
teaching, learning and assessment
strategy and activities
(eg in-class, directed study or
course work)

organise sessions aim, intended


learning outcomes, teaching
methods, learning
activities/opportunities

5
from Definitive Module Record (DMR) to assessment
complete DMR
see Review and Approval
Handbook (section 5)
Programme Specification

design module aims &


professional accreditation intended learning outcomes
requirements
ensure alignment
SEEC Levels to external requirements
Descriptors
design
summative assignments,
assessment questions,
assessment criteria,
(from learning outcomes & levels)
& grade criteria

devise
formative assessment tasks
(sub-components which mirror
assignment)

incorporate
tasks into sessions, directed study,
or course work – communicating
links to outcomes and criteria

integrate
tasks and feedback in
scheme of work

create
opportunities for
feedback to students
(self/peer/tutor/computer)

organise
sessions
with opportunities for student uses
learning/formative assessment & feedback in
feedback revision and
preparation
summative assessment
test/assignment/exam

mark or grade

6
1.2 Constructive alignment - an underpinning philosophy

„From the students‟ point of view assessment always defines the actual
curriculum‟. (Ramsden,1992)

Biggs (1999) makes it clear that assessment should be constructively aligned with the
learning outcomes, content and learning activities in a module. He describes a process
called backwash – arguing that assessment determines students‟ learning more than
the actual curriculum. If students focus their learning effort on their assignments then, we
should design these to genuinely capture the types of learning we want to see
happening.

Given this information what might you do if you are designing a module? Where and
when do you „design in‟ the assessment?

Linear approach
Decide on the aims and learning outcomes for
the module

Select the content and the learning activities to


support the learning outcomes

Select an assessment method and questions to test


the content taught

Interactive approach
Decide on aims and outcomes

Select content Select activities

Select appropriate assessment tasks

Assessment principles and purposes Assessment regulations


In the University assessment policy
(available on the Extranet) External: SEEC level descriptors
(see section 1.4)
Internal: Academic regulations
(see section 1.5)

As the lower diagram shows constructive alignment involves more than simply co-
ordinating the learning outcomes with the assignment. There are additional drivers that
are discussed in sections 1.3 to 1.5.

7
1.3 Assessment principles and purposes

The University has adopted nine principles to underpin assessment across all
programmes (see policy at: http://www.plymouth.ac.uk/pages/view.asp?page=8075).
How might they influence your assessment? Would you add any additional principles?

The principles What might you ask about this principle?


Assessment will be Do the results reflect the student‟s performance?
reliable Would a repeat of the assessment produce a similar
result/performance?

Assessment will be Do the assessment tasks actually test what you want the
valid students to know and understand?
Are all the module learning outcomes assessed in one way
or another?

Information about Do all involved (students, examiners, employers)


assessment will be understand the assessment purposes and processes?
explicit and Do students receive clear detailed briefs?
accessible

Assessment will be Are assessment methods, materials and examination


inclusive and processes fair regardless of gender, race, disability, age,
equitable class, wealth and sexuality?

Assessment will be Have assignments been designed to reflect the broader


relevant to the aims of the programme (these in turn will reflect the subject
programme aims and benchmark and the SEEC generic skills)?
outcomes Are assessment criteria appropriate to the level of study?

The amount of Can the work be done in the time available and within
assessed work existing constraints (facilities, numbers etc.)? Are students
required will be over-assessed?
manageable Is the workload for staff achievable while maintaining
standards?

Formative and Are both types of assessment included in the module?


summative Is formative assessment designed and timed so that it helps
assessment will be students to improve their summative assessments?
included in each
programme
Feedback will be an Is feedback precise and detailed enough to guide future
integral part of the learning? Is the language used positive and constructive?
assessment process Is feedback given as soon after the submission as is
possible?

Each programme will Is a variety of methods used to assess?


include a variety of Does the variety selected give all students the opportunity to
assessment types demonstrate their capabilities?

8
1.4 External assessment regulations
There are a range of external regulations that impact on the type of assessment we can
adopt in higher Education. In particular we need to take note of the following:

 All credit bearing courses need to comply with the National Qualifications
Framework (QAA 2004). This defines the levels at which awards are credited and
is intended to produce comparability across the country. The University regulations
for awards and credits are guided by the NQF requirements (University academic
regulations, updated annually) and they influence flexibility in programme design.

 The QAA publishes subject benchmarks which define the knowledge,


understanding and skills expected of a student who completes an Honours degree.
External examiners and auditors expect to see these reflected in assessment.

 The University has chosen to adopt the SEEC level descriptors (see page 36 of
the Assessment Policy) as the basis for defining the expectation of students
working at different levels/stages of their awards. These influence the choice of
learning outcomes and knowledge, understanding and skills assessed in a module.

 Many university programmes are influenced by the requirements of the


professional bodies that accredit them. In particular, the assessment of work
placements may be defined by external criteria. There may be extra requirements
(such as skills development) that influence the assessment methods used.

1.5 Internal assessment regulations


These are annually updated in the University’s academic regulations. They include:

 Definitions of types of assessment (e.g. examination, OSCE, coursework portfolio)


 APEL/APCL procedures
 Exam and assessment offences (e.g. plagiarism)
 Late coursework and extenuating circumstances

The academic regulations are available on the Academic Regulations Sub-committee


community at: http://staff.plymouth.ac.uk//acregsc/AR20089/intranet.htm

The University‟s Assessment Policy is accompanied by an implementation strategy


that defines the basic requirements for all programmes. These are available in full online
at:http://www.plymouth.ac.uk/files/extranet/docs/TLD/assessment%20policy%20updated
%20January%202008v2.pdf

Implementing the assessment policy involves using:

 Level descriptors and grade descriptors. These may be prepared jointly by a


department or faculty but you will need to use them in setting and marking work.
More information about level and grade descriptors can be found on pages 11 and
12 of the University‟s Assessment Policy.

 Assessment criteria linked to the learning outcomes of the module. These may be
shared between modules in a programme but you will often need to „customise‟
them for your particular assignment. More information on assessment criteria can
be found on page 15 of the University‟s Assessment Policy.
9
Section Two Assessment in practice
This section will take you through the process of devising assessment tasks, introducing
them to students, marking work and providing feedback.

2.1 Devising assessment tasks

Assessment has become a hurdle when – in an ideal world - it should focus and
motivate. Assignments should be part of a learning process, not about catching students
out on what they don't know and can't do. Let‟s find out what they can do. This involves
careful thought about the nature of the assignments we devise.

2.1.1 Selecting formative or summative assessment


It makes sense to build activities into a module which give learners opportunities to
develop and practice dealing with the kind of questions and assessment challenges that
may arise later. This is called formative assessment as it contributes to and helps form
and inform the learning. It is part of the formation of the student‟s knowledge and
ability. The formative activities may be very brief and informal (part of a lecture or
tutorial), or they might be short in-class tests or directed study tasks; they always involve
providing feedback.
Summative assessment is what „counts‟ in terms of marks and grades. It „sums up‟ the
learning. Normally first year marks do not count towards classification but allow student
and tutor to monitor strengths and weaknesses. However, the work does count in that it
determines progression. Summative assessment (e.g. exam, essay, coursework) often
gives students feedback and if the learning process is designed to incorporate tasks from
a number of smaller, formative activities it is said to "mirror" previous formative
assessment activities. This enhances opportunities for students to understand
assessment requirements and practice skills.
Students will be much more motivated to do formative work - even if it doesn‟t count - if
they can see the connection between what they are doing and their ability to perform well
and gain marks subsequently. The module should therefore be designed to show where
these formative experiences will occur and how they relate to the learning. These
opportunities should be clear in the scheme of work.
For example, if the summative examination asks students to, „Critically analyse a set of
data; extrapolate from these to speculate on future implications and make a
recommendations with justification‟ then formative activities and tests throughout the
module could reflect these skills, which would be brought together for that final task
e.g.
Task 1 (in class) „Classify given sets of data.‟ Formative assessment through model
answer on OHT in worked through in class by tutor or students; tutor commenting on
strengths and weaknesses)
Task 2 (Directed Study) „Compare and contrast these data to establish cause and effect
using examples to substantiate their argument‟. In the in next session, students swap
their work; tutor gathers responses and comments on how marks would be won or lost
for these).
Task 3 (in class) „In groups, select one of the causal factors and explore trends in these
areas, then make a poster or short presentation to explain it.‟ Or coursework could be
prepared by the groups between sessions and be peer assessed (see peer assessment
10
grid) in a subsequent session. It may be used summatively to award marks and / or
formatively to give feedback on the presentations.
Task 4 (in class) „Discuss and evaluate suggested strategies.‟ The posters could be
used as a basis for discussion in a tutorial. Ideas, implications and suggestions for
dealing with issues could be shared and then the tutor and/or the group could critique
the strategies. Alternatively, assessment (by tutor sitting outside the discussion circle)
could be used to track and give feedback to students on their contribution to the
discussion and on the kinds of recommendations that would be viable.
Obviously the examination would use different data, but the process will be transferable
and achievable.

2.1.2 Matching learning outcomes and assessment tasks

Writing learning outcomes is covered in the companion to this booklet – „Designing your
programme and modules‟. However, it is important that assessment tasks are devised to
link to the level and wording of the outcomes. One important aspect of this is to find the
right words to define exactly what students should do (it is important to share the
meaning of the words with students as well). The words included below are derived from
Bloom‟s Taxonomy (1956).

Knowledge
describe, recall, define, state, recognise, name, list, underline, reproduce, measure,
write, label, identify, acquire, record
Understanding Comprehension
comprehend, understand, draw, interpolate, extrapolate, predict, have insight into,
translate, give examples of…
Application
apply, show, demonstrate, perform, use, relate, develop, transfer, infer, construct,
translate, illustrate, experiment, refine
Analysis
analyse, identify, detect, distinguish, separate, compare, categorise, investigate,
seek out, explore
Synthesis
combine, restate, summarise, précis, generalise, conclude, derive, organise, design,
deduce, classify, formulate, propose, visualise, solve, realise
Evaluation
evaluate, judge, decide, argue, choose, recommend, critically analyse, select,
defend, assess, self assess, hypothesise, review
Skills
communicate, present, work in team, debate, collaborate, negotiate, reflect on,
assess, resolve, plan, perform

11
2.1.3 Devising appropriate assessment criteria

Using assessment criteria implies that criterion referencing rather than norm referencing
is being adopted for assessment. It is important to distinguish between the two. Moon
(2002) suggests that „in a criterion referencing system, the judgement of the learner‟s
work is made on the basis of its quality in relation to pre-defined criteria while a norm
referencing system is based on a pre arranged distribution of grades‟. The other
important difference is that norm referencing involves judging work by comparison
between individuals while criterion referencing should (but does not always) involve
judgements being made independently about each individual (Biggs, 1999; Race, 2001)

There are two types of assessment criteria used widely in higher education: grade
criteria (see Insight 2, page 12 of the Assessment Policy) and threshold criteria (see
Insight 3, page 13 of the Assessment Policy). There are many ways of writing
assessment criteria and these are often linked to traditions in the subject or discipline. A
thorough discussion of the issues and plenty of good examples of both grade and
threshold criteria can be found in Jenny Moon‟s Module and Programme Development
Handbook (2002).

It is important to remember that different levels of achievement are expected at each


stage of the degree award. This has led to some Universities requiring that the grade
descriptors are constructed for each stage of the degree. The level descriptors are used
as a guide to expectations. Whilst this is not required at Plymouth it is worth considering
how you differentiate between criteria at each stage.

2.1.4 Introducing a variety of assessment tasks

The most effective way of stimulating interest in assessment is to offer students a wide
variety of types of assessment. This ensures that students with a diverse range of
learning styles are catered for and that accessibility is assured for students with
disabilities (see Insight 5, pages 21-23 of the Assessment Policy). Programme teams
should review, from time to time, their assessment processes and consider the value of
formal examinations, as opposed to alternative forms of assessment. The variety of
assessment methods available is enormous. Of course, not all types will suit all subjects
and it is important to evaluate the advantages and disadvantages of adopting particular
methods. The list provided overleaf separates assessment methods into a range of
different categories, and the subsequent table cites some advantages and
disadvantages of different approaches. An excellent chapter in Race (2001) provides a
very readable analysis of the top fifteen assessment methods and tips are provided on
how to avoid the most common problems with each. Use Race to help you to complete
the table on page 15.

12
Varying Assessment

Contents page from Gibbs, G., Habeshaw, S., Habeshaw, T. (1988), 53 Interesting Ways
to Assess Your Students, TES Ltd.

Essays Assessing Practical and Project Work


1. Standard Essay 29. Viva
2. Role Play Essay 30. Crits
3. Structured Essay 31. Observation
4. Interpretation of Evidence
5. Design Assessing Group Project Work
6. Note-Form Essays 32. Shared Group Grade
7. Hypothesis Formation 33. Peer Assessment of Contribution to
Group
Objective Tests 34. Second Marker's Sheet
8. Right/Wrong 35. Exhibition
9. Short Answer 36. Diaries and Log Books
10. Completion 37. Project Exam
11. True/False
38. The Instant Lab Report
12. Matching
39. Laboratory Notes
13. Multiple Choice
14. Multiple Completion
Criteria
15. Assertion/Reason
40. Criteria for Students
16. Best Answer
41. Project Criteria
42. Negotiating Criteria
Alternative Exams
43. Marking Schemes
17. Seen Exam
44. Staff Marking Exercise
18. 168-Hour Exam 45. Profiles
19. Revealed Exam Questions 46. Hidden Criteria
20. Open Book Exam 47. Criterion Referenced Assessment
21. "Doing It" Exam 48. Pass/Fail

Computer Based Assessment Feedback to Students


22. Computer Marking 49. Teach-Test
23. Computer Generated Test Papers 50. SAQs
24. Computer Generated Problems 51. Feedback Classroom
25. Computer Feedback to Students 52. Student Requests for Feedback
26. Computer Based Keller Plan 53. Feedback Checklists
27. Assessed Computer Simulations
28. Computer Marked Practicals

13
Comparison of advantages and disadvantages of different types of test
Type of Some possible advantages Some possible disadvantages
Assessment
Prepared essay Easy to set. Unreliable and time consuming to
exam Demonstrates ability to organise ideas and express them mark.
effectively. Emphasis of writing speed.
Poor coverage of syllabus.

Open book Less study time spent memorising. Poor feedback.


essay exams Inter-disciplinary answers obtained. No valid method of marking.
A typical performance.

Problem-centred Memorising less necessary. More realistic test of ability. Anxiety of situation can disrupt
or case study problem-solving skill.
exam
Short answer Broader coverage of syllabus. More reliable marking. Little opportunity to display argument or
questions originality.

Projects, Can take up too much of student's time. Objectivity in marking difficult.
dissertations & Grading almost meaningless.
thesis
Objective tests A wide range of objectives. Difficult to set to avoid ambiguous
including MCQs Broad coverage of syllabus. questions.
Objective and reliable marking. Require careful presentation to avoid
too many questions testing only recall
Precise feedback.
Provide cues that do not exist in
Computer marking possible. practice.

Oral situations, Provide direct personal contact with candidates. Lack standardisation.
eg viva or group Provide opportunity to take mitigating circumstances into Lack objectivity and reproducibility of
discussion account. results.
Provide flexibility in moving from candidate's strong points Permit favouritism and possible abuse
to weak areas. of the personal contact.
Require the candidate to formulate his/her own replies Suffer from undue influence or
without cues. irrelevant factors.
Provide opportunity to question candidate about how Suffer from undue influence of
he/she arrived at the answer. irrelevant factors.
Provide opportunity for simultaneous assessment by two Suffer from shortage of trained
examiners. examiners to administer the
Unfortunately all these advantages are rarely used in examination.
practice. Are excessively costly in terms of
professional time in relation to the
limited value of the information ii yields.
Simulated tasks Closely approximates to professional work. Careful preparation of marker's
checklist necessary.

Practicals Provide opportunity to test in a realistic setting skills Lack standardised conditions in
involving all the senses while the examiner observes and laboratory experiments using animals,
checks performance. in surveys in the community or in
Provide opportunity to confront the candidate with problems bedside examinations with patients of
he/she has not met before both in the laboratory and at the varying degree of cooperativeness.
bedside, to test his/her investigative ability as opposed to Lack objectivity and suffer from
his ability to apply ready-made "recipes". intrusion or irrelevant factors.
Provide opportunity to observe and test attitudes and Are of limited feasibility for large
responsiveness to a complex situation (video recording). groups.
Provide opportunity to test the ability to communicate under Entail difficulties in arranging for
pressure, to discriminate between important and trivial examiners to observe candidates
issues, to arrange the data in a final form. demonstrating the skills to be tested.

14
Innovative Techniques table

technique saves more fun more fun increased more deeper


tutor for tutor for skills feedback learning
time to mark student

Group work
Computer based assessment
Case studies
Proportional marking
Peer assessment
Self assessment
Games/simulations
Posters
Diary of journal/log
Role play essays
Word constrained
assignments
Design
Open book exams
Observations
Portfolios
Project
Project exam
Lab notes
Pass/fail

traditional techniques

Standard essay
Short essay exams
Lab reports

15
2.1.5 Innovative Assessment

Self and peer assessment


Peer assessment and self assessment may well offer ways of reducing the marking load
on tutors. Use it initially for assessment that „doesn‟t count‟. It is particularly appropriate
for formative assessment and can help prepare students better for summative
assessment. When students are sufficiently reassured about the validity of these
processes they can be applied to summative assessment. The tutor role becomes more
of an arbitrator if difficulties arise and moderator, sampling for reliability.
The tutor may hand out answer sheets for the learners to mark themselves (self
assessment) or swap with another person for marking (peer assessment). Attachment
sheets are particularly useful to guide learners in marking themselves. They have the
added benefit that students must engage with the task and criteria to mark someone
else's work and if they also have to give feedback to another it ensures they have
internalised the material / task / criteria, thus consolidating the learning.

Self Assessment
Being able to assess themselves is an essential skill for students to develop awareness
of their own learning as well as for their future continuing professional development and
lifelong learning. It is the first step to deep learning. It needs a structure, eg cycle of
reflection (SWOTs, learning plans etc.) and should be linked to Personal Development
Planning (PDPs). You can help learners by producing self assessment sheets along with
the assignment briefing, which help them focus on the criteria and their own needs. It
could form a small input to marking once students feel convinced by the process.

Peer assessment
This process develops judgement especially of critical thinking and makes students
engage with the criteria in order to be able to allocate the marks. It is particularly
effective for presentations, performance and posters. If the criteria are negotiated with
the students it creates a feeling of ownership by the individual and the group; they
understand why the criteria are there and what they mean. They will also turn up and pay
attention to and learn from others‟ presentations. (NB SCL aims to encourage students
to learn collaboratively). You can reserve the right to insist on certain criteria if they don‟t
come up with them and you can agree that you will mark some of the aspects (eg
content) while they mark others (e.g. quality of presentation / ability to relate to
audience.)

Combined tutor, peer and self assessment (triadic assessment)


This demands front-loading, good briefing and a rationale. Tutor guidance and support is
important and you may need to act as a referee.

Level implications
 Use with 1st years' to develop judgement, (1st year marks don't count towards the
award); use at Level 2 with care but only use at level 3 (when marks have a large
impact on degree classification) if you have students‟ total backing.
 use with postgraduate work
 use formatively initially until students feel convinced of its validity.

16
A peer-assessment grid

A B C D E F G H
Criteria Weight
1
2
3
4
5
6
7
8

Total
An example of a grid which can be used by each learner in a group, where they all peer-assess eight
examples of work (eg, presentations, posters) against a set of agreed criteria.

A self-assessment grid

Agreed Criteria weight score Comments

Idiosyncratic Criteria

Total
An example of a grid which learners can use to self-assess their own work against some agreed
criteria, with space for individual learners to identify additional criteria with particular relevance to
their individual approaches to the task.
From Race.P, and Brown.S, (1991), 500 Tips for Tutors

17
Group assessment
Group work is essential to develop an understanding of and ability to work in teams. It is
a skill crucial to employment and is therefore an important element of the curriculum.
The fact that it can also help you reduce marking time is an added bonus. For example,
if you have a group of 120 students producing projects it might take you (say) 60hours to
mark them. If they do the projects in groups of four, it will take you … well not just 15
hours to mark, but closer to that. It's not a proportional reduction of time because the
projects will be larger and therefore take longer to mark and give feedback. It's easy if
students decide they want to share the marks equally between group members – but
usually they only do so once, as the better / harder working students see their marks
reduced by lazier members. Although this does reflect professional life it is generally felt
to be unfair at undergraduate level. A system to allocate marks differentially between
group members, on the basis of their input, can only be done by the group members
themselves. Ask them who contributed most to the group using a peer appraisal report –
the form below works reasonably effectively. However it must be done privately and
submitted in confidence or done in class in exam conditions. It also involves you in
translating their responses into higher or lower marks according to their contribution.

Peer appraisal form for group project

Module/Unit . . . . . . . . . . . . Date . . . . . . . . . . . . . Your name . . . . . . . . . . . . . . . . . . . . . . . . .


Please complete one matrix for each of your colleagues in the group that you have worked with
by placing a tick in the most appropriate box for each of the criteria. Do not complete a box for
yourself. This information is confidential, so put your completed Report Form in the box provided
(or return it later to the member of staff involved in the assessment).

Name . . . . . . . . . . . . . . . . . . . . . . .

Group work Skills Project Skills


LOW HIGH LOW HIGH
Time & work Research &
contributed analysis
Communication Initial ideas
skills
Co-operativeness Development of
ideas
Creative Synthesis
input/originality
Overall contribution Final presentation

18
Computer Assisted Assessment (CAA) is efficient for either formative or summative
assessment, but requires considerable time to come up with an adequate range of
questions. The University has a site licence for the software "Question Mark
Perception". The student logs on at a computer terminal and completes the test on
screen.

It is excellent for formative learning as it offers instant feedback to students who can go
through the tests as often as they want.

Advantages Disadvantages
 relatively wide variety of question  time consuming to design good
types, including multi-media materials questions and feedback
 consistent and easy to update at any  needs computer for each student to
time sit at
 instant feedback  needs organisation
 very good formative assessment as  questions and the feedback
feedback is incorporated responses are harder to write
 good for diagnostic tests and
focussed feedback
 flexibility of
access/time/location/repetitions
 can randomise question order
 can be incorporated with distance
learning
 saves time and paper

Types of question

 multiple choice, (true/false, two tier,  numeric


assertion reason)  selection
 multiple response  matrix
 hot spot  explanation "questions"
 fill in blanks/completion
 best answer
 text matching

See examples from Gibbs, G.. "Teaching More Students 4: Assessing More Students."
PCfC

19
Examples of types of questions for computer based assessment
Short answer

Q. Name three of the "first generation" New Towns that were designated in Great Britain
between 1946 and 1950:
1. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
3. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Completion

Q. Central Place Theory was originally formulated by . . . . . . . . . . . . . . . in Germany in


the 1930s.

True/False
1. Modern crofting was founded by the 1886 Crofters' Holdings Act.
TRUE/FALSE
2. The run-rig system was introduced by Jethro Tull TRUE/FALSE
3. Kelping was widely practised in the interior parts of Highland Scotland. TRUE/FALSE

Matching

Q. Match each of the writers in List Y with one of the books in List X by filling in the
boxes below the lists. Do not use any of the boxes more than once.
List X List Y
1. Towards a New Architecture A. Herman Muthesius
2. Yesterday: A Peaceful Path to B. Lewis Mumford
Real Reform C. Patrick Geddes
3. When Democracy Builds D. Frank Lloyd Wright
4. The Culture of Cities E. Ebenezer Howard
5. The English House F. Walter Gropius
G. Le Corbusier
List X 1 2 3 4 5
List Y

Multiple Choice

Q. In a test with a mean of 100 and a standard deviation of 12, a raw score of 124 is
equal to a standard score of:
A +24
B -2
C +2
D 84%
E 124%

20
Best Answer
Q. What does the term "teleological" mean when applied to the early attempts of
geographers to study the relationship between people and of the environment?
1. the view that the earth has been designed for human purposes by a supreme being
2. the argument that human beings are an integral part of their environment
3. that people are at the mercy of their physical environment
4. that people should always seek to change their environment to suit their own ends
5. that the environment
6. never changes

The above are examples of six types of question illustrated in Teaching Geography in
Higher Education (Gold et al, 1991).

2.2 Preparing students for assessment

One of the more frequent complaints made by students about their academic experiences
(for example in module evaluation forms) is that they are insufficiently prepared for
assessment tasks. Sometimes this means that they were given insufficient notice.
Coursework assignments should always be notified (e.g. submission dates; nature of
assignment) at the beginning of the module and the details should be distributed at least
four weeks before the submission date. For more complicated assessment students will
need much longer than four weeks. However, in most cases students are concerned that
they are being given insufficient information about the assignment to allow them to work with
confidence. This wastes time as individual students approach tutors for clarification; it also
leads to disappointing results and sometimes to student appeals.

The simplest and most effective way to provide assessment information is to use an
assessment briefing:
 Produce a briefing sheet (see Insight 4, page 18 of the Assessment Policy) and hand it
out in a session. Make the assignment as clear and unambiguous as possible. Include
guidance notes about any structure or styles that should be used. Include a maximum
word count (we all work to word limits and keeping it low makes students learn to be
succinct and saves you reading time).
 The better the briefing the better focused the students will be and more able to produce
what you want them to show they have learnt. This will reduce the need for lengthy
feedback and save some marking time. This is sometimes called „Front-Ending or Front-
Loading‟. It is worth explaining to students that advice or feedback delivered before or
during assessment is equally valuable for learning if followed.
 Talk it through with them in a session. Anticipate the kinds of things they might get
wrong and give advice to help them avoid these pitfalls. Get them to note anything they
feel unclear about, then to discuss with one or two people around them and see if they
can sort out each other‟s queries. Allow them to ask you for any points that are still
unclear. Answer them but make it clear that that is all the advice they will get. Do not
respond to further queries in the corridor or your office or on email. There will always be
some who ask for more and more detail. This gives them an unfair advantage and eats
up your time.

21
An example of an assignment briefing template is provided below. However, before using
this template check whether there is a standard version in use in your department. This may
save you time if the departmental version includes standard information e.g. arrangements
for submission of coursework, arrangements for claiming extenuating circumstances.

Assignment Briefing

Learning outcomes addressed by this


Module code assignment (including generic key
skills)
Module title

Module leader

Tutor responsible for this assignment

The requirements for this assignment Assessment criteria


(title, practical details, topic for (refer to general programme criteria,
presentation or performance) where they can be found and/or give
specific criteria for this assignment)

Type and weighting of assignment Special Instructions


(resources available, timetable for
(coursework/exam) presentations etc)

(size/length/time limit)

(weighting)

Submission and return of work Accessibility statement


(submission deadline; arrangements for (special arrangements for accessibility)
submission)

(standard arrangements for extenuating


circumstances)

(nature of feedback arrangements and


approximate timetable for return of work)

22
There are some general assignment issues that should be raised with new students to
induct them into academic practices. The most important relates to plagiarism and other
types of academic dishonesty (see separate document, „Plagiarism: A briefing‟). It is
worth spending part of a seminar or lecture on these issues.

2.3 Equality, diversity disability and assessment

Inclusive assessment needs to address a number of issues including gender, race and
disability. It is now widely recognised that there may be a range of different ways of
assessing the same learning outcomes, and that there may be some benefits in
increasing the flexibility of assessment modes offered to students. When approving new
programmes, faculties are required to „consider the range of diverse learners and the
removal of barriers to equality of opportunity, as defined by legislation with particular
regard to disability, race and gender‟ (See „Approval of a New Programme‟ in the Quality
Assurance Handbook). Specific guidance with respect to disability from the Academic
Regulations Sub-Committee states that:

“In line with the positive duties required by the Disability Discrimination Act 2005 it is the
intention of the University that students should not be disadvantaged when being
assessed, whether due to a disability, impairment or other temporary injury or condition.”

It suggests that, where necessary, students should be offered one of the following
options:

a) Alternative assessment – which enables them to meet the learning outcomes via
a different approach (for example by Viva Voce rather than examination or written
assignment)

b) Modified assessment provision (MAP) – for example, extra time, separate room or
use of a computer in examinations

c) Inclusive assessment – whereby a flexible range of assessment mode is offered


to all students

You are encouraged to design assessments which are inclusive wherever possible,
rather than taking a compensatory approach for individual students. Particular issues
relating to disabled students are outlined in detail in the University Assessment Policy,
pages 19-23. Further information is also available from the Academic Regulations Sub-
committee community and detailed guidance on designing inclusive assessments is
available on the SPACE project pages (Staff-Student Partnership for Assessment
Change and Evaluation): http://www.plymouth.ac.uk/pages/view.asp?page=10494

23
2.4 Marking and feedback

2.4.1 The marking load

Marking is one of the most time consuming roles undertaken by a tutor. Try this
calculation to work out the marking load you will carry this year.
With the large number of students on many courses the implications for time taken for
marking are considerable. If you need convincing complete this exercise for a course
you deal with.
Coursework
No. of students X time taken to mark one = hours
Examination or other formal assessment
No. of students X time taken to mark one = hours

Compare this to teaching


Time taken face to face teaching
+ Time taken supporting teaching = total teaching hours

If this does not cause you to consider you assessment strategies carefully – you are
probably very lucky.

2.4.2 Marking reliably

A headline in the Times Higher (28th June 2002) highlighted the results of some research
undertaken with ESRC funding. Entitled ‟ Marking under fire as essay grades exposed as
a lottery‟, it revealed that four essays were sent to 100 lecturers at 24 Universities – 50%
agreed on the marks for the essays while the rest awarded widely divergent marks. This
experiment is often tried within an institution and the marks awarded and the ranking of
assignments diverge remarkably (search the Times Higher Education archives at
http://www.timeshighereducation.co.uk/ for more recent examples).

Mark the Concorde essays (Appendix 1) with a group of colleagues to test this
idea yourselves.
2.4.3 Feedback
All learners need feedback to help them understand and improve. Thoughtfully designed
feedback can greatly enhance motivation; however, feedback is almost invariably one of
the lowest rated aspects of the National Student Survey (NSS). Ill-considered feedback
de-motivates and may result in an inappropriate focus in further work. The tone and
balance of feedback is important, and it needs to be accessible to a range of learners
(unambiguous, accessible to sensory impaired students, perhaps with bullet points, and
always in plain English). Use the „feedback sandwich‟ i.e. start with the positive (there‟s
always something); then address the problems; conclude with something encouraging.
Choosing the best way to give good and bad news is important.
Consider the Concorde essay feedback (Appendix 1) with a group of colleagues to
focus on appropriate feedback.

24
Good News Bad news
What and why? What and why?
Students need to know what they've done Students need to know what they've done
right, or well so that they'll keep on doing it wrong, or poorly, or whether they have
right or well. Also because it will make them performed in some other way which is
feel appropriately good about themselves and inappropriate within the subject. Immediately
their work, which in itself aids learning as well and always they need to know in what
as feeling good. respects their work was wrong or poor or
Students need to know why their work was inappropriate. They also need suggestions
right or good. Learners sometimes do well by on ways in which it could have been correct
accident - so tell them in what respects it was or better.
right or good. In numerical or scientific disciplines, where at
Giving good news least some of the answers to some of the
Good news needs to be: questions can be right or wrong, reasons for
 clear - Don't beat about the bush. If you giving prompt and reasoned feedback on
think it was "great" or "excellent" or wrong answers include:
"admirable" or "very stimulating", then say  so that learners won't repeat the specific
so. Have the courage of your convictions. error
(Don't worry about clichés!).  so that they can identify the
 specific - Words like "great" or "excellent" misunderstanding which led to the error
carry a strong emotional message, but
 so that they can develop a new and
when the emotional buzz fades, the
correct understanding.
intellectual hunger remains. Say what,
exactly what, was good and say why it was In disciplines where answers are more likely
good. to be considered good or bad rather than
 personal - Make the person feel primarily right or wrong, reasons for giving
acknowledged as an individual. This will this kind of feedback on poor answers
become easier as you get to know your include:
students. Names helps e.g. „Sarah, I  to help the learner appreciate why their
thought the way you handled this was both approach or answer was inappropriate
valid and original. I particularly liked the  to help them see the preferred approach
way you …‟
 to help them see why the preferred
 honest - Honest good news clearly
approach is preferred.
distinguishes between fact and judgement.
Giving bad news
Be clear what the nature of your good news is:
Bad news needs to be:
A numerical answer is "right" = fact.  specific - Make it clear what you are
A design was undertaken "rigorously" = reacting to - which word, which idea, which
opinion, though based on clear criteria for equation; which stylistic feature. Explain in
rigour. what respects the work is wrong/ weak/
An argument was "original" = fact, at any rate inappropriate.
relative to your own current knowledge.  constructive - Suggest how the work
An argument was "elegant" = opinion, or a could have been made accurate, good, to
judgement. conform to the paradigm of the subject,
whatever. Suggest sources of information
Encouragement and guidance. Give the student a handle,
Round off your feedback with a high note and encouragement, whatever seems right.
encouragement.  kind - Specific is kind. Constructive is
"You really seem to be getting to grips with kind. "Poor" scribbled at the bottom is
this." "Your analytic skills are improving cruel.
steadily."  honest - See "Good news".
"You're making good use of evidence."

Based on Baume & Baume (1996) Assessing Students’ Work, in the Learning to Teach
series OCSD

25
Making feedback meaningful and effective

 Be clear about what feedback students can expect and when. Then stick to it! Explain
that one-to-one discussion when working in the lab, in the field, or on a project is also
feedback.
 Be specific in your feedback – merely underlining words, or putting a question mark in
the margin is unlikely to help students learn from their mistakes. Clearly link
comments to stated criteria.
 Ask students to tell you which things they want feedback on then you can concentrate
on these aspects.
 Ask students to hand in their work along with a self assessment attachment sheet.
You can congratulate them on areas of agreement with your own judgements and
focus the feedback on those areas where you differ and explain why you do.
 Get students to hand in drafts or essay plans and give brief feedback on these thus
reducing extended feedback later if they go off at a tangent. It also makes them plan!
 Give feedback as soon as you can, better less but earlier, than more but later.
 Explicitly provide „feed-forward‟, i.e. feedback which will help students to improve
future work.
 E-mail feedback to students where possible to increase the chances of them reading
it; group feedback can be useful if many students made similar mistakes, but only as
an addition to individual feedback not as an alternative.
 Bring the group of student together to discuss and explain feedback – this will give
you a better understanding of the ways in which students interpret your comments (it
may not be what you expect!)
 Consider giving feedback through video or audio-recordings (sometimes thought to
be a more personalised approach).
 Vary the weighting of particular criteria across assessments to direct students
energies, e.g.:
 in level 1 essays, allow a good proportion of marks for structure and referencing.
However in later work by which time these skills should be internalised these
criteria will get fewer marks.
 encourage what you value – if you want an original viewpoint – allocate marks for
that and communicate it convincingly in the briefing.

If you have read what Gibbs & Simpson (2002) have found out about feedback you
may find it a little depressing. Is it worth the effort? They suggest five ‘feedback
conditions’ that will lead to better learning .... to what extent can you translate
these conditions into immediate personal practice?

26
Attachment sheets
An attachment Sheet or Feedback Sheet, is a page that shows the criteria and perhaps
the weighting. You attach one to each piece of work before it is returned to the student
having indicated how the student has done on each criterion. There is usually also a
space for comment. It enables individual feedback to be brief and to the point. You might
also write a summary of the areas most of the cohort did well on; address those that
most struggled with and how those areas could be improved. This saves you writing the
same things many times and helps students see their own performance in perspective.
Example 1 Essay feedback sheet

Knowledge
Text deep, thorough,
detailed knowledge  superficial knowledge

Author wide knowledge used in analysis  knowledge lacking or not used
Genre wide knowledge used in analysis  knowledge lacking or not used
Historical and
social context
wide knowledge used in analysis  knowledge lacking or not used

Essay
Structure clear, logical structure  confused list
Quotations correct, purposeful use
properly referenced  references lacking or incorrect

Other sources wide range, relevant


properly referenced
 none or irrelevant

Grammar, spelling correct  many errors

Personal
Response to text vivid, personal  no response
Viewpoint clearly expressed or unoriginal  viewpoint lacking
Other sources imaginative, surprising  predictable

Critical theory
Understanding clear grasp  no grasp
Viewpoint wide range appropriately used  range limited
inappropriately used

Example 2 Seminar Assessment Grid (all marks are moderated)


Name
Individual Performance Clarity/quality of spoken presentation/10
60% weighting Clarity/quality of visual aids/10
Relevance of information to overall topic/10
Individual timekeeping/10
Individual total/40
Group Performance Question answered/covered in sufficient depth?/10
28% weighting Group timekeeping/10
Range of information sources cited/10
Overall quality/clarity of speakers and visual aids/10
Group total/40
12% weighting Peer assessment/10
Overall score (%)

27
Example 3
A feedback sheet for student to self evaluate before tutor comments

University of Plymouth name


Exeter School of Arts and Design semester
BA Hons Design module code
title
assessment profile credits

Please evaluate your performance throughout


the module on a scale of 0 (fail) to 5 (excellent).
You are invited to comment on your
performance below each criterion. You should
refer to your Second Year Assessment ring one figure only Please leave this column blank
Procedure notes for guidance when doing this. for each criterion for comments and marking by
the module tutor

Comprehension 0 1 2 3 4 5

Research and Preparation 0 1 2 3 4 5

Visualisation 0 1 2 3 4 5

Realisation and Solution 0 1 2 3 4 5

Presentation 0 1 2 3 4 5

Review and Context 0 1 2 3 4 5

You are invited to make additional comments overleaf

28
2.5 Avoiding plagiarism and encouraging accurate referencing

Early in the first semester students will need some support to help them learn what we
mean by „academic practice‟. They may come from backgrounds where they have had
almost no preparation for using the work of others to inform their own work. In some
cases they will come from cultures that positively encourage using another author‟s
work.

It is not enough to tell students that they should adopt sound academic practices or to
write about this in a handbook. Practice and the use of examples is needed to ensure
that students become competent academic writers and can avoid plagiarism.

Some ideas for helping students to get it right:

 Use the activities in the „Plagiarism: a briefing‟ document to introduce your students
to the meaning of plagiarism and to help them understand the penalties that are
applied when plagiarism is detected. Do this around the time when their first written
assignment is due to be handed in. Use this briefing to help you to design
assessment that is unlikely to lead to plagiarism

 Refer students to the guidance in programme handbooks for accurate referencing.


Make accurate referencing an explicit assessment criterion and ensure that it
carries marks in assignments.

 Suggest that students check each others‟ work for accurate referencing

 Give out a reference list that is incorrectly referenced – include the most common
errors – get the students to correct the list and then compare their results with
someone else.

2.6 Accrediting work-based learning and informal learning

There has been increasing interest in the HE sector in accrediting work based and
placement learning as these activities are becoming integrated into the modular structure
of degree programmes. Previously, accreditation of work-based learning was restricted
to programmes that were regulated by professional bodies. An increasing recognition of
the value of work-based learning to academic performance and employability, combined
with innovative ways of tracking and assessing learning in the workplace have opened
the way to accrediting non-professional programmes. The next few years will see a
growth in flexible ways of assessing work based and placement learning for academic
credit across the university.

29
References and bibliography
Biggs, J. (1999) Teaching for quality learning at University, Buckingham: Open University
Press/SRHE

Brown, S. & Knight, P. (1994) Assessing Learners in Higher Education. London: Kogan
Page.

Brown, S. Rust, C. & Gibbs, G. (1994) Strategies for Diversifying Assessment in HE.
Oxford Centre for Staff Development. Oxford: Rowley Press.

Brown, S. Race, P. & Smith, B. (1996) 500 Tips on Assessment. London: Kogan Page.

Brown, S., Race, P. and Bull, J. (eds) (1999) Computer Assisted Assessment in Higher
Education. Kogan Page, London.

Bull, J. & McKenna, C. (2003) A Blueprint for Computer-Assisted Assessment.


Routledge, Oxfordshire

Chanock, K. (2000). Comments on essays: do students understand what tutors write?


Teaching in Higher Education. 5 (1), 95-105.

Gibbs, G. (1995) Assessing Student Centred Courses, Oxford: Rowley Press.

Habeshaw, S., Gibbs, G. & Habeshaw, T. (1995) "53 interesting ways to assess your
students." Technical and Educational Services Limited. Melksham: Cromwell Press.

Handley, K. and Cox, B. (2007) 'Beyond model answers: learners' perceptions of self-
assessment materials in e-learning applications', ALT-J: 15 (1): 21-36.

Hols-Elders, W., Boemendaal, P. Bos, N., Quaak, M., Sijstermans, R. & De Jong, P.
(2008) Twelve tips for computer-based assessment in medical education. Medical
Teacher, 30: 673-678

McDonald, B. & Boud, D. (2003). The impact of self-assessment on achievement: the


effects of self-assessment training on performance in external examinations.
Assessment in Education. 10 (2), 209-220

Moon, J. (2002) The module and programme development handbook. London: Kogan
Page.

Mutch & Brown (2001) Guide for Heads of Departments. York: LTSN

Orsmond, P., Merry, S. & Reiling, K. (2002). The use of formative feedback when using
student derived marking criteria in peer and self-assessment. Assessment & Evaluation
in Higher Education. 27 (4), 309-323.

Price, M., O‟Donovan, B & Rust, C. (2007) Putting a social-constructivist assessment


model into practice: building the feedback loop into the assessment process through
peer-feedback. Innovations in Education and Teaching International 44(2) 143-52

Quality Assurance Agency for Higher Education (2000) National Qualifications


Framework Gloucester: QAA.

30
Quality Assurance Agency for Higher Education (May 2000) Code of Practice for the
assurance of academic quality and standards in higher education. Section 6:
Assessment of students. Gloucester: QAA.

Race, P. (1995) The art of assessing Part 1 & Part 2. New Academic 4 & 5.
Birmingham SEDA.

Ramsden, P. (1992) Learning to teach in higher education, London: Routledge.

Ricketts, C. & Zakrzewski, S. (2005) A risk-analysis approach to implementing web-


based assessment. Assessment and Evaluation in Higher Education, 30 (6): 603-620

Rust, C., Price, M. and O‟Donovan, B. (2003). Improving students‟ learning by


developing their understanding of assessment criteria and processes. Assessment and
Evaluation in Higher Education. 28 (2), 147-164.

Rust, C. (2007) “Towards a scholarship of assessment” Assessment and Evaluation in


Higher Education, Vol. 32, No. 2, 229-237

Yorke, M. (2003). Formative assessment in higher education: Moves towards theory and
the enhancement of pedagogic practice. Higher Education, 45 (4), 477-501.

For more information see:

University of Plymouth Assessment Policy and Implementation plan:


http://www.plymouth.ac.uk/files/extranet/docs/TLD/assessment%20policy%20updated%
20January%202008v2.pdf

Academic Regulations Sub-Committee Community:


http://staff.plymouth.ac.uk//acregsc/AR20089/intranet.htm

Higher Education Academy - documents, case studies and videos available online at:
http://www.heacademy.ac.uk/ourwork/learning/assessment

CAA Centre: http://www.caacentre.ac.uk/resources/bibliography/index.shtml

The Assessment Standards Knowledge Exchange (ASKe) CETL:


http://www.brookes.ac.uk/aske/!

The Assessment for Learning (AfL) CETL: http://www.northumbria.ac.uk/cetl_afl/whatis/

The SPACE Project: http://www.plymouth.ac.uk/pages/view.asp?page=10494

Disability Rights Commission: Draft Code of Practice (post 16). Available online at:
http://www.trafforddisability.org/documents/Draft%20Code%20of%20Practice%20(Post%
2016).pdf

31
Appendix 1 Concorde Exercise

Complementary Study Task

1. Read the following two short essay answers and award a mark out of ten for each.

You will find it difficult because of the lack of criteria – but do it anyway!

2. Then consider what criteria you would have used if you were trying to do this
seriously and fairly. Write them in on the simple marking grid.

3. Complete the grid (often called an attachment sheet) and write some comments for
the students.

Question: Assess the noise pollution problems caused by Concorde around


airports.

Answer 1
The sound limit at Kennedy airport, New York, is 112 PNdB* and at Heathrow, London, 110
PNdB. The manufacturers of Concorde (Sud-Aviation and the British Aircraft Corporation) have
promised that Concorde will range between 104 and 108 PNdB, depending on its weight at take-
off.

At the start of Concorde operations at Heathrow, 21 of the first 35 departures exceeded 110
PNdB, and in the first eight months of operations 72% of the 97 departures exceeded 110 PNdB.
Overall in 1976 there were 109 infringements of Heathrow's limit by Concorde. These
measurements of Concorde were about 7 PNdB lower than during its early endurance trials. At
the same time there were 1,941 infringements by subsonic jets. Concorde rarely features in the
list of the ten noisiest take-offs each month at Heathrow, and subsonic aircraft at Kennedy have
been recorded at 121 PNdB - twice the limit.

At Dulles Airport, Washington, Concorde has averaged 119.9 PNdB at take-off and 117.8 PNdB
on landing. This is 12 - 13 PNdB higher than the average for subsonic aircraft. The noise levels
have been going down and with them, the number of complaints. In September 1976 the
average level was 121.3 PNdB and there were 186 complaints (29 of these to one take-off). In
October the average was 117.4. PNdB and there were 101 complaints. During this time polls of
opinion concerning Concorde's trial period at Dulles showed an initial opposition of 36.9% drop
to 26.2%. In New York, opposition to Concorde landing at Kennedy has dropped from 63% in
January 1976 to 53% in April 1977.

While 500,000 people are affected by aircraft noise in Washington, 2,000,000 are affected at
Kennedy. It has been estimated that 40,000 extra people will be affected by noise if 80
Concorde's serve 12 US cities. This represents a 1% increase. Bumps in the runway at
Kennedy force Concorde to take off closer to heavily populated areas, but due to advanced flight
control characteristics Concorde can begin to bank at an altitude of 100 feet compared with an
average of 480 feet for subsonic aircraft, and so can turn away from heavily populated areas
sooner after take-off.

*PNdB means Perceived Noise Decibels - a logarithmic scale of noise.

Mark out of ten

32
Concorde Exercise

Question: Assess the noise pollution problems caused by Concorde around airports

Answer 2

Opposition to Concorde based on arguments concerning noise pollution takes two main themes.
The first is concerned with the 'sonic boom' - a phenomenon of supersonic flight unique to
Concorde amongst commercial aircraft. The second is concerned with noise levels around
airports caused during take-off and landing. This second theme is common to all aircraft, and
the issue at stake is whether Concorde is significantly noisier than subsonic aircraft.

Comparisons with other aircraft are complicated by the changing nature of jet fleets. Early jet
aircraft (e.g. the DC8 and 707) used turbojet engines, and whilst these have been quietened,
they are much nosier than second-generation fan-jet engined aircraft (e.g. DC10 and jumbo
747). Eventually these older aircraft will be phased out, but at the moment Concorde is being
compared with them.

There are also problems of measurement. Objective measures (meters giving a reading in
decibels) cannot give any impression of 'shrillness' or subjectively experienced nuisance. An
aircraft giving higher decibel readings may not be experienced as 'noisier' by someone hearing it
take off. Subjective measures also involve problems, as 'noise' is such a multi-faceted
phenomenon, and different people use different criteria in assessing it. There are dangers, also
in questionnaire surveys or reactions of people living around airports. Average rating of
'nuisance' change over time without any changes in objectively measured decibel levels or
frequency of aircraft movements and so other factors must be involved. These factors can be
political. Boeing took care to sub-contract for parts for its SST at factories surrounding Kennedy
airport, so that votes concerning whether SSTs should be allowed to use the airport would be
influenced by residents concerns for their jobs! Workers at Filton and Toulouse would hardly try
to ban Concorde landing near their homes, however noisy it is!

Finally, there is a variation in recorded noise level dependent on the skill of the pilot, and load
factors of the aircraft. Subsonic aircraft have been measured at twice the legal noise level,
struggling to take-off with heavy loads in adverse conditions. Concorde has been flying under-
loaded, with skilled pilots, who have even been reported banking away from noise monitors.

Given this variety of problems it would seem likely that Concorde causes even more noise
pollution than data suggests and, in comparison with subsonic jets, will become comparatively
worse as time goes on.

Mark out of ten

33
Answer 1
Criteria Good OK Poor Missing

E.g.: establishes a set of


pollution problems

Feedback

Mark out of 10

Answer 2
Criteria Good OK Poor Missing

E.g.: establishes a set of


pollution problems

Feedback

Mark out of 10

34
Concorde essay: Providing Feedback

Below is an example of some feedback from a tutor based on the first 'Concorde' essay that you
have marked. I would like you to reflect on the comments and on the order they are written in. Is
this feedback helpful? How might it be rewritten?

Jane
This was an interesting attempt to answer the question but it has several shortcomings:
1. Your introduction lacks clarity and structure and you have failed to set out the intended
argument properly.
2. The main body of the essay does not address the question and some of the information
seems to be irrelevant.
3. You obviously found it hard to arrange the points you wanted to make in a methodical
manner and your argument lacks coherence.
4. The style of writing is rather rambling and there are a few careless mistakes.
5. You have failed to refer to any of your sources of information.
6. The conclusion was not the place to introduce new material but rather to review your
argument and present a final viewpoint.
Having said this, I am sure that with practice and reflection your essay writing will improve. You
might like to visit a learning support tutor for some extra help before you start on your next essay
or come and talk to me about the essay if you like.

I would like you to rewrite one of these comments considering how you could best enhance
Jane's understanding of her problems without being quite so blunt.

Jane
Introductory comment

1.

2.

3.

4.

5.

6.

Closing comment

35

You might also like