Download as pdf or txt
Download as pdf or txt
You are on page 1of 28

LTSN Generic Centre

Assessment
Series No 14

Computer-assisted
Assessment (CAA)
Joanna Bull and
Myles Danson

ltsn

generic centre
Learning and Teaching Support Network
Joanna Bull is Director of Eduology. Eduology provides research, consultancy, advice
and creative solutions for businesses, training organisations, educational institutions and the
public sector. It advises clients on ways to improve the effectiveness of learning and training
within their organisation. Previously, Joanna was Head of Research in Teaching and Learning
at the University of Luton and Project Manager for the Computer-assisted Assessment (CAA)
Centre, a Teaching and Learning Technology Programme (TLTP) project. The CAA Centre
(www.caacentre.ac.uk) was a national centre for expertise, advice and guidance on CAA in
higher education. Joanna has extensive experience of project management and evaluation,
has published widely and has designed and delivered numerous workshops and seminars
nationally and internationally. She has presented at conferences nationally and internationally
and her most recent publications include articles and books on the assessment of student
learning, learning technology and strategies for implementing computer-assisted
assessment.

Myles Danson is the CAA Manager at Loughborough University and has overall
responsibility for the centrally supported CAA systems. He is also the organiser of the
International CAA Conference (www.caaconference.com/), now in its seventh year. Over the
last five years Myles has developed and managed various CAA systems including optical
data capture, networked PC delivery, and most recently, the development and roll out of a
front line web-based CAA system. The latter was funded by the Joint Information Systems
Committee (JISC) and contributed to their Managed Learning Environment projects. The
CAA systems at Loughborough have delivered and marked over 200,000 assessments to
date. Myles engages in project management and income generation including the CAA
Centre project, FDTL and LTSN work and most recently, the JISC funded Technologies for
Interoperable Assessment project (www.toia.ac.uk). He also has substantial experience of
national and international publications, workshops, guest speaking, policy formation and
strategic planning.
Contents 14
Introduction 2

Overview of CAA 3
What is computer-assisted assessment? 3
What is possible with CAA? 3

Advantages and challenges 5

Getting started 8
Planning 8
Piloting 9
Institutional support 9
Further development 9

Effective practice 10
Case studies 10

Using ‘virtual learning environments’ for assessment 13

Standards 15

Question banks 17
What are ‘question banks’? 17
Why are they useful? 17
Question statistics 17
Examples in practice 18

Evaluating practice 19
Why is evaluation important? 19
What aspects of CAA should be evaluated? 19
What approached to evaluation should be taken? 20

Conclusion 23

References 24
Introduction

The purpose of this briefing is to explore the implementation issues of CAA may find the
role which computers can play in the Blueprint for Computer-assisted
assessment process. The briefing Assessment of use (Bull and McKenna,
introduces the range of activities which 2001).
comprise computer-assisted assessment
(CAA) and discusses the advantages and This briefing is aimed at staff and
challenges of introducing computers into educational developers and other staff who
the assessment process. Ideas and teach and support student learning. It may
suggestions on how to get started are also be useful to anyone who wishes to gain
provided and good practice highlighted an overview of the potential of computers in
through mini case studies of effective the assessment process and the associated
practice. Key issues are also raised such as key issues. The briefing provides a useful
standards, the role of question banks and starting point for those considering CAA and
the importance of evaluation. The briefing could be used as a focus for discussion with
does not discuss the general pedagogical colleagues or as the basis of an introductory
issues associated with assessment - these staff development session. The briefing
are covered in other briefings and guides. complements the existing assessment
Neither does it discuss the technical series, in particular the Guide for Lecturers,
solutions to computer-assisted assessment which provides an overall context within
nor provide a detailed methodology for which CAA is one method of assessment
implementation. However, colleagues that may be used.
wishing to follow up on the pedagogical and

2
LTSN Generic Centre – Assessment Series 2004
Overview of computer-assisted
assessment
What is computer-assisted CAA include: web-based assessment,
assessment? computer-based assessment, online

Computer-assisted assessment (CAA) is a


broad term which describes the application
assessment, and computer-aided
assessment. While the terminology – and
individual interpretations of it – may vary, the
14
of computer technologies to the assessment underlying strategies and practices for
process. This may include a variety of making effective use of computers for
activities which assess knowledge, student assessment remain constant.
understanding and skills using one or more
technologies such as the Internet, intranets, What is possible with CAA?
CD-ROM and optical data capture systems.
CAA may be used for both formative and CAA is most commonly associated with
summative assessment in the following multiple-choice questions. There is a certain
ways: amount of scepticism about the ability of
multiple-choice questions to test anything
• to deliver, mark and analyse more than basic knowledge. Biggs (1999)
assignments or examinations (computer states that ‘MCs assess declarative
or web-based), knowledge, usually in terms of the least
• to collate and analyse data gathered demanding process, recognition’. The
from optical data capture systems (for common misconception goes like this: ‘CAA
example optical mark readers), is only suitable for delivering objective
• to record, analyse and report on assessments, that is those which require the
achievement, for example through the recognition of a single correct answer.
construction of online portfolios, Objective assessments are comprised of
• to collate, analyse and transfer multiple choice questions and these are
assessment information through inadequate for testing more than basic
networks. knowledge.’

There is – in some ways unfortunately – a It is true that there are examples of CAA
plethora of terminology which surrounds the consisting solely of multiple-choice
use of information and communications questions and that some of these are not
technology (ICT) in higher education. The well designed and do not test more than
terminology changes and adapts at the rate basic knowledge. This is often a result of a
of technological development and can cause lack of understanding about the pedagogy
confusion and distrust among academics of designing questions and tests – a
and students. Other terms used in the complex and skilled process. However,
literature and practice to describe types of there are also examples of CAA which draw

3
LTSN Generic Centre – Assessment Series 2004
Assessment: A Briefing on Computer-assisted Assessment

on an extensive range of sophisticated developing blended approaches – part CAA,


question types by utilising computers to part paper-based assessments to make the
create questions which would not be most of both of these approaches.
possible using the medium of paper. They
incorporate sound, multi-media and images, CAA offers the opportunity to creatively
and require students to know, comprehend extend the range and type of assessment
and apply their knowledge. The question methods used. It is not a panacea for rising
types go way beyond the objective format of student numbers and marking overload
‘select one from four’ and require intelligent though if used appropriately, it can clearly
consideration rather than guessing. Such relieve some of these pressures. Used
examples of CAA allow skills and abilities to formatively to provide detailed and timely
be tested which could not be assessed feedback it can support and enhance
using more traditional methods. student learning in ways which are not
possible with paper-based assessments.
In addition to the ‘objective formats’, there Diagnostically, it can provide an instant
is a range of related activities which seek to indication of skill and knowledge gaps.
utilise electronic environments to support Summatively, it can dramatically reduce
innovative assessment methods. Examples marking loads for large classes and offer
include gathering assessment information speedy processing of results.
from online discussion forums and using the
Internet and virtual learning environments to
facilitate peer assessment. Others are

4
LTSN Generic Centre – Assessment Series 2004
Advantages and challenges

Assessment is a high-stakes activity for both laborious, repetitive manual scoring of


the learner and the educating institution. assessments, while the learner no longer

14
Computerising this process to any extent has to wait for this process to be
highlights not only those issues associated undertaken. If designed effectively, CAA can
with traditional assessment, but also poses provide highly effective instant feedback.
new issues. Promotion of a new form of Students may take tests at a time and
assessment usually invokes criticisms rarely frequency to suit themselves and, in the
considered in the traditional assessment case of open access web-based
process. When considering CAA, it is useful assessment, the technology often allows
to think about how the traditional process tests to be taken at a location to suit the
deals with issues such as candidate student.
authentication, security and plagiarism, and
to attempt to draw parallel conclusions In the case of CAA potential benefits are
where appropriate. There is a major issue of available to all parties. The lecturer is free
acceptance of new technologies within the from manual marking: the student gets an
curriculum and it is constructive to instant and objective score with specific and
recognise that standard assessment timely feedback. In addition, the detailed
techniques, frequently developed with little scoring data are already digitised and the
strategic planning, are often far from perfect. possibility of automated score upload to
Mapping the assessment process can be a central repositories such as a student
relatively complex task. However, a full records system offers administrative
evaluation of this often reveals inadequacies benefits. This ‘win-win’ scenario provides
and opportunities for improvement which great potential for formative, summative and
CAA can address. (See Danson et al., 2001, diagnostic assessment. However, it must be
for an example of an assessment mapping recognised that CAA ‘front loads’ the
process.) assessment process – the majority of time
and effort is invested in the design of
At first glance, the key advantage of CAA is questions and tests prior to the assessment
usually perceived as a saving in resources taking place. With traditional assessments,
and time. Automated marking is highly greater time and effort is invested in marking
desirable from both the point of view of the once the assessment is completed. Creating
educator, and if students are properly pedagogically sound CAA is a highly skilled
informed about the capabilities and benefits activity and should not be dismissed lightly.
of the CAA – the learner. Quick and often The testing medium can be far richer than
instant marking and feedback are clear paper-based assessment.
benefits, leaving educators free from often

5
LTSN Generic Centre – Assessment Series 2004
Assessment: A Briefing on Computer-assisted Assessment

Examples of screen-delivered CAA include Perhaps one of the most underestimated


colour, animations, sound and even video challenges for those using CAA is the
content, and it is likely that the learner will gaining of cultural acceptance. We
interact in different ways (McGuire et al., mentioned in the introduction some of the
2002). Adaptive testing is also possible perceptions held about CAA as a limited
whereby the CAA system serves question form of assessment. These, plus some
content based on the responses and reluctance to accept the role of technology
performance of an individual student. An in education, can be potent issues to be
assessment may be repeated several times, faced by those implementing CAA. Despite
each time presenting different variations of the potential time savings, the time and cost
the same question content. This is an investment may initially seem high. Time
excellent way of encouraging learning must be spent developing pedagogically
through the application of skills and is sound questions, and hardware and
particularly effective in areas where this is software must usually be purchased and
necessary, such as mathematics. Recent installed. If summative assessment is to be
CAA systems intelligently comprehend undertaken, a very secure and robust
question and answer content, which has system with back-up contingencies is
been shown to be useful in algebraic testing needed to ensure success.
(Ashton and Beevers, 2002) and in the
assessment of free text (Kukich, 2000, Student technical skills are less of a
Leacock and Chodorow, 2000). problem in the new millennium. However,
effective support must be put in place to
Reporting features frequently include ensure that no student is disadvantaged by
analysis of the question and test the testing medium. This may mean
performance and offer the potential for including reference to CAA in appropriate
quality control often impossible using documentation, handbooks and module or
traditional methods. Detailed analysis of course descriptors. Students involved in
questions and student performance – as summative CAA will need the opportunity to
individuals and as groups – can help to practice with the assessment type and the
identify gaps in skills and knowledge early in software. The logistical and operational
a course or module. Students at risk can issues of delivering CAA must be planned
quickly be identified and offered additional for carefully, particularly where summative
support. Over a period of time this type of assessment is concerned. Traditional
data can help to inform changes in the models of examination delivery may have to
curriculum and – possibly –methods of be adjusted — for example, running two
learning and teaching. sessions of the same examination back-to-

6
LTSN Generic Centre – Assessment Series 2004
Assessment: A Briefing on Computer-assisted Assessment

back to allow for student numbers which support staff are likely to need staff
exceed the numbers of computers available. development and training on both the

14
In addition, policies and procedures may pedagogical and operational aspects of
have to be developed to ensure smooth CAA. There is also an ongoing need for
running of CAA within the context of existing support to ensure that academic staff are
assessment regulations. free to concentrate on developing
pedagogically sound assessments rather
In addition, accessibility needs to be than attempting to install and configure
considered in the design of questions and software on their institutional networks. To
construction of online assessments. meet this challenge, clear roles and
Appropriate measures should be put in responsibilities should be discussed and
place to ensure that assessments are decided early in the process of considering
accessible to all students. Existing policies CAA. It is important to acknowledge that
and support mechanisms should be CAA requires skills and experience from
reviewed in light of the introduction of CAA across an institution, often involving a range
for summative assessment. of staff from academic, IT, library and
administrative departments.
A key challenge is providing appropriate
pedagogical and technical support to
academic and other staff involved in
developing CAA. Both academic and

7
LTSN Generic Centre – Assessment Series 2004
Getting Started

Planning advantages of commercial software or an in-


house solution. The choice and range of CAA
A good starting point is to conduct a needs software is increasing so it is important to
analysis. Consider how assessment is determine the desirable features to meet your
undertaken currently and where a CAA needs, but be aware that pleasing all the
system might be most beneficial. It is stakeholders all of the time is a holy grail and
important to clearly identify the purpose of unlikely to be achieved! For example, the
the assessment. Is it to rapidly evaluate TRIADS (TRIpartite Assessment Delivery
student performance during a course, to System) CAA system offers complex
provide end of course/module assessment, pedagogical features but requires a high level
or to give timely formative feedback to of skill in writing questions and software use.
students on their learning? It may be that The CASTLE (Computer ASsisted Teaching &
there are several purposes for introducing LEarning) system, by comparison, is far
CAA. It is wise, however, to start small, simpler to use but has a restricted number of
usually with small-scale formative or self- question types. It may be beneficial to adopt
assessment tests. This provides the more than one system to meet several needs.
opportunity to learn from experience prior to This, however, raises support and
moving to high-stakes summative implementation issues which must be
assessment. It is important to remember the considered in terms of practicality and
strength of CAA in providing formative achievability.
assessment and feedback - instantly,
consistently and specifically — in order to In particular, the following are important to
support student learning. consider when evaluating CAA software:

Once the purpose is defined, the next step is • Interoperability (IMS standards – see
to identify stakeholders and set up section on Standards on page 15 below)
appropriate consultative processes. This may • Existing systems and integration
be a relatively small group for formative or • Scaleability
self-assessment which is supplementing • Performance levels
existing assessment methods, or a larger • Upgrading procedures and limitations
faculty or institution-wide group for • Support and maintenance
summative assessment. It is particularly • Security — particularly where summative
important to involve appropriate IT staff in the assessment is intended
consultative process given that the use of • Accessibility.
CAA may well impact on other systems and,
depending on the model of support, can It may also be useful to visit other institutions
represent a shift in workload from academic with the aim of discovering both successes
to support staff. It is useful to recruit other and failures. Be aware that there is a general
enthusiasts at this stage to help drive the reluctance to admit failures! The type and
process and provide support and motivation. level of support required by staff and
Review and evaluate the various solutions students should also be considered early in
available and consider the relative the process. Staff development should be
8
LTSN Generic Centre – Assessment Series 2004
Assessment: A Briefing on Computer-assisted Assessment

taken seriously and can help to avoid pitfalls be to foster a culture of ownership and
later in implementation. In particular, responsibility. Since traditional roles and job
pedagogical staff development should be descriptions may not easily fit within the
undertaken to ensure that assessments
delivered in CAA do not fall into the category
context of CAA development and support,
appropriate training and support should be
14
of ‘only good for testing low-level recall’. established. It may be beneficial to recruit a
senior manager as a ‘champion’ to help drive
It is useful to review existing questions – from
the process forward and to secure resources
textbooks, the Internet and other sources – as
where necessary. There may also be existing
this can help with the formulation of ideas
support departments (such as staff
about how CAA can be used effectively. It
development, teaching and learning, and
may also be possible to find existing
learning technology groups) that can help to
questions and tests which can be purchased,
support and guide the implementation in
used freely, or shared and adapted for re-use.
conjunction with academic staff. In some
Care should, however, be taken to ensure
institutions, dedicated CAA support staff are
questions are of a high standard and are
appointed with this specific remit.
pedagogically and culturally appropriate.

Further development
Piloting
As CAA develops and more staff and students
It is important to test CAA systems and
become involved, there may be a move from
assessments, particularly where summative
formative assessment towards summative
assessment is to be implemented. Formative
assessment. The involvement of larger
open access assessments can prove a useful
numbers of students in CAA and the high-
way of gathering information about software
stakes nature of summative assessment leads
and question performance and student
to further considerations, such as:
experience. The level of piloting will depend
on how many students are likely to use the • Existing examination regulations and
CAA and on what basis. A series of pilots may assessment policies and their suitability
be valuable for high-stakes summative for CAA
assessments. Piloting also provides an • Special educational needs
opportunity to test operational procedures • Back-up procedures and contingency
and to gain feedback from students. The planning
original purpose of the CAA should be tested • Usability and ergonomics
by piloting to ensure that the system meets • Bandwidth and stability of connections
the needs of both students and staff. for off campus access.

Institutional support The way in which CAA evolves will vary, and
can be subject-specific. It is important to
It is useful (though not always essential) to acknowledge the time and effort involved in
gain institutional or faculty support for CAA. creating well-designed, valid and reliable CAA
Because CAA often involves a range of staff, and to be clear about the benefits and
both academic and support, the aim should challenges of implementing CAA.
9
LTSN Generic Centre – Assessment Series 2004
Effective practice

Assessment is a high-profile activity and three coursework assignments and an end-


critical to student learning. It drives and of-module examination. The course attracts
motivates students to learn, and changing large numbers of students and lecturers find
assessment will result in a shift in student that some students, although passing the
learning. For that reason it is important to coursework, fail the examination. The
embed CAA within existing methods and coursework assignments represent a
strategies of assessment. It should be clear significant marking load and sometimes
to students that the CAA they are students do not receive any feedback on
undertaking has a distinct and valuable their learning until halfway through the
purpose. CAA which is ‘bolted-on’ to a module. Following a review of the module
course or module, giving the impression that assessments it was decided to replace one
the lecturer is just trying out some new and of the early coursework assignments with a
half-formed idea, will not be effective in series of five short CAA objective tests.
motivating student learning. CAA should Each test related to teaching material
offer benefits and complement existing covering a two-week period and students
assessments, rather than duplicating other received feedback after the test on each of
assessments or only dealing with trivial skills their questions. Students had to book
and knowledge. themselves into a computer lab at specific
times to take tests which were invigilated by
Three mini case studies are presented support staff. Lecturers received the results
below to provide examples of how CAA of the tests the following day and
might be embedded into a course or sometimes noticed that significant numbers
module. of students had failed questions on certain
topics, which they then addressed in the
Case Study 1 following week’s seminar. Through the
feedback students gained a clear idea of
‘Fundamentals of Human Physiology’ is a how they were progressing with the course
first year core module for undergraduate and were motivated to follow up some of the
Biology. The module prepares students for a feedback suggestions regarding further
range of modules in their second year, reading and research.
providing them with essential skills and
knowledge which they will need throughout
their degree. Currently assessments include

10
LTSN Generic Centre – Assessment Series 2004
Assessment: A Briefing on Computer-assisted Assessment

Case Study 2 • Essay – to assess all aspects of the


module, but primarily understanding of

14
‘Understanding Poetry’ is a twelve week first historical development of the genre.
year undergraduate elective module which
teaches students about critical terms and • Computer Aided Learning (CAL)
theories, the process of developing a poem, manuscript assignment – to assess
the historical development of poetry, and students’ understanding of ‘poetic
some aspects of critical comparison. The process’ from manuscript to
module attracts quite large numbers. For publication.
some, it is a pre-requisite for further study In this case, the CAA component of the
while for others it is an elective and they assessment strategy is both formative, with
may be unlikely to study literature any self-assessment questions available to
further. The size and mixed literary students on demand, and summative, with
experience of the cohort mean that the an end-of-module exam. It ensures that
number of free-text assessments (essays, students are tested on a wide range of
critiques) is limited by what can reasonably poetic terms and their application. Students
be marked. The lecturer also cannot assume who are weaker can practice with self-
that all the students have a good knowledge assessment. This motivates them and has a
of those critical concepts which are positive effect on their other assessments.
important as a basis for developing The lecturer also finds that she receives
understanding and critical analysis skills. In fewer basic and repetitive questions about
order to effectively assess the module, a poetic terms and concepts during teaching
range of objective and subjective methods sessions. The summative assessment
of assessment are used including the identifies that students have knowledge and
following: understanding of a core set of critical terms
and concepts.
• CAA objective test – to test knowledge
of poetic terms, application of terms (This case study is adapted from Bull and
and concepts and understanding of McKenna, 2001).
historical development of the genre.

• Critical comparison of two poems – to


assess ability to critically analyse,
application of terms and concepts and
knowledge of terms.

11
LTSN Generic Centre – Assessment Series 2004
Assessment: A Briefing on Computer-assisted Assessment

Case Study 3 competency in these core areas prior to


entering a teaching environment. CAA has
‘Primary Teacher Education’ is a single been introduced to provide a diagnostic test
honours degree course which attracts which all students take during their first
students from a variety of backgrounds – in week. The test assesses their current
particular students who have worked for a knowledge and understanding of ICT,
number of years and those who have few Mathematics and Science. The questions
formal qualifications. In the first year the are directly related to the learning outcomes
course includes modules on of these modules and the results give
lecturers a clear indication of where there
• ICT knowledge and understanding are gaps in students’ skills and knowledge.
• Introduction to Mathematics in the Students who do not achieve a specified
Primary Curriculum grade in any one of the three sections of the
• Introduction to Science in the Primary test are referred to an academic counsellor
Curriculum. who advises them about supplementary
support sessions in the relevant subjects.
These are key modules and in the past, it Supplementary support sessions also use
has been found that student skills and self-assessment CAA which is rich in
abilities in these subject areas vary feedback and directs student learning
dramatically. Although additional support through hints, tips and encouragement.
sessions are available in Mathematics and Students who join the supplementary
Science, students often do not take these sessions can retake the diagnostic test
up or do so late in the course. The course is whenever they wish to check their progress.
intensive and early on includes work in local
primary schools. The lecturers feel that
students need to reach a certain level of

12
LTSN Generic Centre – Assessment Series 2004
Using ‘virtual learning environments’
for assessment
Britain and Liber (1999) state that virtual Some VLEs include the facility to develop
learning environments (VLEs) are, ‘Systems basic objective test questions. These usually
which synthesise the functionality of comprise three or four question types
computer-mediated communications
software (email, bulletin boards etc) and
(multiple choice, multiple response, text
match). The functionality of the question
14
online methods of delivering course features is somewhat limited. However, they
materials’. Examples of VLEs in use in the do offer a valuable opportunity to rapidly
higher education sector include provide self-assessment questions which
‘Blackboard’, ‘WebCT’, ‘Learnwise’, ‘Lotus can be embedded within learning materials,
Learning Space’ and ‘Top Class’, among allowing students to check their progress as
others. Surveys show that in the last two they are working in the VLE. They are a good
years many institutions have adopted a VLE way of introducing CAA into the curriculum
to help support and enhance student and usually provide a simple interface for
learning. question construction.

VLEs can help support and deliver Beyond the use of VLE to provide
assessment. Details of assignment and information, feedback and basic objective
other assessment requirements, test questions there are interesting
examination timetables, regulations, and developments that indicate a movement
sources of help and advice can all be made towards the wider use of computers in the
available for students to access through a assessment process.
VLE via the Internet or intranet. Email and
bulletin boards can be used to notify or Peer assessment is one area where VLE can
remind students of deadlines. be particularly powerful in allowing students
to engage in critical review, evaluation and
VLEs also offer the opportunity to distribute analysis. For example, a cohort of
global and individual feedback to a group of Economics students is asked to research
students about a particular assessment. and write a report exploring the effects of
This could be achieved by uploading a file expanding the European Union on the
containing global feedback which might Common Agricultural Policy. Once
usefully be linked to a discussion session in completed, students upload their reports to
which students query and discuss the a secure folder on the VLE. The tutor
feedback and assignment. Feedback on anonymises each report and assigns it a
assessed work can be emailed to individual number. Each student is then emailed with
students, inviting them to raise questions the number of the report they are to assess
(by email, phone, in person) as appropriate. according to defined marking criteria and a
The online environment may well encourage series of questions designed to give
some, otherwise reticent, students to voice feedback to the author. Once students have
their thoughts and concerns. completed the peer assessment, they post

13
LTSN Generic Centre – Assessment Series 2004
Assessment: A Briefing on Computer-assisted Assessment

their marking schedules back to the tutor To date, no clear models for how this is best
who may wish to moderate them prior to approached have emerged, although the
giving them to the students. This activity design of tools which graphically display
could be followed by an online discussion levels of participation against defined
session, where students comment on the criteria should help to resolve some of the
marks and feedback that they received, issues associated with computer-mediated
their experience of the process of marking communication and assessment (Kuminck
peers’ work, and how they would approach and Pilkington, 2001).
the report differently in future. Students
could also be asked to complete a short VLE offer many opportunities to engage
objective test on how they approached both students in a range of formal and informal
writing the report and marking it. Marks assessment activities, and it is perhaps their
could be awarded to students for the report, ability to integrate learning and assessment
their peer assessment, and engagement in which is particularly powerful. Although
the discussion or objective questions. bespoke assessment software offers
Robinson (1999) describes a system for greater flexibility for objective style
enabling online anonymous peer review assessments, VLE have much potential for
systems and the benefits these offer. enhancing the assessment process,
particular for self- and peer assessment.
There is also interest in assessing the
contributions which students make to online
discussions. Approaches vary from
awarding a small amount of credit for
‘engaging’ at any level, to attempts to
measure the worth of individual’s
contributions. Indeed, discussions are often
richer learning experiences if they are
assessed, as students feel obliged to
contribute and to interact. Beware, however,
of over-assessing.

14
LTSN Generic Centre – Assessment Series 2004
Standards

Standards encompass content, procedural, performance. Further information on item


and in the case of CAA, technical areas. All analysis is provided in the section below on

14
are equally important in any assessment ‘Question banks’.
process. Question and test content should
be academically effective and processes to Standard procedures become increasingly
manage this could include peer review of important as larger numbers of students and
assessment questions and tests. This is staff become involved in CAA. It is possible
most often done ‘in-house’ whereby to bend the rules to accommodate
colleagues are asked to proof questions and individual needs when dealing with a
tests. Increasingly, as technology advances handful of academics wishing to use CAA
and uptake is more widespread, content is but this can lead to problems where
being made available through third parties in significant numbers of staff and students
the form of peer-reviewed question banks. are involved. Adopting standard approaches
(Learning and Teaching Support Network and procedures allows resources to be
subject centres are beginning to produce monitored and adjusted to cope with
and offer these, as are some commercial demand. In a traditional educational setting,
content providers.) As an institutional it should be recognised that peak times
question bank expands there is potential to occur throughout the academic year and
share material across traditionally disparate these should be borne in mind. Diagnostic
departments where teaching content is testing at the start of the academic year is
shared. If successful, this can be a huge often the first peak. The following weeks see
benefit to both new teaching staff such as an ebb in demand as lecture courses are
academic probationers and also established delivered and content is disseminated to
academics who are keen to adapt and ‘re- learners. Once enough content has been
purpose’ questions. delivered, the demand for testing becomes
greater and peaks again at examination
New questions and tests should be times.
introduced gradually and performance
monitored. The majority of CAA systems The technical standards are unique to CAA
offer statistical item analyses and the use of as a form of assessment. The language and
these can reveal weaknesses in questions wealth of acronyms surrounding standards
and errors which may have been neglected initiatives can be confusing. However, the
in past paper-based versions. Consideration key issue is that standards will help different
of the item analysis will help to improve the systems to exchange data. This means that,
reliability and validity of tests, ensuring they if deployed successfully, it should be
are an effective measure of student possible to link student record systems,

15
LTSN Generic Centre – Assessment Series 2004
Assessment: A Briefing on Computer-assisted Assessment

virtual learning environments and CAA of questions, and input it into another, for
systems, resulting in single point of entry to example a better assessment software
such systems: one user name and package. Another advantage is that such a
password is all that is ever needed. For standard will allow questions to be shared
CAA, this could mean that module across departments and institutions using
registrations can be used to schedule compliant software.
assessments, that the system could
automatically inform students of which tests Web-based systems rely on web browser
they have to take, and that once completed, applications which means that security
the score is uploaded and ratified. issues, such as invoking a secure
connection, should be addressed. Web
The IMS Global Learning Consortium is the browsers do pose some additional
most advanced of the standards bodies, difficulties. We must guarantee that the
and was originally formed in 1997 as an assessment looks and behaves as expected
Educom project. Its membership is across the different browsers and platforms
extensive and a wide range of standards are in use for although the web is ‘cross
currently being worked on. One of these platform’, uniformity is not always simple. It
standards is the Question and Test may be necessary to define a specific
Interoperability (QTI) specification. UK configuration to ensure standard
higher education is represented by the presentation. However, this can be
Centre for Educational Technology problematic as settings at the client
Interoperability Standards on international machine are hard to control. There are also
learning technology initiatives such as IMS. differences in Macintosh and PC versions
of the same product such as Internet
TM
The QTI defines question types but Explorer . Reliance on Java TM and ‘plug-ins’
separates content from presentation. This is common in web-delivered CAA and this
allows questions which are meet this must also be enabled on the student
standard to be imported and exported machine.
between IMS compliant systems. This helps
to avoid situations where the design of the
software means that it is difficult to extract
data held in one system, for example a bank

16
LTSN Generic Centre – Assessment Series 2004
Question banks

What are ‘question banks’? assessment. With question banks, questions –


identified by specific descriptors – are drawn

14
Question banks are collections of questions, from a bank according to desired criteria and
each uniquely identified and stored to allow may be re-used where appropriate.
the automated creation of tests to meet
predefined criteria. Each question has Question statistics
associated descriptors which may define a Statistical measures are used to determine the
number of features of the question such as characteristics of question and tests and
academic level, topic, difficulty, and skill or indicate the worth of each question for
knowledge addressed by the question. inclusion in a bank. The question is the unit for
Questions can be contributed to (and analysis rather than the whole assessment,
withdrawn from) the bank by authors and and each question is evaluated independently
users. Question banks are most commonly, to generate item statistics. For a detailed
but not exclusively, used with objective description and discussion of the statistics of
questions. They can be used to store question banking see Hambleton,
questions delivered by computer and paper Swaminathan and Rodgers (1991) and
methods. McAlpine (2002).

Why are they useful? The statistics generated offer detailed


information about:
Question banks contribute to the validity and
reliability of the assessment process by • the questions themselves,
establishing a common language for • individual student performance,
discussing curriculum goals and learning • group performance,
outcomes. The questions relate directly to • specific strengths and weaknesses, and
individual tasks (skills and knowledge- • curriculum design.
specific) which students are capable or
incapable of demonstrating. Questions are For example, in terms of the quality of a
graded according to difficulty on a scale question, statistics can identify incorrect
within the bank and it is possible to identify options within an individual question which
the relative difficulty of particular tasks. This are rarely chosen by students. These options
provides a way to discuss possible learning are poor at discriminating between weaker
hierarchies and ways to better structure the and stronger students and should be replaced
curriculum. Statistics collected each time a to strengthen the question. If a question is
question is used help to assure the quality of answered incorrectly by 90% of a student
the questions and assessments. group, it would probably be determined to be
too difficult. However, it may also be found
Question banks can also offer substantial that all the students select the same incorrect
savings of time and energy over conventional response providing an indication of a common
paper or standard computerised objective test misconception among the group.
development. Traditionally assessments are Both students and lecturers can be informed
developed independently from previous about the detailed performance of an

17
LTSN Generic Centre – Assessment Series 2004
Assessment: A Briefing on Computer-assisted Assessment

individual student. Specific strengths and – last accessed: 16/12/03). Assessment


weaknesses can be identified at a micro level materials are being collected, including
and progress can be evaluated. At the level of paper-based assignments, objective tests,
a course or programme, question statistics examination questions, problems and data
can be usefully employed to help determine response. Questions are described by their
future curriculum design and development. type, subject and academic level. The aim is
to save time and effort for academic staff
Questions are usually piloted before being
across the sector. However, issues of
included in a bank and question statistics
institutional intellectual property rights have
evaluated to weed out or improve any weak
hindered growth of the bank (Poulter, 2002).
questions prior to inclusion in the bank.
Initially, this can be time-consuming, but by
The Electronic and Electrical Engineers
piloting a few new questions each time tests
Assessment Network (E3AN), a ‘Fund for the
are delivered, the bank grows and the quality
Development of Teaching and Learning’
of questions is assured. Formative and
project, is creating a bank of questions in a
summative tests may result in different
range of undergraduate engineering topics.
statistics for the same questions because the
Four UK institutions are collaborating to
conditions for taking the test are likely to be
develop a peer-reviewed question bank for
different. For example, multiple attempts may
electronic and electrical engineering. The
be allowed in formative tests resulting in a
project ‘sees the peer review of questions as
higher correct response rate.
being essential in terms of achieving
ownership of any test banks across the
Although such statistics could be generated
community. It also believes that the process is
from the grades of traditional paper-based
essential in terms of assisting the
assessments, the time taken to analyse marks
establishment of clear academic standards
and overall grades deters most academics.
associated with the test banks’ (White and
CAA software automates this process,
Davies, 2000).
making a wealth of data instantly available.

Question banks can help to overcome some


Examples in practice
of the upfront time investment required by
Computerised question banks are used offering shared resources which can be re-
widely in the United States (Wainer, 2000) and used and adapted. This may be particularly
the literature discussing the construction and useful where common subject matter is to be
application of computerised question banks assessed for large numbers. The
dates from the early 1980s. The LTSN development and adherence to standards can
Economics Subject Centre is developing a help with the technical construction of
question bank for academics in the UK question banks, That said, shared question
involved in the teaching of Economics at banks also raise a cluster of thorny issues
higher education level. (Further information around the problems of ownership, copyright
can be found at and intellectual property (Bull and Daziel, in
http://www.economics.ltsn.ac.uk/qnbank.htm press).

18
LTSN Generic Centre – Assessment Series 2004
Evaluating practice

Why is evaluation important? for. For example, is it to inform lecturers


about whether students have benefited from

14
It is important to evaluate educational additional feedback on their learning? Or is
innovations, new and adapted methods of it to provide evidence for the head of
assessment – and indeed established department that the use of CAA released
methods of assessment – in order to judge time for lecturers to undertake more small-
their impact on learning, efficiency and group teaching?
effectiveness. The combination of
computers and assessment often Those interested in the outcomes of CAA
encourages questions about the evaluation might include:
educational and operational effectiveness of • academics
CAA, questions which are rarely asked of • members of teaching and learning
existing assessment methods. This is committees
largely positive and can prompt a review • educational technologists
and amendment of assessment strategies • computer services staff
and methods overall. • examination officers
• senior managers
What aspects of CAA should • quality assurance staff.
be evaluated?
IIn order to be achievable, it may be
Evaluation of CAA can take many forms. For necessary to focus the evaluation very
example, it may be at an individual module specifically, for example, by gathering
level or at a departmental or institutional qualitative data from a small group of
level. In other cases, it may focus on the students about their experiences of CAA, or
student or staff experience or on the effect by monitoring the volume and pattern of
of CAA on learning and grades. student access to CAA. Where more time
and resources are available, broader
Evaluation studies need to be realistic and approaches maybe adopted, perhaps
to have clearly defined aims and objectives: concerned with gathering experiences,
often these will relate to the motivations for perceptions and data concerning the use of
introducing CAA. It is highly unlikely that several types of CAA from staff and
every aspect of the implementation of CAA students in order to provide a generic
can be evaluated as this would present a measure of the success of its introduction in
time-consuming and exhaustive task. It is a particular course. At an institutional level,
therefore essential to clearly define the evaluations may be concerned with time
aim(s) of introducing CAA in order to identify efficiencies, cost effectiveness and
whether these aims have been met. It is also operational practices.
important to identify who the evaluation is

19
LTSN Generic Centre – Assessment Series 2004
Assessment: A Briefing on Computer-assisted Assessment

Example of evaluations which may be (Dalziel and Gazzard, 1999; O’Hare, 2001;
useful, depending on the type and purpose Ricketts and Wilks, (2002). Mulligan (1999)
of CAA, include: reports on the impact of regular CAA on
students’ study behaviour. McGuire et al.,
• comparison of scores between CAA (2002) evaluate the role of partial credit
and paper-based tests within CAA and paper-based examinations
• correlation between CAA tests and and the mode of delivery is also explored by
other assessments methods within a Fiddes et al., (2002). In the States, the
module Educational Testing Service, responsible for
• student attitudes towards CAA (ease of large-scale national testing, has conducted
use, anxiety, relevance of content, in-depth and large-scale evaluative studies
accessibility, perceived equity of of a range of issues relating to CAA.
system)
• quality and speed of feedback to What approaches to evaluation
students should be taken?
• quality of questions
• effects of CAA on student study Approaches to evaluating learning
behaviour technology generally have been developed
• staff attitudes towards CAA over the last twenty years and Oliver (1998)
(educational efficacy, ease of use, summarises the difficulties associated with
anxiety, use in different educational using learning technology:
levels, perceived).
• ‘the effect of IT is not consistent across
From McKenna and Bull (2000) subject or age groups (Hammond,
1994);
The literature provides examples of • high and low ability learners benefit
evaluations of several types. Early from different types of software (Atkins,
comparisons between paper and computer- 1993);
based assessments were followed by more • lack of expertise amongst students or
sophisticated triangulations of CAA results teachers can create difficulties
with essay and examination scores (Hammond, 1994); and
(Kniveton, 1996; Farthing and McPhee, • it can be extremely difficult even to
1999; Perkin, 1999). Students’ experience of specify or measure “educational value”
CAA are also reported in the literature (Mason, 1992)’.

20
LTSN Generic Centre – Assessment Series 2004
Assessment: A Briefing on Computer-assisted Assessment

These challenges apply equally to the Illuminative evaluation


evaluation of CAA, with the additional
problem that assessment can be both
culturally and politically sensitive – with
many issues of power and control from both
These evaluations attempt to discover
factors and issues which are important to
certain groups of stakeholders rather than
14
an academic and student perspective measuring the success using standard
(McKenna, 2001). measures. It is common practice to seek to
identify and explain problems in adoption
Evaluation can serve a number of purposes using illuminative evaluation.
and several roles are described in the
literature (Oliver 1998, Draper, et al. 1994). Integrative evaluation
Although the terminology may vary, the
principles appear to be commonly Integrative evaluation takes a broad
represented. approach to the adoption of learning
technology. The aim is to find ways of
Formative evaluation improving teaching, learning and
assessment generally through the
Such evaluations seek to identify and introduction of technology. Illuminative and
evaluate the use and impact of CAA with the integrative evaluations are often conducted
purpose of feeding evaluative data back into hand-in-hand, as the former frequently
the design of the system. Formative provides the basis for the latter.
evaluations commonly identify problems
with resources and propose solutions The importance of defining what you wish to
(Cronbach, 1982). evaluate cannot be underestimated! Small
variations in wording or process can have
Summative evaluation considerable impact. Asking, ‘What factors
influence how students perform at CAA?’
Summative evaluations provide information suggests an exploratory study, which seeks
at a particular point in the implementation to identify influences on performance.
process when a specified stage has been However, asking ‘Which of the following
reached. They are often focused on a factors influences how students perform at
particular, well-defined question, seeking to CAA ?’ suggests a comparative study,
measure achievement of learning outcomes. possibly involving a controlled experiment.

21
LTSN Generic Centre – Assessment Series 2004
Assessment: A Briefing on Computer-assisted Assessment

The extent to which the evaluation is Checklists, experiments and quantitative


exploratory is likely to determine the data collection techniques require a
methods used. Asking ‘what’ questions framework for questions to be fixed in
indicates that few pre-conceptions of the advance.
answers are held, and that the study is
largely exploratory in nature – in the There is a range of methodologies which
example above, the factors that will can be used to conduct evaluations and the
influence learning need to be discovered in selection of different methods will depend
the course of the study. When asking on the purpose of the evaluation and the
‘which’ questions, the factors which time and resources available. Cook (2002)
influence learning have already been provides a good overview of the range of
identified and the evaluation is conducted to methodologies which can be used to
measure their influence on students. In this evaluate CAA, while the Evaluating Learning
situation the evaluation is less exploratory. Technology toolkit (Oliver and Conole, 1998)
Qualitative methodologies, which ask open provides a practical guided methodology.
questions – such as interviews,
observations, concept maps and focus
groups – are suited to explorative studies.

22
LTSN Generic Centre – Assessment Series 2004
Conclusion

The opportunities afforded by computer- Technology develops at a furious pace and


assisted assessment, both formatively and keeping up can seem daunting. However,

14
summatively, are many. Students benefit with appropriate support, both
from timely and specific feedback on their pedagogically and technically it is possible
learning and a chance to practise skills and to use CAA to make assessment more
monitor their own progress. Lecturers can effective and more efficient. A variety of
use CAA to complement and enhance opportunities are emerging which offer
existing assessment methods, to broaden exciting possibilities for CAA. Sharing
the range of knowledge assessed, and even questions through banks, and standards
extend the limitations of paper-based compliance, will ease the necessity for
methods. Rapid and detailed feedback to reinventing questions and will open up
staff can also help to guide the development opportunities for re-use and creative
of the curriculum, while automated marking adaptation. Virtual learning environments
releases time which can be spent on richer provide opportunities to support and
forms of face-to-face interaction with enhance assessment and together with
students. Assessment quality can be easily software developments and integration, are
assured and there is a level of consistency beginning to encourage the assessment of
and objectivity in marking which is not skills and abilities which would be not
possible for traditional forms of assessment. attempted on paper.
However, time and effort is required in
designing pedagogically effective questions CAA is still in the relatively early stages of
and tests. A rethinking of assessment – and development. It requires creativity and often
often teaching and learning – is sometimes challenges well-established traditions of
prompted by the introduction of CAA, and assessment but has powerful potential for
this can prove challenging as new ways of both educators and students.
working need to be found. The operational
and technical issues need careful planning
and resourcing and should not be under-
estimated, especially where summative
assessment is undertaken.

23
LTSN Generic Centre – Assessment Series 2004
References

Ashton, H.S. and Beevers, C. (2002) Extending Draper, S. et al. (1994) Observing and
the Flexibility in an Existing On-line Measuring the Performance of Educational
Assessment System, Proceedings of the 6th Technology. TILT report no.1, University of
Annual CAA Conference. Loughborough Glasgow.
University, Loughborough, UK, 9-10 July, Farthing, D. and McPhee, D. (1999) Multiple
2002, 3-16. choice for honours-level students? A statistical
Atkins, M. (1993) Evaluating interactive evaluation. In Danson, M. and Sherratt, R.
technologies for learning. Journal of (eds), Proceedings of the 3rd Annual CAA
Curriculum Studies, 25 (4), 333-342. Conference, Loughborough.
Biggs, J. (1999) Teaching for Quality Learning Fiddes, D. J. et al. (2002) Are Mathematics
at University, Society for Research into Higher Exam Results affected by the Mode of
Education and Open University Press, Delivery? Alt-J 10 61-69.
Buckingham, 175. Hambelton, R. K., Swaminathan, H. and
Britain, S. and Liber, O. (2000) A Framework for Rogers, H. J. (1991) Fundamentals of Item
Pedagogical Evaluation Of Virtual Learning Response Theory, Sage Publications,
Evironments, JISC Technology Applications California.
Programme. Bristol: Joint Information Systems Hammond, M. (1994) Measuring the impact of
Committee report. IT on learning. Journal of Computer Assisted
Available from: Learning, 10, 251-260
http://www.jtap.ac.uk/reports/htm/jtap- Kniveton, G. H. (1996) A correlation analysis of
041.html (Accessed: multiple-choice and essay assessment
January, 2004) measures’ Research in Education, 56.
Bull, J. and Daziel, J. (2003) Assessing Kukich, K (2000) Beyond Automated Essay
Question Banks. In Littlejohn, A. (ed.) Reusing Scoring. IEEE Intelligent Systems, 22-27.
Online Resources, London: Kogan Page. Kuminnek, P.A. and Pilkington, R.M.(2001)
Bull, J. and McKenna, C. (2001) Blueprint for Helping the Tutor Facilitate Debate to Improve
Computer-assisted Assessment. CAA Centre, Literacy using CMC. In Proceedings of IEEE
ISBN: 1-904020-00-3, International Conference on Advanced
http://www.caacentre.ac.uk. Learning Technologies: Issues, Achievements
Cronbach, L. (1982) Issues in planning and Challenges, Madison, Wisconsin, 6-8
evaluations. In Cronbach, L (ed.) Designing August, 261-266.
evaluations of educational and social Leacock, C. and Chodorow, M. (2000)
programs. Jossey-Bass, San Francisco. Automated Scoring of Short Answer
Cook, J. (2002) Evaluating Learning Responses, Research Report, ETS
Technology Resources, LTSN Generic Centre Technologies.
and the Association for Learning Technology, Mason, R. (1992) Methodologies for Evaluating
http://www.ltsn.ac.uk/genericcentre/index.asp Applications of Computer Conferencing. PLUM
?id=17149. report no. 31. Open University, Milton Keynes.
Dalziel J. and Gazzard, S. (1999) Next McAlpine, M. (2002) Design Requirements of a
generation computer-assisted assessment Data Bank. Bluepaper Number 3, CAACentre,
software: the design and implementation of University of Luton,
WebMCQ. In Proceedings of the 3rd Annual http://www.caacentre.ac.uk.
CAA Conference,. Loughborough University, McGuire, G.R., Youngson, M.A., Korabinski,
Loughborough, UK, 16-17 June, 1999, 61-74. A.A. (2002) Partial Credit in Mathematics
Danson, M. et al. (2001) Large Scale Exams - a Comparison of Traditional and CAA
Implementation of Question Mark Perception Exams. In Proceedings of the 6th International
V2.5. In 5th International CAA Conference Conference on Computer-assisted
Proceedings. Loughborough University. 2001. Assessment, Loughborough University,
Loughborough, UK, 9-10 July, 2002. 223 -230.

24
LTSN Generic Centre – Assessment Series 2004
Assessment: A Briefing on Computer-assisted Assessment

McKenna, C. and Bull, J. (2000) Quality Wainer, H. (2000) Computerised Adaptive


assurance of computer-assisted assessment: Testing: A primer, Lawrence Erblaum
practical and strategic issues, in Quality Associates, New Jersey.
Assurance in Education, 8(1). White, S. and Davies, H. (2000) Creating Large-
McKenna, C. (2001) Who’s in control? scale test banks: a briefing for participative
Considering issues of power and control discussion of issues and agendas. In 4th
associated with the use of CAA: a discussion International Computer-assisted Assessment
session. In Proceedings of the 5th International conference proceedings, University of
Conference on Computer-assisted Loughborough, UK June 2000
Assessment, Loughborough University, Available from:
Loughborough, UK, 2- 3 July, 2001, 305-308. http://www.caaconference.com
Mulligan, B. (1999) Pilot study on the impact of (Accessed: January 2004)
frequent computerized assessment on student
work rates, In Danson, M. and Sherratt, R.
(eds), Proceedings of the 3rd Annual CAA
Further Reading
Conference, Loughborough. and Resources
O’Hare, D. (2001) Student Views of Formative
Computer-assisted Assessment Centre
and Summative CAA. In Proceedings of the 5th
website - http://www.caacentre.ac.uk
International Conference on Computer-
assisted Assessment, Loughborough (Accessed: January 2004)
University, Loughborough, UK 2-3 July, 2001, CASTLE Assessment System -
371-386. http://www.le.ac.uk/castle/
Oliver, M., (1999) A framework for evaluating (Accessed: January 2004)
the use of educational technology. Evaluation TRIADS Assessment System -
of Learning Technologies project, University of http://www.derby.ac.uk/assess/talk/
North London quicdemo.html
Oliver, M. (1998) Evaluating Learning (Accessed: January 2004)
Technologies: a toolkit for practioners. Active Centre for Educational Technology
Learning, 8, 3-8. See also: Interoperability Standards -
http://www.unl.ac.uk/tltc/elt/toolkit.pdf http://www.cetis.ac.uk
Perkin, M. (1999) Validating formative and (Accessed: January 2004)
summative assessment. In Brown, S., Bull, J. Educational Testing Service -
and Race, P. (eds.), Computer-Assisted http://www.ets.org/research/newpubs.html
Assessment in Higher Education, Kogan Page, (Accessed: January 2004)
London. Economics Learning and Teaching Support
Poulter, M. (2002) Personal Communication Network Question Banks
with Dr Joanna Bull, April 2002. http://www.economics.ltsn.ac.uk/qnbank.htm
Ricketts, C. and Wilks, S. (2002) What Factors (Accessed: January 2004)
Affect Student Opinions of Computer-assisted Engineering Assessment Network (2002)
Assessment? In Proceedings of the 6th http://www.ecs.soton.ac.uk/E3AN/
International Conference on Computer- (Accessed: January 2004)
assisted Assessment, Loughborough Mostert, E., Knoetze, J. G. (2001)
University, Loughborough, UK, 9-10 July, Implementing an Electronic Portfolio
2002, 307-316. Assessment Strategy: Multiple Pathways for
Robinson, J. (1999) Computer-assisted Peer Diverse Learners. In Proceedings of the 5th
Review. In Brown, S., Bull, J. and Race, P International Conference on Computer-
(eds.), Computer-Assisted Assessment in assisted Assessment, Loughborough
Higher Education, London: Kogan Page. University, Loughborough, UK 2- 3 July, 2001,
349-357.
The Learning and
Teaching Support Network
Generic Centre
The Learning and Teaching Support Network (LTSN) The LTSN Generic Centre
is a network of 24 Subject Centres, based in higher Assessment Series Guides for:
education institutions throughout the UK, and a Senior Managers

Generic Centre, based in York, offering generic Heads of Department


information and expertise on learning and teaching Lecturers
issues that cross subject boundaries. It aims to
Students
promote high quality learning and teaching through
the development and transfer of good practice in all
Briefings:
subject disciplines, and to provide a ‘one-stop shop’
Assessment issues arising from key skills
of learning and teaching resources for the HE
Assessment of portfolios
community.
Key concepts: formative and summative,
criterion and norm-referenced assessment
The Generic Centre, in partnership with other
organisations, will broker information and knowledge Assessing disabled students
to facilitate a more co-ordinated approach to Self, peer and group assessment
enhancing learning and teaching. It will:
Plagiarism
• work with the Subject Centres to maximise the
Work-based learning
potential of the network;
• work in partnership to identify and respond to key Assessment of large groups

priorities within the HE community; Problem-based Learning


• facilitate access to the development of Computer-assisted Assessment
information, expertise and resources to develop
Implementing an Institutional Assessment
new understandings about learning and teaching.
Strategy

Published by
Learning and Teaching Support Network (LTSN) LTSN Generic Centre
The Network Centre, Innovation Close,
Assessment
York Science Park, Heslington York YO10 5ZF
Series No 14
For more information, contact the
Generic Centre at the above address or A Briefing on Computer-assisted Assessment
ISBN 1-904190-53-7
Tel: 01904 754555 Fax: 01904 754599
£75.00 (for full set)
Email: gcenquiries@ltsn.ac.uk
www.ltsn.ac.uk/genericcentre

© LTSN, January 2004


All rights reserved. Apart from any fair dealing for the purposes of research or private study, criticism or review, no part of this
publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, graphic, electronic,
mechanical, photocopying, recording, taping or otherwise, without the prior permission in writing of the publishers.

You might also like