Professional Documents
Culture Documents
Id350 Computer Assisted Assessment Caa 1568036670
Id350 Computer Assisted Assessment Caa 1568036670
Assessment
Series No 14
Computer-assisted
Assessment (CAA)
Joanna Bull and
Myles Danson
ltsn
generic centre
Learning and Teaching Support Network
Joanna Bull is Director of Eduology. Eduology provides research, consultancy, advice
and creative solutions for businesses, training organisations, educational institutions and the
public sector. It advises clients on ways to improve the effectiveness of learning and training
within their organisation. Previously, Joanna was Head of Research in Teaching and Learning
at the University of Luton and Project Manager for the Computer-assisted Assessment (CAA)
Centre, a Teaching and Learning Technology Programme (TLTP) project. The CAA Centre
(www.caacentre.ac.uk) was a national centre for expertise, advice and guidance on CAA in
higher education. Joanna has extensive experience of project management and evaluation,
has published widely and has designed and delivered numerous workshops and seminars
nationally and internationally. She has presented at conferences nationally and internationally
and her most recent publications include articles and books on the assessment of student
learning, learning technology and strategies for implementing computer-assisted
assessment.
Myles Danson is the CAA Manager at Loughborough University and has overall
responsibility for the centrally supported CAA systems. He is also the organiser of the
International CAA Conference (www.caaconference.com/), now in its seventh year. Over the
last five years Myles has developed and managed various CAA systems including optical
data capture, networked PC delivery, and most recently, the development and roll out of a
front line web-based CAA system. The latter was funded by the Joint Information Systems
Committee (JISC) and contributed to their Managed Learning Environment projects. The
CAA systems at Loughborough have delivered and marked over 200,000 assessments to
date. Myles engages in project management and income generation including the CAA
Centre project, FDTL and LTSN work and most recently, the JISC funded Technologies for
Interoperable Assessment project (www.toia.ac.uk). He also has substantial experience of
national and international publications, workshops, guest speaking, policy formation and
strategic planning.
Contents 14
Introduction 2
Overview of CAA 3
What is computer-assisted assessment? 3
What is possible with CAA? 3
Getting started 8
Planning 8
Piloting 9
Institutional support 9
Further development 9
Effective practice 10
Case studies 10
Standards 15
Question banks 17
What are ‘question banks’? 17
Why are they useful? 17
Question statistics 17
Examples in practice 18
Evaluating practice 19
Why is evaluation important? 19
What aspects of CAA should be evaluated? 19
What approached to evaluation should be taken? 20
Conclusion 23
References 24
Introduction
The purpose of this briefing is to explore the implementation issues of CAA may find the
role which computers can play in the Blueprint for Computer-assisted
assessment process. The briefing Assessment of use (Bull and McKenna,
introduces the range of activities which 2001).
comprise computer-assisted assessment
(CAA) and discusses the advantages and This briefing is aimed at staff and
challenges of introducing computers into educational developers and other staff who
the assessment process. Ideas and teach and support student learning. It may
suggestions on how to get started are also be useful to anyone who wishes to gain
provided and good practice highlighted an overview of the potential of computers in
through mini case studies of effective the assessment process and the associated
practice. Key issues are also raised such as key issues. The briefing provides a useful
standards, the role of question banks and starting point for those considering CAA and
the importance of evaluation. The briefing could be used as a focus for discussion with
does not discuss the general pedagogical colleagues or as the basis of an introductory
issues associated with assessment - these staff development session. The briefing
are covered in other briefings and guides. complements the existing assessment
Neither does it discuss the technical series, in particular the Guide for Lecturers,
solutions to computer-assisted assessment which provides an overall context within
nor provide a detailed methodology for which CAA is one method of assessment
implementation. However, colleagues that may be used.
wishing to follow up on the pedagogical and
2
LTSN Generic Centre – Assessment Series 2004
Overview of computer-assisted
assessment
What is computer-assisted CAA include: web-based assessment,
assessment? computer-based assessment, online
There is – in some ways unfortunately – a It is true that there are examples of CAA
plethora of terminology which surrounds the consisting solely of multiple-choice
use of information and communications questions and that some of these are not
technology (ICT) in higher education. The well designed and do not test more than
terminology changes and adapts at the rate basic knowledge. This is often a result of a
of technological development and can cause lack of understanding about the pedagogy
confusion and distrust among academics of designing questions and tests – a
and students. Other terms used in the complex and skilled process. However,
literature and practice to describe types of there are also examples of CAA which draw
3
LTSN Generic Centre – Assessment Series 2004
Assessment: A Briefing on Computer-assisted Assessment
4
LTSN Generic Centre – Assessment Series 2004
Advantages and challenges
14
Computerising this process to any extent has to wait for this process to be
highlights not only those issues associated undertaken. If designed effectively, CAA can
with traditional assessment, but also poses provide highly effective instant feedback.
new issues. Promotion of a new form of Students may take tests at a time and
assessment usually invokes criticisms rarely frequency to suit themselves and, in the
considered in the traditional assessment case of open access web-based
process. When considering CAA, it is useful assessment, the technology often allows
to think about how the traditional process tests to be taken at a location to suit the
deals with issues such as candidate student.
authentication, security and plagiarism, and
to attempt to draw parallel conclusions In the case of CAA potential benefits are
where appropriate. There is a major issue of available to all parties. The lecturer is free
acceptance of new technologies within the from manual marking: the student gets an
curriculum and it is constructive to instant and objective score with specific and
recognise that standard assessment timely feedback. In addition, the detailed
techniques, frequently developed with little scoring data are already digitised and the
strategic planning, are often far from perfect. possibility of automated score upload to
Mapping the assessment process can be a central repositories such as a student
relatively complex task. However, a full records system offers administrative
evaluation of this often reveals inadequacies benefits. This ‘win-win’ scenario provides
and opportunities for improvement which great potential for formative, summative and
CAA can address. (See Danson et al., 2001, diagnostic assessment. However, it must be
for an example of an assessment mapping recognised that CAA ‘front loads’ the
process.) assessment process – the majority of time
and effort is invested in the design of
At first glance, the key advantage of CAA is questions and tests prior to the assessment
usually perceived as a saving in resources taking place. With traditional assessments,
and time. Automated marking is highly greater time and effort is invested in marking
desirable from both the point of view of the once the assessment is completed. Creating
educator, and if students are properly pedagogically sound CAA is a highly skilled
informed about the capabilities and benefits activity and should not be dismissed lightly.
of the CAA – the learner. Quick and often The testing medium can be far richer than
instant marking and feedback are clear paper-based assessment.
benefits, leaving educators free from often
5
LTSN Generic Centre – Assessment Series 2004
Assessment: A Briefing on Computer-assisted Assessment
6
LTSN Generic Centre – Assessment Series 2004
Assessment: A Briefing on Computer-assisted Assessment
back to allow for student numbers which support staff are likely to need staff
exceed the numbers of computers available. development and training on both the
14
In addition, policies and procedures may pedagogical and operational aspects of
have to be developed to ensure smooth CAA. There is also an ongoing need for
running of CAA within the context of existing support to ensure that academic staff are
assessment regulations. free to concentrate on developing
pedagogically sound assessments rather
In addition, accessibility needs to be than attempting to install and configure
considered in the design of questions and software on their institutional networks. To
construction of online assessments. meet this challenge, clear roles and
Appropriate measures should be put in responsibilities should be discussed and
place to ensure that assessments are decided early in the process of considering
accessible to all students. Existing policies CAA. It is important to acknowledge that
and support mechanisms should be CAA requires skills and experience from
reviewed in light of the introduction of CAA across an institution, often involving a range
for summative assessment. of staff from academic, IT, library and
administrative departments.
A key challenge is providing appropriate
pedagogical and technical support to
academic and other staff involved in
developing CAA. Both academic and
7
LTSN Generic Centre – Assessment Series 2004
Getting Started
Once the purpose is defined, the next step is • Interoperability (IMS standards – see
to identify stakeholders and set up section on Standards on page 15 below)
appropriate consultative processes. This may • Existing systems and integration
be a relatively small group for formative or • Scaleability
self-assessment which is supplementing • Performance levels
existing assessment methods, or a larger • Upgrading procedures and limitations
faculty or institution-wide group for • Support and maintenance
summative assessment. It is particularly • Security — particularly where summative
important to involve appropriate IT staff in the assessment is intended
consultative process given that the use of • Accessibility.
CAA may well impact on other systems and,
depending on the model of support, can It may also be useful to visit other institutions
represent a shift in workload from academic with the aim of discovering both successes
to support staff. It is useful to recruit other and failures. Be aware that there is a general
enthusiasts at this stage to help drive the reluctance to admit failures! The type and
process and provide support and motivation. level of support required by staff and
Review and evaluate the various solutions students should also be considered early in
available and consider the relative the process. Staff development should be
8
LTSN Generic Centre – Assessment Series 2004
Assessment: A Briefing on Computer-assisted Assessment
taken seriously and can help to avoid pitfalls be to foster a culture of ownership and
later in implementation. In particular, responsibility. Since traditional roles and job
pedagogical staff development should be descriptions may not easily fit within the
undertaken to ensure that assessments
delivered in CAA do not fall into the category
context of CAA development and support,
appropriate training and support should be
14
of ‘only good for testing low-level recall’. established. It may be beneficial to recruit a
senior manager as a ‘champion’ to help drive
It is useful to review existing questions – from
the process forward and to secure resources
textbooks, the Internet and other sources – as
where necessary. There may also be existing
this can help with the formulation of ideas
support departments (such as staff
about how CAA can be used effectively. It
development, teaching and learning, and
may also be possible to find existing
learning technology groups) that can help to
questions and tests which can be purchased,
support and guide the implementation in
used freely, or shared and adapted for re-use.
conjunction with academic staff. In some
Care should, however, be taken to ensure
institutions, dedicated CAA support staff are
questions are of a high standard and are
appointed with this specific remit.
pedagogically and culturally appropriate.
Further development
Piloting
As CAA develops and more staff and students
It is important to test CAA systems and
become involved, there may be a move from
assessments, particularly where summative
formative assessment towards summative
assessment is to be implemented. Formative
assessment. The involvement of larger
open access assessments can prove a useful
numbers of students in CAA and the high-
way of gathering information about software
stakes nature of summative assessment leads
and question performance and student
to further considerations, such as:
experience. The level of piloting will depend
on how many students are likely to use the • Existing examination regulations and
CAA and on what basis. A series of pilots may assessment policies and their suitability
be valuable for high-stakes summative for CAA
assessments. Piloting also provides an • Special educational needs
opportunity to test operational procedures • Back-up procedures and contingency
and to gain feedback from students. The planning
original purpose of the CAA should be tested • Usability and ergonomics
by piloting to ensure that the system meets • Bandwidth and stability of connections
the needs of both students and staff. for off campus access.
Institutional support The way in which CAA evolves will vary, and
can be subject-specific. It is important to
It is useful (though not always essential) to acknowledge the time and effort involved in
gain institutional or faculty support for CAA. creating well-designed, valid and reliable CAA
Because CAA often involves a range of staff, and to be clear about the benefits and
both academic and support, the aim should challenges of implementing CAA.
9
LTSN Generic Centre – Assessment Series 2004
Effective practice
10
LTSN Generic Centre – Assessment Series 2004
Assessment: A Briefing on Computer-assisted Assessment
14
‘Understanding Poetry’ is a twelve week first historical development of the genre.
year undergraduate elective module which
teaches students about critical terms and • Computer Aided Learning (CAL)
theories, the process of developing a poem, manuscript assignment – to assess
the historical development of poetry, and students’ understanding of ‘poetic
some aspects of critical comparison. The process’ from manuscript to
module attracts quite large numbers. For publication.
some, it is a pre-requisite for further study In this case, the CAA component of the
while for others it is an elective and they assessment strategy is both formative, with
may be unlikely to study literature any self-assessment questions available to
further. The size and mixed literary students on demand, and summative, with
experience of the cohort mean that the an end-of-module exam. It ensures that
number of free-text assessments (essays, students are tested on a wide range of
critiques) is limited by what can reasonably poetic terms and their application. Students
be marked. The lecturer also cannot assume who are weaker can practice with self-
that all the students have a good knowledge assessment. This motivates them and has a
of those critical concepts which are positive effect on their other assessments.
important as a basis for developing The lecturer also finds that she receives
understanding and critical analysis skills. In fewer basic and repetitive questions about
order to effectively assess the module, a poetic terms and concepts during teaching
range of objective and subjective methods sessions. The summative assessment
of assessment are used including the identifies that students have knowledge and
following: understanding of a core set of critical terms
and concepts.
• CAA objective test – to test knowledge
of poetic terms, application of terms (This case study is adapted from Bull and
and concepts and understanding of McKenna, 2001).
historical development of the genre.
11
LTSN Generic Centre – Assessment Series 2004
Assessment: A Briefing on Computer-assisted Assessment
12
LTSN Generic Centre – Assessment Series 2004
Using ‘virtual learning environments’
for assessment
Britain and Liber (1999) state that virtual Some VLEs include the facility to develop
learning environments (VLEs) are, ‘Systems basic objective test questions. These usually
which synthesise the functionality of comprise three or four question types
computer-mediated communications
software (email, bulletin boards etc) and
(multiple choice, multiple response, text
match). The functionality of the question
14
online methods of delivering course features is somewhat limited. However, they
materials’. Examples of VLEs in use in the do offer a valuable opportunity to rapidly
higher education sector include provide self-assessment questions which
‘Blackboard’, ‘WebCT’, ‘Learnwise’, ‘Lotus can be embedded within learning materials,
Learning Space’ and ‘Top Class’, among allowing students to check their progress as
others. Surveys show that in the last two they are working in the VLE. They are a good
years many institutions have adopted a VLE way of introducing CAA into the curriculum
to help support and enhance student and usually provide a simple interface for
learning. question construction.
VLEs can help support and deliver Beyond the use of VLE to provide
assessment. Details of assignment and information, feedback and basic objective
other assessment requirements, test questions there are interesting
examination timetables, regulations, and developments that indicate a movement
sources of help and advice can all be made towards the wider use of computers in the
available for students to access through a assessment process.
VLE via the Internet or intranet. Email and
bulletin boards can be used to notify or Peer assessment is one area where VLE can
remind students of deadlines. be particularly powerful in allowing students
to engage in critical review, evaluation and
VLEs also offer the opportunity to distribute analysis. For example, a cohort of
global and individual feedback to a group of Economics students is asked to research
students about a particular assessment. and write a report exploring the effects of
This could be achieved by uploading a file expanding the European Union on the
containing global feedback which might Common Agricultural Policy. Once
usefully be linked to a discussion session in completed, students upload their reports to
which students query and discuss the a secure folder on the VLE. The tutor
feedback and assignment. Feedback on anonymises each report and assigns it a
assessed work can be emailed to individual number. Each student is then emailed with
students, inviting them to raise questions the number of the report they are to assess
(by email, phone, in person) as appropriate. according to defined marking criteria and a
The online environment may well encourage series of questions designed to give
some, otherwise reticent, students to voice feedback to the author. Once students have
their thoughts and concerns. completed the peer assessment, they post
13
LTSN Generic Centre – Assessment Series 2004
Assessment: A Briefing on Computer-assisted Assessment
their marking schedules back to the tutor To date, no clear models for how this is best
who may wish to moderate them prior to approached have emerged, although the
giving them to the students. This activity design of tools which graphically display
could be followed by an online discussion levels of participation against defined
session, where students comment on the criteria should help to resolve some of the
marks and feedback that they received, issues associated with computer-mediated
their experience of the process of marking communication and assessment (Kuminck
peers’ work, and how they would approach and Pilkington, 2001).
the report differently in future. Students
could also be asked to complete a short VLE offer many opportunities to engage
objective test on how they approached both students in a range of formal and informal
writing the report and marking it. Marks assessment activities, and it is perhaps their
could be awarded to students for the report, ability to integrate learning and assessment
their peer assessment, and engagement in which is particularly powerful. Although
the discussion or objective questions. bespoke assessment software offers
Robinson (1999) describes a system for greater flexibility for objective style
enabling online anonymous peer review assessments, VLE have much potential for
systems and the benefits these offer. enhancing the assessment process,
particular for self- and peer assessment.
There is also interest in assessing the
contributions which students make to online
discussions. Approaches vary from
awarding a small amount of credit for
‘engaging’ at any level, to attempts to
measure the worth of individual’s
contributions. Indeed, discussions are often
richer learning experiences if they are
assessed, as students feel obliged to
contribute and to interact. Beware, however,
of over-assessing.
14
LTSN Generic Centre – Assessment Series 2004
Standards
14
are equally important in any assessment ‘Question banks’.
process. Question and test content should
be academically effective and processes to Standard procedures become increasingly
manage this could include peer review of important as larger numbers of students and
assessment questions and tests. This is staff become involved in CAA. It is possible
most often done ‘in-house’ whereby to bend the rules to accommodate
colleagues are asked to proof questions and individual needs when dealing with a
tests. Increasingly, as technology advances handful of academics wishing to use CAA
and uptake is more widespread, content is but this can lead to problems where
being made available through third parties in significant numbers of staff and students
the form of peer-reviewed question banks. are involved. Adopting standard approaches
(Learning and Teaching Support Network and procedures allows resources to be
subject centres are beginning to produce monitored and adjusted to cope with
and offer these, as are some commercial demand. In a traditional educational setting,
content providers.) As an institutional it should be recognised that peak times
question bank expands there is potential to occur throughout the academic year and
share material across traditionally disparate these should be borne in mind. Diagnostic
departments where teaching content is testing at the start of the academic year is
shared. If successful, this can be a huge often the first peak. The following weeks see
benefit to both new teaching staff such as an ebb in demand as lecture courses are
academic probationers and also established delivered and content is disseminated to
academics who are keen to adapt and ‘re- learners. Once enough content has been
purpose’ questions. delivered, the demand for testing becomes
greater and peaks again at examination
New questions and tests should be times.
introduced gradually and performance
monitored. The majority of CAA systems The technical standards are unique to CAA
offer statistical item analyses and the use of as a form of assessment. The language and
these can reveal weaknesses in questions wealth of acronyms surrounding standards
and errors which may have been neglected initiatives can be confusing. However, the
in past paper-based versions. Consideration key issue is that standards will help different
of the item analysis will help to improve the systems to exchange data. This means that,
reliability and validity of tests, ensuring they if deployed successfully, it should be
are an effective measure of student possible to link student record systems,
15
LTSN Generic Centre – Assessment Series 2004
Assessment: A Briefing on Computer-assisted Assessment
virtual learning environments and CAA of questions, and input it into another, for
systems, resulting in single point of entry to example a better assessment software
such systems: one user name and package. Another advantage is that such a
password is all that is ever needed. For standard will allow questions to be shared
CAA, this could mean that module across departments and institutions using
registrations can be used to schedule compliant software.
assessments, that the system could
automatically inform students of which tests Web-based systems rely on web browser
they have to take, and that once completed, applications which means that security
the score is uploaded and ratified. issues, such as invoking a secure
connection, should be addressed. Web
The IMS Global Learning Consortium is the browsers do pose some additional
most advanced of the standards bodies, difficulties. We must guarantee that the
and was originally formed in 1997 as an assessment looks and behaves as expected
Educom project. Its membership is across the different browsers and platforms
extensive and a wide range of standards are in use for although the web is ‘cross
currently being worked on. One of these platform’, uniformity is not always simple. It
standards is the Question and Test may be necessary to define a specific
Interoperability (QTI) specification. UK configuration to ensure standard
higher education is represented by the presentation. However, this can be
Centre for Educational Technology problematic as settings at the client
Interoperability Standards on international machine are hard to control. There are also
learning technology initiatives such as IMS. differences in Macintosh and PC versions
of the same product such as Internet
TM
The QTI defines question types but Explorer . Reliance on Java TM and ‘plug-ins’
separates content from presentation. This is common in web-delivered CAA and this
allows questions which are meet this must also be enabled on the student
standard to be imported and exported machine.
between IMS compliant systems. This helps
to avoid situations where the design of the
software means that it is difficult to extract
data held in one system, for example a bank
16
LTSN Generic Centre – Assessment Series 2004
Question banks
14
Question banks are collections of questions, from a bank according to desired criteria and
each uniquely identified and stored to allow may be re-used where appropriate.
the automated creation of tests to meet
predefined criteria. Each question has Question statistics
associated descriptors which may define a Statistical measures are used to determine the
number of features of the question such as characteristics of question and tests and
academic level, topic, difficulty, and skill or indicate the worth of each question for
knowledge addressed by the question. inclusion in a bank. The question is the unit for
Questions can be contributed to (and analysis rather than the whole assessment,
withdrawn from) the bank by authors and and each question is evaluated independently
users. Question banks are most commonly, to generate item statistics. For a detailed
but not exclusively, used with objective description and discussion of the statistics of
questions. They can be used to store question banking see Hambleton,
questions delivered by computer and paper Swaminathan and Rodgers (1991) and
methods. McAlpine (2002).
17
LTSN Generic Centre – Assessment Series 2004
Assessment: A Briefing on Computer-assisted Assessment
18
LTSN Generic Centre – Assessment Series 2004
Evaluating practice
14
It is important to evaluate educational additional feedback on their learning? Or is
innovations, new and adapted methods of it to provide evidence for the head of
assessment – and indeed established department that the use of CAA released
methods of assessment – in order to judge time for lecturers to undertake more small-
their impact on learning, efficiency and group teaching?
effectiveness. The combination of
computers and assessment often Those interested in the outcomes of CAA
encourages questions about the evaluation might include:
educational and operational effectiveness of • academics
CAA, questions which are rarely asked of • members of teaching and learning
existing assessment methods. This is committees
largely positive and can prompt a review • educational technologists
and amendment of assessment strategies • computer services staff
and methods overall. • examination officers
• senior managers
What aspects of CAA should • quality assurance staff.
be evaluated?
IIn order to be achievable, it may be
Evaluation of CAA can take many forms. For necessary to focus the evaluation very
example, it may be at an individual module specifically, for example, by gathering
level or at a departmental or institutional qualitative data from a small group of
level. In other cases, it may focus on the students about their experiences of CAA, or
student or staff experience or on the effect by monitoring the volume and pattern of
of CAA on learning and grades. student access to CAA. Where more time
and resources are available, broader
Evaluation studies need to be realistic and approaches maybe adopted, perhaps
to have clearly defined aims and objectives: concerned with gathering experiences,
often these will relate to the motivations for perceptions and data concerning the use of
introducing CAA. It is highly unlikely that several types of CAA from staff and
every aspect of the implementation of CAA students in order to provide a generic
can be evaluated as this would present a measure of the success of its introduction in
time-consuming and exhaustive task. It is a particular course. At an institutional level,
therefore essential to clearly define the evaluations may be concerned with time
aim(s) of introducing CAA in order to identify efficiencies, cost effectiveness and
whether these aims have been met. It is also operational practices.
important to identify who the evaluation is
19
LTSN Generic Centre – Assessment Series 2004
Assessment: A Briefing on Computer-assisted Assessment
Example of evaluations which may be (Dalziel and Gazzard, 1999; O’Hare, 2001;
useful, depending on the type and purpose Ricketts and Wilks, (2002). Mulligan (1999)
of CAA, include: reports on the impact of regular CAA on
students’ study behaviour. McGuire et al.,
• comparison of scores between CAA (2002) evaluate the role of partial credit
and paper-based tests within CAA and paper-based examinations
• correlation between CAA tests and and the mode of delivery is also explored by
other assessments methods within a Fiddes et al., (2002). In the States, the
module Educational Testing Service, responsible for
• student attitudes towards CAA (ease of large-scale national testing, has conducted
use, anxiety, relevance of content, in-depth and large-scale evaluative studies
accessibility, perceived equity of of a range of issues relating to CAA.
system)
• quality and speed of feedback to What approaches to evaluation
students should be taken?
• quality of questions
• effects of CAA on student study Approaches to evaluating learning
behaviour technology generally have been developed
• staff attitudes towards CAA over the last twenty years and Oliver (1998)
(educational efficacy, ease of use, summarises the difficulties associated with
anxiety, use in different educational using learning technology:
levels, perceived).
• ‘the effect of IT is not consistent across
From McKenna and Bull (2000) subject or age groups (Hammond,
1994);
The literature provides examples of • high and low ability learners benefit
evaluations of several types. Early from different types of software (Atkins,
comparisons between paper and computer- 1993);
based assessments were followed by more • lack of expertise amongst students or
sophisticated triangulations of CAA results teachers can create difficulties
with essay and examination scores (Hammond, 1994); and
(Kniveton, 1996; Farthing and McPhee, • it can be extremely difficult even to
1999; Perkin, 1999). Students’ experience of specify or measure “educational value”
CAA are also reported in the literature (Mason, 1992)’.
20
LTSN Generic Centre – Assessment Series 2004
Assessment: A Briefing on Computer-assisted Assessment
21
LTSN Generic Centre – Assessment Series 2004
Assessment: A Briefing on Computer-assisted Assessment
22
LTSN Generic Centre – Assessment Series 2004
Conclusion
14
summatively, are many. Students benefit with appropriate support, both
from timely and specific feedback on their pedagogically and technically it is possible
learning and a chance to practise skills and to use CAA to make assessment more
monitor their own progress. Lecturers can effective and more efficient. A variety of
use CAA to complement and enhance opportunities are emerging which offer
existing assessment methods, to broaden exciting possibilities for CAA. Sharing
the range of knowledge assessed, and even questions through banks, and standards
extend the limitations of paper-based compliance, will ease the necessity for
methods. Rapid and detailed feedback to reinventing questions and will open up
staff can also help to guide the development opportunities for re-use and creative
of the curriculum, while automated marking adaptation. Virtual learning environments
releases time which can be spent on richer provide opportunities to support and
forms of face-to-face interaction with enhance assessment and together with
students. Assessment quality can be easily software developments and integration, are
assured and there is a level of consistency beginning to encourage the assessment of
and objectivity in marking which is not skills and abilities which would be not
possible for traditional forms of assessment. attempted on paper.
However, time and effort is required in
designing pedagogically effective questions CAA is still in the relatively early stages of
and tests. A rethinking of assessment – and development. It requires creativity and often
often teaching and learning – is sometimes challenges well-established traditions of
prompted by the introduction of CAA, and assessment but has powerful potential for
this can prove challenging as new ways of both educators and students.
working need to be found. The operational
and technical issues need careful planning
and resourcing and should not be under-
estimated, especially where summative
assessment is undertaken.
23
LTSN Generic Centre – Assessment Series 2004
References
Ashton, H.S. and Beevers, C. (2002) Extending Draper, S. et al. (1994) Observing and
the Flexibility in an Existing On-line Measuring the Performance of Educational
Assessment System, Proceedings of the 6th Technology. TILT report no.1, University of
Annual CAA Conference. Loughborough Glasgow.
University, Loughborough, UK, 9-10 July, Farthing, D. and McPhee, D. (1999) Multiple
2002, 3-16. choice for honours-level students? A statistical
Atkins, M. (1993) Evaluating interactive evaluation. In Danson, M. and Sherratt, R.
technologies for learning. Journal of (eds), Proceedings of the 3rd Annual CAA
Curriculum Studies, 25 (4), 333-342. Conference, Loughborough.
Biggs, J. (1999) Teaching for Quality Learning Fiddes, D. J. et al. (2002) Are Mathematics
at University, Society for Research into Higher Exam Results affected by the Mode of
Education and Open University Press, Delivery? Alt-J 10 61-69.
Buckingham, 175. Hambelton, R. K., Swaminathan, H. and
Britain, S. and Liber, O. (2000) A Framework for Rogers, H. J. (1991) Fundamentals of Item
Pedagogical Evaluation Of Virtual Learning Response Theory, Sage Publications,
Evironments, JISC Technology Applications California.
Programme. Bristol: Joint Information Systems Hammond, M. (1994) Measuring the impact of
Committee report. IT on learning. Journal of Computer Assisted
Available from: Learning, 10, 251-260
http://www.jtap.ac.uk/reports/htm/jtap- Kniveton, G. H. (1996) A correlation analysis of
041.html (Accessed: multiple-choice and essay assessment
January, 2004) measures’ Research in Education, 56.
Bull, J. and Daziel, J. (2003) Assessing Kukich, K (2000) Beyond Automated Essay
Question Banks. In Littlejohn, A. (ed.) Reusing Scoring. IEEE Intelligent Systems, 22-27.
Online Resources, London: Kogan Page. Kuminnek, P.A. and Pilkington, R.M.(2001)
Bull, J. and McKenna, C. (2001) Blueprint for Helping the Tutor Facilitate Debate to Improve
Computer-assisted Assessment. CAA Centre, Literacy using CMC. In Proceedings of IEEE
ISBN: 1-904020-00-3, International Conference on Advanced
http://www.caacentre.ac.uk. Learning Technologies: Issues, Achievements
Cronbach, L. (1982) Issues in planning and Challenges, Madison, Wisconsin, 6-8
evaluations. In Cronbach, L (ed.) Designing August, 261-266.
evaluations of educational and social Leacock, C. and Chodorow, M. (2000)
programs. Jossey-Bass, San Francisco. Automated Scoring of Short Answer
Cook, J. (2002) Evaluating Learning Responses, Research Report, ETS
Technology Resources, LTSN Generic Centre Technologies.
and the Association for Learning Technology, Mason, R. (1992) Methodologies for Evaluating
http://www.ltsn.ac.uk/genericcentre/index.asp Applications of Computer Conferencing. PLUM
?id=17149. report no. 31. Open University, Milton Keynes.
Dalziel J. and Gazzard, S. (1999) Next McAlpine, M. (2002) Design Requirements of a
generation computer-assisted assessment Data Bank. Bluepaper Number 3, CAACentre,
software: the design and implementation of University of Luton,
WebMCQ. In Proceedings of the 3rd Annual http://www.caacentre.ac.uk.
CAA Conference,. Loughborough University, McGuire, G.R., Youngson, M.A., Korabinski,
Loughborough, UK, 16-17 June, 1999, 61-74. A.A. (2002) Partial Credit in Mathematics
Danson, M. et al. (2001) Large Scale Exams - a Comparison of Traditional and CAA
Implementation of Question Mark Perception Exams. In Proceedings of the 6th International
V2.5. In 5th International CAA Conference Conference on Computer-assisted
Proceedings. Loughborough University. 2001. Assessment, Loughborough University,
Loughborough, UK, 9-10 July, 2002. 223 -230.
24
LTSN Generic Centre – Assessment Series 2004
Assessment: A Briefing on Computer-assisted Assessment
Published by
Learning and Teaching Support Network (LTSN) LTSN Generic Centre
The Network Centre, Innovation Close,
Assessment
York Science Park, Heslington York YO10 5ZF
Series No 14
For more information, contact the
Generic Centre at the above address or A Briefing on Computer-assisted Assessment
ISBN 1-904190-53-7
Tel: 01904 754555 Fax: 01904 754599
£75.00 (for full set)
Email: gcenquiries@ltsn.ac.uk
www.ltsn.ac.uk/genericcentre