Professional Documents
Culture Documents
Dimas Krisna Wijaya Kusuma_012_Assessment 3 Alternative Assessment Essay
Dimas Krisna Wijaya Kusuma_012_Assessment 3 Alternative Assessment Essay
NIM : 20220810012
Class : LAE A
Practical Exams At my vocational school, EFL practical exams are an important part
of our final exam series. This assessment aims to evaluate our practical use of English, such
as speaking, listening, reading and writing skills. The assessment is divided into stages and
has its own objectives, each designed to test a different language skill. The speaking test
required us to engage in a conversation with our English teacher on a specific topic. We were
evaluated on our fluency, accuracy, pronunciation, and our ability to make the conversation
come alive. For the listening test, we were asked to listen to an audio clip and answer
comprehension questions based on the content. For reading, we were given a passage to read
aloud and answer questions to test our comprehension. The writing test required us to write
an essay on a specific topic, assessing our grammar, coherence and argumentative skills. We
were very worried about the results. On the other hand, we were also very excited to know
the results. Many of my friends felt that the speaking and listening practice exams, as these
components require spontaneous responses and excessive focus. However, some students felt
that the practical exam would be more interactive and interesting if it used modern methods
and involved technology such as mobile phones and laptops. The scoring system of the
modern method was more transparent, with rubrics provided in advance, detailing the criteria
for each skill. The rubric helps us understand what is expected and how we will be assessed
and also what we need to improve on afterwards.
The practical exam was generally well organized but there is still room for
improvement. The practicality of this assessment is high as it provides a realistic measure of
our language ability. However, scheduling is a challenge, as conducting individual speaking
tests for a large number of students requires substantial time and resources. In terms of
validity, the assessment accurately measures language skills as intended. However, reliability
is sometimes compromised due to subjective scoring, especially in the speaking and writing
components. Different teachers may have different interpretations of the rubric, leading to
inconsistent scores. The authenticity of the tasks is commendable, as they reflect real-world
language use. However, the listening tasks could include more varied accents to better
prepare students for global communication. The washback effect of these practice exams is
positive, encouraging students to focus on developing the full range of language skills rather
than memorization. Despite these advantages, the assessment could be improved by ensuring
more consistent and objective scoring. In addition, the inclusion of peer and self-assessment
components can increase reliability and provide a more comprehensive evaluation of
students' abilities.
In conclusion, while the existing practice exams in my high school are effective in
many ways, they could be improved by addressing reliability issues and incorporating
alternative assessment methods such as portfolios. Such improvements would provide a more
comprehensive and accurate evaluation of students' English language abilities, so that they
can be better prepared for future challenges.