Download as pdf or txt
Download as pdf or txt
You are on page 1of 23

HANOI UNIVERSITY

Km 9 Nguyen Trai Road, Thanh Xuan, Hanoi, Vietnam


Telephone: (84-4)3854 4338; Fax: (84-4)3854 4550

E-mail: hanu@hanu.edu.vn; Website: www.hanu.edu.vn

GRADUATE DIPLOMA ASSIGNMENT COVER SHEET


Family Name: First Name:

Tran Khanh Van

Hoang Ngoc Tram

Unit Title: Language Testing and Assessment

Assignment Title: A literature review of 10 articles and a test evaluation

Name of Lecturer: Mr. Trinh Hai An, M.A. Class: 2PGN51

Date Submitted:
Student Contact Telephone No./Student Email Address:
May 12th, 2024
Tran Khanh Van: (+84)973303597/ khanhvan.3397@gmail.com

Hoang Ngoc Tram: (+84)988786180/ ngoctramm180@gmail.com

STUDENT DECLARATION

I DECLARE THAT THIS ASSIGNMENT IS ORIGINAL AND HAS NOT BEEN SUBMITTED FOR ASSESSMENT ELSEWHERE.

I DECLARE THAT THIS ASSIGNMENT IS MY OWN WORK AND DOES NOT INVOLVE PLAGIARISM OR

COLLUSION.

I GIVE MY CONSENT FOR THE ELECTRONIC VERSION TO BE EXAMINED BY RELEVANT PLAGIARISM

SOFTWARE PROGRAMS.

I HAVE MADE A PHOTOCOPY OR ELECTRONIC COPY OF MY ASSIGNMENT, WHICH I CAN PRODUCE IF THE ORIGINAL
IS LOST FOR ANY
REASON.

SIGNED: DATED: 12.05.2024


MARKS

COMMENTS: .......................................................................................................
...............................................................................................................................
...............................................................................................................................
...............................................................................................................................
...............................................................................................................................
...............................................................................................................................
...............................................................................................................................
...............................................................................................................................
...............................................................................................................................
...............................................................................................................................
...............................................................................................................................
...............................................................................................................................
...............................................................................................................................
...............................................................................................................................
...............................................................................................................................

Lecturer’s Signature: ..……………………………………… Date: ………………....


Table of Contents
A. A REVIEW OF TEN RESEARCH ARTICLES........................................................................................ 2
I. Introduction .................................................................................................................................................. 2
II. A review of ten related research articles .......................................................................................... 2
III. Conclusion .................................................................................................................................................. 9
B. TEST EVALUATION ................................................................................................................................ 9
I. Description of the context ........................................................................................................................ 9
1. Teaching context ......................................................................................................................................................................... 9
3. Course materials....................................................................................................................................................................... 10
4. Course content ........................................................................................................................................................................... 10
II. Test specification ..................................................................................................................................... 14
1. Test context and test purpose .......................................................................................................................................... 14
2. Test context and construct ................................................................................................................................................. 14
III. Analysis and evaluation of the validity of the test ...................................................................... 15
1. Content validity ......................................................................................................................................................................... 15
2. Construct validity ..................................................................................................................................................................... 16
3. Face validity ................................................................................................................................................................................ 16
4. Validity in scoring .................................................................................................................................................................... 17
REFERENCES ............................................................................................................................................. 17
APPENDIX .................................................................................................................................................. 19

1
A. A REVIEW OF TEN RESEARCH ARTICLES

I. Introduction

The realm of language testing and assessment is both intriguing and complex, garnering
significant attention from researchers and educators globally. While the process appears simple —
constructing tests to measure student proficiency — it entails a myriad of challenges and nuances.
Extensive research has delved into these complexities, aiming to address various issues
encountered by test developers and educators. In light of this, a review is conducted to analyze ten
influential articles authored by well-known researchers. The objective is to assess the strengths
and limitations of these articles and provide valuable insights for future research in the field.

II. A review of ten related research articles

1. Overview: The alternatives in language assessment (Brown 1998)

In the article The alternatives in language assessment, Brown delves into alternative
language assessment methods, shedding light on their potential to address the shortcomings of
traditional testing. He underscores the significance of reliability and validity in ensuring credible
assessment outcomes, highlighting credibility and auditability as crucial aspects.

Alternative assessments are categorized into constructed-response, personal-response


formats, selected-response, each offering distinct advantages and disadvantages. Constructed-
response assessments, including short-answer questions, provide deeper insights to produce and
quickly administer but may focus on assessing a few phrases or sentences only and need to craft
the prompts carefully based on multiple answers. Secondly, personal-response assessments, such
as portfolios or peer assessments, prioritize individualized reflections and student involvement but
may have issues on reliability and validity. Last but not least, selected-response assessments, such
as True/False questions, focus on students’ ability to select the correct and can be useful for large-
scale assessment, but may emphasize on details and unimportant facts.

Teachers are urged to consider the broader implications of assessment practices, including
their impact on the effect of testing and assessment on the curriculum related (washback effect),
the relevance of feedback derived from assessments, and the value of utilizing multiple sources of
information. Brown's discourse underscores the importance of informed decision-making in
assessment selection to optimize language learning outcomes effectively.

2
Given the delineation into three overarching categories outlined earlier in the alternative
assessment, together with the discussion of how teachers can make rational choices among the
various assessment options, subsequent readings led us to categorize the articles based on their
shared objectives:
• Constructed response assessments:
• Teacher perception in class speaking assessment (Nguyen, H. H. T. & Tran, T. T.
N., 2018)
• The role of learners’ test perception in changing English learning practices: a case
of high-stakes English Test at Vietnam National University, Hanoi (N.T. Lan, N.T.
Nga, 2019)
• Personal response assessments:
• Self-regulation through portfolio assessment in writing classrooms (Mark, 2018)
• Portfolios in the EFL classroom: disclosing an informed practice (Nunes, 2004)
• Self-assessment of language learning in formal settings (Harris, 1997)
• Teachers' perspectives on using peer assessment among young EFL learners (Yim,
2016)
• Investigating teacher-supported peer assessment for EFL writing (Zhao, 2014)
• Teachers’ rational choices among various assessment:
• Illustrating formative assessment in task-based language teaching (Gan & Leung,
2020)
• Challenges for college English teachers as assessors (2022)

2. Research articles related to alternative assessments

2.1. Constructed response assessments

2.1.1. An investigation into EFL teachers’ perception of in-class speaking assessment


(Nguyen, H. H. T. & Tran, T. T. N., 2018)

The study aimed to explore various aspects of speaking assessment within an EFL
classroom environment, encompassing teachers' comprehension of speaking assessment, the types
of tasks utilized in in-class speaking assessment, and teachers' participation in the assessment
implementation process. EFL teachers, comprising 38 males and 4 females aged between 23 and
50, from 15 different high schools in Quang Tri province, partook in the study, with teaching
experience spanning from 1 to 22 years. Data collection involved questionnaires and interviews,
with the questionnaires comprising 44 items categorized into three sections and assessed via a five-
point Likert scale. The interviews consisted of seven questions linked to the questionnaire items
and underwent thematic analysis post-transcription.

3
The findings unveiled positive perceptions among participants regarding their
understanding of in-class speaking assessment and the array of task types, with mean scores of
4.17 and 3.92, respectively. Moreover, teachers expressed favorable views regarding their
involvement in the assessment application process, notably at the pre-stage. Fluency, interaction,
pronunciation, and vocabulary were deemed equally significant, while content and grammar
received lower rankings. Feedback emerged as a pivotal factor in aiding students in recognizing
their strengths and weaknesses. Identified areas for improvement and insights encompassed
knowledge gaps in portfolios and self-assessment, offering valuable guidance for forthcoming
training and development initiatives. Nonetheless, the study identified research gaps, including the
potential for overgeneralization stemming from its focus on a single province in Vietnam and the
perceived impracticality of the assessment method in the current educational context.

2.1.2. The role of learners’ test perception in changing English learning practices: a case of
high-stakes English Test at Vietnam National University, Hanoi (Lan, N.T. & Nga N.T.,
2019)

The objective of the study is to examine how students' perceptions of the VSTEP 3 test
influence their learning activities at Vietnam National University (VNU), Hanoi. Conducted at
VNU, Hanoi, the study involved 751 VNU students who were not specializing in English,
comprising 149, 360, and 242 students enrolled in GE1, GE2, and GE3, respectively. Utilizing a
two-part questionnaire with a 6-point Likert scale, the first section gauged students' views on the
test's difficulty, familiarity, and significance, while the second section delved into their English
learning practices, encompassing goal setting, study planning, content, materials, methods, and
test preparation strategies. Data analysis methods included exploratory factor analysis,
confirmatory factor analysis, and structural equation modeling.

The findings indicated that a majority of respondents determined the test's difficulty level
based on assessments from senior students. Students experienced anxiety regarding the importance
of the VSTEP, with teachers acting as liaisons between students and the test, although they did not
underscore its difficulty or significance. While students felt assured about the skills being
evaluated, they harbored uncertainty regarding the VSTEP's format and purpose. The study
concludes that high-stakes tests like the VSTEP notably influence students' learning priorities,
fostering test-oriented learning due to the pressures and unfamiliarity associated with such
assessments.

The study's implications span various domains, including teachers' perceptions and
approaches, student learning practices, the teaching and learning process, and student perceptions
and behavior. Recommendations entail the development of interactive, meaningful, and supportive
activities to mitigate test pressure and the reassessment and enhancement of VNU's current English
program to address test unfamiliarity.

4
2.2. Personal response assessments

2.2.1. Self-regulation through portfolio assessment in writing classrooms (Mak, 2018)

Portfolio assessment (PA) is valued for its potential to enhance learning in writing
classrooms, yet its impact on students' self-regulation remains underexplored. Challenges in
implementation include sociocultural factors, limited teacher training, and a lack of assessment
literacy. Self-regulation, crucial for academic success, requires effective instructional practices.
Pintrich's framework offers a lens to explore this phenomenon, encompassing forethought,
planning, monitoring, control, and reflection. This study addresses how elementary teachers foster
self-regulation through PA and its effects on students. By investigating these questions, it aims to
fill the gap in understanding how PA can support self-regulated learning in elementary writing
instruction. Through empirical research, insights into effective pedagogical strategies for
cultivating self-regulated learning can be gained, contributing to the enhancement of writing
instruction in elementary classrooms.

2.2.2. Portfolios in the EFL classroom: disclosing an informed practice (Nunes, 2004)

In her article "Portfolios in the EFL Classroom: Disclosing an Informed Practice" (2004),
Alexandra Nunes explores the multifaceted role of portfolios in English as a Foreign Language
(EFL) education. Portfolios serve as a versatile tool for teachers to identify learners' skills,
competencies, learning preferences, and strategies. They also function as an alternative assessment
program, documenting students' learning processes and promoting reflective practices.

Nunes categorizes portfolios into several types: found samples, processed samples,
revisions, reflections, and portfolio projects. Found samples include class assignments selected by
teachers to showcase students' language abilities. Processed samples involve students' self-analysis
and assessment of their work, demonstrating comprehension and application of language concepts.
Revisions showcase students' efforts to enhance their language skills through graded, revised,
edited, and rewritten work. Reflections encourage metacognitive awareness and self-regulated
learning as students analyze and assess their portfolios. Portfolio projects feature works
specifically designed for inclusion, allowing students to showcase proficiency in diverse contexts.
Overall, Nunes emphasizes the importance of portfolios in EFL education for their ability
to provide a comprehensive view of students' linguistic development. By understanding the various
types of portfolios and their functions, educators can implement effective portfolio-based
assessment strategies to foster comprehensive language development and reflective learning
among EFL learners.

2.2.3. Self-assessment of language learning in formal settings (Harris, 1997)

5
In his article, Michael Harris investigates the application of self-assessment in compulsory
English as a Foreign Language (EFL) classes. Despite the acknowledged importance of self-
assessment for fostering learner autonomy, its implementation encounters numerous challenges
within formal educational settings. These challenges include institutional constraints, fixed
learning objectives, and cultural norms that may not align with the concept of self-directed
learning.

Moreover, critiques regarding the reliability of self-assessment further complicate its


adoption in formal contexts. Studies questioning the accuracy of learners' self-assessment add to
the skepticism surrounding its efficacy within traditional educational frameworks. However,
Harris suggests that formal education, with its predetermined objectives and standardized
assessments, might benefit from embracing alternative perspectives on progress evaluation.

By empowering students to reflect on their learning progress and set personal criteria for
evaluation, self-assessment has the potential to enhance learner engagement and autonomy.
Despite these potential benefits, integrating self-assessment practices into formal language
education requires further exploration and research to identify effective implementation strategies
that accommodate the unique challenges of formal educational environments.

2.2.4. Teachers' perspectives on using peer assessment among young EFL learners
(Yim, 2016)

Assessment for learning is emphasized as an integral part of the learning process rather
than solely occurring at its conclusion. This approach allows students to actively participate in
assessment activities, fostering self-directed learning, particularly evident in the acquisition of
writing skills through feedback exchanges (Cho & Cho, 2011). However, a discrepancy arises
between the national curriculum's promotion of peer assessment and the reality of primary students
facing pressure from standardized tests. Consequently, an inquiry into primary school teachers'
intentions regarding the utilization of peer assessment in English lessons has emerged. The study
investigates the dynamics influencing primary school teachers' inclination towards incorporating
peer assessment in English writing. With 82 participants, consisting of 79 females and 3 males in
Korea, data collection involves Quantitative Analysis via a Theory of Planned Behavior (TPB)
survey and Qualitative Analysis through open-ended responses.

Quantitative analysis reveals strong reliability, indicating Korean elementary school


teachers generally hold positive attitudes towards peer assessment for teaching writing in English
classes, despite less favorable perceptions of subjective norms. Qualitative findings highlight
teachers' recognition of the benefits of peer assessment in enhancing students' writing skills, error
awareness, and self-reflection. However, concerns persist regarding its application, especially with
low-achieving students.

6
Consequently, further exploration into integrating peer evaluation for young EFL learners
is recommended, employing diverse methodologies such as surveys, observations, and interviews.
Moreover, the active involvement of educators, students, and potentially parents and
policymakers is encouraged to enhance the effectiveness of this endeavor.

2.2.5. Investigating teacher-supported peer assessment for EFL writing (Zhao, 2014)

In response to the interest generated by a previous study on implementing peer assessment,


Huahin Zhao conducted a research project in an EFL writing class at the University of China over
a duration of 4 months. The class comprised 8 sophomore English majors, with a gender
distribution of 10 males and 8 females. The study aimed to examine the effects of teacher
intervention strategies on peer assessment in EFL writing. Data collection involved analyzing peer
feedback from 79 assignments, teacher evaluations of peer feedback, revisions based on peer input,
and student perceptions of teacher-supported peer assessment through post-task questionnaires and
follow-up interviews. The research methodology encompassed introducing 9 writing tasks across
five different genres, pairing students to provide feedback and discuss each other's writing,
collecting initial drafts for evaluation, and mandating revisions before final submission.

The findings revealed three primary impacts of teacher intervention strategies on peer
assessment: training enhanced efficiency and quality, influenced the nature of peer feedback, and
shaped students' responses to and utilization of peer feedback based on teachers' input. Students
showed a preference for utilizing peer feedback endorsed by the teacher, as evidenced by post-task
questionnaires where all students acknowledged the usefulness of teacher-supported peer
assessment. Follow-up interviews highlighted three significant advantages of peer review:
improving comprehension of peers' writing, leveraging shared educational and cultural
backgrounds for better understanding, and fostering more interactive peer discussions compared
to interactions between teachers and students. This underscores the importance of teacher guidance
in refining students' ability to critically analyze peers' writing and provide more effective
feedback.

Recommendations for future research include extending the intervention period to assess
long-term effects on students' attitudes and practices, as well as enlarging the sample size to
enhance the applicability of findings to broader EFL contexts and instructional settings.

2.3. Research articles related to rational choices among the various assessment options

2.3.1. Illustrating formative assessment in task-based language teaching (Gan Leung, 2020)

Formative assessment, defined as classroom activities providing diagnostic insights for


teaching and learning, is acknowledged for its motivational impact on student learning. However,
its implementation faces challenges in ESL contexts due to examination-driven systems and

7
standardized testing emphasis. Task-Based Learning (TBL) offers a learner-centered approach
where tasks are central to language learning. Integrating formative assessment into TBL practices
holds promise for enhancing language pedagogy by empowering students in their learning journey.

Research illustrates how formative assessment can be seamlessly integrated into TBLT
cycles, exemplified by Hong Kong's ESL curriculum. This integration not only fosters language
proficiency but also promotes collaborative skills and self-assessment among students. Key
strategies for effective formative assessment, such as clarifying learning intentions and providing
actionable feedback, align closely with TBLT principles. However, successful implementation
hinges on educators' clear understanding of TBLT, necessitating comprehensive teacher training.

In conclusion, the research article focuses on the synthesis of formative assessment and
TBLT offers a transformative approach to ESL pedagogy. By embracing formative assessment,
educators can shift from traditional, teacher-centered methods to dynamic, student-centered
practices, fostering deeper learning and skill development among ESL learners.

2.3.2 Challenges for college English teachers as assessors (2022)

The case study seeks to examine the obstacles encountered by teachers in a Vietnamese
university when assuming the role of language assessors in their outcome-based English courses.
Employing a qualitative case study methodology, the study focused on three distinct cases drawn
from two English faculties. Students underwent various periodic assessments organized by their
divisional instructors, alongside an English proficiency exam administered by an institution within
the university, mandated as part of their program's requirements. These assessments encompassed
online progress evaluations, mid-term appraisals, end-of-term tests, and a proficiency examination
spanning four years, with assessments conducted by three different teachers to ensure diversity
and comprehensive case study opportunities.

Data collection procedures involved gathering written narratives over a two-month period
and oral narratives within a two-week timeframe. Analysis and interpretation of the data followed
a series of steps, including transcribing oral narratives, coding both written and oral narratives, and
identifying themes from predetermined and emergent patterns.

The research highlighted that teachers initially undertook tasks such as grading, offering
feedback, crafting test items, and evaluating them at the outset of their careers. Challenges
encountered included inadequate assessment literacy, characterized by a lack of formal
background knowledge or insufficient expertise despite a basic understanding. Moreover, a lack
of shared knowledge and an unclear assessment identity presented hurdles, as evaluators often
struggled with differing interpretations of rating scales, exacerbated by the dual role of teachers as
both educators and assessors. Furthermore, insufficient professional training and limited
opportunities for professional discussions exacerbated these challenges.

8
Implications drawn from the findings emphasized the conflict stemming from teachers' diverse
roles as a central cause of the challenges encountered. Additionally, it was noted that a lack of
interaction among stakeholders exacerbates the situation, necessitating collaborative efforts to
effectively address the identified issues.

III. Conclusion
After reviewing ten related research articles on language testing and assessment, several
recurring problems have emerged. While language testing and assessment play a crucial role in
evaluating language proficiency and informing educational decisions, they are not without their
challenges. Addressing these issues requires a concerted effort from researchers, educators,
policymakers, and assessment developers to ensure that language assessments are valid, reliable,
fair, and authentic measures of language proficiency. Only through ongoing research,
collaboration, and innovation can we strive towards more effective and equitable language
assessment practices.

B. TEST EVALUATION

I. Description of the context

1. Teaching context
The educational setup is crafted as a comprehensive language program that extends over
35 weeks in an academic year, tailored for a small group of 20 to 25 students. These learners are
at the A1 Movers level (Young Learners English), indicating a basic grasp of the language being
taught.
Assessments are strategically planned throughout the course, happening every 7 weeks,
with 3 lessons per week. The curriculum comprises 87 sessions, including 6 review sessions, 5
sessions for progress assessments, 4 sessions for test corrections and feedback, and 3 sessions
dedicated to Cambridge YLE Practice Tests. These periodic progress assessments act as milestones
for students to measure their progress, reinforce their comprehension, and pinpoint areas needing
further attention and enhancement.

2. Course objectives
The ESL curriculum enables even the youngest students to communicate with confidence
and efficacy, fostering the acquisition of skills necessary to engage with various forms of
information, media, and texts. It advocates for active learning, nurtures critical thinking abilities,
and fosters intellectual involvement.
Aligned with the Council of Europe’s Common European Framework of Reference for
Languages (CEFR), this curriculum instills in learners a curiosity for other languages and cultures,
illuminating how these elements shape our worldview. Through this approach, students perceive
themselves as adept language learners, capable of effective communication and enjoying the

9
exploration of diverse texts as their proficiency grows. This framework facilitates an integrated
teaching strategy aimed at cultivating proficient English communication skills. The five strands,
along with their respective learning objectives, synergistically contribute to the development of
knowledge, skills, sub-skills, and comprehension in Reading, Writing, Use of English, Listening,
and Speaking.

3. Course materials
Covering a range of international cross-curricular topics, from space exploration to
community studies, this collection of resources aids learners in acquiring the necessary skills to
engage with diverse topics across the curriculum in English. Filled with literature and interactive
activities, the program fosters learners' confidence in communication. Endorsed by Cambridge
Press, particularly the Cambridge Global English (2nd edition), these materials align with the
fundamental Cambridge subject framework. They facilitate comprehensive language acquisition,
empowering students to methodically and efficiently enhance their abilities. starting from the
defined A2 Movers level.

4. Course content

The content is predominantly structured in accordance with Cambridge guidelines,


ensuring the fulfillment of objectives concerning language skills, grammar structures, vocabulary,
and communication skills corresponding to the proficiency level. Additionally, the course
encompasses a diverse array of topics aligned with the Cambridge Primary curricula for both
Cambridge Primary Maths and Cambridge Primary Science.

Period Content Objectives


52
Unit 5: Let’s Students read the story and identify the verbs in past
measure tense.
Lesson 4.2: using Vocabulary: drew, rub out.
the past simple

53
Unit 5: Let’s Students are able to recognize the homophones.
measure
Lesson 5.1: Words Vocabulary: one-won, two-too, four-for, eight-ate,
that sound the same racehorse

10
54 Unit 5: Let’s
measure Students read the story and identify the key words and
recognise the verb in past tense.
Lesson 6.1: Many
ways to count to ten Vocabulary: contest, quickly, spear, water ox,
quietly

55 Unit 1: A day at
school Students read the story again, discuss the plot, and
Lesson 6.2: Many assess the characters.
ways to count to ten - Use adjectives to talk about people's qualities.

Vocabulary: kind, funny, honest, patient

56
Unit 5. Let’s Students do the project to reinforce their knowledge of
measure unit 5 and revise the numbers and verbs in the past
tense.
Lesson 7. Project
challenge

57
Unit 6. All about Students are able to:
bugs - Identify some insects, compare and contrast insects
- Use the determiner all, some and prepositions to
Lesson 1. Bugs and describe
other garden Vocabulary: insect, butterfly, bee, cricket, ant,
animals worm, spider, above, between, in front of, on

58
Unit 6. All about Students are able to compare how animals are similar
bugs and different.
Vocabulary: antennae, wings, worm, diagram, pet,
Lesson 2 Crickets pocket, plenty of
and other insects

59
Unit 6. All about Students read and talk about ants.
bugs
Vocabulary: communicate, trail, feel, smell, taste,
Lesson 3.1 Ants and build, seed.
spiders

11
60
Unit 6. All about Students are able to:
bugs - Read and talk about spiders
- Understand the similarities and differences between
Lesson 3.2 Ants and ants and spiders.
spiders Vocabulary: web, silk, mice.

61
Unit 6. All about Students are able to use WH question to ask then
bugs answer
Vocabulary: helpful, silkworm, honey, spot
Lesson 4. Writing
questions

62
Unit 6. All about Students are able to identify rhyming words and long e
bugs sound
Vocabulary: chest, flea, leaf, bump
Lesson 5. Rhyming
words, words with
long e

63
Unit 6. All about Students listen to the story, identify keywords and
bugs understand the story’s content. Vocabulary: folktale,
beetle, blow-blew, shiver, lift up, chase, busy, bite,
Lesson 6.1 Little yelp
Ant

64
Unit 6. All about Students read the story again to
bugs - Identify the verbs
- Assess the characters
Lesson 6.2 Little - Answer the comprehension questions
Ant

65
Unit 6. All about Students are able to use the story map to retell the story
bugs then act the story out.

Lesson 6.3 Little


Ant

66
Unit 6. All about Students do the project to reinforce their knowledge of
bugs unit 6 and revise the vocabulary related to insects.

12
Lesson 7. Project
challenge

67
Unit 7: The world Students have the sense of caring for the Earth and
around us know how to do it.
Vocabulary: planting, watering, picking up, bin,
Lesson 1: Caring recycle
for planet Earth

68
Unit 7: The world Students are able to do an experiment then describe the
around us processes and talk about plants Vocabulary: roots,
stem, leaf, flower
Lesson 2: Plants
and flowers

69
Unit 7. The world Students know the importance of trees then talk about
around us trees and cycling
Lesson 3. The Vocabulary: leaves, fresh, breathe, fruit, wood,
importance of trees fires, furniture, soil, cut down, recycle, factory,
nappies

70
Unit 7. The world Students are able to:
around us - Use properly this (one)/ these, that/ those
Lesson 4. Using - Use the sequencers to talk about the process
this/these and Vocabulary: first, next, then
that/those

71
Unit 7. The world Students are able to identify long o spellings, variant
around us sounds of ow/ou
Lesson 5. Long o Vocabulary: snow, slow, owl, bowl, soap, boat,
stone, throw, toes, crow, shower, down

72
Unit 7. The world Students are able to:
around us - Tell what biography is
Lesson 6.1 Wangari - Read and identify key words
Maathai - Understand the content
Vocabulary: biography, village, fig, dry (v), blow
away, tea, firewood, autobiography

13
73 Unit 7. The world
around us Students are able to
Lesson 6.2 Wangari - Retell the text using the sequences
Maathai - Use the past simple tense to write the autobiography

74
Unit 7. The world Students do project to reinforce their knowledge of unit
around us 7 and revise the vocabulary relating to plants and past
simple tense
Lesson 7. Project
challenge

75 Revision for Progression Test 3 (Unit 6-7)


76 Progression Test 3

II. Test specification

1. Test context and test purpose

The test, administered after the completion of three units, corresponding to a total of 22
lessons, serves as a formative assessment within the curriculum. Designed as a progress test, its
primary aim is multifaceted. Firstly, it acts as a comprehensive measure of students' grasp on the
material covered thus far, providing instructors with valuable insights into individual and
collective progress. Additionally, the test results also enable teachers to tailor their teaching
methods to address specific needs effectively. Through this iterative process of assessment and
adjustment, the overall learning experience within the course is enriched, ensuring that each
student receives personalized attention and support on their educational journey.

2. Test context and construct

Text length

The precise extent of the reading passages remains unspecified within the test's structural
framework and accompanying directives. Nonetheless, the passages are intentionally crafted to
span approximately 50-70 words, a deliberate consideration of both students' proficiency levels
and the temporal constraints inherent in the reading assessment.

Timing

The overall duration of the examination spans 35 minutes, with no explicit indication
regarding the allotted time for the reading assessment. Consequently, students are required to
autonomously manage their time allocation to ensure completion of the reading part within the

14
designated timeframe. The anticipated finishing time for the 3 reading parts is not provided,
resulting in uncertainty regarding the duration students have to finalize these assessments.

Test structure and number of questions

The test comprises three reading sections, encompassing a total of 11 points and 9
questions. A consistent directive across all three reading exercises entails students viewing an
image, reading a narrative, and fulfilling assigned tasks, with each section offering a model
example. In the first part, students are required to read for the specific details to fill in the blank
with 1 to 3 words to complete the sentences about the story. In the subsequent part, students need
to write the correc past simple form of the verbs in brackets. The last part involves reading the text
and deciding whether statements are True or False by ticking Yes or No.

Sub-skills examined

The tests evaluates the skills acquired by students from Unit 5 to Unit 7. It encompasses
reading comprehension, necessitating the reading for specific information from texts (Scanning).
Additionally, integrated skills such as reading and writing are incorporated.

Topics tested

The test content reflects the thematic units outlined in the course curriculum, specifically
centered around Unit 5: Let’s measure, Unit 6: All about bugs, and Unit 7: The world around us.

Language element

In the part two of the test (Write the correct past simple form of the verbs in brackets. There
is one exmaple.), there is an examination of students' comprehension of a grammar point (Past
Simple tense). This not only evaluates their grasp of grammar rules but also their ability to apply
them accurately in context. Moreover, this clarity can facilitate consistent preparation and enhance
students' ability to refine their understanding of crucial language components.

III. Analysis and evaluation of the validity of the test

1. Content validity

Topic

The chosen subjects in the three reading texts correspond to topic taught in the students'
curriculum. In particular, the reading texts focuses on the story of Daisy and her family’s visit to
the farm, which was introduced in Unit 6: All about bugs. Because students have already been
exposed to this topic, understanding the readings and responding to related questions may be easier
for them.

15
Text length

In contrast, the text length in the formative test is notably shorter than the content students
typically engage with during classroom practice. This aids in fostering students' confidence,
averting any sense of intimidation from overly long reading materials. However, in the second
exercise, the text consists of roughly 70 words, necessitating a slightly longer period for students
to read and answer the four accompanying questions. Overall, from our perspective, the three
reading passages appear to be of similar length, primarily suitable for those at the Pre A1 level.

Vocabulary level

Upon employing the Oxford text checker tool to assess the vocabulary level within the
three reading texts, a conspicuous predominance of A1-level vocabulary emerges, while the
presence of A2-level vocabulary remains minimal. Such findings align with expectations,
considering the primary aim of the reading test: to evaluate students' reading abilities at the A1
level. This distribution of vocabulary levels underscores the suitability of the test content for its
intended purpose, ensuring that the text complexity aligns closely with the targeted proficiency
level of the students.

Reading 1 Reading 2 Reading 3

2. Construct validity

3. Face validity

Face validity concerns the initial impression of how well an evaluation, examination, or
assessment aligns with its intended purpose. In this context, the present test demonstrates face
validity as it resembles a typical reading passage, with almost questions designed to assess

16
students' comprehension abilities. However, while Parts 1 and 3 primarily assess students' reading
comprehension of passages, Part 2 seems to concentrate more on evaluating their understanding
of a grammar point by telling a related story.

4. Validity in scoring

In this test, students are tasked with various activities, ranging from filling missing words
to providing correct verb forms and ticking Yes or No. Given the diverse nature of these tasks,
scoring may be influenced by factors such as spelling, grammar, or even handwriting. The scoring
system corresponds to the test's format, awarding one point per question (except in Part 3, where
two points are allocated per question). Consequently, the scoring validity of this test remains intact.

REFERENCES

Brown, J. D., & Hudson, T. (1998). The Alternatives in Language Assessment. TESOL
Quarterly, 32(4), 653. https://doi.org/10.2307/3587999
Gan, Z., & Leung, C. (2019). Illustrating formative assessment in task-based language
teaching. ELT Journal, 74(1), 10–19. https://doi.org/10.1093/elt/ccz048
Harris, M. (1997). Self-assessment of language learning in formal settings. ELT Journal, 51(1),
12–20. https://doi.org/10.1093/elt/51.1.12
Mak, P., & Wong, K. M. (2017). Self-regulation through portfolio assessment in writing
classrooms. ELT Journal, 72(1), 49–61. https://doi.org/10.1093/elt/ccx012
Nguyen, T. C. (2022). CHALLENGES FOR COLLEGE ENGLISH TEACHERS AS
ASSESSORS. VNU Journal of Foreign Studies, 38(5). https://doi.org/10.25073/2525-
2445/vnufs.4817
Nguyen, T. L., & Nguyen, T. N. (2019). THE ROLE OF LEARNERS’ TEST PERCEPTION IN
CHANGING ENGLISH LEARNING PRACTICES: A CASE OF A HIGH-STAKES
ENGLISH TEST AT VIETNAM NATIONAL UNIVERSITY, HANOI. VNU Journal of
Foreign Studies, 35(6). https://doi.org/10.25073/2525-2445/vnufs.4481
Nguyen, H. H. T., & Tran, T. T. N. (2018). An Investigation Into Efl Teachers’ Perceptions of In-
Class English Speaking Assessment. VNU Journal of Foreign Studies.
Nunes, A. (2004). Portfolios in the EFL classroom: disclosing an informed practice. ELT
Journal, 58(4), 327–335. https://doi.org/10.1093/elt/58.4.327

17
Yim, S. Y. (2016). Teachers’ perspectives on using peer assessment among young EFL
learners. JLTA Journal, 19(0), 15.
Zhao, H. (2014). Investigating teacher-supported peer assessment for EFL writing. ELT
Journal, 68(2), 155–168. https://doi.org/10.1093/elt/cct068

18
APPENDIX

19
20
21

You might also like