Jwjshs

You might also like

Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 34

Annex 1 Supplier Response

For the supply of Organisations with expertise on assessment tools


for English Language Learning to the British Council

Company name:

Pearson Educación de Colombia SAS


Company address:
_________________________________________
Carrera 7 # 156 – 68 floor 26
Company Reg:

S.A.S _________________________________________

Contact name:

Diego Tafur _________________________________________

Contact email address:

diego.tafur@pearson.com _________________________________________

Contact Telephone number:

3158520718

Supplier Response Template (annex to RFP/ITT) – 26 February 2019


1
For individual and legally established entities
domiciled in Colombia. Please provide as For overseas bidders, please provide
applicable:

1) Photocopy of the citizenship card (cédula - Document which proves the legal status of the
de ciudadanía) of the Legal bidding entity or individual
Representative - Financial statements: Balance Sheet and
2) Photocopy of RUT (Registro Único Income Statement as of 31st December 2020
Tributario)
3) Certificado de Existencia y To provide proof of the technical capacity, the applicant
Representación Legal (issued within the must submit supporting documentation. appending the
last 30 days of the submission date) following:
4) Financial statements: Balance Sheet and
Income Statement as of 31st December - List of work team members who would be
2020 assigned to the service for this contract and
their role
To provide proof of the technical capacity, the applicant - Brief CV with the information directly
must submit supporting documentation. appending the associated to the service to be provided
following: - Three (3) business references preferably
related to services like the ones specified in
5) List of work team members who would be this ITT
assigned to the service for this contract
and their role
6) Brief CV with the information directly
associated to the service to be provided
7) Three (3) business references preferably
related to services like the ones specified
in this ITT

___________________________________

Instructions
1) Provide Company Name and Contact details above.

2) Complete Part 1 (Supplier Response) ensuring all answers are inserted in the space below
each section of the British Council requirement / question. Note: Any alteration to a
question will invalidate your response to that question and a mark of zero will be applied.

3) Complete Part 2 (Submission Checklist) to acknowledge and ensure your submission


includes all the mandatory requirements and documentation. The checklist must also be
signed by an authorised representative.

4) Submit all mandatory documentation to [englisharea@britishcouncil.org.co / British Council’s


e-Tendering portal hosted at https://in-tendhost.co.uk/britishcouncil] by the Response
Deadline, as set out in the Timescales section of the RFP/ITT document. If procurement is
conducted via the British Council’s e-Tendering portal hosted at https://in-

Supplier Response Template (annex to RFP/ITT) – 26 February 2019 2


tendhost.co.uk/britishcouncil, All communication to be conducted via the correspondence
tab within the project

Part 1 – Supplier Response

1.1 Responses will be scored according to the methodology as set out in Evaluation Criteria section of the
tender document.

Social Value – 10%


ID % Requirement

SV01 10% Supplier Note: Please refer to Procurement Policy Note (PPN) 06/20 before completing this
criterion. PPN 06/20 Social Value (Maximum word count 750 Words)

• Has your organisation developed/adapted products considering contexts of vulnerable


population? If so, please give an example and what changes/adaptations did you apply.

Pearson has increased access to learning for underserved groups through new and existing
products and partnerships, identifying strategies to overcome barriers. These groups include, but
are not limited to, women, racial minorities, low-income groups and people with disabilities

We strengthen existing and created new processes, Editorial Policy, and partnerships to eliminate
bias and represent the consumers we serve, including based on race, ethnicity and gender, in our
products and through our content providers. In Pearson LATAM we have materials such as Our
Stories a series that is developed and based around Diversity and Inclusion. Characters represent
all ethnic groups and physical abilities.

• Has your organisation adapted to deliver products/services to people with disabilities? If


so, please provide an example and explain what your organisation did.

We believe in the power of learning and want to help everyone get access to quality life-long
learning. It’s why we are contributing to UN Sustainable Development Goal (SDG) 4, which calls
for inclusive education and lifelong opportunities for all by 2030. Our Stories is developed taking
dyslexic students into consideration, in both content and design. Dyslexic students using product.
Also, we provide Wizard books in braille and have platforms that are prepared to deliver content
for people with disabilities.

• How can this contract be an opportunity for your organisation to contribute on the
improvement of educational or any other conditions of underage students in
vulnerable communities?

We believe it’s our responsibility to help people overcome barriers to learning. Whether they’re
held back by health challenges or facing socio-economic hurdles – we focus on under-
represented groups including women, minorities, low-income families, and people with disabilities.
These are important social issues for us, and we know that many people within these groups
would benefit from additional learning and employability skills.

The best way for us to leave a lasting positive impact is to leverage the strengths of our
organization to help underserved people access opportunities to learn. We have many
products and services that help people access quality education and develop essential skills for
employment.

Supplier Response Template (annex to RFP/ITT) – 26 February 2019 3


We partner with governments (local, state, federal, national) to ensure learners have access to
high-quality instruction, materials and assessments linked to beneficial outcomes, including
building workforce skills. Engagement with statutory bodies such as listing authorities and financial
regulators is key to doing business as a listed global company. We are committed to building
strong relationships with political and educational leaders at every level across the world. We do
not make policy, rather we share best practices and inform the policymaking process through our
knowledge gained from our expertise as the world’s learning company.

Technical requirements, methodology and approach of the proposal: – [35%]


ID % Requirement

TR01 [35%] Please detail one or more of the following:


1. Test features
2. Technical requirements for student presentation of the test on mobile
devices, tablets and/or computers
3. Action plan to ensure 100% testing
4. Work schedule
5. Methodology for the delivery of licenses / pins to access the tests
6. Methodology for data analysis
7. Team

Supplier Response:

1. Test Features

Introduction
Level Test (Entry Test): An overview
The Pearson English Level Test is an online adaptive test of English language proficiency
developed to provide teachers with information that will help them stream students and
place them in the right class.

The test measures a student’s English language competency and reports a CEFR score
from pre-A1 to C2, together with a GSE score range between 10 and 90. It also presents a
high-level view of a student’s performance across multiple skills (Reading, Writing,
Listening, Speaking). There are two versions of the test – one that tests writing, reading,
and listening (together with the enabling skills Grammar and Vocabulary) and another that
tests the same skills plus Speaking.

The score report is provides the overall test result, with high-level skill ratings and
performance summaries. Reporting is designed for teachers to place prospective students
into classes, and results can be reported at both the group and individual level.

The test is delivered through the proficiency assessment portal, Pearson English Test Hub,
which also stores and displays the results of the test.

The purpose of the test


The test provides information to a teacher about the level of general English proficiency of
the prospective student so that they can be placed in the best level to start their course. It
also provides a skills profile, highlighting the student’s strengths and weaknesses.

Who is it for?
The test is designed for students who are 14 or older. It is not Junior or Primary focussed or
designed to assess English for specific purposes (e.g., business). It can be used before any
adult or upper secondary course.

Supplier Response Template (annex to RFP/ITT) – 26 February 2019 4


Why take an integrated skills test?
Several the questions on the test are integrated skills questions. These questions test more
than one skill at the same time.

Using integrated skills questions means that Level Test is a better test of a learner’s English. In
real life and in the classroom, learners use more than one skill to complete communicative
tasks. To order something in a restaurant we need to listen and speak, to take notes in a
classroom we need to listen and write. Integrated skills questions test how well learners can use
the skills they have learnt and practised in the classroom and used in real life.

Test design
Level Test is a computer-based test normally taken on school premises; however, it can be
taken at home if the test taker has access to a computer with headset and microphone (if
taking the 4 skills version). Each administration takes 20-30 minutes, and the results are
available within minutes.
Level Test employs an adaptive method to select questions to present to the student. The
test uses an adaptive algorithm which takes a student’s answers to a previous question to
select the most suitable question to present next. It selects these items from a large item
bank making each student’s experience different. The adaptive nature of the test allows
Level Test to estimate a student’s English proficiency quickly and accurately.

The content of the test is very closely linked to the Pearson GSE scale, which guides
content production at every level. To aid with this linking, each question is tagged with a
GSE score and a GSE can-do statement.

Test development
The questions in Level Test are based on those developed for the Benchmark Test and
have been developed by international teams of experienced writers. Teams were based in
the UK, Australia, the USA, and Hong Kong.

Once written, all questions are reviewed by the teams in the different countries. Comments
and suggestions for improvement are stored with the test questions on a secure database.
The questions then go through a further review by an expert panel and decisions are made
on the quality of the questions, which to keep and which to reject. All questions are then
thoroughly checked by Pearson staff and images and high-quality recordings are added to
complete the questions before they go forward to be calibrated in a large-scale field test.

After the field testing, further checks are made on item quality based on the measurement
characteristics of the questions. Questions are eliminated from the item pool if they are too
easy or too difficult, if weaker learners get them right but stronger learners get them wrong,
or if they show any bias. These checks then result in a bank of the best quality questions.
Questions are selected from this bank to go into the final tests.

Automated scoring validation process


From the field test data, 300 candidates were randomly selected as the validation data set.
A validation data set is a group of candidates whose data are segregated out prior to
statistical analysis to independently test how well automated scoring models work, once
they are complete. Additionally, these candidates’ data were not included in the
psychometric item calibration, or in the scaling onto the GSE. If the test scores for these
candidates as calculated by both automated and human scoring models are highly
correlated, this provides evidence that the automated scoring models will work as expected
for other new candidates in the operational setting.

Once the automated scoring system was developed, the responses from the validation set
were run through the same psychometric model to produce overall and sub-skill scores for
each candidate. Those human and machine scores were then correlated to compare how
similar those two kinds of scores are for each person. When candidates were identified as
having extreme scores (i.e., well outside the reported score range of the GSE and not well
estimated), or when they had fewer than five responses which were able to be scored in a
particular skill area, their scores were excluded from the analyses. This reduced the number

Supplier Response Template (annex to RFP/ITT) – 26 February 2019 5


of candidates for the Overall score correlation to 288. The results of the analysis show there
is a very strong relationship between machine and human Overall scores, with a correlation
of .97.

Alignment to the Common European Framework of Reference (CEFR)


Level test is aligned to the Common European Framework of Reference (CEFR) levels in
several ways:
• Question writers are experienced teachers and authors who were also trained on
the CEFR levels.
• Questions were written to specific CEFR levels using ‘can do’ statements taken
from the CEFR documents.
• Many learners who took the Field Test were studying at levels which had been
previously linked to CEF levels in a comprehensive academic study.
• Learners’ responses in the Field Test were independently rated by examiners
trained on the CEFR levels to check they were at the right level of proficiency.

In the following table, we define how the Global Scale of English is related to the CEFR
levels. To give an impression of what the levels mean, i.e., what learners at particular levels
can do, we use the summary descriptors published in the CEFR (Council of Europe, 2001,
p. 24) where they exist.
Global CEFR Description
Scale of band
English
GSE 10-21 <A1 This level of proficiency is likened to a tourist who may know some
individual words but does not have enough control of language to
produce full sentences and mostly communicates with words or very basic
phrases.
The words they do know may carry a lot of communicative meaning or
be effective when used with hand gestures or when the context is very
clear (e.g., pointing to an object in a shop).
GSE 22–29 A1 Can understand and use familiar everyday expressions and very
basic phrases aimed at the satisfaction of needs of a concrete type.
Can introduce him/herself and others and can ask and answer questions
about personal details such as where he/she lives, people he/she knows
and things he/she has.
Can interact in a simple way provided the other person talks slowly and
clearly and is prepared to help.
GSE 30-35 A2 A2+ Can understand sentences and frequently used expressions related
and 36-42 to areas of most immediate relevance (e.g., very basic personal and
family information, shopping, local geography, employment).
Can communicate in simple and routine tasks requiring a simple
and direct exchange of information on familiar and routine matters.
Can describe in simple terms aspects of his/her background,
immediate environment and matters in areas of immediate need.
GSE 36–42 B1 B1+ Can understand the main points of clear standard input on
and 43-58 familiar matters regularly encountered in work, school, leisure,
etc.
Can deal with most situations likely to arise whilst travelling in an
area where the language is spoken.
Can produce simple connected text on topics which are familiar or
of personal interest.
Can describe experiences and events, dreams, hopes, and ambitions
and briefly give reasons and explanations for opinions and plans.
GSE 59-66 B2 B2+ Can understand the main ideas of complex text on both concrete
and 67-75 and abstract topics, including technical discussions in his/her field of
specialisation.
Can interact with a degree of fluency and spontaneity that makes
regular interaction with native speakers quite possible without strain for
either party.
Can produce clear, detailed text on a wide range of subjects and explain
a viewpoint on a topical issue giving the advantages and independent
disadvantages of various options.
GSE 76–84 C1 Can understand a wide range of demanding, longer texts, and
recognise implicit meaning.
Can express him/herself fluently and spontaneously without much
obvious searching for expressions.

Supplier Response Template (annex to RFP/ITT) – 26 February 2019 6


Can use language flexibly and effectively for social, academic,
and professional purposes.
Test Coverage
The 3-skill test covers three skills: Reading, Listening, Writing, as well as knowledge of
grammar and vocabulary. The 4 skill test tests speaking skills after the adaptive section of
the test.

Skills or Test Focus


Knowledge
Reading To demonstrate reading skills, learners will be asked to:
• read and understand the main points from signs,
newspapers, and magazines
• understand the detail of short texts
• understand the detail in longer texts

Listening To demonstrate listening skills, learners will be asked to:


• listen for specific information in listening texts
• follow and understand short texts and show understanding by writing
down
what was said
Writing To demonstrate writing skills, learners will be asked to:
• write accurately what they hear using correct sentence structure,
word order and connectors.
Speaking To demonstrate speaking skills, learners will be asked to:
(4-Skill test only) • speak clearly using appropriate stress and intonation
• pronounce words so that they can be understood
• describe pictures or other visual material connecting ideas
together accurately and with a range of language
Grammar To demonstrate knowledge of grammar, learners will be asked to:
• choose the right word or phrase to make an accurate sentence
• understand the difference between different grammatical tenses and
other structures
Vocabulary To demonstrate knowledge of vocabulary, learners will be asked to:
• produce words which relate to common themes and topics such as
family, work, and social situations
• use appropriate words in different contexts
• show an understanding of the different meaning of words and how
they relate to other words

Test Questions
What kinds of questions are in the test and what do they measure?
The test has several different question types. This gives learners a chance to demonstrate
their English skills in different ways. There are questions where learners choose the correct
option or where they write the answer to an open question. There are questions where the
learner repeats or copies what has been said as well as questions where learners describe
something.
Because Level Test is adaptive, different students will see different questions and may not
be presented with all the question types described below.

The 3 skills version of the test takes 20 minutes to complete, and the 4-skill version takes
30 minutes.

Item Type What do the What is being tested? Skill(s) 3 4


learners have Skill Skill
to do? test test
Fill in the table This question asks the This question tests the Vocabulary
learner to complete a vocabulary knowledge of the
set of vocabulary items learner. It tests the words the
with appropriate learner knows and the accuracy
words. The words are of the form of the word. It tests
presented as a table of the learner’s knowledge of word
related words. families and related sets of
words that they may have met in
the classroom or when learning
English.

Supplier Response Template (annex to RFP/ITT) – 26 February 2019 7


Choose the This question asks This question tests the Vocabulary
right word or the learner to choose vocabulary knowledge of the
phrase the correct word to learner in a written context. It
complete several tests the vocabulary the learner
sentences. The knows and whether they can
sentences are related understand the use of the
by a similar theme. vocabulary in the context of a
sentence. It tests the range of
vocabulary the learner knows.
Complete This question asks the This question tests the Vocabulary
the Dialogue learner to select words vocabulary of the learner in a
from a word bank to spoken context. It tests the
complete a dialogue. vocabulary the learner knows
and whether they can
understand the use of the
vocabulary in the context of a
conversation. It tests the range
of vocabulary the learner knows.
Choose the This question asks the This question tests the Grammar
right word or learner to choose the knowledge of grammar of the
phrase correct word or phrase learner. It tests the range of
to complete several grammatical knowledge as well
sentences. The as the accuracy of grammar in
sentences are related a written context.
by a similar theme.
Multiple Answer This question asks This question tests the Grammar
Multiple Choice the learner to choose knowledge of grammar of the
the correct word to learner in a written context and
complete several whether they can choose the
sentences. The right grammatical form in a
sentences are related sentence.
by a similar theme.
Drag and drop This question asks This question tests the Grammar
the learner to re-order grammatical knowledge of the
a sentence correctly. learner at sentence level. It
tests word order, connectors,
and discourse markers. It tests
grammatical knowledge in a
written context.
Error correction This question asks This question tests knowledge Grammar
the learner to select of grammatical rules in use.
one of the available
options to correct the
mistake in the
sentence.
Choose the This question asks This question tests the global Reading
right word or learners to read a understanding of short
phrase (gap fill) short text and select messages, notes and short
the best word or pieces of writing.
phrase to complete
the text
Graphical This question asks This question tests the global Reading
multiple choice learners to read a understanding of short
short text and select messages, notes and short
the best picture to pieces of writing.
match with the text.
Short answer This question asks This question tests the reading Reading
questions the learner to read a comprehension of the learner.
longer text and It tests specific information
answer questions on included in the text.
the text
Listen and write This question asks This question tests listening Listening
(Dictation) the learner to listen to comprehension at the word Writing
a sentence or short and sentence level. It tests the
text and write what ability to write accurately and
they have heard. understand sentence
structure, word order and
connectors.
Listen and read This question asks This question tests reading Listening
(Hotspots) the learner to read a and listening comprehension. Reading
text and at the same It tests the ability to recognise
time listen to the text. individual words in a text.
The learner must find
the differences
between the written
text and the spoken
text.
Read aloud This question asks This question tests accurate Speaking
the learner to read pronunciation and how fluent Reading

Supplier Response Template (annex to RFP/ITT) – 26 February 2019 8


aloud a sentence or the learner is at speaking. It
short text. tests if the words in the text
are understood and repeated
accurately.
Listen and This question asks This question tests listening Speaking
repeat the learner to listen to comprehension at the word Listening
a sentence or short and sentence level. It tests
text and then repeat pronunciation and fluency. It
it. tests if the words heard are
understood and repeated
accurately.
Describe a This question asks This question tests the Speaking
picture the learner’s
learner to look at a ability to speak in an extended
photograph or picture way linking concepts and
and describe what ideas. It tests the accuracy of
they see. speech including accurate
grammar, pronunciation, and
stress as well as the fluency of
the speech. It tests the use of
appropriate words to describe
the photograph or picture.
Listen to a This question asks This question tests listening Listening
conversation the learner to listen to comprehension. It tests the Speaking
a short conversation accuracy of the listening
and then answer a comprehension of the learner.
question about the
conversation.
Story retell This question asks This question tests listening Speaking
the learner to listen to and speaking. It assesses
a short narrative and understanding of a short
then retell the narrative.
narrative using their
own words.

Reporting
The test reports half CEFR bands from <A1 to C2 with the corresponding GSE range. Sub
skills are reported using a short performance summary and a 3-step rating system (above,
at, or below level) to provide a high-level view of performance and indicate stronger or
weaker skill areas. This skills profile can be used by teachers to help tailor the course
content and focus on the needs of their students.

Skill scores (listening, reading, speaking, writing) are based on test items that assess those
skills, either as a single skill or integrated skill tasks.
The Level Test result can also be used to assign the appropriate Benchmark Test once they
have been assigned to a class.

Level Test results can be provided at the individual or group level, depending on the needs
of the school and teacher.

The following is an Individual Sample Score Report:

Supplier Response Template (annex to RFP/ITT) – 26 February 2019 9


Supplier Response Template (annex to RFP/ITT) – 26 February 2019 10
Benchmark Test (Exit Test): An overview

The Pearson English Benchmark Test is a course agnostic online assessment of English
language proficiency for individuals and groups of students. It can be used to as a one-off
test to assess Reading, Writing, Speaking and Listening skills or to measure progress over
time. The test measures a student’s English language competency and reports a CEFR
score from pre-A1 to C2, together with a GSE score between 10 and 90.

The score report provides the overall test result, with skill scores, performance summaries,
and recommendations for future study. Results can be reported at both the group and
individual level. The assessment is delivered digitally through the Pearson English Test
Hub, which also stores and displays the results of the test.

Supplier Response Template (annex to RFP/ITT) – 26 February 2019 11


The purpose of the test
The test is normally used in a learning context in conjunction with relevant materials
courseware and other formative assessment tasks. It provides detailed information to a
teacher about individual students and groups of learners who are studying English. The
information provided includes:
• An overall Global Scale of English (GSE) score and CEFR band for each student
• A profile of sub-skill scores for a group showing areas of strengths or weaknesses
for the group
• A profile of sub-skill scores for each student which show the strengths and
weaknesses for each student
• Comparison of scores from one test administration to another so progress, or lack
of it, can be viewed. This is available at the group and individual level.
• Performance summaries, recommendations for further study, and links to Pearson
courseware and GSE Learning Objectives

This information allows the teacher to measure student progression and make decisions
about adapting learning material to suit the level of learners, as well as providing extension
activities where needed. It also allows the teacher to tailor the learning program to particular
learners, giving extra support and input where required.

Who is it for?
The test is designed for secondary and adult learners who are aged 14 or older. Benchmark
Test can be used alongside any adult or upper secondary course and is intended to be
used with comprehensive integrated skills courses not short or partial courses.

Why take an integrated skills test?


Some of the questions Benchmark Test uses test a single skill such as speaking or writing.
When assessing these skills, we test traits such as pronunciation and fluency, the ability to
argue as well as written conventions along with grammar and vocabulary.

Several the questions on the test are integrated skills questions. These questions test more
than one skill at the same time. Using integrated skills questions means that Benchmark
Test is a better test of a learner’s English. In real life and in the classroom, learners use
more than one skill to complete communicative tasks. To order something in a restaurant
we need to listen and speak, to take notes in a classroom we need to listen and write.
Integrated skills questions test, how well learners can use the skills they have learnt and
practised in the classroom and used in real life.

Test Design
Benchmark Test is designed specifically to measure progress in language proficiency. The test
construct is based on actionable learner outcomes as embodied in can-do statements in the
Common European Framework of Languages and the Global Scale of English General learning
objectives. It is built on the body of applied linguistic research of the last 50 years which
prioritises the ability to use language in context rather than just knowledge of the language. To
use language effectively it is assumed that learners require certain knowledge of the systems of
language such as grammar, vocabulary, and phonemic systems.

The test explicitly measures language as a unitary trait. It also provides a breakdown of
sub-scores for the convenience of users to provide some insights into the relative strengths
and weaknesses of students.

The test suite contains 4 tests: Test A, Test B1, Test B2 and Test C. Test A assesses at
CEFR A1 and A2, and Test C at C1 and C2. The tests use fixed, linear forms. The total time
of the test is approximately 45 minutes, although this may vary slightly according to level.
Each test has three parts:
• Part 1 - Part 1 tests Grammar, Vocabulary and Reading. There are 8 item types in
this section.
• Part 2 - Part 2 assesses Speaking and Listening. There are 7 item types in this
section.
• Part 3 - Part 3 assesses Writing. There are two item types in this section.

Supplier Response Template (annex to RFP/ITT) – 26 February 2019 12


Test items are primarily integrated skills and scored on the GSE scale. Speaking and
Writing items utilise automated scoring technologies, described below.

Global Scale of English and the Common European Framework levels


In the following tables we define how the Global Scale of English is related to the CEFR
levels. To give an impression of what the levels mean, i.e., what learners at particular levels
can do, we use the summary descriptors published in the CEFR (Council of Europe, 2001,
p. 24) where provided.

GSE 10–21 This level of proficiency is likened to a tourist who


Global assessment may know some individual words but does not
The range on the Global Scale of English from 22 have enough control of language to produce full
to 29 corresponds to the pre-A1 level of the CEFR. sentences and mostly communicates with words
There are no Global descriptors for pre-A1, but or very basic phrases.
abilities at this level can be summarised as The words they do know may carry a lot of
follows: communicative meaning or be effective when
used with hand gestures or when the context is
very clear (e.g., pointing to an object in a shop).
GSE 22–29 Can understand and use familiar everyday
Global assessment expressions and very basic phrases aimed at the
The range on the Global Scale of English from 22 satisfaction of needs of a concrete type.
to 29 corresponds to the A1 level of the CEFR. Can introduce him/herself and others and can ask
The capabilities of learners at Level A1 have been and answer questions about personal details such
summarised in the CEFR (Council of Europe, as where he/she lives, people he/she knows and
2001, Table 1, p. 24) as follows: things he/she has.
Can interact in a simple way provided the other
person talks slowly and clearly and is prepared to
help.
GSE 30-35 and 36-42 Can understand sentences and frequently used
Global assessment expressions related to areas of most immediate
The interval on the Global Scale of English from 30 relevance (e.g., very basic personal and family
to 35 corresponds to the lower part of the A2 level information, shopping, local geography,
of the CEFR, while the interval from 36 to 42 employment). Can communicate in simple and
corresponds to the upper part of the A2 level, which routine tasks requiring a simple and direct
is also sometimes referred to as the A2+ level. exchange of information on familiar and routine
The capabilities of learners at Level A2 have been matters.
summarised in the CEFR (Council of Europe, 2001, Can describe in simple terms aspects of his/her
Table 1, p. 24) as follows: background, immediate environment and matters
in areas of immediate need.
GSE 43–50 and 51-58 Can understand the main points of clear standard
Global assessment input on familiar matters regularly encountered in
The interval on the Global Scale of English from 43 work, school, leisure, etc.
to 50 corresponds to the lower part of the B1 level Can deal with most situations likely to arise whilst
of the CEFR, while the interval from 51 to 58 travelling in an area where the language is
corresponds to the upper part of the B1 level, which spoken. Can produce simple connected text on
is also sometimes referred to as the B1+ level. topics which are familiar or of personal interest.
The capabilities of learners at Level B1 have been Can describe experiences and events, dreams,
summarised in the CEFR (Council of Europe, 2001, hopes, and ambitions and briefly give reasons and
Table 1, p. 24) as follows: explanations for opinions and plans.
GSE 59-66 and 67-75 Can understand the main ideas of complex text on
Global assessment both concrete and abstract topics, including
The interval on the Global Scale of English from 59 technical discussions in his/her field of
to 66 corresponds to the lower part of the B2 level specialisation.
of the CEFR, while the interval from 67 to 75 Can interact with a degree of fluency and
corresponds to the upper part of the B2 level, which spontaneity that makes regular interaction with
is also sometimes referred to as the B2+ level. native speakers quite possible without strain for
The capabilities of learners at Level B2 have been either party.
summarised in the CEFR (Council of Europe, 2001, Can produce clear, detailed text on a wide range
Table 1, p. 24) as follows: of subjects and explain a viewpoint on a topical
issue giving the advantages and independent
disadvantages of various options.
GSE 76–84 Can understand a wide range of demanding,
Global assessment longer texts, and recognise implicit meaning.
The interval on the Global Scale of English from 76 Can express him/herself fluently and
to 84 corresponds to the C1 level of the CEFR. The spontaneously without much obvious searching for
capabilities of learners at Level C1 have been expressions.
summarised in the CEFR (Council of Europe, Can use language flexibly and effectively for
Supplier Response Template (annex to RFP/ITT) – 26 February 2019 13
2001, Table 1, p. 24) as follows: social, academic, and professional purposes.
Can produce clear, well-structured, detailed text
on complex subjects, showing controlled use of
organisational patterns, connectors, and cohesive
devices.
GSE 85 to 90 Can understand with ease virtually everything
Global assessment heard or read.
The interval on the global scale corresponds to the Can summarise information from different spoken
level C2 on the CEFR. This is a very high level of and written sources, reconstructing arguments,
attainment which has been summarised in the and accounts in a coherent presentation.
CEFR (Council of Europe, 2001, Table 1, p. 24) as Can express him/herself spontaneously, very
follows: fluently and precisely, differentiating finer shades
of meaning even in more complex situations.

Test Development
The questions in Benchmark Test have been developed by international teams of writers
who are very experienced in writing assessment questions. Teams are based in the UK,
Australia, the USA, and Hong Kong. All questions have been tagged with a Global Scale of
English (GSE) level and linked to a ‘can do’ statement.

Once written, all questions are reviewed by the teams in the different countries. Comments
and suggestions for improvement are stored with the test questions on a secure database.
The questions then go through a further review by an expert panel and decisions are made
on the quality of the questions, which to keep and which to reject. All questions are then
thoroughly checked by Pearson staff and images and high-quality recordings are added to
complete the questions before they go forward to be calibrated in a large-scale field test.
After the field testing, further checks are made on item quality based on the measurement
characteristics of the questions. Questions are eliminated from the item pool if they are too
easy or too difficult, if weaker learners get them right but stronger learners get them wrong,
or if they show any bias. These checks then result in a bank of the best quality questions.
Questions are selected from this bank to go into the final tests .

Field Testing
As part of the test development process, a large field test, conducted in two phases, was
carried out to ascertain the appropriateness of the pool of items and to serve as a source
for constructing individual test forms which would allow reliable predictions of students’
ability in English. A portion of the data collected was transcribed and rated which was used
to train automated scoring systems.
Field test forms were created using a linking approach. That is, the forms were linked
together with sets of items that appeared on all forms. Also, during the second phase of
data collection, since most candidates took two tests, the field test forms were also linked
through candidates. Learners and L1 English speakers were recruited to participate in the
field test. A total of 13,073 tests were submitted during the two field test phases. The
demographic for Benchmark is upper secondary and young adult. Most participants were
aged 16 to 35. Participants were from 96 countries. The countries with the largest number
of participants included Saudi Arabia, Poland, Panama, Ecuador, The Netherlands,
Argentina, Brazil, Spain, Guatemala, Japan, and Thailand. As an incentive to participate,
students received a year’s free access to the Longman Dictionary of Contemporary English
Online (LDOCE). L1 English speakers were offered an Amazon voucher.

Validity Evidence

Test Reliability
Reliability is one aspect of validity - if a candidate took a test on multiple occasions, would that
person get a similar score each time? During field testing, many candidates took two tests in a
short period of time. The two tests were made up of different items. Presumably, little or no
learning occurred between these test administrations, so the correlation of the scores from these
two tests should provide a good estimate of test reliability, known as test-retest reliability. The
higher the observed correlation between the two test administrations, the more reliable the test
scores are. In the observed field test data, after removing test data from candidates who either
did not answer a sufficient number of items, or who got extreme

Supplier Response Template (annex to RFP/ITT) – 26 February 2019 14


scores outside of the normal GSE range, the test-retest correlation was .861 (n=2,141).
This observed correlation demonstrates a high level of consistency of measurement of
Benchmark test administrations.
The psychometric analysis tool, called Winsteps, also yielded another measure of test
reliability estimate as part of item calibration. The reliability estimate is 0.90 (n=11,908).
From these two estimates, test reliability is high.

Automated scoring validation process


From the field test data, 300 candidates were randomly selected as the validation data set.
A validation data set is a group of candidates whose data are segregated out prior to
psychometric analysis to independently test how well automated scoring models work, once
they are complete. Additionally, these candidates’ data were not included in the
psychometric item calibration, or in the scaling onto the GSE. If the test scores for these
candidates as calculated by both automated and human scoring models are highly
correlated, this provides evidence that the automated scoring models will work as expected
for other new candidates in the operational setting.

Once the automated scoring system was developed, the responses from the validation set were
run through the same psychometric model to produce an Overall and six skill scores for each
candidate. Those human and machine scores were then correlated to compare how similar
those two kinds of scores are for each person. When candidates were identified as having
extreme scores (i.e., well outside the reported score range of the GSE and not well estimated),
or when they had fewer than five responses which were able to be scored in a skill area, their
scores were excluded from the analyses. This reduced the n-count for the Overall score
correlation to 288 candidates. The relationship between machine and human Overall scores was
found to be a very strong one with a correlation of .97 (see Table 1).

Table 1. Correlations between scores using machine and human scoring methods for
Overall and skill reporting areas.

Score Type Correlation

Overall .97
Listening .93
Speaking .83
Reading .90
Writing .99
Grammar .97
Vocabulary .93

Machine scoring produces scores that are nearly identical to those that a careful human
rating process for many item types would require (see Figure 1).

Figure 1. Scatter plot of GSE scaled scores for validation set candidates using human and
machine scoring methods.

Supplier Response Template (annex to RFP/ITT) – 26 February 2019 15


Test reporting
A key feature of the Benchmark Test is the score report, specifically the performance
summaries, recommendations, GSE Learning objectives, and relevant courseware
activities. These provide more detailed feedback for teachers and students and help advise
future learning.

Using a Benchmark Test assessment map, based on Pearson assessment frameworks,


item types and content, the score report provides test-specific performance summaries for
each skill at different CEFR levels. The performance summaries link to student and teacher
recommendations as well as GSE Learning Objectives, which state what students need to
do to improve i.e., reach the next CEFR band. Recommendations are course agnostic, but
a range of Pearson courseware titles are available to provide links to specific activities.

The performance summaries, recommendations and recommended activities form part of the
test reporting, together with overall and skill scores. Enabling skills (Vocabulary, Grammar) are
reported on in the performance summaries and recommendations where relevant but are not be
given a specific GSE score. Reporting is provided at the individual and group level.
The test and reporting are course agnostic, but specific courseware activities related to the
recommendations are provided for selected ELT courseware

Test Questions
What kinds of questions are in the test and what do they measure?
The test has several different question types. This gives learners a chance to demonstrate
their English skills in different ways. There are questions where learners choose the correct
option or where they write the answer into an open question. There are questions where the
learner repeats or copies what has been said as well as questions where learners describe
something or write a short essay. The questions are similar to the questions and tasks
learners will have done in the classroom as part of their learning and so should be familiar.
Not all question types appear at all levels.

Vocabulary questions
There are three vocabulary question types. Vocabulary is also tested as part of Describe
Image, Short Essay and Recall a Passage which are integrated skills questions.

Supplier Response Template (annex to RFP/ITT) – 26 February 2019


16
Item type What do the learners have to do? What is being tested?
Fill in the Table This question asks the learner to This question tests the vocabulary
complete a set of vocabulary knowledge of the learner. It tests the
items with appropriate words. The words the learner knows and the accuracy
words are presented as a table of of the form of the word. It tests the
related words. learner’s knowledge of word families and
related sets of words that they may have
met in the
classroom or when learning English.
Choose the This question asks the learner to This question tests the vocabulary
Right Word or choose the correct word to knowledge of the learner in a written
Phrase complete several sentences. The context. It tests the vocabulary the
sentences are related by a similar learner knows and whether they can
theme. understand the use of the vocabulary in
the context of a sentence. It tests the
range of vocabulary
the learner knows.
Complete This question asks the learner This question tests the vocabulary of the
the Dialogue to select words from a word learner in a spoken context. It tests the
bank to complete a dialogue. vocabulary the learner knows and
whether they can understand the use of
the vocabulary in the context of a
conversation. It tests the range of
vocabulary the learner
knows.

Grammar questions
There are two grammar question types. Grammar is also tested as part of Short Essay and
Recall a passage.

Item type What do the learners have to do? What is being tested?
Choose the This question asks the learner to This question tests the knowledge
Right Word or choose the correct word or phrase to of grammar of the learner. It tests
Phrase complete several sentences. The the range of grammatical
sentences are related by a similar knowledge as well as the accuracy
theme. of grammar in a written context.
Correct This question asks the learner to This question tests the knowledge of
the choose the correct word or phrase to grammar of the learner in a written
Mistake replace a mistake in a sentence context and whether they can
choose the right grammatical form in
a
sentence.

Reading questions
There are three reading question types. Reading is also tested as part of Read aloud, recall
a passage, and Listen and Read which are all Integrated Skills questions.

Item type What do the learners have to do? What is being tested?
Choose the This question asks learners to read a This question tests the global
Right Picture short text and select the best picture to understanding of short
match with the text. messages,
notes and short pieces of writing.
Short This question asks the learner to read This question tests the reading
Answer a longer text and answer questions on comprehension of the learner. It tests
Questions the text. specific information included in the
text.
Choose the This question asks learners to read a This question tests the global
Right Word or short text and select the best word or understanding of short
Phrase (gap fill) phrase to complete the text. messages, notes and short pieces
of writing.

Listening questions
Listening is tested as part of Listen and Repeat, Story Retell, listen to the conversation,
Listen and Write (Dictation), and Listen and Read (Hotspots) which are all Integrated Skills
questions.

Supplier Response Template (annex to RFP/ITT) – 26 February 2019 17


Speaking questions
There is one speaking question type which tests speaking. Speaking is also tested as part
of Read aloud, Listen and Repeat, Story retell, listen to the conversation and Passage
comprehension which are Integrated Skills questions.
Item type What do the learners have to do? What is being tested?
Describe Image This question asks the learner to look This question tests the learner’s
at a photograph or picture and describe ability to speak in an extended way
what they see. linking concepts and ideas. It tests
the accuracy of speech including
accurate grammar, pronunciation,
and stress as well as the fluency of
the speech. It tests the use of
appropriate words to
describe the photograph or picture.

Writing questions
There is one question type which tests only writing. Writing is also tested as part of Listen
and Write and Recall a passage which are Integrated Skills questions.

Item type What do the learners have to do? What is being tested?
Short Essay This question asks the learner to write This question tests global writing skills.
a short essay in response to a prompt. It tests paragraph and sentence
For lower levels, test takers need to structure, the range and accuracy of
write a short description of an image the language used, the ability to
structure an argument or discussion in
a written context. It tests grammar and
vocabulary as an essential part of
writing.

Integrated skills questions


There are seven question types which measure more than one skill at the same time.
These are called Integrated Skills Questions.

Item type What do the learners have to do? What is being tested?
Read Aloud This question asks the learner to This question tests accurate
read aloud a sentence or short text. pronunciation and how fluent the
learner is at speaking. It tests if the
words in the text are understood and
repeated accurately.
Listen and then This question asks the learner to This question tests listening
Write (Dictation) listen to a sentence or short text and comprehension at the word and
write what they have heard. sentence level. It tests the ability
to write accurately and understand
sentence structure, word order
and
connectors.
Listen and Repeat This question asks the learner to This question tests listening
listen to a sentence or short text and comprehension at the word and
then repeat it sentence level. It tests pronunciation
and fluency. It tests if the words
heard are understood and repeated
accurately
Read and then This question asks the learners to This question tests reading
Write read a short story or short piece of comprehension. It tests the ability to
factual text. The text then disappears, write accurately and understand
and the learner must reconstruct the sentence structure, word order and
text. connectors.
Listen and This question asks the learner to read This question tests reading and
Read (Hotspots) a text and at the same time listen to the listening comprehension. It tests the
text. The learner must find the ability to recognise individual words in
differences between the written text a text.
and the spoken text.
Story Retell This question asks the learner to listen This questions tests listening and
to a short narrative and then retell the speaking. It assesses understanding
narrative using their own words. of a short narrative
Supplier Response Template (annex to RFP/ITT) – 26 February 2019 18
Listen to the This question asks the learner to This question tests listening
Conversation listen to a short conversation and then comprehension. It tests the accuracy
answer a question about the of the listening comprehension of the
conversation. learner

Question type and level


Most questions are used across all 4 levels of the test. However, some questions are more
appropriate for students at lower or higher levels of proficiency. The table below shows
question types in relation to four different test levels.

Item Skills covered Test Level


Type A B1 B2 C
Fill in the Table Vocabulary
Choose the Right Word or Phrase Vocabulary
Choose the Right Word or Phrase Grammar
Complete the Dialogue Vocabulary
Part 1 Correct the Mistake Grammar
Choose the Right Picture Reading
Short Answer Questions Reading
Choose the Right Word or Phrase Reading
Read Aloud Speaking & Reading
Listen and Repeat Listening & Speaking
Describe a Picture Speaking
Part 2 Story Retell Listening & Speaking
Listen to the Conversation Listening & Speaking
Listen and Read (Hotspots) Listening & Reading
Listen and Write (Dictation) Listening & Writing
Part 3 Recall a Passage Reading & Writing
Short Writing Task (Essay or Writing
describe an image)

Sample test
Learners can take the unscored sample test to familiarise themselves with the question
types in the test. There is a sample test for every Benchmark level as question types and
the level of difficulty vary. The sample test is approximately half the length of the full test
and contains examples of all the item types of students will attempt in the scored version of
the test. Teachers should run through the sample test the first-time students take a test at a
particular level.

Reporting
Group Report
REPORTS AND SCORES
The group report provides information about the set of students you have selected including
overall performance and each skill. It is generated in the Overview tab on Test Hub. The
default view shows results for all test levels, unless selected otherwise.

Supplier Response Template (annex to RFP/ITT) – 26 February 2019 19


The following points generally describe the group report features:

1. The report details (name of the institution, number of students)


2. A summary of the group’s overall performance
3. A date range of the group report
4. The group averages for each skill presented in a diagnostic chart
5. A list of test-takers’ names
6. The level of the test they took
7. Each student's overall score followed by the skill scores
8. A link to the individual report
9. An asterisked overall score means that an individual skill score was flagged as NS
or BL. NA as an overall score means that there were 2 or more BL/NS scores or a
combination of NS and BL scores.
10. Skill name and GSE score
11. Performance Summary
12. Recommended Activities
13. Suggested GSE Learning Objectives for the group

Supplier Response Template (annex to RFP/ITT) – 26 February 2019 20


Individual Report
The individual report includes:

Test details (student’s name, test date, and institution's name)


1. A summary of the student’s overall performance
2. Their overall GSE score and CEFR band
3. Diagnostic chart showing score per skill
4. Course mapping –Instructors can select a Pearson product from the course dropdown to
see specific activities from the course in recommended activities section.
5. GSE score for each skill is followed by a description of their ability and
recommendations
6. Suggested GSE Learning Objectives (describe what a student at a particular level
should be able to do across each skill). These can be used as learning goals.
7. If scores cannot be provided, advisory notices will explain why (see the following
slides for more information).

Note:
An asterisked overall score means that an individual skill score was flagged as NS or BL.
NA as an overall score means that there were 2 or more BL/NS scores or a combination of
NS and BL scores.

Test Scores
The Benchmark Test assesses the test-taker's performance across each skill and gives
them a level on the GSE General scale. The score means they have demonstrated that
they can perform certain tasks at this level. The number is accompanied by some
information describing what they were able to do (Performance Summary) and areas for
them to work on (Recommended Activities).

In most cases students will receive numerical scores, but in certain circumstances you may see
different results as shown below. In each case advisory notices are provided in the report.

BL - Below level the student scored below the GSE range of the test, so they have not
received a score for that skill.

NS - Not scorable the responses could not be scored by the technology. This can occur
because students did not answer enough questions. It is also given if their recorded spoken
answers could not be heard clearly. This happens if there is background noise, the student
mumbled, spoke too softly or loudly, or was speaking in a language other than English. Using

Supplier Response Template (annex to RFP/ITT) – 26 February 2019 21


the sample test and following best practice will help avoid this. Advisory notices are
provided to aid understanding.

Overall score This is the overall GSE score for the test-taker, based on their performance
in the test across all the skills.

Asterisked overall score If test-takers receive a BL or NS for one of the skills, their overall
score is asterisked to highlight that one skill score was not reported and has not been
included in the overall score calculation.

NA - Not applicable This only applies to an overall score. If test-takers receive two or more
BL/NS scores (or a combination of BL and NS), they will see NA instead of an overall score.
This is because it is inaccurate to calculate a score from a reduced number of skill scores.

What are the Advisory Notices?


Skills may be flagged as Not Scorable or Below Level and advisory notices are provided.
These are described below. Schools and students must follow best practice to avoid this .

Flag Skill Scenario Advisory notice


NS Speaking Voice was not Environmental or behavioural factors
Not clear due to meant we could not score responses.
Scorable environmental, Factors include background noise,
technical, or problems with headsets or computers,
behavioural answers were too quiet or not intelligible.
issues
NS Reading, Too many items Many questions were not answered so
Not Writing, were skipped a score cannot
Scorable Listening be given. Please encourage students
Speaking to answer all questions.
BL Scores were The score was below the level of the
Below level below test so a score cannot be given.
thelevel of the
test range
If two or more scores are not given, the overall GSE score is not meaningful and is not
shown. What do the Scores mean?
The GSE score the student receives (e.g., 20 in Listening) is based on their performance on
the selection of listening skills tested at this level and is therefore indicative of their
proficiency level in listening, but it is not an absolute score. Similarly, it does not mean they
can do every GSE Learning Objective for listening at 20 (or below). They stand a high
chance of being able to do tasks below this level and some chance of being able to do
things slightly above this level, but this is the average level they have demonstrated.

The Performance Summary report summarises the student’s performance relative to the
performance descriptors for each skill/proficiency area articulated in the GSE Assessment
Framework rather than on their performance on specific curriculum learning objectives.

Recommended Activities suggest ways to improve based on skills tested and


strengths/weaknesses demonstrated. The specific activities referenced in Pearson
coursebooks, included in the report, relate to curriculum learning objectives which, with
additional study, should support further improvement in the areas of performance identified.

The Suggested GSE Learning Objectives relate to aspects that may not specifically have
been tested but relate to the skill/proficiency area at the same level so may also be
beneficial to work on.

The Benchmark Test assesses a representative cross-section of skills and proficiency traits
to identify average performance and general areas of strength/weakness. It is sufficiently
robust to reliably inform teaching and planning, but if instructors want absolute GSE levels
for their test-takers then they need to enter them for longer tests which have more items at
each level and test every sub-skill.

Supplier Response Template (annex to RFP/ITT) – 26 February 2019 22


2. Technical requirements for student presentation of the test on mobile
devices, tablets and/or computers

System Requirements
To successfully use Test Hub, please ensure your devices meet the following
system requirements.
Note: Tests can only be taken on computer devices.

For PC-compatible computers:


• Operating System: Windows 7+
• Web Browser (in the latest version): Google Chrome, Mozilla Firefox, Microsoft
Edge (higher than V79)
For Macintosh computers
• Operating System: OS 10.13+
• Web Browser (in the latest version): Google Chrome, Mozilla Firefox, Microsoft
Edge (higher than V79)
Note: Due to issues with recordings Safari browser is not supported.

For all computers


• Processor: Intel Core™ Duo 2.0GHz or faster processor
• Screen resolution: 1280 px (we have responsive design that will adapt to
your screen resolution)
• Memory: 4 GB or higher RAM
• Cookies: Must be enabled
• A stable internet connection: Recommended speed of 10 Mbps and minimum 4G
mobile data access. Test your connection speed.

Bandwidth Requirements
The table below shows the recommended bandwidth when users located in the same room
are using the platform at the same time on a shared dedicated internet connection.
To avoid performance issues, check with your school's local Information Technology
(IT) support team if you are unsure of your available bandwidth.

Concurrent Users Upload Speed Download Speed


1 128 kbps (0.125 Mbps) 128 kbps (0.125 Mbps)
2 256 kbps (0.25 Mbps) 256 kbps (0.25 Mbps)
4 512 kbps (0.5 Mbps) 512 kbps (0.5 Mbps)
10 1024 kbps (1 Mbps) 1024 kbps (1 Mbps)
20 2048 kbps (2 Mbps) 2048 kbps (2 Mbps)
50 5130 kbps (5 Mbps) 5130 kbps (5 Mbps)
100 10260 kbps (10 Mbps) 10260 kbps (10 Mbps)
200 20520 kbps (20 Mbps) 20520 kbps (20 Mbps)

Guidelines for Headsets


The headsets needed to take the speaking part of each test should follow certain
specifications to ensure the audio is clear.

If taking a test that assesses Speaking it is especially important to ensure the headset has a
microphone boom to reduce the impact of background noise such as other test-takers'
voices and air conditioning.

We recommend using headsets with the following specifications:


• Lightweight and durable design
• Clear audio through the headphones
• Clear playback (recorded audio)
• Ambient noise cancellation functionality of microphone
• Over-the-ear style (non-earbud)
• Adjustable boom microphone (not an inline mic)

Supplier Response Template (annex to RFP/ITT) – 26 February 2019 23


• Leatherette ear pads
• Ambidextrous headset design
• No in-line volume control or mute button
Ensure the headsets are compatible with your devices!

Prepare a Suitable Test Room


Level Test/ Benchmark is a computer-based test normally taken on school premises;
however, it can be taken at home if the test taker has access to a computer with headset
and microphone (if taking the 4 skills version).

On-Site Testing
If delivered on-site tests must be administered in a quiet, distraction-free room to help test-
takers focus. Excess noise, or the sound of instructors and other test-takers’ voices will
spoil recorded answers so they cannot be accurately scored.

To prevent this, test-takers must sit 10ft/3m apart in each direction (minimum distance
of 6ft/2m). Desktop privacy partitions will also reduce distractions.

Avoid gyms, halls, rooms with loud air conditioning, and hard floors or high ceilings as they
can cause echoes which can reduce the clarity of recorded spoken answers. Keep
windows closed if there is loud traffic or building work next to the classroom.

Ideally, test-takers should be tested in small groups to avoid noise interference and
distractions, and to make it easier for instructors to provide them with support.

Remote Testing
If delivered remotely, tests must be taken in a quiet, distraction-free room to help test-takers
focus.

Excess noise loud music, appliances, ambient sounds, ringtones, pets, construction tools,
etc, or the sound of other people’ voices will spoil recorded answers so they cannot be
accurately scored.

To prevent this, test-takers must choose the most appropriate time and place for this (e.g.,
night hours tend to be more silent and quieter than during daytime). If at home Test Takers
must warn their relatives and/or roommates to avoid making noises that may disrupt the
testing environment.

Avoid gyms, halls, rooms with loud air conditioning, and hard floors or high ceilings as they
can cause echoes which can reduce the clarity of recorded spoken answers. Keep windows
closed if there is loud traffic or building work next to the place where the test is taking place.

Test takers must be properly “trained” by test admins, so candidates consider all good
delivery practices with a view to having a successful test. This includes appropriate internet
connection, software, hardware, etc.

3. Action plan to ensure 100% testing

AWARENESS

During the implementation of the test; the users will receive a one-hour
session of awareness in which they will receive general information
about:

Supplier Response Template (annex to RFP/ITT) – 26 February 2019 24


• How to start the test
• Considerations about the test
o Internet connection o
Microphone
o Headset
o Type of questions
• What the tests assesses
• How should I take the test?
• When and where should I take the test?
• Special tips.
• Among others.

Besides, we have already set a plan in regard to frequent situations:

a) Students haven’t received any invitation, what should they do?


- Students should check if they haven’t received their invitation
in Spam Mailbox.
- If students do not have any mail either their Inbox or Spam,
they should contact their test-admin; they validate if their e-
mail is correct.
- Students ask their test admin to resend the invitation if their
e-mails are right.
- Students should use another e-mail and ask for the invitation
again to their test admin.

NOTE: By the time they receive an invitation and it is not clear for
them what to do, they can follow the next link for more
information: Receiving an Invitation.

b) Students forgot their username and password.

Students should go to the following link: Recovering Username


and Password.

c) Could students take a Sample Test before the real test?

Yes, they can go to the following link: Taking a Sample Test


NOTE: It is suggested that they take the Sample Test before
taking the real test in order that they get to know better the
structure of the exam.
d) Students cannot start the test; their screen is frozen.
- Students should use a different browser or private mode.
They need to bear in mind that Safari and Opera won’t meet
the requirements of the test.
- Students should check their internet connection; if it won’t
work, they must change their network.
- Students should restart their computer; then, go directly to
the test instead of opening another program.
- If the problem persists, students should change their
computer.

e) The test does not detect microphone.

Supplier Response Template (annex to RFP/ITT) – 26 February 2019 25


1. Students should search an online microphone on the browser
and test it.
2. Students should use a different browser or private mode.
They need to bear in mind that Safari and Opera won’t meet
the requirements of the test. Then, they need to search again
an online microphone on the browser and test it.
3. Students should check their internet connection; if it won’t
work, they must change their network. Then, try step 1 again.
4. Students should restart their computer; then, go directly to the
test instead of opening another program. Then, try step 1 again.
5. If the problem persists, students should change their microphone.
6. If the problem persists, students should change their computer.

f) Students cannot listen to the test with their headset.

- Students should sook for a video or song on your browser and


test it.
- Students should use a different browser or private mode.
They need to bear in mind that Safari and Opera won’t meet
the requirements of the test. Then, they should look for a
video or song again on the browser and test it.
- Students should check their internet connection; if it won’t
work, they must change their network. Then, try step 1 again.
- Students should restart their computer; then, go directly to the
test instead of opening another program. Then, try step 1 again.
- If the problem persists, students should change their headset.
- If the problem persists, students should change their computer.

g) How long do students have to retake their test If they had


connection problems?
If students cannot retake their test on the allotted time, it
will be considered as “Test abandoned”; therefore, they
need to take the test again. Students should contact their
test admin to provide details about the problem and ask for
a reassignment.

h) What happen if students did not take the test before


the deadline?
Their test will be considered as “Expired” and they need to
contact their test admin for a reassignment.

i) Where can students see their results?

They have to contact their test admin.

Supplier Response Template (annex to RFP/ITT) – 26 February 2019 26


4. Work Schedule

Entry Test Implementation

Exit Test Implementation

5. Methodology for the delivery of licenses / pins to access the tests

With the following diagram, methodology will be explained step by step with participants.

Supplier Response Template (annex to RFP/ITT) – 26 February 2019 27


6. Methodology for data analysis

As there will be two testing moments, the data analysis will occur as follows:

Entry Test
1. Candidates take the entry test.
2. Individual PDF score reports download.
3. All candidates result downloaded via Excel worksheet.
4. With all data downloaded, the existing candidate’s info (names, last names, age,
gender, grade, school, district, city, etc) can be unified with their results (Overall
level and levels per skills).
5. The before mentioned data will allow to identify, compare and/or contrast the
English level (Overall level and levels per skills) of the test takers along with their
personal, group and/or location data among others. The following table shows an
initial Overall Level sample data from the entry test which can be later analysed and
itemised to several categories:

Level Test Results – District 1


Overall Level # TTs %
PreA1 87 1%
A1 104 2%
A2 157 2%
A2+ 306 5%
B1 577 9%
B1+ 1045 16%
B2 1354 21%
B2+ 1632 26%
C1 757 12%
C2 353 6%
Total taken 6372 100%

Exit Test
1. Candidates take the exit test.
2. Individual PDF score reports download.
3. All candidates result downloaded via Excel worksheet.
4. With all data downloaded, the existing candidate’s info (names, last names, age,
gender, grade, school, district, city, etc) can be unified with their results (Overall
level/Score and levels per skills,).
5. The before mentioned data will allow to identify, compare and/or contrast the
English level (Overall level and levels per skills) of the test takers along with their
personal, group and/or location data among others. Also, as this test is more
profound and specific to a CEFR band, the exit test result can reveal progress in
comparison with the Entry Test result; of course, some very important criteria from
the GSE theory and research must be met (see How long does it take to learn a
language? Insights from research on language learning.

Page 9). The following graphic/table shows a sample data analysis where an entry
test result is compared to an exit test which can be later analysed and itemised to
several categories:

Supplier Response Template (annex to RFP/ITT) – 26 February 2019 28


Supplier Response Template (annex to RFP/ITT) – 26 February 2019 29
7. Team

Pearson will assign a dedicated team as follows; Brief CVs with the information
associated are appended to this document:

Pearson Head of Service


(Iván Tavera)

Project Cordinator
(Cesar Hortua)

Project Assisstant 1 Project Assisstant 2


(Grisel Hernandez) (Maghaly Guzman)

Experience - [30%]
ID % Requirement

EX01 [30%] Recent and relevant experience from both the team and other customers in supporting
and managing platforms for virtual course development, management, online material
usage, application, and testing development.
Supplier Response:

Pearson’s team has extensively experience in managing different projects with different

products and tests based on customers’ needs. Moreover, professional staff needs to be
trained in current education trends when owning different platforms, books, online material,
and tests. As we implement different tests around different places around the world, staff’s
experience includes the following items:

- Services and Assessment Delivery Management in Latin America.


- Training on services along different services such as F2F, Blended and Online).
- English Assessment Delivery with low, medium, and high stakes tests.
- English Assessment Delivery with placement, diagnostics, monitoring,
proficiency, and certification tests.
- Academic and administrative management.
- Use in different LMS platforms along different English programs.
- Academic awareness training.
- Associate Digital Product Management.
- Technological support.
- Assessment tools Management.
- Certification test preparation and lesson planning.
- Among others

Supplier Response Template (annex to RFP/ITT) – 26 February 2019 30


Such as Head of Service, Project Coordinator and Project assistants comply with all items
mentioned previously as they are ELL professionals with further complementary education
such as: TKT, master’s degree, Cambridge’s certifications, certification programs, etc.

Regarding experience with customers, we are attaching 3 certification letters with


satisfied customers becoming relevant to provide an excellent service for this bit.

- Teleperformance: English Assessment services for more than 50.000 test per
year.
- Universidad EAN: Apply more than 3.000 test per year to students.
- Educar Futuro: Apply diagnostic test for 3.810 students for Public Schools.
- Universidad Tecnologia de Monterrey: Apply not only level test but also
International Certifications such as Pearson International Certificate, which we are
offering as our extra value to our proposal.

Price – [15%]
ID % Requirement

PR01 [15%] Supplier Response:


Value for money will be assessed. Please include a brief description of the proposed
tool/tools along with their price
The value includes all logistics, implementation, and analysis of the results for the two

moments (entry and exit). The project will have one Project Coordinator and two
assistants that will oversee:
- Set an implementation plan based on the dates and needs of the users.

-Get test takers to become aware of the tests


-Provide academic support of the tests to be implemented.
-Answer different queries that users could have.
-Arrange and organize tests schedule.
-Assign tests on the allotted times.
-Support users on different situations.
-Report problems to users when tests do not provide results due to technical
issues of the assignment.
- Reassign tests under specific conditions.
- Validate results.
- Provide feedback to users.
- Create statistical analysis of the results (Pre-test & Post-test)
- Analise differences and create reports.
- Backup test as a safety measure (100 units)
- 10 International Certificate Test to the best performance
The total Price for this proposal is: $214.200.000 IVA included.

Supplier Response Template (annex to RFP/ITT) – 26 February 2019 31


Added value – [15%]
ID % Requirement

AV01 [10%] Supplier Response:


Competitive features of the proposed tool/test and any associated services which the
bidder considers add to the value of their tender proposal.
As part of our added value, we offer 10 International Certification Test called Pearson

English International Certificate. This test will be given to the students who demonstrate
the best improvement between the first and the second test.
Characteristics of the test:

Computer-based Testing
PTE General/Pearson English International Certificate
Overview

Background

The Pearson Test of English General (PTE General), soon to be known as Pearson
English International Certificate, is an assessment solution for six levels of proficiency
that assesses and accredits general English ability.

What does the computer-based test include?


• Computer-based test delivered in secure test centre locations
• 4 skills mandatory: Reading, Writing, Listening, Speaking: all delivered in one test
session
• CEFR-aligned
• GSE (Global Scale of English)-alignment coming in 2022
• On-demand delivery based on test centre availability; home delivery possible in some
countries with remote proctoring
• Quick turnaround time on scores
• Ofqual regulated under the title Pearson Edexcel Certificate in ESOL International
• Awarded by Edexcel, the largest UK awarding body for academic and vocational
qualifications, delivering more than 130 million exams each year
• Extended certification fall-back grades
• 6 computer-based tests (A1, A2, B1, B2, C1, C2)
• 4 Readiness tests: A2-C1 at launch
• Eco-system of test prep: App, Question Bank, Teacher training

Competitive advantages

Supplier Response Template (annex to RFP/ITT) – 26 February 2019 32


More convenient

• Simple online booking, up to 24


hours in advance
• Testing on-demand based on
local test centre availability, any
time during the year
• Single test, under 2 hours

Enhanced Security

• Delivered at Pearson run


test centres
• Completely digital delivery and evaluation provides greater accuracy and security

Faster results

• Fast results: weeks quicker than paper-based testing


• One week (2022)

Fair and accurate results

• Responses are scored objectively and consistently, with no regional variances in


standards with machine scoring
• English skills are assessed through integrated skill items (reading & writing, listening
& speaking, etc.) so students are judged on their communicative skills, not rote
learning.

Part 2 – Submission Checklist


Insert Yes (Y) or No (N) in each box in the table below to indicate that your submission includes all of the
mandatory requirements for this tender.

Important Note: Failure to provide all mandatory documentation may result in your submission being
rejected.

Submission Checklist
Document Y/N
1. Confirm acceptance of the Terms and Conditions of this RFP/ITT and of Annex 1,
including any changes made via clarifications during the tender process. Y

2. Completed tender response in Annex [1] (Supplier Response) and in accordance


with the requirements of the RFP/ITT Y

3. This checklist signed by an authorised representative


Y

I confirm on behalf of the supplier submitting the documents set out in the above checklist that to the best
of our knowledge and belief, having applied all reasonable diligence and care in the preparation of our
responses, that the information contained within our responses is accurate and truthful.

Supplier: Pearson Educación de Colombia S.A.S


Date: 22 Februrary 2022

Supplier Response Template (annex to RFP/ITT) – 26 February 2019 33


Name (print): Luis Edgardo Mendoza Guevara
Position: Country Sales Manager
Signature:
Title: MR

Supplier Response Template (annex to RFP/ITT) – 26 February 2019 34

You might also like