Download as pdf or txt
Download as pdf or txt
You are on page 1of 55

• In DOK taxonomy, name, define, list are key words for learning level recall.

• In DOK taxonomy, hypothesize, reflect, generalize are key words for learning level extended
abstract.
• Highest learning extended abstract
• Student learn about link between curriculum and assessment. Through with reference of
National Curriculum of Pakistan 2006.
• In national curriculums of Pakistan, learning of student is classified into four levels.
• The benchmarks further elaborate the standards,
• There are two forms of assessment recommended.
• A wide variety of assessment tools and techniques to measure student’s ability to use
language effectively.
• Form of suitable assessment tools are MCQs - Constructed response.
• Form of suitable assessment tools- English in 2006.
• Achievement test, student learn by themselves or we teach them and at the end we want
to see that how much student learn
• Achievement test designed to indicate the degree of success in some past learning
activity.
• Fixed Choice Assessment is sued to measure the skills of people efficiently. (multiple
choice questions, MCQs)
• Graph, separate, relate, contrast, narrate, compare keywords are used in skill or concept
level of Depth of Knowledge
• Create, Synthesize, Design and reflection keywords are used in Extended Thinking level
of Depth of Knowledge
• Seven levels in psychomotor domain the lowest level in psychomotor domain is
Origination
• The highest level of psychomotor domain is Perception
• There are five levels in Affective Domain
• The highest level of Affective Domain is Receiving
• The lowest level of Affective Domain is Characterization.
• Type of assessment includes attitude, Interest and personality inventories, observational
techniques, peer appraisal: typical performance test
• Fixed Choice Assessment is used for efficient measurement of knowledge and skills
• Placement test is used to measure student„s entry level performance to know either
student have a sufficient knowledge required for a particular course or not
• Readiness test: It is a test used to determine the students„ knowledge or concept about a
particular course of instruction or what is the level of students
• Aptitude test: It is used for the admission in a particular program
• Pretest: It is made according to the course objectives and determines the student present
knowledge about them
• Self- report inventories: Determines the student level by interviewing or discussion
• Comprehension defined as the ability to grasp the meaning of the material
• Which of the following method is a measure of equivalence: Equivalent Forms
• Which of the following method is a measure of stability and equivalence: Test Retest
with Equivalent Forms
• Which of the following method is a measure of consistency of rating: Inter-rater
Reliability
• Assessment: It means appraisal of something to improve quality of teaching and learning
process for deciding what more can be done to improve the teaching, learning and
outcomes.
• Evaluation is process of making a value judgment against intended learning outcomes
and behavior, to decide quality and extent of learning.
• Measurement is the process by which the attributes or dimensions of some object (both
physical and abstract) are quantified.
• One of the tools used by the teachers to develop a blueprint for the test is called: table of
specification TOC
• Criterion validity refers how much test can measure some performance with reference to
some defined standard.
• Predictive validity refers that how much test can predict future performance with
reference to some defined standard.
• Competency: It is a key learning area. For example algebra, arithmetic, geometry etc. in
mathematics and vocabulary, grammar, composition etc. in English.
• Standards: These define the competency by specifying broadly, the knowledge, skills
and attitudes that students will acquire.
• Elaborate the development level further,
• Most tests in education are verbal tests.
• Teacher made test are designed according to need and issues related to specific class.
• Non-verbal: Does not require reading, writing or speaking ability, tests composed of
numerals or drawings is example.
• A test designed to measure depression must only measure that particular construct.
• Formative assessment focuses on the process toward completing the product.
• Formative assessment focus of measurement in formative assessment is predefined
segment of instruction.
• In Formative assessment, Limited sample of learning tasks are addressed.
• Formative assessment is conducted periodically during the instructional process.
• Results of formative assessment are used to improve and direct learning through ongoing
feedback.
• An arithmetic test has high degree of validity for computational skill and low degree for
arithmetical reasoning
• Homework exercises as review for exams and class discussions: Formative assessment
• Formative assessment provides feedback and information during the instructional
process, while learning is taking place, and while learning is occurring.
• Formative assessment measures student progress but it can also assess your own
progress as an instructor.
• Reflections journals that are reviewed periodically during the semester: Formative
assessment
• Question and answer sessions, both formal (planned) and informal (spontaneous):
Formative assessment
• Conferences between the instructor and student at various points in the semester:
Formative assessment
• In-class activities where students informally present their results Formative assessment
• Student feedback collected by periodically answering specific question about the
instruction and their self-evaluation of performance and progress: Formative assessment
• Validity is matter of degree
• Validity is referred as validity of test but it is in fact validity of the interpretation and
use to be made of the results.
• summative assessment focuses on course or unit objectives
• Summative assessment is done at the end of the unit or the course.
• Most important functions of summative assessment are to assign grade, certification of
accomplishment and evaluation of teaching.
• Summative assessment takes place after the learning has been completed.
• Summative assessment is product-oriented and assesses the final product
• Reliability of measurement is needed to obtain the valid results, but we can have
reliability without validity
• Reliability refers to the consistency of assessment results.
• Validity is an evaluation of adequacy and appropriateness of the interpretation and uses
of results.
• Characteristics of good test are Validity, Reliability, Usability.
• Standards for selecting appropriate test First define the purpose for testing and the
population to be tested and select the test accordingly.
• Published tests commonly used by provincial or national testing programs are: Aptitude
tests, Readiness tests, Placement tests(all)
• Published test are designed and conducted in such a manner that each and every
characteristic is pre planned and known.
• Time in class spent on objective (min) / total time for the instruction being examined
(min) Percentage of instruction time is equal to instruction time.
• Published test are also called supplement and complement informal classroom tests, and
aid in many Instructional decisions.
• Carey (1988) listed six major elements that should be attended to in developing a Table
of Specifications
• The blueprint is meant to insure content validity.
• Content validity is the most important factor in constructing an achievement test
• A unit test or comprehensive exam is based on several lessons and/or chapters in a
book supposedly reflecting a balance between content areas and learning levels .
• In classrooms most used published tests are two Achievement tests and Reading test
• In SOLO taxonomy; Students are simply able to acquire bits of unconnected information
and respond to a question in meaningless way such as What is your name? pre-
structural
• In SOLO taxonomy; enumerate, classify, describe, list, combine, do algorithms are key
words for learning level: multi-Structural
• Student can understand several components but the understanding of each remains
discreet. Multi- Structural
• Student can indicate connection between facts and theory, action and purpose.: Rational
• In SOLO taxonomy ,compare/contrast, explain causes, integrate, analyze, relate, and
apply are key words for learning level: Rational
• Student at this level is able to think hypothetically and can synthesize a material
logically: Extended Abstract
• Student make connections not only with in the given subject area but understanding is
transferable and generalizable to different areas: Extended Abstract
• A collection of the constituents or parts of a concept so as to make a whole: Synthesis
• Design a classification Scheme for writing educational objectives that combines the
cognitive, affective, and Psychomotor domains: Synthesis
• The breakdown of a concept into its constituents parts such that the relative hierarchy of
the concept is made easy to understand: analysis
• Compare and contrast the cognitive and affective domains: analysis
• How far the different BISEs and universities are developing papers using Bloom's
taxonomy? Support your answer with arguments: Evaluation
• Identify the value of item difficulty for easiest test item. 0.2
• A MCQ is composed of statement referred as Stem
• Holistic rubric are good for evaluating Overall performance
• Most difficult test item Lowest 0.1
• Schools professional practices Material kept in a locked storage
• Extended response items are helpful to determine Length and complexity of response
• Equivalent Forms method is measure of equivalence.
• Equivalent gives two forms of the test to the same group in close succession
• Test- Retest with Equivalent Forms gives two forms of the test to the same group with
increased interval between forms
• The generalization of KR20 for assessments that have more than dichotomous, right-
wrong scores is called Coefficient Alpha.
• Inter-Rater Method is measure of consistency of ratings.

1. Types of assessment tools:


• 2
• 3
• 4 ((Anecdotal record), (Peer appraisal), (Self-appraisal), (portfolio))
• 5

2. Outcomes in the cognitive domain can be measured by paper pencil tests:


• pertaining to knowledge
• understanding
• thinking skills
• All

3. There are still many learning outcomes that require observation of natural interactions:
• Formal
• Informal
• Both
• None

4. Questioning the students directly and assessing expressed interests:


• Anecdotal record
• Peer appraisal
• Self-appraisal
• Portfolio

5. Observing students as they perform and describing or judging that behaviors:


• Anecdotal record
• Peer appraisal
• Self-appraisal
• Portfolio

6. Measuring progress by recorded work:


• Anecdotal record
• Peer appraisal S
• elf-appraisal
• Portfolio

7. Asking their peers about students and assessing social relationships:


• Anecdotal record
• Peer appraisal
• Self-appraisal
• Portfolio

8. Anecdotal records are---descriptions of meaning incidents and events that the teacher
observes:
• factual
• complete
• unbiased
• All

9. One should keep in mind points to use anecdotal records effectively:


• 7
• 8
• 9
• 6

10. Determine in advance what to observe but be alert of-----behavior:


• usual
• unusual
• supportive
• unsupportive

11. Keep the factual description of the incident and your interpretation of it:
• Combine
• merge
• separate
• in order

12. Record----behavioral incidents is a point of effective use of anecdotal record.


• Positive
• Negative
• Both
• typical
13. There are-----advantages and limitations Anecdotal Records:
• 3,3
• 3,4
• 4,3
• 4,5

14. Advantages of anecdotal records :


• It depicts actual behaviors in natural situations.
• Facilitate gathering evidence on the events that are exceptional but significant.
• Beneficial for students with less communication skills.
• All

15. Limitations of anecdotal records:


• It takes long time to maintain.
• Subjective in nature.
• Anxiety may lead to wrong observation.
• All

16. There is/ are-----widely used technique/s of peer appraisal:


• one
• Two (Guess who technique , Sociometric technique)
• Three
• Four

17. The guess who technique is based on----of obtaining peer ratings and is scored by simply
counting the number of mentions each students receive on each description:
• Nomination method
• Description method
• Communication method
• Selection method

18. -----is based on students choice of companion for some group situation:
• Guess who technique
• Sociometric technique
• Anecdotal record
• Portfolio
19. There are----important principles of sociometric choosing:
• 3
• 4
• 5
• 6

20. -----collection of students work into portfolios can serve a variety of instructional and
assessment purposes:
• Systematic
• Appraisal
• Confidential
• Overall

21. The value of portfolios depends heavily on the clarity of purpose the guidelines for the
inclusion of materials and the----to be used in evaluating portfolio.
• Material
• Facts
• Details
• Criteria

22. It is purposeful collection of pieces of student ‘s work.


• Anecdotal record
• Portfolio
• Classroom activities
• Confidential appraisal

23. Key steps in defining and using portfolios:


• Specify purpose.
• Define student ‘s role in selection and self-evaluation.
• Specify evaluation criteria.
• All

24. Fundamentally-----global purposes for creating portfolios of students work:


• 2
• 5
• 7
• 9

25. When the focus is on----portfolios usually are limited to finished work and may cover
only a relatively small period of time.
• accomplishments
• demonstrating growth
• development
• assessment

26. When focus is on the time frame is longer. It will include multiple versions of same
work over time to measure progress.
• Accomplishments
• Demonstrating growth
• Development
• B&C

27. ----- is used when we want to assess student’s prior knowledge


• Placement assessment
• Formative Assessment
• Summative assessment
• Diagnostic Assessment

28. Cognitive domain of Bloom Taxonomy are


• 3
• 4
• 6
• 8

29. Frequently used tools of summative evaluation are ?


• Teacher observation
• Daily assignment
• Test
• None
30. What individual will do under natural conditions is the
• Maximum Performance Assessment
• Norm-referenced Assessment
• Criterion-referenced Assessment
• Typical Performance Assessment

31. To assess achievement at the end of the instruction is……………..


• Summative assessment
• Placement assessment
• Formative Assessment
• Diagnostic Assessment

32. Student’s performance is compared with other student’s in


• Standardized Test
• Essay type Test
• Objective type test
• Norm referenced test

33. What is your name? is the example of which level of SOLO taxonomy
• Pre-structural
• Uni Structural
• Multi- Structural
• Relational

34. The purpose of the evaluation is to make ------


• Decision
• Prediction
• Judgment
• Opinion

35. Explain the process of water cycle is the question of-----level in Blooms taxonomy of
cognitive domain.
• Knowledge
• Comprehension
• Analysis
• Application
36. Permanent difficulties in learning are investigated in-----------
• Summative assessment
• Placement assessment
• Formative Assessment
• Diagnostic Assessment

37. The purpose of formative Evaluation is------------------


• Decision of what to measure
• Development of the test
• Administering the test
• Monitoring progress of the students

38. Curriculum Wing also known as


• National Bureau of Curriculum
• Curriculum authority development
• Development of curriculum development
• None of the above

39. Assessment that monitors learning progress --------------


• Placement assessment
• Formative Assessment
• Diagnostic Assessment
• Summative assessment

40. The purpose of measurement in education is to --------------


• Make decision about students
• Determine if objectives are being met
• Assess teacher skills
• Collect information for decision making

41. The summative evaluation is……………


• Diagnostic
• Certifying judgment
• Continuous
• None
42. Multiple choice question, matching exercise, fill in the blanks and true false is the
example of
• Typical performance assessment
• Fixed choice assessment
• Maximum performance assessment
• Formative assessment

43. List, enlist, name, define keywords are used in ----- level of Depth of knowledge
• Recall
• skill/concept
• Strategic thinking
• Extended thinking

44. In norm referenced test the comparison is between ---------


• Groups
• Individuals
• Areas
• Interest

45. Who was the founder of Pakistan is the question of----- in Bloom's taxonomy of cognitive
• Knowledge
• Comprehension
• Application
• Synthesis

46. Measurement and evaluation can be used to …………..


• Motivate students
• Help develop the scope and sequence of teaching
• Assess the effectiveness of learning activities
• all of the above

47. How many domains of Bloom Taxonomy are…………..


• 1
• 2
• 3
• 4

48. Procedures used to determine person best abilities are……….


• Maximum performance test
• Typical performance test
• Norm reference test
• Criterion reference test

49. Which of following is the type of assessment by the use of classroom instruction……….?
• Complex performance assessment
• Placement assessment
• Fixed choice assessment
• Norm reference assessment

50. In----- Federal government appointed university grant commission


• 1960
• 1964
• 1976
• 1980

51. The type of essay item in which content are limited is:
• Restricted response question
• Extended response question
• Matching item
• M.C.Q items

52. Identifying relationship between two things is done in


• True/false items
• Matching item
• M.C.Q items
• Completion item

53. How many column matchings items have? ?


• One
• Two
• three
• Four

54. The incorrect option in MCQ are called


• Answer
• Premise
• Response
• Distractor

55. The item in the column for which a match is sought is called
• Premise
• Response
• Distractor
• Answer key

56. Alternative response item is a type of ……..


• Matching List
• Completion List
• Correct/Incorrect
• All

57. In which type of question marking will be more reliable?


• Completion
• Short answer
• Multiple choice question
• Essay

58. In multiple choice items the stem of the items should be


• Large
• Small
• Meaningful
• None of the above

59. A multiple-choice question is composed of question referred as


• Stem
• Distractor
• Foil
• Response

60. Which of the following option includes in reproducing the test?


• Tests item from easy to hard
• Checking the answer key
• Keeping items and options on the same page
• Knowing the photocopying machine

61. Types 40 words per minute without error is an example of


• Norm referenced interpretation
• Criterion referenced interpretation
• Classical test theory
• Item response modeling

62. Which of the following is not a basic step to design rubric?


• Identify a learning goal
• Give hints to a student’s directly
• Choose outcomes that may be measured
• Develop or adapt an existing rubric

63. A description of specific tasks that the student can perform


• Norm referenced interpretation
• Criterion referenced interpretation
• Classical test theory
• Item response modeling

64. Feature, trait or dimension to be measured is decided in the---element of rubric


• Score
• Criteria
• Level of performance
• Descriptors

65. Advantages of holistic rubric involves …….


• Quick scoring
• Detailed feed back
• Scoring more consistent across students and grades
• All of the above

66. Scoring of essay items is a ………


• Easy task
• Time consuming
• Manageable
• All of the above

67. Which of the following should be school's professional security practices?


• Materials must be kept in a locked storage
• Leave a testing room unsupervised
• Give hints to student directly
• Making any changes to students' responses

68. Holistic rubric is good for evaluating


• A single person tests
• Overall performance on a task
• performance on test
• None of the above

69. There are----- main types of rubric


• Two
• Three
• Four
• Five

70. Descriptors are………..


• Detailed descriptions for each level of performance
• Adjectives to describe the performance levels
• A system of number or values assigned to a work
• All of the above
71. A rubric is………..
• Used for assessing a particular type of work or performance
• Test used to assess student achievement at the end
• Used to compare student scores to one another
• Used for assessing student while giving instruction

72. When you are ready to package and reproduce the test………
• Aim and goals of the test
• Objective and test blue print
• Syllabus for the test
• None of the above

73. A flatter item characteristics curve shows that item is


• Good discriminator
• Poor discriminator
• Easy
• Difficult

74. When the test is ready the next step is to it


• Prepare Administer Assembling Scoring
• Which of the following are major threats to the validity of test score interpretation
• Parent’s behavior Teacher behavior
• Student mood Student’s motivation

75. Instrument used for measuring sample of behavior is?


• Test
• Measurement
• Assessment
• Evaluation

76. Limited to quantitative description of pupil’s performance is?


• Test
• Measurement
• Assessment
• Evaluation
77. The purpose of evaluation is to make judgment about educational?
• Quantity
• Quality
• Time period
• Age

78. A formal and systematic procedure of getting information is?


• Test
• Measurement
• Assessment
• Evaluation

79. The process of obtaining numerical value is?


• Test
• Measurement
• Assessment
• Evaluation

80. A sum of questions is?


• Test
• Measurement
• Assessment
• Evaluation

81. The first step in measurement is?


• Decision of what to measure
• Development of the test
• Administering the test
• None

82. Vast of all in scope?


• Test
• Measurement
• Assessment
• Evaluation
83. The least in scope is?
• Test
• Measurement
• Assessment
• Evaluation

84. Broader in meaning is?


• Aims
• Objectives
• Instructional objectives
• Specific Objectives

85. Facility value of less than 0.20 means?


• Item is too easy
• Item is difficult
• Item is acceptable
• Item is easy

86. Objective type question have advantage over essay type because such questions?
• Are easy to prepare
• Are easy to solve
• Are easy to mark
• None

87. Discrimination value of more than 0.4 means


• Item is good
• Item is acceptable
• Item is weak
• None

88. Test involving the construction of certain patterns are called?


• Intelligence test
• Performance tests
• Scholastic test
• None
89. Which appropriate verb will you use to make an objective behavioral?
• To know
• To appreciate
• To understand
• To construct

90. Objectives representing the purposes of instruction of a teacher are called?


• Performance
• Instructional
• Attainment
• None

91. Running description of active behavior of a student as observed by the teacher is?
• Anecdotal record
• Autobiography
• Interview
• None

92. A test very popular with class room teacher is?


• True false test
• Multiple choices
• Matching
• Completion test

93. The most commonly used guessing correction formula to predict and control is?
• S=R-W
• S=R-W/N-1
• S=R-w/2-1
• None

94. The difference between maximum and minimum values is?


• Mean
• Mode
• Range
• None
95. The number of score lying in a class interval is?
• Mid-point
• Quartiles
• Class
• Frequencies

96. In a norm referenced test which item is best?


• Item difficulty is near zero
• Item difficulty is near 100
• Item difficulty is near 70
• Item difficulty is near 50

97. Which question has increasing objectivity of marking?


• Unstructured essay
• Structured essay
• Short answer
• Multiple type questions

98. The most widely used format on standardized test in USA is?
• Unstructured essay
• Structured essay
• Short answer
• Multiple type questions

99. Which questions are difficult to mark with reliability?


• Unstructured essay
• Structured essay
• Short answer
• Multiple type questions

100. Projective techniques are used to measure?


• Aptitude
• Intelligence
• Knowledge
• Personality
101. Test meant for prediction on a certain criterion are called?
• Aptitude test
• Intelligence
• Knowledge
• Personality

102. Kuder Richardson method is used to estimate?


• Reliability
• Validity
• Objectivity
• Usability

103. Value that divides the data into two equal parts is?
• Mean
• Median
• Mode
• None

104. The test measures what we intend to measure. This quality of the test is called?
• Reliability
• Validity
• Objectivity
• Usability

105. The length of a test is an important factor in obtaining a representative?


• Mean
• Median
• Mode
• Sample

106. The test made to compare the performance of student with the other students is
called?
• Criterion reference
• Norm reference
• Achievement
• None
107. The summative evaluation is used?
• At the end of the program
• At the middle of the program
• At the start of the program
• None

108. The appearance of normal curve resembles with?


• U
• Bell
• V
• None

109. The alternative name of the “table of specification” is?


• Test Blue Print
• Test Construction
• Test Administration
• Test Scoring

110. ” Table of specification” helps in?


• Test development
• Test Construction
• Test Administration
• Test Scoring

111. The supply type test item is?


• True / False items
• Matching items
• M.C.Q items
• Completion items

112. The statement of problem in M.C.Qs is?


• Premise
• Response
• Stem
• None
113. The correct option in M.C.Q is?
• Answer
• Premise
• Response
• Destructor

114. The incorrect options in M.C.Q are?


• Answer
• Premise
• Response
• Destructor

115. The most widely applicable test item is?


• True / False items
• Matching items
• M.C.Q items
• Completion items

116. The ability to select organize, integrate and evaluate ideas is demonstrated by?
• Restricted Response Questions
• Extended Response Questions
• Matching items
• M.C.Q items

117. The Analysis of items is necessary in?


• Standardized Test
• Essay Type Test
• Objective type test
• Norm referenced test

118. Which one is not the type of test by purpose?


• Standardized Test
• Essay Type Test
• Objective type test
• Norm referenced test
119. The type of the test by method is?
• Standardized Test
• Essay Type Test
• Objective type test
• Norm referenced test

120. Student performance is compared with clearly defined learning tasks in?
• Standardized Test
• Essay Type Test
• Criterion reverenced test
• Norm referenced test

121. Test that measures learning outcome of students is


• Achievement test
• Aptitude test
• Criterion reverenced test
• Norm referenced test

122. The tests designed to predict future performance is?


• Achievement test
• Aptitude test
• Criterion reverenced test
• Norm referenced test

123. The founder of modern intelligent tests was?


• Alfred Binet
• Stern
• Gulford
• None

124. The formula to determine I.Q was presented by?


• Alfred Binet
• Stern
• Gulford
• None
125. I.Q of a student having same physical and mental age will be?
• A. 100
• B. 120
• C. 50
• D. 140

126. The I.Q of a student having twelve years mental age and tem years physical age
will be?
• A. 100
• B. 120
• C. 50
• D. 140

127. The quality of test that measures “what it claims to measure” is?
• Validity
• Differentiability
• Objectivity
• Reliability

128. The characteristic of a test to discriminate between high achievers and low
achievers is?
• Validity
• Differentiability
• Objectivity
• Reliability

129. If the scoring of the test is not effected by any factor, quality of test is called?
• Validity
• Differentiability
• Objectivity
• Reliability

130. The quality of test to give same scores when administered at different occasions
is?
• Validity
• Differentiability
• Objectivity
• Reliability

131. If the sample of the question in the test is sufficiently large enough, the quality of
test is?
• Adequacy
• Differentiability
• Objectivity
• Reliability

132. The quality of test showing ease of time, cost, administration and interpretation is
called?
• Usability
• Differentiability
• Objectivity
• Reliability

133. Facility index of an item determines?


• Ease or difficulty
• Discrimination power
• Objectivity
• Reliability

134. High and low achievers are sorted out by?


• Ease or difficulty
• Discrimination power
• Objectivity
• Reliability

135. Test item is acceptable which its faculty index /difficulty level ranges from?
• 30-70 %
• 70 %
• 30%
• None
136. Test item is very easy when value of faculty index/ difficulty level is higher than?
• 30-70 %
• 70 %
• 30%
• None

137. Test item is very difficult when value of facility index/ difficulty level is less
than?
• 30-70 %
• 70 %
• 30%
• None

138. Discrimination power of an item is acceptable when its value ranges from?
• 0.5 - 1
• 0.2-1.00
• 0.6 - 1
• 0.7 - 1

139. Test item discriminates 100% when its value for discrimination is?
• 0.30 – 1
• 1
• 0.30
• None

140. Test item cannot discriminate low achievers and high achievers when its value is
lower than?
• 0.8
• 0.7
• 0.3
• 0.6

141. Which of the following method is a measure of stability?


• Split-half
• Equivalent form
• Cronback alpha
• Test-retest

142. Persistent difficulties in learning are investigated in ------.


• Placement assessment
• Formative assessment
• Summative assessment
• Diagnostic assessment

143. To give an answer or make a judgement about something without being sure of all
facts is called:
• Option
• Guess
• Answer
• Notation

144. Extended response items are helpful to determine:


• Length and complexity of response
• Basic level of knowledge
• One-to-two word response
• Association between concepts

145. MCQs can Not be used to assess the knowledge of


• Specific facts
• Principle
• Methods
• Illusions

146. Identify the value of item difficulty for most difficult test item.
• 0.1
• 0.3
• 0.6
• 0.8

147. True-false items are also known as


• Extended form items
• Optional items
• Alternative form items
• Matching items

148. The domain which deals with physical abilities and coordination objectives is
• Cognitive
• Affective
• Psychomotor
• Intellectual

149. Which of the following test with time limit strict that no one is supposed to complete all
items?
• Speed
• Verbal
• Power
• Objective

150. “What is your country name?” is example of which level of SOLO taxonomy.
• Pre-structural
• Uni-structural
• Multi-structural
• Relational

151. In what kind of assessment, recurrent difficulties in learning are investigated?


• Placement
• Formative
• Diagnostic
• Summative

152. The degree to which an instrument measures what it is supposed to be measured is called
• Reliability
• Validity
• Objectivity
• Usability
153. In SOLO taxonomy: classify, list, combine are keywords for learning level:
• Pre-structural
• Uni-structural
• Multi-structural
• Relational

154. In SOLO taxonomy; identify, memorize, are keywords for learning level:
• Pre-structural
• Uni-structural
• Multi-structural
• Relational

155. Aptitude test is example of which type of Assessment


• Formative
• Placement
• Summative
• Diagnostic

156. The way of judging how well students are doing by looking at their work during
educational process is called
• Test
• Measurement
• Assessment
• Evaluation

157. In DOK taxonomy; argue, critique, formulate are key words for learning level:
• Recall
• Skill/concept
• Strategic thinking
• Extended thinking

158. The process of obtaining a numerical description of the degree to which an


individual processes particular characteristic is called:
• Test
• Measurement
• Assessment
• Evaluation

159. Which type of assessment is designed to measure whether the students have
mastered the skills presented in instruction?
• Placement
• Summative
• Formative
• Diagnostic

160. Hina Exceeded 85% of the sixth graders on the mathematics test. The interpretation of
this statement is:
• Norm referenced
• Criterion referenced
• Typical performance
• Maximum performance

161. Which of the following provides a framework of categories with different hierarchical
levels of outcomes?
• Content
• Questions
• Taxonomy
• Domain

162. Which of following learning level of DOK taxonomy demands extended use of higher
order thinking processes such as synthesis, reflection, assessment?
• Recall
• Skill/concept
• Strategic thinking
• Extended thinking

163. Identify the highest learning level of Solo taxonomy:


• Pre-structured
• Uni-structured
• Relational
• Extended abstract
164. Objective related to affective domain is that student can:
• Make a painting
• Draw a graph
• Values honestly
• Write a letter

165. Identify the learning level in national curriculum that specifies the skills and
attitudes that students will acquire:
• Competency
• Benchmarks
• Standards
• Student learning outcomes

166. Test designed to measure the number of items and individual can attempt correctly in a
given time is referred as:
• Power
• Speed
• Achievement
• Supply

167. To put ideas together to form a new whole is called:


• Evaluation
• Synthesis
• Analysis
• Application

168. In revised Bloom taxonomy; classify, explain, illustrate are key words for level:
• Remembering
• Understanding
• Applying
• Analyzing

169. Student's ability of writing an essay can be measured through:


• Fixed choice assessment
• Formative assessment
• Complex choice assessment
• Summative assessment

170. SOLO stands for:


• Subject of observed learning outcomes
• Structure of observed learning outcomes
• Structure of observed learning objectives
• Subject of observed learning objectives

171. Reliability within an assessment itself is known as:


• Stability
• Equivalence
• Internal consistency
• Objectivity

172. ---- test designed to measure broad national objectives and have a uniform set of
instructions that are adhered to during each administration.
• Power
• Standardized
• Achievement
• Speed

173. When two or more scorers can easily agree on whether the answer is correct or
incorrect is called:
• Usability
• Differentiability
• Objectivity
• Reliability

174. Identify the assessment used to find the root cause of continuous or constant problem in
students learning
• Diagnostic
• Placement
• Formative
• Summative
175. Achievement tests and aptitude tests are example of:
• Typical performance assessment
• Maximum performance assessment
• Complex performance assessment
• Fixed choice assessment

176. Norm reference assessment and criterion reference assessment are classifications
of assessment on the basis of:
• Nature of assessment
• Format of assessment
• Use in classroom instruction
• Method of interpreting results

177. In which type of assessment, the monitoring learning progress provides feedback
to reinforce and correct learning errors?
• Summative
• Diagnostic
• Placement
• Formative

178. Type of assessment used for identifying the problems of students is called?
• Diagnostic
• Formative
• Summative
• Placement

179. To judge the worth or value of material for a given purpose is known as:
• Analysis
• Application
• Knowledge
• Evaluation

180. In an art class a student paints a new and original painting. What is the cognitive
level of this outcome according to Bloom’s taxonomy?
• Analysis
• Synthesis
• Evaluation
• Application

181. Measure of student’s performance against a certain criterion is:


• Typical performance
• Maximum performance
• Norm referenced
• Criterion referenced

182. Identify the excellent item from given values of item discrimination.
• 0.17
• 0.26
• 0.35
• 0.48

183. Identify the good item from given values of item discrimination.
• 0.17
• 0.26
• 0.35
• 0.48

184. Good distractor is that


• Attracts high achievers more than lower achievers
• Does not attract
• Attracts low achievers more than high achievers
• Attracts equally low and high achievers

185. Identify the types of objective type items


• Selection and supply
• Selection and matching columns
• Supply and MCQs
• Supply and fill in the blanks

186. Multiple test forms should be assembled so that -----.


• Students don’t cheat
• Similar to one another
• Need to monitor students
• All of the above

187. Which of the following is an example of supply response items?


• True/false
• MCQ
• Short answer
• Matching items

188. A steeper item characteristics curve shows that item is ----.


• Easy
• Difficulty
• Good discriminator
• Bad discriminator

189. Which of the following are major threats to the validity of test score
interpretations?
• Parent’s behavior
• Teacher's behavior
• Student's mood
• Student's motivation

190. Difficulty level of an item determines?


• Ease or difficulty
• Discrimination power
• Effectiveness of distractors
• All above

191. A set of criteria and standard, educators can assess each student’s performance on
a wide variety of work.
• Rubric
• Test
• Marking
• Test scoring
192. Limited numbers of content area is needed to be tested in
• Completion items
• Restricted response items
• Fill in the blanks
• Extended response items

193. The problem stated in the form of question or incomplete statement in MCQs is
called:
• Sentence
• Stem
• Question
• Alternatives

194. Guessing parameter can be measured in:


• IRT
• CCT
• NRT
• CRT

195. A list of three or more choices from which an examinee is required to choose the correct
one is given in:
• True/ false items
• Matching items
• M.C.Q items
• Short answer

196. Under Item Response Theory, the difficulty of an item describes where the item
functions along the
• Ability scale
• Likert scale
• Intelligence scale
• Thurston scale

197. The list of suggested solutions in MCQs in the form of word, symbol, phrase etc.,
are called:
• Options
• Alternatives
• Answers
• Responses

198. Item analysis focuses to find out


• Difficulty level
• Effectiveness of distractors
• Discrimination power
• All above

199. Bad distractor is that:


• Attracts high achievers more than lower achievers.
• Attracts low achievers more than high achievers
• Does not
• Attracts equally low and high achievers

200. A test form is statistically equated to another test form to make the resulting test
scores ----.
• Comparable
• Fair
• Valid
• Accurate

201. Which test requires the students to match a series of responses with corresponding
items in stimulus list?
• True/ false items
• Matching items
• MCQ items
• Short answer

202. Identify the poor item from given values of item discrimination
• 0.15
• 0.22
• 0.36
• 0.49
203. Following are the usefulness of completion questions EXCEPT:
• Student complete large number of items in short time
• Minimize chance of guessing
• Measure complex learning outcomes
• Easy to mark

204. Ali's writing score placed near the bottom of the class. The interpretation of this
statement is:
• Criterion referenced
• Norm referenced
• Typical referenced
• Maximum performance

205. The Assessment determines what individual can do when performing at their best is
called:
• Maximum performance assessment
• Typical performance assessment
• Norm referenced assessment
• Criterion referenced assessment

206. In revised Bloom taxonomy; assess, defend, appraise are key words for level:
• Remembering
• Understanding
• Applying
• Evaluating

207. The procedure used to gain information about student learning and the formation of value
judgement concerning learning progress is called:
• Test
• Measurement
• Assessment
• Evaluation

208. Final term examination is an example of which of the following type of


assessment:
• Formative
• Placement
• Summative
• Diagnostic

209. The degree of consistency with which an instrument measures the attributes, it is
supposed to measure is called:
• Reliability
• Validity
• Usability
• Objectivity

210. A teacher wishes to assess students prior to instruction to measure their current
level of skills. What kind of assessment is this teacher using?
• Placement
• Formative
• Diagnostic
• Summative

211. Student creatively applies knowledge to construct an overall theory called:


• Evaluation
• Synthesis
• Analysis
• Application

212. A student is asked to critique a painting based on explicit criteria. What is the
cognitive level of this outcome according to Bloom's taxonomy?
• Analysis
• Application
• Synthesis
• Evaluation

213. Measure of performance which interprets an individual’s relative standing in some


known group is:
• Criterion referenced
• Norm referenced
• Maximum performance
• Typical performance

214. Identify the highest learning level of DOK taxonomy:


• Recall
• skill/Concept
• Strategic thinking
• Extended thinking

215. Which of the following decision determines prerequisite skills, degree of mastery
of course goals and modes of learning?
• Diagnostic
• Grading
• Placement
• Formative

216. Which evidence of validity can be confirmed by factor analysis?


• Content validity
• Criterion validity
• Construct validity
• Predictive validity

217. In what kind of assessment, a teacher uses information on learners’ progress


during a course to adapt their teaching or to give learners feedback on their learning?
• Placement
• Formative
• Summative
• Diagnostic

218. In revised Bloom taxonomy; define, name, select are key words for level:
• Remembering
• Understanding
• Applying
• Analyzing

219. Reliability of a test is necessity but not sufficient condition for:


• Validity
• Usability
• Guessing
• Accuracy

220. The Degree to which a test measures an intended hypothetical construct is called:
• Reliability
• Content validity
• Construct validity
• Face validity

221. How well the sample of assessment tasks represents the domain of the tasks to be
measured is known as
• Content validity
• Criterion validity
• Construct validity
• Predictive validity

222. In SOLO taxonomy, analyze, relate, apply are key words for learning level:
• Pre-structural
• Uni-structural
• Multi-structural
• Relational

223. Which type of reliability refers to the consistency of scores on two equivalent forms of a
test designed to measure the same characteristics?
• Split-half
• Test-retest
• Alternative-test form
• content

224. In national curriculum, key learning areas are defined as:


• Competency
• Benchmarks
• Standards
• Student learning outcomes
225. The spearman-brown formula is used to estimate:
• Split-half
• Test-retest
• Internal consistency
• Equivalent forms

226. Identify the lowest learning level of DOK taxonomy:


• Recall
• Skill/concept
• Strategic thinking
• Extended thinking

227. The classification of cognitive domain was presented by:


• Benjamin S. Bloom
• Karthwohl
• Skinner
• Simpson

228. To use the previous learned material in new situation is called


• Comprehension
• Application

229. If you achieved 97% percentile on a test, then your result has been assessed
according to:
• Summative
• Criterion reference
• Formative
• Norm reference

230. Cognitive domain was divided into sub subgroups by bloom in


• 1954
• 1964
• 1974
• 1956
231. When viewing instructional objectives in terms of learning outcomes, we are
concerned with:
• Material
• Input
• Process
• product

232. Bring together scientific ideas to form a unique idea is called:


• Application
• Analysis
• Synthesis
• evaluation

233. The collection of productive work called to evaluate the performance of students
is called:
• Portfolio
• Assignment
• Quiz
• Test

234. Student can design a laboratory according to certain specification in which


category of objective?
• Analysis
• Application
• Synthesis
• evaluation

235. Identify the learning level according to national curriculum that indicates what the
students will accomplish at the end of each of the five developmental levels in order to
meet the standard.
• Competency
• Benchmarks
• Principles
• Student learning outcomes
236. Reliability over different forms of assessment is known as:
• Stability
• Equivalence
• Internal consistency
• objectivity

237. Observation in a classroom is an example of which of the following type of


assessment
• Formative
• Placement
• Summative
• Diagnostic

238. To grasp the meaning of the material is:


• Comprehension
• Application
• Knowledge
• Synthesis

239. Attitudes, values and interests are reflected by


• Cognitive domain
• Affective domain
• Psychomotor domain
• Both A and B

240. Educational objectives have been divided into:


• Two domains
• Three domains
• Four domains
• Five domains

241. Which of the following is NOT an evidence of reliability?


• Split half
• Construct
• Cronback alpha
• Test-retest
242. --------refers to the assessment of participants, and summarize their development at the
particular time
• Formative assessment
• Diagnostic assessment
• Summative assessment
• Placement assessment

243. Which type of assessment determines the student’s prerequisite skills to begin instruction
• Placement
• Summative
• Formative
• Diagnostic

244. In reliability, Consistency over period of time is known as:


• Stability
• Equivalent
• Internal consistency
• Objectivity

245. If you have been declared as master of content in specific domain after a test, then
your result have been assessed according to:
• Summative
• Criterion reference
• Formative
• Norm reference

246. Test retest is the method to estimate:


• Reliability
• accuracy

247. Degree to which a test measures an intended content area is called-------


• Reliability
• Face validity
• Content validity
• Construct validity
248. Which type of validity involves testing a group of subjects for a certain construct, and the
comparing them with results obtained at some point in the future?
• Content validity
• Criterion validity
• Construct validity
• Predictive validity

249. A type of assessment done at the end of a course where the focus is on the learners
receiving a grade for their work rather than receiving feedback on their progress is called -------.
• Formative
• Placement
• Diagnostic
• summative

250. Which type of assessment measures entry level performance of student?


• Formative
• Summative
• Diagnostic
• Placement

251. Which type of items format require the students to structure a rather long written
response up to several paragraphs.
• Restricted
• Short answer items
• Essay
• Both A and B

252. Guessing in eliminated in:


• Completion items
• Multiple choice items
• Fill in the blanks
• Essay type items

253. Scoring is a time-consuming process in marking of:


• Multiple choice items
• Matching columns
• Essay type answers
• Fill in the blanks

254. The pattern of student’s errors and misconceptions start emerging after reading a
few papers that help to develop scoring rubric for extended response items which will
applied to score all papers.
• Analytic scoring
• Holistic scoring
• Partial credit scoring
• Complete credit scoring

255. Analytical rubric works best for:


• Restricted response items
• Matching items
• Extended response items
• Fill in the blanks

256. Which of the following is best to measure higher order learning skills?
• Matching items
• Objective test items
• Completion items
• Subjective items

257. Which of the following type of test emphasized on reading and speaking?
• Speed
• Verbal
• Power
• Objective

258. The Scores of students in papers are:


• Test
• Measurement
• Evaluation
• Assessment
259. Which of the following method is a measure of internal consistency?
• Split half
• Equivalent form
• Cronback alpha
• Test-retest

260. Test designed to measure the learning during liberal time that allow each student
to attempt each item.
• Power
• Supply
• Achievement
• Speed

261. Student judges information based upon standards, criteria, values and opinions is
known as:
• Analysis
• Application
• Knowledge
• Evaluation

262. The cognitive level where components of a whole are broken into parts to know
it's organizational structure is called:
• Comprehension
• Application
• Analysis
• Synthesis

263. In SOLO taxonomy, theorize, generalize, hypothesize, reflect, generate are key
words for learning level:
• Pre-structural
• Uni-structural
• Multi-structural
• Extended abstract

264. Identify the lowest learning level of Bloom’s cognitive domain


• Knowledge
• Comprehension
• Recall
• Application

265. Identify the highest learning level of Bloom’s cognitive domain.


• Application
• Analysis
• Synthesis
• Evaluation

266. Which evidence of validity can be confirmed by correlation coefficient?


• Content validity
• Criterion validity
• Construct validity
• Predictive validity

267. How well a test measures up to its claims, is known as:


• Content validity
• Criterion validity
• Construct validity
• Predictive validity

268. Which of the following is NOT an evidence of validity of a test?


• Contest
• Construct
• Cronback
• Criterion

269. Monitoring the outcomes with reference to the objectives is called:


• Assessment
• Measurement
• Test
• Evaluation
270. In SOLO taxonomy; hypothesize, reflect, generalize are key words for learning
level:
• Pre-structured
• Uni-structured
• Multi-structured
• Extended abstract

271. Identify the lowest learning level of SOLO taxonomy.


• Pre-structured
• Uni-structured
• Relational
• Extended abstract

272. Knowing/ memorizing and recalling is concerned with:


• Comprehension
• Knowledge
• Application
• Evaluation

273. In revised Bloom taxonomy: build, identify, develop are key words for level:
• Remembering
• Understanding
• Applying
• Analyzing

274. In DOK taxonomy: graph, separate, narrate are key words for learning level:
• Recall
• Skill/ concept
• Strategic thinking
• Extended thinking

275. The domain in which intellectual skills are reflected:


• Cognitive
• Affective
• Psychomotor
• None of them
276. Which teaching method is based on John Dewey’s philosophy of pragmatism.
• Heuristic method
• Laboratory method
• Project method
• Problem solving method

277. Pakistan and ----- have failed by large margins to meet 100% primary education.
• China
• Maldives
• Afghanistan
• Sri Lanka

278. Equity means ----- of treatment


• Sameness
• Fairness
• Absence

279. Qualitative approach challenges ------.


• Subject's perspective
• Facts
• Objectivity
• Issues

280. ----- variable(s) should be changed at a time when conducting a single-case


design.
• 4
• 3
• 2
• 1

281. Which of the following are three domains of Blooms’ taxonomy?


• Cognitive, affective, psychomotor
• Cognitive, relational, psychomotor
• Psychomotor, affective, strategic thinking
• Affective, cognitive, relational
282. Holistic rubric is good for evaluating -----.
• A single person tests
• Overall performance on a task
• Specific performance on a task
• None of the above

283. How many columns matching items have -----?


• One
• Two
• Three
• Four

284. Test item is very difficult when value of difficulty level is less than?
• 0.8
• 0.7
• 0.3
• 0.6

285. Identify the value of item difficulty for easiest test item.
• 0.2
• 0.4
• 0.5
• 0.8

286. Identify the value of item difficulty for easiest test item.
• 0.2
• 0.4
• 0.5
• 0.8

287. Identify the acceptable item from given values of item discrimination.
• 0.17
• 0.26
• 0.36
• 0.48
288. Which of the following types of items is not a selection item?
• True/ false
• M.C.Q
• Matching
• Short answer

289. Scoring is a time consuming process in marking of:


• Multiple choice items
• Matching columns
• Essay type answers
• Fill in the blanks

290. Identify the excellent item from given values of item discrimination
• 0.29
• 0.24
• 0.38
• 0.50

You might also like