Download as pdf or txt
Download as pdf or txt
You are on page 1of 10

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/327882359

AN EVALUATION TOOL TO MEASURE COMPUTATIONAL THINKING SKILLS:


PILOT INVESTIGATION

Conference Paper · September 2018

CITATIONS READS

0 1,196

5 authors, including:

Ling LING Ung J. Labadin


Universiti Teknologi MARA University Malaysia Sarawak
4 PUBLICATIONS   3 CITATIONS    81 PUBLICATIONS   277 CITATIONS   

SEE PROFILE SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Teaching and Learning of Computational Thinking View project

A Flow simulation study of EHD ion-drag micro pump View project

All content following this page was uploaded by Ling LING Ung on 26 September 2018.

The user has requested enhancement of the downloaded file.


Herald NAMSCA 1, 2018 Ung L. Ling, Tammie C. Saibin, Nasrah Naharu, Jane Labadin,
Norazila A. Aziz
__________________________________________________________________________________________________

AN EVALUATION TOOL TO MEASURE COMPUTATIONAL THINKING SKILLS: PILOT


INVESTIGATION

Ung L. Ling1*, Tammie C. Saibin2 , Nasrah Naharu3, Jane Labadin4 , Norazila A. Aziz5
*1, 2, 3
Faculty of Computer and Mathematical Sciences, University of Technology MARA (UiT M), 88999
Kota Kinabalu, Sabah, Malaysia
4, 5
Institute of Social Informatics and Technological Innovations (ISITI), University of Sarawak
Malaysia (UNIMAS), 94300 Kota Samarahan, Sarawak, Malaysia

Abstract. This is a pilot study whereby we present an evaluation rubric to measure Computational
Thinking (CT) skills for primary school level. The rubric is designed based on different criteria from
prominent CT trainers and researchers. It produces measureable CT evidence from students’ performed
tasks and written test. The experiment kit was sent to thirty schools, and nine primary 1 teachers have
responded to this pilot investigation. They have recorded and reported 359 students learning outcome with
pre-designed evaluation as pre- and post-test after conducted a semester teaching and learning activities.
Our results demonstrate positive attitude from the teachers towards the proposed evaluation tool, and it
manages to measure the Computational Thinking skill learning outcome. Teachers’ concern on CT skill
evaluation is also obtained from this investigation.
Keywords: computational thinking (CT), teaching and learning (TL), primary school student,
evaluation rubric, curriculum.

Introduction. The Prime Minister of Malaysia has announced the integration of Computational Thinking (CT)
skills into all subjects, starts in 2017 with Primary 1 students 10. With this announcement, there is an urgent need to
equip the schools, especially the teachers on the teaching and learning (TL) of CT skills in their daily lessons. The
newly revised curriculums are uploaded in the Ministry of Education (MOE) website, describing the revised curriculum
has accounted improvements content by global trends and international benchmarking in the content. The teaching and
learning pedagogy will concentrate on learning in depth, contextually and effectively; and the development of student
learning is assessed based on an on-going basis45.
T skills will expose students to the logical thinking skills, problem solving and life-long learning skills as well
as Information and Communication Technology (ICT)skills3, 13, 69. Previous studies have proven that CT skills are able
to improve individual higher-order thinking skills, which is a crucial element to survive in the 21st century especially
excelling in the work force 13, 35, 38, 48, 61. Despite numerous studies embarked on to understand CT skills; there are still
many concerns that need to be addressed in order to ensure effective CT teaching and learning process 22, 62, especially
for Malaysia, a beginner in this field. Key question in this study include: How can teachers effectively assess the
outcome of their teaching and learning (TL)?
CT Concepts. Wing 67stressed that CT is as important as reading, writing and counting and it should be
included as part of school curriculum. Researchers reported that CT alleviate one’s higher order thinking skills and
improve problem-solving skills23, 31. Experiments proved that leaners scored better not only in computing lessons, but
also in mathematics, languages and sciences compared to those who are not 1, 6, 28. CT is claimed as a must have skill in
order to live and work in today’s challenging world 63. It is defined by Cuny 16 as “the thought of processes involved in
formulating problems and their solutions so that the solutions are represented in a form that can be effectively carried
out by an information-processing agent”. However, the adaptation of CT concepts and the delivery of it by teachers to
their everyday school practices are not going to be easy 26 and it will require thorough study, to seek and determine the
most effective ways of teaching and learning CT 2, 22, 62, 69.
Regardless of all these studies, the definition of CT is still uncertain or lack of common definition, and the skill
set elements are different for different perspectives, which might due to premature investigations carried out 24, 25, 57, 65.
The prominent CT operational definitions are by Computer Science Teachers Association (CSTA) and The International
Society for Technology in Education(ISTE), defining CT skills as human ability in formulating solutions in
computational manner and the solution to be carried out by a computer 15. Table 1 shows different definitions of CT from
different institutions/organizations/bodies. There were also various publications have also proposed different CT
definitions for different fields 4, 7, 13, 19, 53, 57, 67. CT is claimed as a skill with varieties of skills namely the logical,
algorithmic, modeling, simulations and abstraction thinking 37, 51, 53, 54, 66, 68. 57 and 17 suggested that the core concepts of
CT are the algorithmic thinking, evaluation, decomposition, abstraction and generalization, while continuous work in
Scratch12, have developed a definition of CT, that involves three key dimensions: computational concepts,
computational practices and computational perspectives. There are also reports investigated the relevancy of CT with
Mathematics and Sciences 44, 50, 55, 56, 58 . According to 21, CT has a long history with computer science since 1950s,
sharing the same skills components such as the algorithmic thinking, conditional logics and modeling but CT is not all
about computer programming 37, 52, 65, 67. Some researchers have associated CT skills with attitude such as

606
Herald NAMSCA 1, 2018 Ung L. Ling, Tammie C. Saibin, Nasrah Naharu, Jane Labadin,
Norazila A. Aziz
__________________________________________________________________________________________________

communication, perseverance, team work and persistency6, 7, 20.


CT Curriculum. Many countries have taken steps to bring the CT concept into their respective curriculum. It is
found that many countries do not use the term CT in curriculum documents, where ICT and technology are used
instead. The concept of CT has been applied to the curriculum, especially in programming lessons. CT concepts have
been introduced to students either through formal classes or informal classes. CT skills have also been introduced to
students through various learning techniques through their science and mathematics subjects, programming, robotic,
game programming and computer unplugged. Among the tools used to carry out the CT teaching and learning process
are the use of Scratch, Alice, Logo programming, Block games, and Bee-Bot. While many tools are applied to assess
the learning outcome but most are tested on programming/coding and hands-on activities9, 27, 29, 32-34, 52.
CT concept has been implemented in the K-12 curriculum in the United States 3. It is introduced to students at
the secondary school level through computer science subjects. A framework has been specially designed to introduce
CT skills via coding. Within the framework, 6 practices have been identified as the basis for CT. Although at the
beginning of the definition of CT has stated the application a computer, CT has been regarded to be a skill that can be
mastered and practiced without the use of computers 2. New Zealand is expanding its technology subject by injecting
CT concepts into their core curriculum for all children, starting from Year 1 to Year 10 of primary school in year 2018
14
. The government is pushing for a new curriculum, which will be based on six themes, namely the algorithm, data
representations, digital applications, digital devices and infrastructure, human and computer, and programming. In
Australia, CT skills are introduced in curriculum area called Digital Technologies. Australian teachers used computer
science platform to introduce CT skills. Another popular approach by 18, the CS Unplugged introduced computing
without using technology.
The MOE has been working closely with Malaysia Digital Economy Corporation (MDEC) to design and
prepare a new curriculum 10 to support TL of CT. Teachers have been undergoing training on how to integrate
computational thinking into their daily classroom teaching and learning practices. Six institutions are appointed
(University of Technology Malaysia-UTM, University Science Malaysia –USM, University Malaya – UM, University
Technology MARA – UiTM, University Malaysia Sarawak – UNIMAS and University Malaysia Sabah - UMS) to carry
out trainings to prepare in-service teachers in understanding CT and TL approaches. A training module is designed and
produced by MDEC and is used as the reference in conducting the training. The training module has listed six main CT
concepts, namely the abstraction, pattern, decompose, algorithm, logical reasoning and evaluation; while other concepts
such as parallelization, simulation, data representation, data analysis, data collection and automation are listed but not
put emphasis on. CT skills are integrated across all subjects in Malaysia, including languages and arts. In this paper we
present an evaluation rubric to assist teachers in primary schools to assess CT components after a lesson. Rubric is
selected mainly because it provides a more objective method of scoring and it is more students centred32, 36, 46.The
evaluation rubric is designed to tailor evaluation of CT skill in different subjects across Malaysian primary school
subjects, in particular focuses on CT components involved in completing an educational activity. In addition, the
designed rubric is tested to evaluate CT components based on written test.
Research aim and methodology. The aim of the study is to determine student’s CT skills learning outcome after
completion of a lesson by addressing the following research questions:
a. Is there any difference on the learning outcome before and after a lesson is conducted?
b. Does the evaluation rubric able to measure students’ CT skills after a lesson?
c. What are the comments from the teachers about the evaluation rubric?
The objectives of this research are to test the designed evaluation rubric to measure CT skills acquired after a
lesson and to investigate students’ learning outcome regarding CT skills.
Plenty of studies have been carried out related to the teaching and learning of CT 11, 30, 49, 64. In order to carry out
this investigation, ISTE/CSTA, Next Generation Science Standard (NGSS), CAS, MDEC CT training module and Google
for Education on CT operational resources, and Malaysian 2017 revised curriculum were referred. A preliminary
investigation was carried out in 2016 to investigate Malaysian teachers’ perception on the newly introduced CT concepts in
the revised curriculum 62. One of the investigation outcome from concluded that teachers will want an easy or trouble free
pedagogy to conduct CT TL, which includes assessment method. Therefore, a holistic assessment rubric is designed for
this study. The standard performance applied in the rubric is based on the reporting standards in the Malaysia curriculum5,
39, 42
. The CT skills set is chosen based on the concepts designed teaching manual prepared by MDEC.

607
Herald NAMSCA 1, 2018 Ung L. Ling, Tammie C. Saibin, Nasrah Naharu, Jane Labadin,
Norazila A. Aziz
__________________________________________________________________________________________________

Table 1: Definition of CT skillsets by CAS Barefoot, CSTA, Google for Education, MDEC and K-12,US
Institutions/
CT Skill Set
Organizations/Bodies
Computing at School Barefoot Computational Thinking is the thought processes involved in formulating a problem and expressing its solution
(CAS) in a way that a computer, human or machine can effectively carry out 8.
Computer Science Teachers Computational thinking is a problem solving process that includes: Formulating problems, logically, organizing
Association (CSTA) and analyzing data, representing data , identifying, analyzing, and implementing possible solutions and
generalizing and transferring this problem-solving process to a wide variety of problems 18.
Google for Education CT is a problem solving process that includes a number of characteristics, such as logically ordering and
analyzing data and creating solutions using a series of ordered steps (or algorithms), and dispositions, such as
the ability to confidently deal with complexity and open-ended problems 17.
Malaysia Digital Economy “CT is the ability to dissect problems and formulate solutions by drawing from concepts in computer science…
59
Corporation (MDEC) ”. Comprises of 6 main concepts: decompose, pattern, abstraction, algorithm, logical reasoning and
evaluation.
K-12, US CT element include assess its development: Abstractions and pattern generalizations (including models and
simulations), systematic processing of information, symbol systems and representations, algorithmic notions of
flow of control, structured problem decomposition (modularizing), iterative, recursive, and parallel thinking,
conditional logic, efficiency and performance constraints, debugging and systematic error detection 30.

Participants. A written permission was obtained from the Malaysian Ministry of Education (MOE), the State
Education Department (Jabatan Pendidikan Negeri-JPN) and also from the school principals. Teachers of 10 schools
from all over Malaysia (Sabah, Sarawak, Selangor, Pulau Pinang, Johor and Kelantan) were randomly picked for this
study. The participants were teachers who teach primary 1 students and volunteered to take part in this investigation.
The participated teachers teaching different subjects executed the experiment with their students. They conducted their
TL activities in classroom according to their teaching plan over a semester which is approximate 20 weeks. Each
teacher carried out a pre-test before any lesson commenced, then set up another post-test after completing a lesson to
measure the results of TL that have been conducted after one semester. The method of assessment was not set in the
curriculum, and thus the teachers had a say to decide types of assessment to be carried out.
Data Collection Instruments. Data collection instruments consisted of several items: - A set of information of
CT skills prepared by referring to available online resources mainly from the Google for Education and my digital maker
by MDEC, the instructions of the experiments and CT evaluation rubric (in Malay language version and English
version). The selected CT skills in the rubric is based on the CT skills mentioned in the MDEC teaching module for
trainers 43. The designed rubric has 6 performance indicators, namely 1- Very Limited, 2-Limited, 3-Fair, 4-Moderate, 5-
Good and 6-Excellent which is designed based on the reporting standard in MOE template5, 39-41.Each performance
indicator is described with expected CT learning outcome. As part of this study, a feedback form was shared (via
postage) and the teachers were required to fill in the form each time they conducted an assessment. The form required
participating teachers to record the following information: - Number of students, a short summary of the lesson, CT
concepts intending to cover (a list of possible selections is provided), mode of assessment (selected from the choices:
oral test, written test, games, presentation, role-play, project and respondents were also provided with “other” tostate
other assessment method) and a column to obtain any other comments or observations they wished to record (free text).
To acquire teachers’ perception on the proposed rubric, 5 Likert-type scale questions were asked in the form, which were
adapted and modified based on multiple resources 25, 36, 46. An open-ended section is provided to the teachers to state
down their concern on the proposed rubric as a tool to assess their TL outcome.
Data Analysis. Descriptive quantitative method was applied to analyze the data collected during the pre-test
and the post-test. The data is analyzed using SPSS. Cronbach’s alpha was applied to measure the strength of the
correlation between the questionnaire items in the survey form which was used to acquire teachers’ perception on the
proposed rubric60.
Results. A total of 9 teachers participated in the experiment voluntarily, with 359 students involved in the TL activities.
Assessment methods carried out by the teachers were written test, via game, singing and calculation exercise. Table 2 shows the
number of students involved in the respective assessments.

Table 2: Assessments/tests information


Number of students

Assessed CT skills
Type of assessment
Pre/Post tests
Teacher

Subject

Pre-test Malay Singing Pattern


1 24
Post-test Language Written test recognition
Pre-test Malay Written test Abstraction
2 27 Post-test Language Pattern
recognition
3 41 Pre-test Science Written test Logical reasoning
608
Herald NAMSCA 1, 2018
Ung L. Ling, Tammie C. Saibin, Nasrah Naharu, Jane Labadin,
Norazila A. Aziz
__________________________________________________________________________________________________

Pattern
Post-test
recognition
Pre-test Logical reasoning
4 41 Science Written test Pattern
Post-test
recognition
Pre-test Pattern
5 36 Science Written test
Post-test recognition
Pre-test Logical reasoning
6 36 Science Written test Pattern
Post-test
recognition
Pre-test Calculation Logical reasoning
7 41 Mathematics
Post-test exercise Decomposition
Pre-test Algorithm
8 36 Mathematics Written test Pattern
Post-test
recognition
Pre-test
40 Mathematics Logical reasoning
Post-test Game
9 Decomposition
37 Pre-test
Mathematics
Post-test

3.1 research question 1: Is there any difference on the learning outcome before and after a lesson was conducted?
Table 3: Pre-Test by Subjects
Results of pre-test
1 2 3 4 5 6
Mathemati %
29 38 44 26 14 3
cs
18.8% 24.7% 28.6% 16.9% 9.1% 1.9%
Subject

Science % 41 35 50 13 10 5
26.6% 22.7% 32.5% 8.4% 6.5% 3.2%
Malay % 7 22 9 8 5 0
Language 13.7% 43.1% 17.6% 15.7% 9.8% 0.0%

Table 3 shows the marks for pre-test by subject. For Mathematics, 28.6% (44) students attained scores of 3
followed by scores of 2 (24.7%) and score of 1 (18.8%). Only 9.1% and 1.9% students attained higher score 5 and 6 for
this subject. For Science subject, 32.5% of the students attained the score of 3 followed by score of 1 (26.6%) and
2(22.7%). Only 6.5% and 3.2% students attained scores of 5 and 6 respectively. For Malay Language subject, 43.1% of
the students attained scores of 2, followed by scores of 3 (17.6%), scores of 4 (15.7%), score of 1 (13.7%) and score of 5
(9.8%).
Table 4: Post-Test by Subject
Results of post-test
1 2 3 4 5 6
1 7 33 63 40 10
Mathematics

0.6% 4.5% 21.4% 40.9% 26.0% 6.5%


Count %
Subject

3 10 35 48 39 19
Science

1.9% 6.5% 22.7% 31.2% 25.3% 12.3%

0 3 18 23 7 0
Language
Malay

0.0% 5.9% 35.3% 45.1% 13.7% 0.0%

Table above shows the marks for post-test by subjects. For Mathematics, 40.9% managed to get score of 4 followed by
score of 5 (26%), score of 3 (21.4%), score of 6 (6.5%) and scores of 2(4.5%). It was noted that there was slight increase
in terms of attainable scores by the students (from score of 3 to score of 4). For Science subject, 31.2% of the students
attained score of 4 followed by score of 5 (25.3%), score of 3 (22.7%), score of 6 (12.3%), score of 2 and 1 with 6.5%
and 1.9% respectively. The number of students attained higher scores for Science subject also increased (from score of 3
to score of 4). For Malay Language subject, 45.1% of the students attained score of 4, followed by score of 3 (35.3%),
score of 5(13.7%) and score of 2(5.9%). For this subject, the marks drastically increased from score of 2 in pre-test to
scores of 4 for post-test.

609
Herald NAMSCA 1, 2018
Ung L. Ling, Tammie C. Saibin, Nasrah Naharu, Jane Labadin,
Norazila A. Aziz
__________________________________________________________________________________________________

Table 5: Mean Scores by Subject

Subject Pre-Test Post-Test


Mathematics 2.79 4.06
Science 2.55 4.08
Malay Language 2.65 3.67

Table 5 presents the mean scores from pre-test to post-test by subjects. For Mathematics subject, the mean score of 2.79 during pre-
test has increased to 4.06 for post-test. For Science subject, the score during pre-test, 2.55 was increased to 4.08 for post-test. For
Malay Language subject, score of 2.65 during pre-test slightly increased to 3.67 for post-test. Overall, we observed the mean score for
all the subjects have increased. There were more students attained higher marks for post-test compared to pre-test. The finding
suggests that the students managed to learn andunderstand the respective subject after TL commenced. The proposed rubric
successfully assessed students’ performance in assessed subjects.
2nd research question: Does the evaluation rubric able to measure students’ CT skills after a lesson?
Table 6: Pre-Test by CT concepts
Pre-Test Score
1 2 3 4 5 6
53 51 49 23 14 5
Reasoning
Logical

27.2% 26.2% 25.1% 11.8% 7.2% 2.6%

2 6 11 12 5 0
Algorithm

5.6% 16.7% 30.6% 33.3% 13.9% 0.0%


CT concepts

Count%

18 28 39 8 6 3
Recognition
Pattern

17.6% 27.5% 38.2% 7.8% 5.9% 2.9%

4 10 4 4 4 0
Abstraction

15.4% 38.5% 15.4% 15.4% 15.4% 0.0%

Table 6 shows the score during pre-test by CT skills involved. There are four out of six CT skills assessed in this research. For
logical reasoning, 27.2% of the students attained the lowest score, followed by the score of 2 (26.2%), score of 3 (25.1%), score of 4
(11.8%), score of 5 (7.2%) and score of 6(2.6%). For CT skill involving algorithm, 33.3% of the students attained score of 4, followed by
score of 3(30.6%), score of 2 (16.7%), score of 5 (13.9%) and score of 1 (5.6%). For CT skill involving pattern recognition, 38.2% of the
students attained score of 3, followed by the score of 2 (27.5%), score of 1 (17.6%), score of 4(7.8%), score of 5 (5.9%) and score of
6(2.9%). For CT skill involving abstraction, 38.5% of the students attained score of 2. No students attained score of 6.

Table 7: Post-Test by CT concepts

Post-test Score
1 2 3 4 5 6
2 12 44 75 46 16
Reasoning
Logical

1.0% 6.2% 22.6% 38.5% 23.6% 8.2%

0 1 4 16 13 2
Algorithm

0.0% 2.8% 11.1% 44.4% 36.1% 5.6%


CT concepts

Count %
Recognition
Pattern

2 4 28 32 25 11

2.0% 3.9% 27.5% 31.4% 24.5% 10.8%


0 3 10 11 2 0
Abstraction

0.0% 11.5% 38.5% 42.3% 7.7% 0.0%

The table above presents the result of marks attained by students for post-test according to CT skills involved. For logical
reasoning, 38.5% of the students attained score of 4, followed by score of 3 (22.6%), score of 5 (23.6%), score of 6 (8.2%), score of 2

610
Herald NAMSCA 1, 2018 Ung L. Ling, Tammie C. Saibin, Nasrah Naharu, Jane Labadin,
Norazila A. Aziz
__________________________________________________________________________________________________

(6.2%) and score of 1 (1%). For algorithms, 44.4% students attained score of 4, followed by score of 5 (36.1%), score of 3 (11.1%),
score of 6 (5.6%) and score of 2 (2.8%). For CT skill involving pattern recognition, 31.4% of the students attained score of 4, score of
3 (27.5%), score of 5(24.5%), score of 6(10.8%), score of 2 (3.9%), and score of 1 (2%). For abstraction, 42.3% of the students
attained score of 4, followed by score 3 (38.5%), score of 2 (11.5%) and score of 5 (7.7%). No students attained score of 1 and 6.

Table 8: Mean Scores by Computational Thinking Skills

CT concepts Pre-test Post-test Difference

Logical Reasoning 2.53 4.02 + 1.49

Algorithm 3.33 4.31 + 0.98

Pattern Recognition 2.66 4.05 + 1.39

Abstraction 2.77 3.46 + 0.69

Table 8 presents the mean scores achieved by the students according to CT skills assessed. Overall the means scores for
all CT skills assessed was positively increased. The CT skill that has the highest difference from pre-test to post-test is logical
reasoning which increased by 1.49 in terms of score followed by pattern recognition (+1.39), algorithms (+0.98) and abstraction
(+0.69). The finding suggests that students were able to learn and acquire CT skills after TL of the subjects were conducted. The
proposed rubric managed to record down students’ CT understanding before and after a lesson.
3rd research question: What are the comments from the teachers about the evaluation rubric?

Table 9: Constructs reliability

Cronbach's Alpha N of Items

.936 5

The teachers then were asked to answer a few questions consisting of 5 statements regarding their perception towards the
evaluation rubric. To test whether each statements measure the same underlying concepts, Cronbach’s alpha was used. 47provided
guidance in the interpretation of the reliability coefficient by stating that a value of 0.70 is sufficient for early stages of a research.As
shown in Table 9, the questionnaire achieved excellent reliability with α=0.936. Therefore, the questions in the survey form are
reliable and the questionnaire is internally consistent.
Table 10: Summary of teachers’ perception on the proposed rubric
Disagree

Not sure
Strongly

Strongly
disagree

Agree

agree

I managed to measure the


achievement of student’s CT skills 0 0 1 5 3
well.
I did not face any difficulty in
measuring student’s CT learning 0 2 1 3 3
outcome.
This tool/rubric is suitable for
measuring CT learning outcome and 0 1 0 5 3
achievement of this subject.
This assessment tool/rubric can help
me to plan the next student teaching 0 1 1 4 3
and learning strategy.
I will continue using this assessment
0 1 2 4 2
rubric for future CT skills assessment.

Table 9 shows the teachers’ level of perception towards the evaluation rubric. 5 of the respondents agreed and 3
strongly agreed that they can measure the students’ CT skills by using the newly introduced rubric. On the other hand, 2
of the respondents had difficulties using the evaluation rubric to assess students’ performance. Despite the difficulties in
using the rubric, 8 of the respondents agreed and strongly agreed that the newly introduced rubric was suitable to assess
students’ CT skills in their respective subjects. Majority of the teachers also believed that the rubric can be used in
helping them in planning their TL strategy. 2 of the respondents were not sure if he/she will use the evaluation rubric in
future and 1 respondent disagreed in using the evaluation rubric. Nevertheless, 6 respondents positively agreed and
strongly agreed in using the rubric as a method to assess students’ CT skills in future. In the form, the teachers were also
required to state down their related concern on the integration of proposed rubric in their daily TL practices. Only 7 out
of the 9 participated teachers had stated their concern. 2 teachers stated about their concern on the duration of
assessment required if there is a bigger group of students involved. 2 teachers were concerned about the usage of the
proposed rubric on written test. These 2 teachers were the teachers who conducted non-written assessments. 2
respondents stated that they will use the rubric if it is made compulsory to measure CT skill in their assessment

611
Herald NAMSCA 1, 2018 Ung L. Ling, Tammie C. Saibin, Nasrah Naharu, Jane Labadin,
Norazila A. Aziz
__________________________________________________________________________________________________

practices. While another 3 teachers stated that they were satisfied with the rubric but stressed that training should be
provided to the teachers before application of the rubric commences.
Conclusion. The present study was designed to describe the usability of the proposed rubric as an assessment tool
in measuring CT TL outcome. Positive outcome were found in terms of rubric usability in measuring CT TL learning
outcome despite 4 different assessment methods were carried out.
The results showcased that most of the students developed better understanding on CT skills after TL activities
commenced, with majority improved from a very limited understanding on CT concepts to a relatively uniform
distribution of understanding across Levels 4, 5 and 6. The teachers indicated the agility used of the proposed rubric in 4
different assessment methods. Besides that the proposed rubric is proven to be practical too as 7 out of 9 participated
teachers had conducted written test as assessment, and 4 Teachers conducted assessment on more than 30 students in a
class.
This small scale of study serves important implications for incorporating CT in Malaysia classrooms as there
are existing methods suggested by some researchers to evaluate the learning outcomes of CT skills, but most of their
methods are simply applicable to programming environment8, 12, 33, 56. As CT skills are to be integrated in primary 1, and
continue on to secondary level, teachers will need a mechanism to measure the TL outcome, align with the existing
curriculum design, and this might be one of the tools that are applicable in Malaysia education environment. Future
studies will include bigger number of participants in the experiment to examine teacher’s use of the proposed rubric.

References

1 Sheikh Iqbal Ahamed, Dennis Brylow, Rong Ge, Praveen Madiraju, Stephen J Merrill, Craig A Struble, and
James P Early, 'Computational Thinking for the Sciences: A Three Day Workshop for High School Science Teachers', in
Proceedings of the 41st ACM technical symposium on Computer science educationACM, 2010), pp. 42-46.
2 Chris Stephenson Aman Yadav, Hai Hong 'Computational Thinking for Teacher Education', Communications of
the ACM, 2017).
3 Jon Good Aman Yadav, Joke Voogt, Petra Fisser, Computational Thinking as an Emerging Competence Domain.
ed. by M. Mulder. Vol. 23, Technical and Vocational Education and Training: Issues, Concerns and Prospects
(Switzerland Springer International Publishing, 2017).
4 Gabriella Anton, 'Power of Play: Exploring Computational Thinking through Game Design', The Velvet Light
Trap - A Critical Journal of Film and Television (2013), 74-75.
5 KEMENTERIAN PENDIDIKAN MALAYSIA BAHAGIAN PEMBANGUNAN KURIKULUM, Tmk Dalam
Kurikulumstandard Sekolah Rendah (Semakan 2017), 2017.
6 David Barr, John Harrison, and Leslie Conery, 'Computational Thinking: A Digital Age Skill for Everyone',
Learning & Leading with Technology, 38 (2011), 20-23.
7 Valerie Barr, and Chris Stephenson, 'Bringing Computational Thinking to K-12: What Is Involved and What Is
the Role of the Computer Science Education Community?', ACM Inroads, 2 (2011), 48-54.
8 Ashok Basawapatna, Kyu Han Koh, Alexander Repenning, David C. Webb, and Krista Sekeres Marshall,
'Recognizing Computational Thinking Patterns', in Proceedings of the 42nd ACM technical symposium on Computer
science education (Dallas, TX, USA: ACM, 2011), pp. 245-50.
9 Ashok R Basawapatna, Kyu Han Koh, and Alexander Repenning, 'Using Scalable Game Design to Teach
Computer Science from Middle School to Graduate School', in Proceedings of the fifteenth annual conference on
Innovation and technology in computer science educationACM, 2010), pp. 224-28.
10 BERNAMA, 'Pemikiran Komputasional, Sains Komputer Akan Diajar Di Sekolah Tahun Depan', Utusan
ONLINE, 11 Ogos 2016 2016.
11 Matt Bower, and Katrina Falkner, 'Computational Thinking, the Notional Machine, Pre-Service Teachers, and
Research Opportunities', in Proceedings of the 17th Australasian Computing Education Conference (ACE 2015), 2015),
p. 30.
12 Karen Brennan, and Mitchel Resnick, 'New Frameworks for Studying and Assessing the Development of
Computational Thinking', in Proceedings of the 2012 annual meeting of the American Educational Research
Association, Vancouver, Canada, 2012).
13 Sheryl Buckley, 'The Role of Computational Thinking and Critical Thinking in Problem Solving in a Learning
Environment', (Kidmore End: Academic Conferences International Limited, 2012), pp. 63-XI.
14 Simon Collins, 'All Kiwi Kids to Learn 'Computational Thinking' for 10 Years', NZ Herald, 28 June 2017 2017.
15 Computer Science Teacher Association CSTA, 'Csta: The Voice of K–12 Computer Science Education and Its
Educators', in Csta Voice: Computational Thinking, 2016).
16 Jan and Snyder Cuny, Larry and Wing, Jeannette M., 'Demystifying Computational Thinking for Non-Computer
Scientists', Unpublished manuscript in progress, referenced in http://www. cs. cmu. edu/~
CompThink/resources/TheLinkWing. pdf (2010).
17 Paul Curzon, Mark Dorling, Thomas Ng, Cynthia Selby, and John Woollard, 'Developing Computational

612
Herald NAMSCA 1, 2018 Ung L. Ling, Tammie C. Saibin, Nasrah Naharu, Jane Labadin,
Norazila A. Aziz
__________________________________________________________________________________________________

Thinking in the Classroom: A Framework', (2014).


18 Paul Curzon, Peter W. McOwan, Nicola Plant, and Laura R. Meagher, 'Introducing Teachers to Computational
Thinking Using Unplugged Storytelling', in Proceedings of the 9th Workshop in Primary and Secondary Computing
Education (Berlin, Germany: ACM, 2014), pp. 89-92.
19 Betul C. Czerkawski, and Eugene W. Lyman, 'Exploring Issues About Computational Thinking in Higher
Education', TechTrends, 59 (2015), 57-65.
20 Chris Dede, Punya Mishra, and Joke Voogt, 'Working Group 6: Advancing Computational Thinking in 21st
Century Learning', (2013).
21 Peter J. Denning, 'The Profession of It: Beyond Computational Thinking', Commun. ACM, 52 (2009), 28-30.
22 Caitlin Duncan, Tim Bell, and James Atlas, 'What Do the Teachers Think?: Introducing Computational Thinking
in the Primary School Curriculum', in Proceedings of the Nineteenth Australasian Computing Education Conference
(Geelong, VIC, Australia: ACM, 2017), pp. 65-74.
23 G. Fessakis, E. Gouli, and E. Mavroudi, 'Problem Solving by 5-6 Years Old Kindergarten Children in a
Computer Programming Environment: A Case Study', Computers & Education, 63 (2013), 87-97.
24 Yasemin GÜLBAHAR Filiz KALELİOĞLU, Volkan KUKUL, 'A Framework for Computational Thinking
Based on a Systematic Research Review', Baltic J. Modern Computing, 4(2016) (2016), 583-96.
25 Takam Djambong & Viktor Freiman, 'Task-Based Assessment of Students’ Computational Thinking Skills
Developed through Visual Programming or Tangible Coding Environments ', in 13th International Conference on
Cognition and Exploratory Learning in Digital Age (CELDA 2016), 2016), p. 13.
26 Robert Glaser, 'Education and Thinking: The Role of Knowledge', American psychologist, 39 (1984), 93.
27 Lindsey Ann Gouws, Karen Bradshaw, and Peter Wentworth, 'Computational Thinking in Educational Activities:
An Evaluation of the Educational Game Light-Bot', in Proceedings of the 18th ACM conference on Innovation and
technology in computer science education (Canterbury, England, UK: ACM, 2013), pp. 10-15.
28 Shuchi Grover, and Roy Pea, 'Computational Thinking in K–12 a Review of the State of the Field', Educational
Researcher, 42 (2013), 38-43.
29 Andri Ioannidou, Vicki Bennett, Alexander Repenning, Kyu Han Koh, and Ashok Basawapatna, 'Computational
Thinking Patterns', Online Submission (2011).
30 Maya Israel, Jamie N. Pearson, Tanya Tapia, Quentin M. Wherfel, and George Reese, 'Supporting All Learners in
School-Wide Computational Thinking: A cross-Case Qualitative Analysis', Computers & Education, 82 (2015), 263-79.
31 Yasmin B. Kafai, and Quinn Burke, 'Computer Programming Goes Back to School', Phi Delta Kappan, 95
(2013), 61-65.
32 Kyu Han Koh, 'Computational Thinking Pattern Analysis: A Phenomenological Approach to Compute
Computational Thinking' (Ph.D., University of Colorado at Boulder, 2014), p. 157.
33 Kyu Han Koh, Ashok Basawapatna, Vicki Bennett, and Alexander Repenning, 'Towards the Automatic
Recognition of Computational Thinking for Adaptive Visual Language Learning', in Visual Languages and Human-
Centric Computing (VL/HCC), 2010 IEEE Symposium onIEEE, 2010), pp. 59-66.
34 Hilarie Nickerson Kyu Han Koh, Ashok Basawapatna, Alexander Repenning, 'Early Validation of Computational
Thinking Pattern Analysis', in ITiCSE '14: Proceedings of the 2014 conference on Innovation & technology in computer
science education (Uppsala, Sweden: ACM, 2014), pp. 213-18.
35 Annette Lamb, and Larry Johnson, 'Scratch: Computer Programming for 21st Century Learners', Teacher
Librarian, 38 (2011), 64-68.
36 Michael G Lovorn, and Ali Reza Rezaei, 'Assessing the Assessment: Rubrics Training for Pre-Service and New
in-Service Teachers', Practical Assessment, Research & Evaluation, 16 (2011), 1-18.
37 James J Lu, and George HL Fletcher, 'Thinking About Computational Thinking', in ACM SIGCSE BulletinACM,
2009), pp. 260-64.
38 Michael McCauley Mahsa Mohaghegh, 'Computational Thinking: The Skill Set of the 21st Century', (IJCSIT)
International Journal of Computer Science and Information Technologies, 7 (2016), 1524-30.
39 KEMENTERIAN PELAJARAN MALAYSIA, 'Dokumen Standard Dunia Sains Dan Teknologi Tahun 1', ed. by
Bahagian Pembangunan Kurikulum (No.22, Jalan Sri Ehsan Satu, Taman Siri Ehsan, Kepong,52100 Kuala Lumpur.:
Bahagian Pembangunan Kurikulum, Kementerian Pelajaran Malaysia, 2011).
40 ———, 'Teknologi Maklumat Dan Komunikasi Tahun Enam', ed. by BAHAGIAN PEMBANGUNAN
KURIKULUM KEMENTERIAN PENDIDIKAN MALAYSIA (Pusat Pentadbiran Kerajaan Persekutuan , 62604
Putrajaya.: KEMENTERIAN PELAJARAN MALAYSIA, 2014).
41 Kementerian Pendidikan Malaysia, 'Bahasa Cina Sekolah Jenis Kebangsaan (C) Tahun 1 Dokumen Standard
Kurikulum Dan Pentaksiran', ed. by Bahagian Pembangunan Kurikulum (Kompleks Pentadbiran Kerajaan Persekutuan,
62604 Putrajaya: Kementerian Pendidikan Malaysia, 2016).
42 ———, 'Kurikulum Standard Sekolah Rendah Kssr (Semakan)', 2016).
43 MDEC, 'Duta Penggerak Digital', ed. by Kementerian Pendidikan Malaysia.
44 Lio Moscardini, 'Developing Equitable Elementary Mathematics Classrooms through Teachers Learning About

613
Herald NAMSCA 1, 2018
Ung L. Ling, Tammie C. Saibin, Nasrah Naharu, Jane Labadin,
Norazila A. Aziz
__________________________________________________________________________________________________

Children's Mathematical Thinking: Cognitively Guided Instruction as an Inclusive Pedagogy', Teaching and Teacher
Education, 43 (2014), 69-79.
45 Y Yi Mun, and Yujong Hwang, 'Predicting the Use of Web-Based Information Systems: Self-Efficacy,
Enjoyment, Learning Goal Orientation, and the Technology Acceptance Model', International journal of human-
computer studies, 59 (2003), 431-49.
46 Andrea Niosi, 'Creating Rubrics for Assessment', (2012).
47 Jum C Nunnally, and IH Bernstein, 'The Assessment of Reliability', Psychometric theory, 3 (1994), 248-92.
48 Kian L. Pokorny, and Nathan White, 'Computational Thinking Outreach: Reaching across the K-12 Curriculum',
J. Comput. Sci. Coll., 27 (2012), 234-42.
49 Dylan J. Portelance, 'Code and Tell: An Exploration of Peer Interviews and Computational Thinking with
Scratchjr in the Early Childhood Classroom' (M.A., Tufts University, 2015), p. 77.
50 Siwarak Promraksa, Kiat Sangaroon, and Maitree Inprasitha, 'Characteristics of Computational Thinking About
the Estimation of the Students in Mathematics Classroom Applying Lesson Study and Open Approach', in Journal of
Education and Learning (Toronto: Canadian Center of Science and Education, 2014), pp. 56-66.
51 Christie Lee Lili Prottsman, 'Computational Thinking and Women in Computer Science' (M.S., University of
Oregon, 2011), p. 51.
52 Alexander Repenning, David Webb, and Andri Ioannidou, 'Scalable Game Design and the Development of a
Checklist for Getting Computational Thinking into Public Schools', in Proceedings of the 41st ACM technical
symposium on Computer science educationACM, 2010), pp. 265-69.
53 Mitchel Resnick, John Maloney, Andrés Monroy-Hernández, Natalie Rusk, Evelyn Eastmond, Karen Brennan,
Amon Millner, Eric Rosenbaum, Jay Silver, and Brian Silverman, 'Scratch: Programming for All', Communications of
the ACM, 52 (2009), 60-67.
54 Justin Richards, 'Computational Thinking: A Discipline with Uses Outside the Computer Lab?', Computer
Weekly (2007), 52.
55 Young Jeon Se Young Park, Teachers’ Perception on Computational Thinking in Science Practices. Vol. Volume
9, 2015), p. 5.
56 Linda Seiter, and Brendan Foreman, 'Modeling the Learning Progressions of Computational Thinking of Primary
Grade Students', in Proceedings of the ninth annual international ACM conference on International computing
education research (San Diego, San California, USA: ACM, 2013), pp. 59-66.
57 Cynthia Selby, and John Woollard, 'Computational Thinking: The Developing Definition', (2013).
58 Pratim Sengupta, John S. Kinnebrew, Satabdi Basu, Gautam Biswas, and Douglas Clark, 'Integrating
Computational Thinking with K-12 Science Education Using Agent-Based Computation: A Theoretical Framework',
Education and Information Technologies, 18 (2013), 351-80.
59 Karamjit Singh, 'Computational Thinking Comes to the Fore in Malaysian Schools', Digital News Asia (DNA),
12 August 2016 2016.
60 Mohsen Tavakol, and Reg Dennick, 'Making Sense of Cronbach's Alpha', International journal of medical
education, 2 (2011), 53.
61 Allen Tucker, 'A Model Curriculum for K–12 Computer Science', in Final Report of the ACM K–12 Task Force
Curriculum Committee., ed. by Computer Science Teachers Association (New York, 10121-0071: Computer Science
Teachers Association, 2003).
62 L. Ling Ung, Tammie, C. Saibin, Jane, Labadin and Norazila, Abdul Aziz 'Preliminary Investigation: Teachers’
Perception on Computational Thinking Concepts', Journal of Telecommunication, Electronic and Computer
Engineering (JTEC) (2017).
63 Viswanath Venkatesh, Michael G Morris, Gordon B Davis, and Fred D Davis, 'User Acceptance of Information
Technology: Toward a Unified View', MIS quarterly (2003), 425-78.
64 Garret Walliman, 'Genost: A System for Introductory Computer Science Education with a Focus on
Computational Thinking' (M.S., Arizona State University, 2015), p. 374.
65 Andrea Elizabeth Weinberg, 'Computational Thinking: An Investigation of the Existing Scholarship and
Research' (Ph.D., Colorado State University, 2013), p. 99.
66 Michael Philetus Weller, Ellen Yi-Luen Do, and Mark D Gross, 'Escape Machine: Teaching Computational
Thinking with a Tangible State Machine Game', in Proceedings of the 7th international conference on
Interaction design and childrenACM, 2008), pp. 282-89.
67 Jeannette M Wing, 'Computational Thinking', Magazine Communications of the ACM - Self managed systems
CACM March, 2006, pp. 33-35.
68 A. Yadav, C. Mayfield, N. E. Zhou, S. Hambrusch, and J. T. Korb, 'Computational Thinking in Elementary and
Secondary Teacher Education', Acm Transactions on Computing Education, 14 (2014), 1-16.
69 Aman Yadav, Ninger Zhou, Chris Mayfield, Susanne Hambrusch, and John T. Korb, 'Introducing Computational
Thinking in Education Courses', in Proceedings of the 42nd ACM technical symposium on Computer science
educationACM, 2011), pp. 465-70.

614

View publication stats

You might also like