Download as pdf or txt
Download as pdf or txt
You are on page 1of 7

DIFFERENTIATED ASSESSMENT OF UNDERGRADUATE

STUDENTS’ SCIENCE PROCESS SKILLS

CLARISSE P. CACAPIT

DON MARIANO MARCOS MEMORIAL STATE UNIVERSITY


SOUTH LA UNION CAMPUS
COLLEGE OF GRADUATE STUDIES
AGOO, LA UNION

DOCTOR OF PHILOSOPHY IN SCIENCE EDUCATION

JULY 2023
2

Rationale

Science Process Skills (SPS) are important practical skills to develop scientific

knowledge that allow students to explore and address relevant issues and problems around

them. Chiappetta and Koballa, (2002) classified science process skills into two categories:

basic and integrated. Basic science process skills entail observing, measuring, inferring,

predicting, and communicating. On the other hand, integrated science process skills include

controlling variables, hypothesizing, experimentation, and data interpreting.

The nature of science education involves students in conducting scientific

investigations. Science process skills such as inferring, hypothesizing, and data

interpreting are among the major skills required to carry out investigative tasks. Moreover,

these skills will help them understand a scientific phenomenon, gather significant

information, and improve their sense of taking responsibility for their own learning (Kim,

2018).

Apart from acquiring and developing scientific process skills, it is also necessary

to measure these skills, evaluate their development and interpret them in a meaningful way

(Fugarasti et al., 2019). That’s where the role of assessment comes in. Assessment in

education involves the use of a wide variety of methods or tools to evaluate, measure, and

document the academic readiness, learning progress, skill acquisition, or educational needs

of students. The importance and integrity of assessment as part of instruction have been

long established by literature (Shepard, 2019; Archer, 2017; Coates, 2015), but several

studies indicate that there’s an imbalance between the use of various assessment tools as

selection-response formats are more likely to be used by educators due to several factors
3

like large class size, pressure in grading, and limited access to different measurement tools

(Monteiro et al., 2021; Rawlusyk, 2018; Mekonen & Fitiavana, 2018; Karakaya & Yilmaz,

2022).

Individual differences in students are personal differences specific to each student.

This includes variables such as physical characteristics (height, weight), intelligence,

interest, perception, gender, ability, learning/cognitive styles, and personality traits (Ari &

Deniz, 2008 as cited by Kubat, 2018). Associations between individual differences factors

such as gender (Kristyasari et al, 2018); personality traits (Yu, 2021); motivation (Butcher

et al., 2021), and cognitive styles (Ceran & Ates, 2020) to student perceptions, science

literacy, learning efficiency, learning performance, and learning outcomes among others

were adequate. But it was noticeable that these studies only used one type of measurement

or assessment tool. Hence, the need to examine students’ science process skills using

different types of tests emerges.

The observed scarcity of studies in this field suggests that the evaluation of

scientific process skills with only one assessment type is a factor that should be discussed.

Besides, the use of assessment instruments in different formats will ensure that the

disadvantages caused by individual differences factors are eliminated.

With the aforementioned statements in focus, this study aims to describe the level

of science process skills of students as measured through different test formats. The results

of this study can be utilized as a resource for anyone working in science education.

Particularly, it will benefit educators and learners and shall strengthen the importance of
4

utilizing different assessment tools to counter discriminating factors that may affect the

measurement of students’ performances.

Objectives/Statement of the Problem

This research aims to describe the level of science process skills of BSE-Science

students of Pangasinan State University as measured through different test formats and will

serve as an input for the development of a differentiated assessment plan. Specifically, it

aims to answer the following questions:

1. What individual differences exist among the BSE-Science students of Pangasinan

State University in terms of:

a. Sex;

b. Motivation;

c. Personality traits; and

d. Cognitive style?

2. To what extent is the BSE-Science students’ level of science process skills (SPS) as

evaluated using different assessment tools such as:

a. Multiple choice tests;

b. Open-ended questions; and

c. Performance checklist?

3. Is there a significant difference between the level of science process skills of BSE-

Science students measured by different test formats as affected by their individual

differences factors?
5

4. What assessment plan could be proposed to evaluate the science process skills of

BSE-Science students catering to individual differences?

Literature Cited

Archer, E. (2017). The Assessment Purpose Triangle: Balancing the Purposes of

Educational Assessment. Frontiers in Education, 2(41).

https://doi.org/10.3389/feduc.2017.00041

Butcher, R. P. & Stock, R. & Lynam, S. & Cachia, M., (2021) “Academic success and

individual differences”. New Vistas 7(1), 37-42.

https://doi.org/10.36828/newvistas.24

Ceran, S. & Ates, S. (2020). Measuring scientific process skills with different test formats:

A research from the perspective of cognitive styles. Journal of Education in Science,

Environment and Health, 6(3), 220-230. https://doi.org/10.21891/jeseh.703442

Chiappetta, E., & Koballa, T. (2002). Science Instruction in the Middle and Secondary

Schools (5th ed). Merrill Prentice Hall.

Coates, H. (2015). Assessment of Learning Outcomes. The European Higher Education

Area, Springer Cham.

Fugarasti, H., Ramli, M., & Muzzazinah (2019). Undergraduate Students’ Science Process

Skills: A Systematic Review. American Institute of Physics Conference Proceedings,

2194 (1). https://doi.org/10.1063/1.5139762


6

Karakaya, F. & Yilmaz, M. (2022). Teachers‟ views on assessment and evaluation methods

in STEM education: A science course example. Journal of Pedagogical Research, 6

(2), 61-71. https://dx.doi.org/10.33902/JPR.202213526

Kim, M. (2018). Understanding children’s science identity through classroom interactions.

International Journal of Science Education, 40(1), 24-45.

https://doi.org/10.1080/09500693.2017.1395925

Kristyasari, M., Yamtinah, S., Utomo, S., Ashadi, & Indriyanti, N. (2018). Gender

Differences in Students’ Science Literacy Towards Learning on Integrated Science

Subject. Journal of Physics: Conference Series, 1097(1).

https://doi.org/10.1088/1742-6596/1097/1/012002

Kubat, U. (2018). Identifying the Individual Differences Among Students During Learning

and Teaching Process by Science Teachers. International Journal of Research in

Educational and Science. International Journal of Research in Education and Science,

4(1), 30-38. https://doi.org/10.21890/ijres.369746

Mekonen, Y.K. & Fitiavana, A. (2021). Assessment of Learning Outcomes in Higher

Education: Review of Literature. International Journal of Research Publications,

71(1), 69-76. https://doi.org/10.47119/IJRP100711220211766

Monteiro, V., Mata, L., & Santos, N. (2021) Assessment Conceptions and Practices:

Perspectives of Primary School Teachers and Students. Frontiers in Education, 6.

https://doi.org/10.3389/feduc.2021.631185
7

Rawlusyk, P.E. (2018). Assessment in Higher Education and Student Learning. Journal of

Instructional Pedagogies, 21, 1-34. https://eric.ed.gov/?id=EJ1194243

Shepard, L. A. (2019). Classroom Assessment to Support Teaching and Learning. The

ANNALS of the American Academy of Political and Social Science, 683(1), 183–200.

https://doi.org/10.1177/0002716219843818

Stangor, C. (2011). Research methods for the behavioral sciences (4th ed.). Cengage.

Wang, X. & Cheng, Z. (2020). Cross-Sectional Studies: Strengths, Weaknesses, and

Recommendations. CHEST Journal, 158(1), 65-71.

https://doi.org/10.1016/j.chest.2020.03.012

Yu, Z. (2021). The effects of gender, educational level, and personality on online learning

outcomes during the COVID‑19 pandemic. International Journal of Educational

Technology in Higher Education, 18(14). https://doi.org/10.1186/s41239-021-00252-

You might also like