Download as pdf or txt
Download as pdf or txt
You are on page 1of 13

Studies in Educational Evaluation 68 (2021) 100976

Contents lists available at ScienceDirect

Studies in Educational Evaluation


journal homepage: www.elsevier.com/locate/stueduc

Measuring classroom climate: Validation of the CCQ-P for primary, middle


school and high school levels, on the Romanian population
Mihaela Cimpian a, 1, Laurenţiu P. Maricuţoiu a, c, 1, Marian D. Ilie a, b, *, 1
a
Center of Academic Development at the West University of Timișoara, Romania
b
Teacher Training Department at the West University of Timișoara, Romania
c
Department of Psychology at the West University of Timișoara, Romania

A R T I C L E I N F O A B S T R A C T

Keywords: Several high-quality instruments are used for investigations on classroom climate. Out of these, none is valid on
Classroom climate all three educational pre-tertiary levels (primary, middle school and high school). This situation limits our un­
Primary school derstanding about classroom climate along different educational levels. We collected responses on a large
Middle school
Romanian student sample (1003 students, 49.3 % boys), and we investigated the internal validity of the
High school
Invariance analysis
Classroom Climate Questionnaire – Primary (CCQ-P, Aldridge & Galos, 2018) using confirmatory factor analyses
Student perceptions and measurement invariance analyses. The original model obtained a good fit and partially maintained structure
across the three educational levels. Thus, this study opens the door for the investigation of this questionnaire for
the three pre-university levels in other national contexts. This could impact how we look at the transition of
students between levels and it could offer new input about the practical actions related to this process.

1. Introduction 2014). In this paper, given the terminology used in the questionnaire
(Aldridge & Galos, 2018) selected for our research, we used the term
Human well-being is influenced by psychological and social factors classroom climate.
such as social support and good social relationships (Holder & Coleman, The classroom climate is a collection of notions related to the school
2009; Kawachi & Berkman, 2001). In the educational field, the research experience rather than a single general concept. For example, Aldridge
on psycho-social factors focuses on concepts such as learning environ­ and Galos (2018) used nine constructs to describe the structure of the
ment. When referring to the learning environment, Fraser (2012b) dif­ classroom climate (e.g., Cohesiveness, Teacher support, Involvement etc.).
ferentiates between the school level-approach and the classroom-level However, classroom climate is a popular classroom-level concept that
approach. The first focuses on the relations between teachers and the correlates with various positive consequences (Fraser, 2019). A positive
administrative personnel (e.g., the head of department, secretaries). classroom climate is associated with student outcomes such as successful
This concept can be associated with the educational administration. At student self-regulation (Velayutham & Aldridge, 2013), study enjoy­
the classroom level the term learning environment is synonym with the ment (Martin-Dunlop & Fraser, 2008; Bell & Aldridge, 2014; Ogbuehi &
classroom climate (Fraser, 2012b). Classroom climate refers to the Fraser, 2007), academic achievement (Chionh & Fraser, 2009; Wolf &
quality of the relationships between students, students and teachers, and Fraser, 2008), and cognitive executive functions (Vandenbroucke, Spilt,
emphasizes the social and psychological classroom components such as Verschueren, Piccinin, & Baeyens, 2018). Regarding the social in­
equity and involvement (Aldridge & Galos, 2018; Barr, 2016). Several fluences of a positive student climate, this has been associated with
terms with the same meaning as classroom climate can also be found in lower rates of bullying (Roland & Galloway, 2002), and greater social
the international literature such as: instructional/learning environment, competence (Wilson, Pianta, & Stuhlman, 2007). Moreover, it seems
classroom context, classroom environment or learning context (e.g., Ilie, that perceived positive climate is also related with enhanced quality of

Abbreviations: CCQ-P, classroom climate questionnaire for primary; ToM, theory of mind; CFA, confirmatory factor analysis; ECV, explained common variance;
IECV, item explained common variance; MI, measurement invariance.
* Corresponding author at: West University of Timișoara, Teacher Training Department, Center of Academic Development, No. 4 Vasile Pârvan Blvd., 300223,
Timișoara, Romania.
E-mail address: marian.ilie@e-uvt.ro (M.D. Ilie).
1
The three authors contributed equally to this paper.

https://doi.org/10.1016/j.stueduc.2021.100976
Received 28 November 2019; Received in revised form 22 November 2020; Accepted 30 December 2020
Available online 22 January 2021
0191-491X/© 2021 Elsevier Ltd. All rights reserved.
M. Cimpian et al. Studies in Educational Evaluation 68 (2021) 100976

life for both teachers and students, lower dropout rates, higher atten­ conclusions and provide different evidence regarding the internal val­
dance, increased engagement and a deep learning approach to studying idity of these scales. For example, without a confirmatory approach one
(Evans, Harvey, Buckley, & Yan, 2009). In the opposite direction, a poor cannot conclude that the original structure of a questionnaire provides
classroom climate has been shown to be related with poor social re­ enough fit to the dataset. Therefore, we still know very little regarding
lations, higher student aggression and poor academic focus (Barth, the configural invariance of these scales, from one school level to
Dunlap, Dane, Lochman, & Wells, 2004). another. In a similar vein, each study has used samples from different
Taken together, previous research findings emphasize the role of countries and cultures (e.g., Macedonia, South Africa, Australia, Taiwan,
classroom climate in assuring a qualitative and successful instruction. In etc.). Each nationality comes with certain variables that are dependent
the last decades, researchers invested substantial effort to evaluate and on culture and context, and this leads to the impossibility of comparing
understand the classroom climate implications. Unfortunately, there are the internal validity of the questionnaire without proper statistical
multiple perspectives regarding the concept of classroom climate, and analyses.
numerous questionnaires that use different definitions and components Secondly, another important aspect is the form of the survey. Three
to assess this concept (Fraser, 1998). Because all these apparently different forms are used in the field: teacher-focused, personal-focused,
different perspectives describe the same phenomenon using different or classroom-focused. When researchers use the first form, the questions
labels, it is difficult to identify a common approach. Furthermore, it is are addressed to the teachers (e.g., “In this class, students learn about the
not uncommon to find different perspectives on classroom climate at world outside of school”, Johnson & McClure, 2004). Also, when the
different educational levels. For example, the My class inventory (Fraser instrument has the personal or the classroom form, the questions are
& Fisher, 1982) was developed for elementary students while the scale addressed to the students. The personal form asses the individual
What is happening in my classroom (Fraser, McRobbie, & Fisher, 1996) perception of the student within the classroom (e.g., “My ideas are used
was developed for middle-school students. Because there are different during classroom discussions”, Aldridge & Galos, 2018), while the
instruments for different school levels, comparisons regarding the classroom form assesses the perception that the student has of the
classroom climate and its implications at different educational levels are classroom overall (e.g., “Student’s ideas and suggestions are used during
difficult to make (Alansari & Rubie-Davies, 2019; Fraser, 1989). The class discussions”, Majeed, Fraser, & Aldridge, 2002). Thus, using
biggest problem that results from the use of parallel questionnaires is the different forms of instruments to investigate the same concept will lead
delay of the scientific process due to the impossibility to compare con­ to different results (Fraser et al., 1996) and will make it difficult to carry
clusions of different studies that analyzed the same phenomenon using out valid comparisons or syntheses. It is also true that the perspective
different assessment methods (Skinner, Edge, Altman, & Sherwood, that there is a unique learning environment in the classroom (i.e.,
2003). Consequently, Alansari and Rubie-Davies (2019) highlighted the classroom-focused), and that variations when measuring the learning
necessity to have transferable, suitable and relevant instruments across environment were caused by errors, has been challenged in the 1980
different educational levels. (Fraser et al., 1996). In the context of the same classroom, various stu­
In this study, we aim to address this issue by investigating if Class­ dents and groups of students can be more engaged and that can lead
room Climate Questionnaire-Primary (CCQ-P, Aldridge & Galos, 2018) them to perceive a more positive climate. Thus, instruments that
can be used at different educational levels as well. As a preliminary adopted the personal evaluation form represent a more appropriate way
analysis, we investigated the internal validity of CCQ-P on the Romanian to assess the classroom climate, especially for studies that aime to
population. After that, we used invariance analysis to test the internal investigate individual students’ perception and/or analyse of perspec­
validity of CCQ-P on three different educational levels (i.e., primary, tives of within-classroom subgroups of students such as males and fe­
secondary, high school). males (Fraser & Tobin, 1991).
Finally, a third issue is the fact that studies used different versions of
2. An overview of classroom climate instruments the instruments that we mentioned, with different number of items or
dimensions. Thus, it is hard to talk about the same instrument used
There are many valid and robust instruments which have been across all the educational levels, but we can rather observe different
developed and used for investigating students’ perceptions of classroom versions of the same instrument for different educational levels. For
climate (Fraser, 1998). The most popular are: What is happening in my example, the TROFLEI questionnaire (Aldridge et al., 2004) has been
classroom? (WIHIC - Fraser et al., 1996), Constructivist learning environ­ used in numerous contexts (e.g., Earle & Fraser, 2017; Gunawardena,
ment survey (CLES - Taylor & Fraser, 1991), Technology-Rich 1988). These studies covered all the educational levels (see Table 1), in
Outcomes-Focused Learning Environment Inventory (TROFLEI - Aldridge, which authors used four different versions of TROFLEI (Aldridge et al.,
Dorman, & Fraser, 2004), My Class Inventory (MCI - Fraser & Fisher, 2004) in five contexts. When researchers are using different versions of
1982), Questionnaire on teacher interaction (QTI - Wubbels & Levy, 1991), the same measure in different studies, they reduce the generalizability of
Science Laboratory Environment Inventory (SLEI - Fraser, McRobbie, & their results, because the internal structure of their measure is not
Giddings, 1993), Individualized Classroom Environment Questionnaire similar from one study to another.
(ICEQ - Rentoul & Fraser, 1979), and College and University Classroom Measurement invariance (MI) investigates the compatibility between
Environment Inventory (CUCEI - Fraser & Treagust, 1986). groups and whether the items have the same contribution across these
These popular scales are used at various educational levels, although groups (Meredith, 1993). According to Schmitt and Kuljanin (2008), MI
each was initially developed for a particular level. For example, the testing has become a growing practice when testing the validity of an
WIHIC questionnaire (Fraser et al., 1996), has been used in numerous instrument on different populations, using confirmatory factor analyses.
contexts (Skordi & Fraser, 2019). The evidence presented in Table 1 One advantage of MI is that it allows researchers to check if a certain
shows that the WIHIC was used in at least 9 educational contexts that measurement difference between two or more groups is the result of a
cover all educational levels. This practice is problematic because, in the measurement variance across the studied groups. (Sass, 2011). When
absence of evidence regarding its invariance from one educational level measurement invariance is ignored, researchers cannot compare the
to another, we cannot interpret the similar or different results reported groups because the observed differences between the means obtained
by various studies in an integrative manner. can be the result of measurement variance or the result of a real dif­
The studies summarized in Table 1 highlight some aspects regarding ference between groups.
the use of climate questionnaires in educational research. First, each Moreover, MI is important when longitudinal designs are used. An
study used different statistical techniques (e.g., Confirmatory Factor example in this case could be the evaluation of classroom climate
Analysis, Exploratory Factor Analysis, Principal Components Analysis, changes, when students transition from one level to another. In most
etc.). These different statistical approaches can yield different educational systems, the students remain in the same classroom (i.e.,

2
M. Cimpian et al. Studies in Educational Evaluation 68 (2021) 100976

Table 1
Overview of WHIHIC, TROFLEI, MCI, QTI, SLEI, ICEQ, CUCEI previous validations on primary, secondary and tertiary education.

Primary Secondary education Tertiary Study Instrument


Scale (reference) Sample Method
education Middle school High School education (reference) form

4 scales, 19 items
(WIHIC- N = 1,077 Principal axis (Aldridge,
Personal
Primary) (actual students from factoring with Fraser, & Ntuli,
form
and preferred South Africa oblique rotation 2009)
version)
N = 1488
7 scales, 45 and
students from (Charalampous
43 items (old + Mixed method, EFA Classroom
Northern & Kokkinos,
new) (G- + CFA form
Macedonia and 2017)
EWIHIC)
Greece
N = 1,076
Personal
7 scales, 56 items students from Alpha Cronbach (Ilie et al., 2014)
form
Romania
N = 1,434 Principal
7 scales, 52 students from components factor (Wolf & Fraser, Personal
What is happening in
items Australia and analysis followed by 2008) form
my classroom?
Taiwan varimax rotation
(WIHIC) (Fraser
7 scales, 56 N = 3980
et al., 1996) Model fit, model
items (actual students from Personal
comparison and (Dorman, 2003)
and preferred Canada and form
model parsimony
version) Australia
N = 978
7 scales, 56 Confirmatory factor Personal
students from (Dorman, 2008)
items analysis form
Queensland
*Cronbach Alpha,
*N = 355 Confirmatory factor Personal and
*7 scales, 54 *(Fraser et al.,
students from analysis, ANOVA, classroom
items 1996)
Australia Discriminant form
validity
N = 375 Principal axis factor
7 scales, 56 students from analysis with direct (Skordi & Fraser, Personal
items Southern oblimin rotation and 2019) form
California Kaiser normalization
Exploratory factor
N = 290 analysis, principal
5 scales, 30 5 scales, 30 (Johnson & Teacher
5 scales, 30 items teachers from axis factoring with
items items McClure, 2004) form
the USA oblimin rotation,
Cronbach Alpha
Principal axis
factoring with
N = 1081 varimax rotation and
Personal
Constructivist 3 scales, 18 items students from Kaiser (Peer, 2011)
form
learning Singapore normalization,
environment Cronbach Alpha,
survey ANOVA
(CLES) *Principal
(Taylor & Fraser, *N = 508 components factor
*4 scales, 28 *4 scales, 28 (Taylor & Fraser, Personal
1991) students from analysis followed by
items items 1991) form
Australia varimax rotation,
Cronbach Alpha
Principal
N = 1081
components factor
students from (Aldridge,
5 scales, 30 analysis followed by Personal
Australia and Fraser, Taylor, &
items varimax rotation, form
1879 students Chen, 2000)
Cronbach Alpha,
from Taiwan
ANOVA
64 items (actual
and preferred
N = 122
version) 8 Personal
students from Alpha Cronbach (Benson, 2012)
dimensions + 31 form
New Zealand
additional items,
Technology-Rich
4 dimensions
Outcomes-Focused
66 items
Learning N = 914
(actual ANOVA, Alpha (Earle & Fraser, Personal
Environment students from
version) 10 Cronbach 2017) form
Inventory Florida
dimensions
(TROFLEI) (
*80 items *N = 1249
Aldridge et al., *Alpha Cronbach,
(actual and students from
2004) Exploratory factor *(Aldridge et al., Personal
preferred Western
analysis Multitrait 2004) form
version), 10 Australia and
Method
dimensions Tasmania
80 items (Gunawardena, Personal
(actual and 1988) from
(continued on next page)

3
M. Cimpian et al. Studies in Educational Evaluation 68 (2021) 100976

Table 1 (continued )

Primary Secondary education Tertiary Study Instrument


Scale (reference) Sample Method
education Middle school High School education (reference) form

preferred N = 322 Discriminant


version), 10 students from Validity
dimensions New Zealand (Correlations)
Principal
components factor
N = 588 (Houston,
19 items, 3 analysis with Classroom
students from Fraser, &
dimensions varimax rotation, form
North Texas Ledbetter, 2008)
Cronbach alpha,
ANCOVA
My Class Inventory
25 items (actual
(MCI) (Fraser & N = 120
and preferred Alpha Cronbach, (Mink & Fraser, Classroom
Fisher, 1982) students from
version), 5 Mean correlation 2005) form
Miami
dimensions
A principal
N = 1565
38 items, 4 components factor (Majeed et al., Classroom
students in
dimensions analysis with 2002) form
Darussalam
varimax rotation
*N = 2305
*Alpha Cronbach, Classroom
*38 items, 5 students from *Fraser & Fisher,
Mean correlation, form,
dimensions Australia and 1982
ANOVA teacher form
Tasmania
N = 3104
48 items, 8 Cronbach alpha, (Scott & Fisher, Classroom
students from
dimensions ANOVA 2004) form
Malaysia
N = 614 (Passini,
64 items, 8 Circumplex Leary Classroom
students from Molinari, &
dimensions model form
Northern Italy Speltini, 2015)
N = 439
Questionnaire on 48 items, 8 Alpha Cronbach, (Lee, Fraser, & Classroom
students from
teacher interaction dimensions MANOVA Fisher, 2003) form
Korea
(QTI) (Wubbels &
*64 items, 8 *64 items, 8 *Alpha Cronbach,
Levy, 1991) *N = 1606
dimensions dimensions Analysis of variance, *(Wubbels & Classroom
students from
(Actual and (Actual and Factor analyses with Levy, 1991) form
USA
ideal version) ideal version) varimax rotation
(Fraser,
N = 422 Confirmatory factor
40 items, 8 Aldridge, & Classroom
students from analysis, Alpha
dimensions Soerjaningsih, form
Indonesia Cronbach, ANOVA
2010)
(Chua &
35 items, 5 N = 60 students Personal
Alpha Cronbach Karpudewan,
dimensions from Malaysia form
2017)
Principal
components factor
N = 761
22 items, 4 analysis with (Lightburn & Personal
Science Laboratory students from
dimensions varimax rotation, Fraser, 2007) form
Environment USA
Alpha Cronbach,
Inventory (SLEI) (
ANOVA
Fraser et al., 1993)
Principal
components method
N = 439
23 items, 5 (with varimax (Fraser & Lee, Personal
students from
dimensions rotation), Alpha 2009) form
Korea
Cronbach, ANOVA,
mean correlation
*N = 3727
*Item analyses,
students from
Alpha Cronbach,
*72 items, 8 Australia, USA, (Fraser et al., Classroom
Factor analyses with
dimensions Canada, 1993) form
varimax rotation,
England, Israel,
discriminant validity
and Nigeria
25 items, 5
dimensions N = 484 Burnett and Dart
(Dart et al., Classroom
Individualized (actual and students from (1997) procedure,
1999) form
Classroom preferred Australia Alpha Cronbach
Environment version)
Questionnaire *N = 230 *Alpha Cronbach,
*50 items, 5 *(Rentoul & Classroom
(ICEQ) (Rentoul & students from Discriminant
dimensions Fraser, 1979) form
Fraser, 1979) Sydney validity
25 items, 5 25 items, 5 25 items, 5 N = 495 boys Rasch model,
(Yates, 2011)
dimensions dimensions dimensions from USA ANOVA
College and 49 items, 7
University dimensions N = 165 Item analysis,
(Logan, Crump, Personal
Classroom (actual and students from Principal
& Rennie, 2006) form
Environment preferred New Zealand components analysis
Inventory (CUCEI) version)
(continued on next page)

4
M. Cimpian et al. Studies in Educational Evaluation 68 (2021) 100976

Table 1 (continued )

Primary Secondary education Tertiary Study Instrument


Scale (reference) Sample Method
education Middle school High School education (reference) form

(Fraser & *49 items, 7


*N = 372
Treagust, 1986) scales *Alpha Cronbach,
students from *(Fraser & Personal
(preferred Mean correlation,
Australia and Treagust, 1986 form
and actual ANOVA
USA
version)
49 items, 7 49 items, 7 N = 504 Alpha Cronbach, (Nair & Fisher, Classroom
scales (actual scales students from Mean Correlation, 1999) form
and preferred (preferred Canada and ANOVA
version) and actual Australia
version)

Note: In the table, the studies were classified by the educational level that authors specified, based on each educational system they belonged to.
*Marks the original instrument development studies.

with the same colleagues) during primary and secondary school. In these exception of 2 items (Item 4 and Item 35), which had low loadings and
situations, it is more reliable and accurate to test the assumption that the did not meet the necessary convergent validity criteria for the actual
measure we use reflects the same construct at each point in time version. Nonetheless, these items were kept as they strengthened the
(Widaman, Ferrer, & Conger, 2010). internal consistency of the scale overall.
Given the above-mentioned limitations of the current practice in the The final version of CCQ-P includes nine scales. Out of these, six (i.e.,
field, we tested one version of the CCQ-P (Aldridge & Galos, 2018) for all Student Cohesiveness, Teacher Support, Involvement, Equity, Task
three pre-tertiary educational levels (primary, middle school and high Orientation, Collaboration) are based on the scales of WIHIC (Fraser
school) on the same population (i.e., Romanian). We opted for an in­ et al., 1996), one (i.e., Personal Relevance) is taken form CLES (Taylor &
strument (i.e., CCQ-P) that assesses individual student perception. Also, Fraser, 1991) and two scales (i.e., Task Clarity, Responsibility for
we investigated its invariance across these school levels. Learning) are new, designed by the authors. Thus, CCQ-P has the
following scales:
3. Classroom climate questionnaire-primary (CCQ-P)
• The Student Cohesiveness scale stresses upon the importance of
In a recent paper, Aldridge and Galos (2018) concluded that the creating friendships, receiving support and acceptance from peers,
instruments used to study the classroom climate at a primary level were thus leading towards a favorable environment for pupils to express
either lacking the evidence of factorial validity or were limited in scope. their thoughts.
The poor evidence regarding factorial validity represents a serious threat • The Teacher support scale assesses the relationship between teachers
to the quality of the measurement because it does not support the re­ and students.
searchers’ decision to aggregate responses into super-ordinate scales. In • The Involvement scale evaluates the perceptions of the students
consequence, Aldridge and Galos (2018) designed the CCQ-P specif­ regarding their opportunities to get directly involved in the learning
ically to cover this gap. Also, the authors formulated the items of the process.
instrument by using the personal evaluation form, not the classroom • The Equity scale refers to the equality of chances given to the learners.
form. • The Task orientation scale offers a perspective on how involved stu­
The CCQ-P has nine components that belong to all three basic types dents are in completing their tasks.
of dimensions (Table 2), described in Moos’ scheme for classifying • The Personal Relevance scale measures the extent to which the content
human environments (Moos, 1974). In the development of the CCQ-P, studied by the learners correlates with everyday activities and the
Aldridge and Galos (2018) followed a framework used in another one importance that it holds for them.
of their validation studies (Velayutham, Aldridge, & Fraser, 2011). This • The Collaboration scale measures the perceptions of the students
process includes the following steps: identifying and defining the scales, concerning the opportunities they have to work collaboratively
creating the individual items for each scale, having the items evaluated within the learning process.
by a panel of experts and then conducting a pilot research. The pilot • The Task Clarity scale ensures that pupils understand the instructions
study tested the response time needed to answer the items and also the they are given and the goals that are to be achieved.
degree of the students understanding of the proposed items. The authors • The Responsibility for learning scale measures the degree in which
concluded that students were able to respond meaningfully to the students feel that they are being responsible for their learning.
questionnaire. In order to validate the scale, the authors used Trochim
and Donnelly’s (2006) model as a guide. Criterion validity was analyzed: Consequently, considering the evidence of factorial validity of CCQ-P
predictive, concurrent, convergent and divergent. The results indicated and its structure and theoretical foundation, we decided to use this in­
that the validity of the questionnaire was overall satisfactory, with the strument to achieve the aim of the current study. In the original study
(Aldridge & Galos, 2018), the CCQ-P also included 5 scales containing a
total of 19 items, which measure motivation and engagement. These
Table 2 scales were used in order to establish the convergent validity of the
CCQ-P scales classified by Moos’ scheme (Moos, 1974) for classifying human questionnaire (self-efficacy, learning goal orientation, self-regulation,
environments.
enjoyment of class, and enjoyment of school). In the present study,
Relationship Personal development System maintenance and system these additional scales were not used. We used only the 45 items from
dimensions dimensions change dimensions
the first part of the questionnaire, as we were only interested in the
Student Involvement Equity investigation of the internal validity across three educational levels (i.e.,
Cohesiveness primary, secondary, high school).
Teacher support Task orientation Task Clarity
Initially, we also considered the application of CCQ-P at the tertiary
Collaboration Personal Relevance
Responsibility for level of education, but we limited our study at the pre-tertiary educa­
learning tional levels taking in consideration evidences from the literature. This

5
M. Cimpian et al. Studies in Educational Evaluation 68 (2021) 100976

idea is confirmed by Alansari and Rubie-Davies (2019). The two authors 4.3. Procedure
state that even if the problem of transferability of class climate in­
struments across the school years remains a big challenge, conceptual­ In each classroom, all the students presented were asked to complete
izing, measuring and evaluating tertiary learning environments require the paper-and-pencil form of the CCQ-P. Before completing it, in­
a different perspective in comparison with that used in pre-tertiary ed­ structions were read to the participants regarding the way they should
ucation. Thus, the aim of this study is to contribute to this issue by fill in the questionnaire. It is important to note that participants from the
testing the internal validity of CCQ-P across the three pre-tertiary levels middle and high school level were asked to complete the questionnaire
mentioned above. To the best of our knowledge, the CCQ-P has not been based on their experience in general (not only for a specific course or
validated on other populations before, except the Australian sample in teacher). For the primary school level this was not an issue because the
the original study. majority of classes at this level are taught by one teacher. The anonymity
of the student’s answers was highlighted during the instruction pro­
4. Methodology cedure. Also, the teacher was asked to leave the room in order to
eliminate the effect that his/her presence might have over the student’s
4.1. Participants answers. In order to facilitate the application procedure, the items were
read to the second-grade students, and emphasis was placed on the
Our sample was a convenience sample and consisted of 1003 par­ understanding of the way in which such a questionnaire should be
ticipants (47.9 % girls, 49.3 % boys) from all the pre-tertiary educational completed. In all the cases mentioned above, we obtained approval for
levels (342 primary school students in 25 classes belonging to three the students to participate in our study, from both the headmaster of the
schools, 294 middle school students in 25 classes belonging to five school and the teacher.
schools, and 358 high school students in 25 classes belonging to two
schools) with ages ranging from 7 to 20 years (M = 13,03; SD = 3,21). 4.4. Statistical analyses
The schools that participated in this study were from both the urban and
rural area from the West region of Romania. It is also important to The analyses were performed using the package Lavaan (Rosseel,
mention that in the Romanian national educational system, the primary 2012), implemented in R. Individual participants subscale mean was
education level is composed of the grades 1–4. The secondary educa­ used as unit of analysis. Because of the asymmetrical distributions of
tional level consists of middle school from grade 5 to grade 8 and high some of the subscales, especially those obtained from the primary level
school from grade 9–12. A more detailed preview of the participants can participants (skewness between -1.245 and -.543) we decided to use a
be found in Table 3. robust estimator, for incomplete data (MLR). We examined our database
for missing values and eliminated the participants who had more than 5
4.2. Instrument missing answers (N = 13). From the remaining participants, 252 had
incomplete data sets, with an average of 2 answers missing from each
The CCQ-P (Aldridge & Galos, 2018) was used, as described above. set. There were also 9 participants with unspecified level of education. In
The instrument consists of 45 items measured on a 5-point Likert scale these cases, we only used their responses for calculating the general
ranging from “almost never” to “almost every time”. In order to translate mean and standard deviation of each item. Goodness of fit indices were:
the instrument, we used the back-translation procedure. First, the Absolute indices: Chi square goodness of fit statistic, Comparative Fit
original version of the questionnaire was translated into Romanian. This Index (CFI), and relative indices: Root Mean Square Error of Approxi­
version of the questionnaire was then translated back to English by an mation (RMSEA), Root Mean Square Residual (SRMR). For CFI, values
independent translator. The two English versions were compared in over .90 indicate an acceptable fit (Byrne, 2001). For RMSEA, values
order to see if the meaning of the items is similar, and final adjustments under .05 indicate a good fit while values under .08 indicate acceptable
were debated and applied. fit (Browne & Cudeck, 1993). In the case of the SRMR indicator, a good
The CCQ-P in its original form, contains items with response options fit is indicated by values under .08 (Hu & Bentler, 1999).
split into 2 categories: first, for the present moment (the actual situation, High correlations between some subscales of the questionnaire
as it is perceived by the students) and second, options for the preferred determined us to consider the existence of a common source of unex­
situation (how would the students like the situation to be). The analyses plained variance. In order to address this problem, we decided to
conducted by Aldridge and Galson (2018) on the original form of the compare 3 models for participants from all educational levels (primary
questionnaire indicated that the actual version and the preferred version + middle school + high school): a single factor model, a 9 correlated
had similar factorial structures. For this study we only used the version factors model and a bifactorial model. We did this in order to examine
for the actual situation. We considered the actual version to be easier to the dimensionality of the construct as measured by the CCQ-P. The first
complete by the students, especially those belonging to the primary one-factor model assumes the existence of a single general classroom
level. Also, this version of the questionnaire was better suited for our climate factor, that accounts for the variance of the items. The 9
goal of finding a questionnaire to commonly use in order to assess the correlated factor model proposes the existence of 9 separate factors, that
classroom climate at primary, middle school and high school educa­ explain the variance, as stated by the original questionnaire and the
tional levels. Therefore, we considered it to be sufficient for our study. bifactor model implies the existence of both a general classroom climate
general factor as well as the 9 correlated factors. Also, we analyzed the
Table 3 explained common variance (ECV), for the bifactorial model and for the
Participants characteristics. 9 factors model. In addition, for the bifactorial model we calculated the
Variable Educational level
item explained common variance (IECV). Finally, in order to see if the 9
factors model maintains its structure within the 3 studied groups, the
Primary Middle High Unspecified
analysis of invariance was performed.
School School

Age Mean 9.94 12.56 16.5


5. Results
Standard 1.12 1.19 1.39
deviation
Gender Girls 167 140 173 The purpose of the present study is to test for the invariance of the
Boys 163 149 182 CCQ-P across different education cycles. Therefore, we will first present
Unspecified 12 5 3 9 results regarding the internal validity of the CCQ-P on a Romanian
Total 342 294 358 9
student sample, then we will report the results of the invariance

6
M. Cimpian et al. Studies in Educational Evaluation 68 (2021) 100976

analyses. Table 4
Table 4 presents the means and the standard deviations of the 45 Means and standard deviations for all the items of the CCQ-P.
items of CCQ-P, from all educational levels, together with the compo­ No Item Component Level N M SD
nents they belong to. CFA results are shown in Table 5. In the case of the
C 989 4.13 .967
single factor model, the fit indices have poor values and none of them I know what I need to C Primary 332 4.38 .887
falls under the needed parameters. Regarding both the 9 correlated 5 do to complete my C Secondary 289 4.15 .929
factors model and the bifactor model we have good model fit, with a CFI school work.
C
High
352 3.92 .993
of over .90. Between the 2 models there are no significant differences. school
C 993 3.82 1.041
From this point of view the 9-factor model is superior to the single factor C Primary 334 4.30 .871
model, which indicates the presence of a multifaceted construct. The instructions for
14 C Secondary 289 3.85 1.040
tasks are clear.
Correlation values between the 9 factors of the CCQ-P questionnaire High
C 354 3.37 .974
are presented in Table 6. Between some scales we can notice high pos­ school
C 992 3.94 .993
itive correlations (r>.5, p < .05). For example, the Equity scale is
I know how to C Primary 333 4.17 .995
strongly correlated with Student Cohesiveness, Teacher support, and 23 complete tasks C Secondary 292 3.89 .971
Involvement, as well as with Personal Relevance which correlates with successfully High
C 353 3.78 .955
Teacher support, Collaboration and Task orientation. Although the authors school
of the original article mention the natural tendency of the dimensions C 984 3.91 1.017
I understand how to C Primary 330 4.29 .903
that compose the classroom climate to overlap (Aldridge & Galos, 2018), 32 do a good job in my C Secondary 290 3.89 1.020
these strong correlations suggested the existence of a common variance tasks. High
C 352 3.55 .989
source. school
C 978 3.83 1.044
I understand the C Primary 329 4.23 .919
5.1. Alternative models
41 instructions that are C Secondary 285 3.80 1.034
given. High
C 353 3.47 1.028
Table 7 contains standardized factor loadings and explained common school
variance for the 3 models. Regarding factor loadings, the 9 correlated CL 998 2.93 1.066
factors model had a mean loading of .66 with loadings that range be­ CL Primary 334 3.14 1.180
We work in groups (or
9 CL Secondary 293 2.69 .973
tween .01 and .67. This may suggest that when we kept the variance of pairs) in this class.
High
the general factor under control, some factors from the 9 factors model CL 355 2.95 .986
school
did not maintain their specificity. Also, in the bifactor model, the CL 995 3.45 1.230
loadings were higher on the general factor, than on the group factors, for CL Primary 332 4.08 1.054
In this class, there is
18 CL Secondary 292 3.33 1.222
32 items. This supports our argument that these items could belong to teamwork.
High
scales which did not maintain their structure. CL
school
356 2.97 1.133
Concerning the common explained variance, this has dropped for CL 984 3.42 1.154
most of the scales belonging to the bifactorial model (between 2.74 % I work with other
CL Primary 328 3.73 1.201
and 7.80 %), in contrast to that of the scales belonging to the 9 factors 27 CL Secondary 293 3.41 1.087
students.
High
model (between 6.61 % and 13.43 %). In the case of the bifactor model CL
school
353 3.15 1.096
58 % of the ECV was explained by the general factor, and the remaining CL 994 3.73 1.097
43 % by the remaining 9 factors. This indicates that the model contains I share with other CL Primary 333 3.96 1.097
scales that could be better explained by the general factor, which had a 36 students when doing CL Secondary 291 3.75 .963
class work. High
drop in both ECV and loadings of specific items. Such scales are Teacher CL 355 3.47 1.153
school
support (ECV from 13.43 % to 2.70 %), Equity (ECV from 10.36 % to 3.74 CL 995 3.50 1.263
%) and Involvement (ECV from 9.39 % to 3.74 %). There are also scales Working with other CL Primary 333 3.97 1.202
which maintain their factorial structure when it is controlled for the 45 students helps me to CL Secondary 293 3.49 1.212
general factor (Student Cohesiveness, Task orientation, Personal Relevance, learn. High
CL 355 3.06 1.201
school
Responsibility for learning). These scales contain items which have high E 983 3.76 1.220
loadings for both general factor and specific factors. The IECV allowed E Primary 333 4.00 1.211
I get as much say as
us to check which item’s responses are accounted for, at an individual 3 E Secondary 287 3.69 1.146
other students.
level, by the variation of the general dimension alone (Stucky, Thissen, E
High
347 3.59 1.267
school
& Edelen, 2013). The items with the highest IECV were Item 11 (The
E 997 3.27 1.336
teacher cares about my feelings, IECV = 0.87), 20 (The teacher listens to me, I get the same
E Primary 334 4.04 1.141
IECV = 0.88), 29 (The teacher is interested in how I am going, IECV = 0.86) encouragement from
12 E Secondary 291 3.00 1.223
the teachers as other
belonging to the Teacher Support scale, and items 3 (I get as much say as students do. E
High
355 2.77 1.263
other students, IECV = 0.88) and 12 (I get the same encouragement from the school
E 992 3.87 1.169
teachers as other students do, IECV = 0.99) belonging to the Equity scale. I get the same
E Primary 333 4.01 1.176
opportunity to ask
21 E Secondary 289 3.81 1.164
questions as other
5.2. Invariance analyses High
students. E 355 3.79 1.154
school
E 991 3.93 1.094
For the 9 factors model we analyzed the invariance of the model I get the same
E Primary 335 4.15 1.093
across the educational levels. The results are shown in Table 8. When we 30
opportunity to take
E Secondary 291 3.85 1.159
part in discussions as
estimated the models together, the CFI was low, although the RMSEA High
other students E 351 3.77 1.006
was good (CFI = .88, RMSEA = .04). This led us to further analyze the school
RMSEA of the null model which had a value of 0.141. For the RMSEA of I get the same E 990 3.84 1.135
opportunity to E Primary 332 4.09 1.105
the null model, a value of less than 0.158 indicates that the incremental 39
answer questions as E Secondary 291 3.82 1.157
fit indices become distorted (Kenny, 2015). Thus, in order to analyze the other students E 353 3.63 1.105
fit of the models and the measurement invariance, we will carefully (continued on next page)
interpret the incremental indices alongside with the RMSEA and delta

7
M. Cimpian et al. Studies in Educational Evaluation 68 (2021) 100976

Table 4 (continued ) Table 4 (continued )


No Item Component Level N M SD No Item Component Level N M SD

High High
school school
I 991 3.16 1.248 SC 999 4.26 .932
I Primary 331 3.45 1.344 SC Primary 335 4.31 .911
I get on well with
6 I discuss ideas in class I Secondary 291 3.12 1.206 1 SC Secondary 292 4.36 .844
students in this class.
High High
I 355 2.93 1.137 SC 355 4.14 .986
school school
I 993 3.17 1.286 SC 994 4.19 1.097
I give my opinions I Primary 334 3.46 1.316 SC Primary 333 4.29 1.095
Students in this class
15 during class I Secondary 291 3.12 1.226 10 SC Secondary 290 4.33 1.023
are my friends
discussions High High
I 352 2.93 1.251 SC 354 3.97 1.133
school school
I 978 3.61 1.045 SC 989 4.12 1.065
I Primary 330 3.87 1.064 SC Primary 330 4.43 .960
The teacher asks me I get to know the
24 I Secondary 286 3.55 1.007 19 SC Secondary 292 4.32 .864
questions students in this class.
High High
I 352 3.44 1.005 SC 355 3.68 1.147
school school
I 977 3.03 1.177 SC 987 3.87 1.114
My ideas are used I Primary 326 3.49 1.192 SC Primary 327 3.93 1.130
Students in this class
33 during classroom I Secondary 291 2.96 1.109 28 SC Secondary 291 3.79 1.127
are nice to me.
discussions High High
I 349 2.66 1.070 SC 355 3.87 1.089
school school
I 980 3.47 1.152 SC 990 4.09 1.173
I Primary 330 3.77 1.135 SC Primary 334 4.47 .973
I explain my ideas to I feel welcome in this
42 I Secondary 286 3.40 1.137 37 SC Secondary 292 4.09 1.171
other students class
High High
I 352 3.24 1.121 SC 352 3.74 1.240
school school
PR 999 3.28 1.300 TO 989 4.38 .918
PR Primary 335 4.20 .922 Getting my work TO Primary 333 4.74 .628
I use what I learn in
8 PR Secondary 292 3.30 1.157 7 done is important to TO Secondary 288 4.38 .911
my everyday life.
High me High
PR 356 2.39 1.083 TO 353 4.08 1.017
school school
I can make PR 995 3.15 1.317 TO 995 3.54 1.304
connections between PR Primary 335 4.03 1.054 I work hard even if I TO Primary 335 3.95 1.288
17 what I learn in this PR Secondary 288 3.09 1.187 16 do not like what I am TO Secondary 292 3.65 1.197
class to my life High doing. High
PR 355 2.39 1.130 TO 352 3.09 1.243
outside of school. school school
PR 987 3.61 1.323 TO 992 4.09 1.017
PR Primary 332 4.62 .721 TO Primary 331 4.57 .732
What I learn in this I pay attention during
26 PR Secondary 292 3.60 1.213 25 TO Secondary 293 4.12 1.025
class is useful. class.
High High
PR 352 2.62 1.002 TO 355 3.63 1.026
school school
PR 995 3.42 1.364 TO 984 4.28 .900
What I learn is PR Primary 334 4.47 .919 TO Primary 330 4.60 .782
I try to understand the
35 important to my life PR Secondary 293 3.47 1.221 34 TO Secondary 288 4.32 .857
work.
outside of school. High High
PR 353 2.39 1.028 TO 353 3.97 .916
school school
PR 985 3.19 1.321 TO 981 4.12 .991
I use what I learn in PR Primary 330 4.16 .999 TO Primary 330 4.39 .946
I know how much
44 my life outside of PR Secondary 292 3.17 1.196 43 TO Secondary 287 4.24 .891
work I have to do.
school. High High
PR 352 2.27 .998 TO 353 3.80 1.001
school school
R 992 3.67 1.146 TS 990 3.59 1.114
R Primary 334 3.80 1.191 TS Primary 334 4.14 1.087
I am expected to work The teacher helps me
4 R Secondary 290 3.66 1.187 2 TS Secondary 290 3.51 1.066
independently with my work.
High High
R 352 3.57 1.041 TS 351 3.16 .957
school school
R 991 3.49 1.174 TS 986 3.12 1.491
R Primary 334 3.85 1.148 TS Primary 334 4.51 .844
I am given The teacher cares
13 R Secondary 289 3.39 1.251 11 TS Secondary 287 2.70 1.285
responsibility about my feelings
High High
R 353 3.23 1.054 TS 349 2.17 1.185
school school
R 993 4.14 .980 TS 986 3.82 1.166
R Primary 333 4.24 1.032 TS Primary 330 4.32 1.011
I am expected to think The teacher listens to
22 R Secondary 291 4.10 .967 20 TS Secondary 292 3.75 1.168
for myself me.
High High
R 356 4.08 .928 TS 352 3.41 1.126
school school
R 980 3.79 1.090 TS 991 3.58 1.228
I am given the R Primary 332 3.97 1.118 The teacher is TS Primary 333 4.38 .900
31 opportunity to be R Secondary 288 3.74 1.062 29 interested in how I am TS Secondary 290 3.41 1.176
independent High going. High
R 349 3.65 1.060 TS 353 2.97 1.133
school school
R 986 3.73 1.123 TS 983 3.77 1.163
I am encouraged to R Primary 329 4.00 1.144 The teacher helps me TS Primary 333 4.43 .924
40
work independently. R Secondary 291 3.69 1.133 38 to understand my TS Secondary 287 3.69 1.185
R 354 3.51 1.049 work. High
TS 352 3.21 1.039
school

8
M. Cimpian et al. Studies in Educational Evaluation 68 (2021) 100976

Note: C = Clarity of instructions, CL = Collaboration, E = Equity, I = Involve­ three educational levels. The 9 factors model proved to have a good fit,
ment, PR = Personal Relevance, SC = Student Cohesiveness, TO = Task although our analysis suggested that there might be a source of common
Orientation, TS = Teacher Support. variance that generated high correlations between some factors. This
common variance is not an unusual situation when talk about classroom
Table 5 climate because the dimensions that compose the concept have a natural
Fit indices for the CCQ-P. tendency to overlap (Aldridge & Galos, 2018). Thus, the present study
Model Robust chi^2 Df Robust RMSEA SRMR Robust CFI brings additional information regarding the way in which the ques­
Single-factor 6095.275** 945 0.083 [.081–.085] .076 .670
tionnaire is structured, and regarding the way it can be used at the three
Nine-factors 2228.817** 909 0.042 [.040–.044] .051 .914 school levels on Romanian population. In addition to this, we could not
Two-factors 2214.534** 990 0.042 [.00–.044] .050 .918 find another instrument that assessed classroom climate at primary,
**
p<.01.
middle school and high school educational levels, on the same popula­
tion. Consequently, our study offers the premise for a comprehensive
perspective on classroom climate.
RMSEA indices. From this point of view, the results suggest a good
The ECV of the model dropped for all scales when accounting for the
configural invariance. Therefore, we can point out that the question­
general factor. This means that some scales maintained their factorial
naire holds its factorial structure for the primary, middle school and
structures, while some scales dropped both in ECV and loadings and
high school educational level. When we added the constraint of the
could have been influenced by a source of common variance. The scales
loadings, the model did not change significantly (Delta CFI = .004, Delta
that obtained the most notable differences regarding the mean loading
CFI < .01, Delta RMSEA = .00, Delta RMSEA < .01), which indicates a
of the items, were: Teacher Support (mean loadings dropped from 0.74 to
good metric invariance. The constraints of the intercepts, however,
0.33, when the general factor was added), Equity (mean loadings drop­
caused a discrepancy regarding the delta CFI and the delta RMSEA.
ped from 0.64 to 0.33, when the general factor was added) and
(Delta CFI = .045, Delta CFI > .01, Delta RMSEA = .006, Delta RMSEA <
Involvement (mean loadings dropped from 0.61 to 0 37, when the general
.01). In this case, although the delta RMSEA indicated a good scalar
factor was added).When we analyzed these scales, we noticed that they
invariance, the differences in the CFI values and the partial invariance
contain items that explicitly state the involvement of the teacher (e.g.,
results indicate that the model changes. Because the value of the CFI is
The teacher listens to me). The items that changed their loadings signifi­
biased in these models, the fact that different indices provided divergent
cantly are the following: item 11. (The teacher cares about my feelings -
conclusions was difficult to interpret.
Teacher support), item 29. (The teacher is interested in how I am going -
Teacher support), item 33 (My ideas are used during classroom discussions -
6. Discussions
Involvement), item 12 (I get the same encouragement from the teachers as
other students do - Equity). Items 11, 20, 29 also have high loadings on the
There are many high-quality assessment instruments that could be
general factor and on IECV, thus reflecting the content of the general
used to investigate the classroom climate (Fraser, 1998). Unfortunately,
dimension (Stucky & Edelen, 2015). It can be noticed that all these
all of these instruments present almost different versions for different
items, besides the explanations offered by the description of each scale,
educational levels (e.g., primary, middle or high school). Thus, in­
indicate a relational implication of teacher. They also suggest an open
vestigations of the classroom climate across different educational levels
communication of feelings and opinions, contrary to a focus on student
(without impaling methodological issues) are hard to make (Alansari &
results (e.g., item 29: The teacher helps me with my work. - Teacher sup­
Rubie-Davies, 2019; Fraser, 1989). This situation limits our under­
port). This could indicate that a possible source of item variance is
standing (Skinner et al., 2003) about classroom climate and its impli­
represented by the students’ needs (e.g., to be encouraged by the teacher
cations. Consequently, Alansari and Rubie-Davies (2019) presented the
to communicate ideas and feelings that are not only related to school
necessity to have suitable and relevant tools across different educational
tasks) and/or changes in how teachers approach the relationship with
levels as an important challenge in the field. In this study, we addressed
their students across different educational cycles (e.g., primary school
this issue by testing the internal validity of CCQ-P (Aldridge & Galos,
teachers could consider that their students need more attention in
2018) on Romanian population at three educational levels: primary,
comparison with middle or high school teachers), rather than by a
middle school and high school. We opted for the CCQ-P because it has
general factor. Also, following the testing the bifactor model, only some
good evidence of factorial validity and, in the same time, it also touches
these scales failed to maintain their factorial structure. Therefore, we
the most important components used to describe human environments
argue that the source of common variance might be caused by particu­
(Moos, 1974). Moreover, CCQ-P used the personal evaluation form
larities of these scales, not by the existence of a general factor influ­
which is preferable in comparison whit the general classroom evaluation
encing the whole model. Thus, we decided that the 9 factors model is a
form to assess the classroom climate (Fraser & Tobin, 1991; Fraser et al.,
more accurate representation than the unique factor model. Therefore,
1996).
our analysis provides evidences that sustain the interpretation of class­
The factor structure of the CCQ-P has not been verified until now on
room climate by creating scores for each its attribute. Using 9 scores
any other population. Our results supported the factorial validity of the
(one for each attribute of the concept) to compare different classroom
CCQ-P, when we analyze the responses provided by students from all
climate seems to be a better option in comparison with using of a single

Table 6
Correlation matrix of the 9 factors composing the CCQ-P.
Variable SC TS E R C I TO PR CL

SC 1
TS .389** 1
E .504** .608** 1
R .239** .347** .393** 1
C .316** .533** .487** .396** 1
I .367** .500** .546** .383** .446** 1
TO .258** .466** .365** .331** .479** .355** 1
PR .307** .651** .399** .308** .524** .469** .531** 1
CL .475** .467** .446** .252** .326** .454** .305** .441** 1

9
M. Cimpian et al.
Table 7
Standardized factor loading and explained common variance.
Nine factors Bifactor

Item G SC TS E R C I TO PR CL G SC TS E R C I TO PR CL

1 .37 .72 .35 .68


10 .42 .75 .40 .66
19 .54 .66 .53 .40
28 .39 .70 .40 .56
37 .52 .78 .53 .56
2 .57 .64 .54 .41
11 .70 .75 .68 .26
20 .67 .73 .68 .25
29 .73 .79 .72 .28
38 .72 .80 .71 .45
3 .44 .51 .45 .16
12 .62 .59 .64 .02
21 .50 .68 .51 .55
30 .55 .71 .57 .38
39 .57 .75 .59 .57
4 .14 .36 .13 .65
13 .47 .58 .48 .25
22 .21 .39 .20 .53
31 .44 .59 .45 .25
40 .44 .63 .45 .38
5 .49 .63 .46 .45
10

14 .61 .66 .60 .27


23 .51 .65 .49 .44
32 .62 .78 .61 .50
41 .63 .75 .62 .43
6 .47 .62 .45 .56
15 .46 .64 .46 .56
24 .44 .51 .44 .21
33 .61 .70 .61 .28
42 .51 .62 .52 .25
7 .50 .66 .46 .50
16 .36 .49 .32 .42
25 .60 .74 .58 .40
34 .47 .59 .46 .38
43 .47 .58 .45 .38

Studies in Educational Evaluation 68 (2021) 100976


8 .65 .79 .58 .55
17 .69 .77 .63 .43
26 .70 .83 .64 .53
35 .74 .87 .68 .55
44 .74 .88 .67 .57
9 .26 .45 .24 .51
18 .54 .71 .53 .52
27 .51 .74 .50 .58
36 .46 .58 .46 .30
45 .53 0.62 .53 .26
ECV 12.69 13.43 10.3 6.61 11.81 9.39 14.96 16.75 9.63 57.98 7.83 2.70 3.74 4.52 4.20 3.73 4.10 6.45 4.75
(%)
M. Cimpian et al. Studies in Educational Evaluation 68 (2021) 100976

Table 8 measures, ii) compare the perceptions of the teachers to the feedback
Invariance indices for group analysis (primary, middle and high school). obtained from the children, iii) evaluate the effectiveness of educational
Measure Configural Metric Scalar Strict and curricular innovations. Now, these implications of the assessment of
Invariance Invariance Invariance Invariance the classroom climate could be extended even beyond the context of one
CFI .88 .87 .83 .77 single educational level.
Delta NA .004 .04 .05
CFI
RMSEA .042 .042 .048 .056 6.2. Limits

There are a few limitations that must be taken into account when
general score.
interpreting the results of our study. As we showed in the results section,
When we inspected the invariance of the model, we found good
the CCQ-P reported an invariant structure across primary, elementary
configural invariance. The model with 9 correlated factors obtained
and high school levels, except for the scalar invariance (which provided
acceptable fit indices when it was tested for each group as well as when
inconclusive results).
the groups were considered together. The metric invariance also sug­
Another limit is represented by the lack of questionnaires validated
gests that the loadings of the items remained similar for all 3 tested
on Romanian population, that asses the concept of classroom climate.
groups. This means that the CCQ constructs are measured in a similar
This prevented us to check if the results that we obtained from our
manner at different educational levels, and that one can compare the
survey correlated with the results of other questionnaire that measure
results obtained in studies that used students from different educational
similar concepts. Because of this we were not able to study the conver­
levels. Scalar invariance is needed in order to compare the latent means
gent validity of the CCQ-P.
(Milfont & Fischer, 2010). Therefore, if scalar invariance does not hold,
A third limit that should be considered, is the generalization of our
one cannot compare the responses of students from different pre-tertiary
results. We found promising results for the usage of the CCQ-P on all
levels. Unlike the results for the configural and for the metric invariance,
three educational levels (primary school, middle school and high
we found inconclusive results regarding the scalar invariance. On the
school), but it should not be forgotten that our study only includes the
one hand, some indices supported the scalar invariance of the CCQ at
Romanian population. Moreover, the sample that we used was a con­
different educational levels, while other indices suggested that the in­
venience sample not a random representative sample, so the results
tercepts were not equal. Therefore, we cannot formulate strong con­
should not be generalized beyond the scope of this study. In addition, in
clusions regarding the equivalence of the CCQ factor means, and we
this research we focused on a single possible source of measurement
encourage future research studies to conduct further investigations of
variance (i.e., the education cycles). Future research could also inves­
this matter.
tigate whether CCQ-P is invariant from one classroom to another (i.e.,
The need for more evidence regarding the invariance of the CCQ
within the same school) or from one school to another.
throughout different educational levels is highlighted by the evidence
A final limitation could be related to the phrasing and the adaptation
provided by the Theory of Mind (ToM). Theory of mind is the ability to
of the questionnaire. At the primary level, the classes are taught by one
understand mental states, and it involves inferences about others’
teacher but when students transition to middle school and high school
cognitive (cognitive theory of mind) and emotional (affective theory of
each discipline is becoming a subject on its own and it is taught by a
mind) mental states (Vetter, Altgassen, Phillips, Mahy, & Kliegel, 2013).
separate teacher. This posed the question of analyzing classroom climate
In turn, affective ToM is conceptualized as cognitive ToM and empathy
as a subject-specific concept or a general classroom concept. As Evans
(Shamay-Tsoory, Harari, Aharon-Peretz, & Levkovitz, 2010). As
et al. (2009) states, the literature points to three differentiable compo­
behavioral and neuroimaging evidence suggest, affective ToM devel­
nents: the academic component (pedagogical and curricular elements),
opment does not only happen during childhood, similar to cognitive
the management (discipline styles of maintaining order) and the
ToM, but spans over the adolescence period (Vetter et al., 2013). This
emotional component (affective interactions between the classroom).
could result in a higher difficulty for children to understand complex
These components often overlap, and so we decided to treat the class­
emotional states. Items such as the teacher cares about my feelings and the
room climate as a general concept. In this manner, for the middle school
teacher is interested in how I am going do not only require the observation
and high school versions we transformed the items that contained the
of direct explicit behaviors, but also subtle inferences on how the teacher
singular “teacher” to the plural “teachers” (e.g., The teachers ask me
feels. Therefore, such items could require psychological capacities
questions). When searching the literature, we could not find articles
described by ToM that are not developed in primary school.
comparing the general classroom climate to the climate recorded for a
certain subject. We propose that future research could be done in this
6.1. Implications direction.

The present study could have implications for research and for 7. Conclusions
practice in the field. First, our study confirmed the CCQ-P to have
adequate internal validity for three pre-tertiary educational levels (i.e., Our study was designed in order to address the problem of parallel
primary, middle or high school). Thus, further research that aims to instruments, that slow down the research process, in the field that
investigate the problem of classroom climate in the context of students’ studies classroom climate. In order to do this, we chose the CCQ-P and
transition through education levels could be more easily approached. analyzed its internal structure on three educational levels: primary,
Second, the present study presented the first validation of the CCQ-P on middle school and high school. This is a first cultural adaptation of the
a different population (i.e., Romanian) to that on which it was developed CCQ-P and we confirmed the original model of CCQ-P on the three
(i.e., Australian, Aldridge & Galos, 2018). This evidence could be educational levels on which it was investigated. We also conducted
interpreted as a premise for the utility of this scale to carry out measurement invariance analyses to check if the structure of the ques­
cross-cultural studies. tionnaire varies between levels. We suggest that future research could
Beyond the contribution that our study can make to the field of take into account the possibility of validating of the CCQ-P across
research on classroom climate, we also consider it to have practical educational levels for other populations, as well as testing the conver­
implications. As Fraser (2012a) states, the assessment of classroom gent validity of the questionnaire. Also, a closer look should be taken
climate is important in order to: i) provide information about the regarding the interpretation of classroom climate as general classroom
classroom, subtler than that obtained from student learning outcome climate and/or as a case of subject-specificity.

11
M. Cimpian et al. Studies in Educational Evaluation 68 (2021) 100976

Funding New Zealand Journal of Social Sciences Online, 4, 131–146. https://doi.org/10.1080/


1177083X.2009.9522449.
Fraser, B. J. (1989). Twenty years of classroom climate work: Progress and prospect.
This work was partially supported by three grants of the Romanian Journal of Curriculum Studies, 21, 307–327. https://doi.org/10.1080/
Ministry of National Education, CNFIS-UEFISCDI, projects number 0022027890210402.
CNFIS-FDI-2017-0518, CNFIS-FDI-2018-0063 and CNFIS-FDI-2019- Fraser, B. J. (2019). Milestones in the evolution of the learning environments field over
the past three decades. In D. Zandvliet, & B. J. Fraser (Eds.), Thirty years of learning
0408. This organization had no role in the design and implementation environments: Looking back and looking forward (pp. 1–18). Leiden: Brill Sense.
of the study. Fraser, B. J. (2012a). Classroom environment (vol. 234). London and New York:
Routledge, Taylor and Francis Group.
Fraser, B. J. (2012b). Classroom learning environments: Retrospect, context and
Appendix A. Supplementary data prospect. In B. Fraser, K. Tobin, & C. McRobbie (Eds.), Second International handbook
of science education (pp. 1191–1239). Dordrecht: Springer.
Supplementary material related to this article can be found, in the Fraser, B. J., & Fisher, D. L. (1982). Predictive validity of my class inventory. Studies in
Educational Evaluation, 8, 129–140. https://doi.org/10.1016/0191-491X(82)90004-
online version, at doi:https://doi.org/10.1016/j.stueduc.2021.100976. 9.
Fraser, B. J., & Lee, S. S. (2009). Science laboratory classroom environments in Korean
References high schools. Learning Environments Research, 12, 67–84. https://doi.org/10.1007/
s10984-008-9048-1.
Fraser, B. J., & Tobin, K. (1991). Combining qualitative and quantitative methods in
Alansari, M., & Rubie-Davies, C. (2019). What about the tertiary climate? Reflecting on
classroom environment research. In B. J. Fraser, & H. J. Walberg (Eds.), Educational
five decades of class climate research. Learning Environments Research, 1–25. https://
environments: Evaluation, antecedents and consequences (pp. 271–292). London:
doi.org/10.1007/s10984-019-09288-9.
Pergamon.
Aldridge, J. M., & Galos, S. (2018). Development and validation of an instrument to
Fraser, B. J., & Treagust, D. F. (1986). Validity and use of an instrument for assessing
assess primary school students’ perceptions of the learning environment. Learning
classroom psychosocial environment in higher education. Higher Education, 15,
Environments Research, 21, 349–368. https://doi.org/10.1007/s10984-017-9248-7.
37–57. https://doi.org/10.1007/BF00138091.
Aldridge, J. M., Dorman, J. P., & Fraser, B. J. (2004). Use of multitrait-multimethod
Fraser, B. J. (1998). Classroom environment instrument: Development, validity and
modelling to validate actual and preferred forms of the technology-rich outcomes-
applications. Learning Environment Rsearch, 1, 7–33. https://doi.org/10.1023/A:
focused learning environment inventory (TROFLEI). Australian Journal of Educational
1009932514731.
& Developmental Psychology, 4, 110–125. https://doi.org/10.1007/s10984-017-
Fraser, B. J., McRobbie, C. J., & Giddings, G. J. (1993). Development and cross-national
9248-7.
validation of a laboratory classroom environment instrument for senior high school
Aldridge, J., Fraser, B., & Ntuli, S. (2009). Utilising learning environment assessments to
science. Science Education, 77, 1–24. https://doi.org/10.1002/sce.3730770102.
improve teaching practices among in-service teachers undertaking a distance-
Fraser, B. J., Aldridge, J. M., & Soerjaningsih, W. (2010). Instructor-student interpersonal
education programme. South African Journal of Education, 29(2), 147–170. Retrieved
interaction and student outcomes at the university level in Indonesia. The Open
from https://www.ajol.info/index.php/saje/article/view/44147.
Education Journal, 3, 21–33. https://doi.org/10.2174/1874920801003010021.
Aldridge, J. M., Fraser, B. J., Taylor, P. C., & Chen, C. C. (2000). Constructivist learning
Fraser, B. J., McRobbie, C. J., & Fisher, D. L. (1996). Development, validation and use of
environments in a crossnational study in Taiwan and Australia. International Journal
personal and class forms of a new classroom environment instrument. Paper
of Science Education, 22, 37–55. https://doi.org/10.1080/095006900289994.
Presented at the Annual Meeting of the American Educational Research Association, New
Barr, J. J. (2016). Developing a positive classroom climate. IDEA Paper #61, October 2016,
York.
1-9. Retrieved from https://eric.ed.gov/?id=ED573643.
Gunawardena, K. M. (1988). Technology-rich learning environments in New Zealand
Barth, J. M., Dunlap, S. T., Dane, H., Lochman, J. E., & Wells, K. C. (2004). Classroom
ITPs. The National Advisory Committee on Computing Qualifications was formed in 1988,
environment influences on aggression, peer relations, and academic focus. Journal of
35.
School Psychology, 42, 115–133. https://doi.org/10.1016/j.jsp.2003.11.004.
Holder, M. D., & Coleman, B. (2009). The contribution of social relationships to
Bell, L. M., & Aldridge, J. M. (2014). Investigating the use of student perception data for
children’s happiness. Journal of Happiness Studies, 10, 329–349. https://doi.org/
teacher reflection and classroom improvement. Learning Environments Research, 17,
10.1007/s10902-007-9083-0.
371–388. https://doi.org/10.1007/s10984-014-9164-z.
Houston, L. S., Fraser, B. J., & Ledbetter, C. E. (2008). An evaluation of elementary
Benson, J. (2012). An investigation into the effectiveness of an IT-based learning management
school science kits in terms of classroom environment and student attitudes. Journal
system to support learning in a New Zealand primary school. Doctoral dissertation,
of Elementary Science Education, 20(4), 29–47. https://doi.org/10.1007/
Retrieved from. Curtin University https://espace.curtin.edu.au/handle/20.500.11
BF03173675.
937/2400.
Hu, L. T., & Bentler, P. (1999). Cutoff criteria for fit indexes in covariance structure
Browne, M. W., & Cudeck, R. (1993). Alternative ways of assessing model fit. In
analysis: Conventional criteria versus new alternatives. Structural Equation Modeling:
M. W. Brown, & R. Cudek (Eds.), Testing structural equations models (pp. 136–163).
A Multidisciplinary Journal, 6, 1–55. https://doi.org/10.1080/10705519909540118.
California, USA: Sage focus editions.
Ilie, M. D. (2014). An adaption of Gagné’s instructional model to increase the teaching
Burnett, P. C., & Dart, B. C. (1997). Conventional versus confirmatory factor analysis:
effectiveness in the classroom: The impact in Romanian Universities. Educational
Methods for validating the structure of existing scales. Journal of Research and
Technology Research and Development, 62, 767–794. https://doi.org/10.1007/
Development in Education, 30, 126–132. Retrieved from https://www.http://eprints.
s11423-014-9353-6.
qut.edu.au/27860/1/c27860.pdf.
Ilie, M. D., Crașovan, M., Petrescu, P., Matichescu, M. L., Sârbescu, P., & Florea, C. I.
Byrne, B. M. (2001). Structural equation modeling with Amos: Basic concepts, applications,
(2014). EU citizen: Handbook of instructional strategies on evidence-based foundation for
and programming. Mahwah, New Jersey: Erlbaum.
teaching in primary schools. Cluj-Napoca: Eikon.
Charalampous, K., & Kokkinos, C. M. (2017). The Greek elementary “What Is Happening
Johnson, B., & McClure, R. (2004). Validity and reliability of a shortened, revised version
In this Class?” (G-EWIHIC): A three-phase multi-sample mixed-methods study.
of the Constructivist Learning Environment Survey (CLES). Learning Environments
Studies in Educational Evaluation, 52, 55–70. https://doi.org/10.1016/j.
Research, 7, 65–80. https://doi.org/10.1023/B:LERI.0000022279.89075.9f.
stueduc.2016.12.005.
Kawachi, I., & Berkman, L. F. (2001). Social ties and mental health. Journal of Urban
Chionh, Y. H., & Fraser, B. J. (2009). Classroom environment, achievement, attitudes and
Health, 78, 458–467. https://doi.org/10.1093/jurban/78.3.458.
self-esteem in geography and mathematics in Singapore. International Research in
Kenny, D. A. (2015). Measuring model fit. Retrieved from http://davidakenny.net/cm/fit.
Geographical and Environmental Education, 18, 29–44. https://doi.org/10.1080/
htm.
10382040802591530.
Lee, S. S., Fraser, B. J., & Fisher, D. L. (2003). Teacher–student interactions in Korean
Chua, K. E., & Karpudewan, M. (2017). The role of motivation and perceptions about
high school science classrooms. International Journal of Science and Mathematics
science laboratory environment on lower secondary students’ attitude towards
Education, 1, 67–85. https://doi.org/10.1023/A:1026191226676.
science. Asia-Pacific Forum on Science Learning and Teaching, 18(2), 1–16. Retrieved
Lightburn, M. E., & Fraser, B. J. (2007). Classroom environment and student outcomes
from https://bit.ly/2qO6GZp.
among students using anthropometry activities in high-school science. Research in
Dart, B., Burnett, P., Boulton-Lewis, G., Campbell, J., Smith, D., & McCrindle, A. (1999).
Science & Technological Education, 25, 153–166. https://doi.org/10.1080/
Classroom learning environments and students’ approaches to learning. Learning
02635140701250576.
Environments Research, 2, 137–156. https://doi.org/10.1023/A:1009966107233.
Logan, K. A., Crump, B. J., & Rennie, L. J. (2006). Measuring the computer classroom
Dorman, J. P. (2003). Cross-national validation of the What Is Happening In this Class?
environment: Lessons learned from using a new instrument. Learning Environments
(WIHIC) questionnaire using confirmatory factor analysis. Learning Environments
Research, 9, 67–93. https://doi.org/10.1007/s10984-005-9004-2.
Research, 6, 231. https://doi.org/10.1023/A:1027355123577.
Majeed, A., Fraser, B. J., & Aldridge, J. M. (2002). Learning environment and its
Dorman, J. P. (2008). se of multitrait-multimethod modelling to validate actual and
association with student satisfaction among mathematics students in Brunei
preferred forms of the What Is Happening In this Class? (WIHIC) questionnaire.
Darussalam. Learning Environments Research, 5, 203–226. https://doi.org/10.1023/
ZZZZZ, 2021Dorman, J. P.Learning Environments Research, 11, 179–193. Retrieved
A:1020382914724.
from https://www.
Martin-Dunlop, C., & Fraser, B. J. (2008). Learning environment and attitudes associated
Earle, J. E., & Fraser, B. J. (2017). Evaluating online resources in terms of learning
with an innovative science course designed for prospective elementary teachers.
environment and student attitudes in middle-grade mathematics classes. Learning
International Journal of Science and Mathematics Education, 6, 163–190. https://doi.
Environments Research, 20, 339–364. https://doi.org/10.1007/s10984-016-9221-x.
org/10.1007/s10763-007-9070-2.
Evans, I. M., Harvey, S. T., Buckley, L., & Yan, E. (2009). Differentiating classroom
Meredith, W. (1993). Measurement invariance, factor analysis and factorial invariance.
climate concepts: Academic, management, and emotional environments. Kōtuitui:
Psychometrika, 58, 525–543. https://doi.org/10.1007/BF02294825.

12
M. Cimpian et al. Studies in Educational Evaluation 68 (2021) 100976

Milfont, T. L., & Fischer, R. (2010). Testing measurement invariance across groups: Skordi, P., & Fraser, B. J. (2019). Validity and use of the What Is Happening In this Class?
Applications in cross-cultural research. International Journal of Psychological (WIHIC) questionnaire in university business statistics classrooms. Learning
Research, 3, 111–121. https://doi.org/10.21500/20112084.857. Environments Research, 22, 275–295. https://doi.org/10.1007/s10984-018-09277-4.
Mink, D. V., & Fraser, B. J. (2005). Evaluation of a K–5 mathematics program which Stucky, B. D., & Edelen, M. O. (2015). Using heterarchical IRT models to create
integrates children’s literature: Classroom environment and attitudes. International unidimensional measures from multidimensional data. In S. P. Reise, & D. A. Revicki
Journal of Science and Mathematics Education, 3, 59–85. https://doi.org/10.1007/ (Eds.), Handbook of item response theory modeling: Applications to typical performance
s10763-004-2975-0. assessment (pp. 183–206). New York: Routledge.
Moos, R. H. (1974). The social climate scales: An overview. Palo Alto, CA: Consulting Stucky, B. D., Thissen, D., & Edelen, M. O. (2013). Using logistic approximations of
Psychologists Press. marginal trace lines to develop short assessments. Applied Psychological Measurement,
Nair, C. S., & Fisher, D. L. (1999). A learning environment study of tertiary classrooms. 37, 41–57. https://doi.org/10.1177/0146621612462759.
Proceedings Western Australian Institute for Educational Research Forum 1999. http:// Taylor, P. C., & Fraser, B. J. (1991). CLES: An instrument for assessing constructivist
www.waier.org.au/forums/1999/nair.html. learning environments. April Paper Presented at the Annual Meeting of the National
Ogbuehi, P. I., & Fraser, B. J. (2007). Learning environment, attitudes and conceptual Association for Research in Science Teaching (NARST).
development associated with innovative strategies in middle-school mathematics. Trochim, W. M., & Donnelly, J. P. (2006). The research methods knowledge base (3rd ed.).
Learning Environments Research, 10, 101–114. https://doi.org/10.1007/s10984-007- Cincinnati: Atomic Dog.
9026-z. Vandenbroucke, L., Spilt, J., Verschueren, K., Piccinin, C., & Baeyens, D. (2018). The
Passini, S., Molinari, L., & Speltini, G. (2015). A validation of the Questionnaire on classroom as a developmental context for cognitive development: A meta-analysis on
Teacher Interaction in Italian secondary school students: The effect of positive the importance of teacher–student interactions for children’s executive functions.
relations on motivation and academic achievement. Social Psychology of Education, Review of Educational Research, 88, 125–164. https://doi.org/10.3102/
18, 547–559. https://doi.org/10.1007/s11218-015-9300-3. 0034654317743200.
Peer, J. (2011). Gender, grade-level and stream differences in learning environment and Velayutham, S., & Aldridge, J. M. (2013). Influence of psychosocial classroom
student attitudes in primary science classrooms in Singapore. Doctoral thesis, Retrieved environment on students’ motivation and self-regulation in science learning: A
from. Perth, Australia: Curtin University http://hdl.handle.net/20.500.11937/1158. structural equation modeling approach. Research in Science Education, 43, 507–527.
Rentoul, A. J., & Fraser, B. J. (1979). Conceptualization of enquiry-based or open https://doi.org/10.1007/s11165-011-9273-y.
classroom learning environments. Journal of Curriculum Studies, 11, 233–245. Velayutham, S., Aldridge, J. M., & Fraser, B. J. (2011). Development and validation of an
https://doi.org/10.1080/0022027790110306. instrument to measure students’ motivation and self-regulation in science learning.
Roland, E., & Galloway, D. (2002). Classroom influences on bullying. Educational International Journal of Science Education, 33, 2159–2179. https://doi.org/10.1080/
Research, 4, 299–312. https://doi.org/10.1080/0013188022000031597. 09500693.2010.541529.
Rosseel, Y. (2012). Lavaan: An R package for structural equation modeling and more. Vetter, N. C., Altgassen, M., Phillips, L., Mahy, C. E., & Kliegel, M. (2013). Development
Version 0.5–12 (BETA). Journal of Statistical Software, 48(2), 1–36. of affective theory of mind across adolescence: Disentangling the role of executive
Sass, D. A. (2011). Testing measurement invariance and comparing latent factor means functions. Developmental Neuropsychology, 38, 114–125. https://doi.org/10.1080/
within a confirmatory factor analysis framework. Journal of Psychoeducational 87565641.2012.733786.
Assessment, 29, 347–363. https://doi.org/10.1177/0734282911406661. Widaman, K. F., Ferrer, E., & Conger, R. D. (2010). Factorial invariance within
Schmitt, N., & Kuljanin, G. (2008). Measurement invariance: Review of practice and longitudinal structural equation models: Measuring the same construct across time.
implications. Human Resource Management Review, 18, 210–222. https://doi.org/ Child Development Perspectives, 4, 10–18. https://doi.org/10.1111/j.1750-
10.1016/j.hrmr.2008.03.003. 8606.2009.00110.x.
Scott, R. H., & Fisher, D. L. (2004). Development, validation and application of a Malay Wilson, H. K., Pianta, R. C., & Stuhlman, M. (2007). Typical classroom experiences in
translation of an elementary version of the Questionnaire on Teacher Interaction. first grade: The role of classroom climate and functional risk in the development of
Research in Science Education, 34, 173–194. https://doi.org/10.1023/B: social competencies. The Elementary School Journal, 108, 81–96. Retrieved from
RISE.0000033759.09807.50. https://www.https://www.journals.uchicago.edu/doi/abs/10.1086/525548.
Shamay-Tsoory, S. G., Harari, H., Aharon-Peretz, J., & Levkovitz, Y. (2010). The role of Wolf, S. J., & Fraser, B. J. (2008). Learning environment, attitudes and achievement
the orbitofrontal cortex in affective theory of mind deficits in criminal offenders with among middle-school science students using inquiry-based laboratory activities.
psychopathic tendencies. Cortex, 46, 668–677. https://doi.org/10.1016/j. Research in Science Education, 38, 321–341. https://doi.org/10.1007/s11165-007-
cortex.2009.04.008. 9052-y.
Skinner, E. A., Edge, K., Altman, J., & Sherwood, H. (2003). Searching for the structure of Wubbels, T., & Levy, J. (1991). A comparison of interpersonal behavior of Dutch and
coping: A review and critique of category systems for classifying ways of coping. American teachers. International Journal of Intercultural Relations, 15, 1–18. https://
Psychological Bulletin, 129, 216–269. Retrieved from https://psycnet.apa.org/re doi.org/10.1016/0147-1767(91)90070-W.
cord/2003-01977-005. Yates, S. M. (2011). Single sex schoolboys’ perceptions of coeducational classroom
learning environments. Learning Environments Research, 14, 1–10. https://doi.org/
10.1007/s10984-011-9079-x.

13

You might also like