The Relationship Between Students Course Perception and Their Approaches To Studying in Undergraduate Science Courses A Canadian Experience

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 21

Higher Education Research & Development

ISSN: 0729-4360 (Print) 1469-8366 (Online) Journal homepage: www.tandfonline.com/journals/cher20

The Relationship between Students' Course


Perception and their Approaches to Studying
in Undergraduate Science Courses: A Canadian
experience

Carolin Kreber

To cite this article: Carolin Kreber (2003) The Relationship between Students' Course
Perception and their Approaches to Studying in Undergraduate Science Courses: A
Canadian experience, Higher Education Research & Development, 22:1, 57-75, DOI:
10.1080/0729436032000058623

To link to this article: https://doi.org/10.1080/0729436032000058623

Published online: 14 Jul 2010.

Submit your article to this journal

Article views: 665

View related articles

Citing articles: 13 View citing articles

Full Terms & Conditions of access and use can be found at


https://www.tandfonline.com/action/journalInformation?journalCode=cher20
Higher Education Research & Development, Vol. 22, No, 1, 2003

The Relationship between Students’


Course Perception and their Approaches
to Studying in Undergraduate Science
Courses: A Canadian experience
CAROLIN KREBER
University of Alberta

ABSTRACT Research on students’ approaches to learning in higher education has consist-


ently demonstrated strong relationships between approaches to studying and perceptions of
the learning environment. The vast majority of these studies have been carried out with
Australian and British students. Using the Approaches and Study Skills Inventory for
Students (ASSIST) and the Course Experience Questionnaire (CEQ), this study investi-
gated these relationships with a large sample of Canadian undergraduates. The factor
structure of the ASSIST was confirmed at the main scale level. The factor structure of the
CEQ was largely confirmed, though some small changes were noted. Previous findings of
significant correlations between approaches and CEQ scales were supported. The strongest
relationships were found between heavy workload/inappropriate assessment and surface
approach, and between generic skills and deep approach. Consistent with other studies, age
was found to be a significant variable with regard to approaches. Implications for the
practice of higher education staff development are discussed.

Introduction: Approaches to Studying and Learning Environments


The study draws on two perspectives well established in the post-secondary learning
literature. The first is the notion of approaches to studying, as developed by Biggs
(1987); Entwistle (1998); Entwistle and Ramsden (1983); Tait, Entwistle, and
McCune (1997) and many others. In this situation-specific construct (approaches to
studying are not considered stable across courses), three approaches to learning are
distinguished: (1) a deep approach—characterised by a desire to understand under-
lying principles; (2) a surface approach—characterised by a desire to cope with
content or task sets by memorising detail to get by and (3) a strategic approach—
characterised by strong attention to study organisation, time management, and a
desire to excel in a course by understanding what is expected and doing what is
needed to achieve good marks. The second related perspective is the idea of
perception of the learning environment within a course (Entwistle & Ramsden,
1983; Ramsden, 1979, 1991, 1992), and how both, approaches to studying and

ISSN 0729-4360 print; ISSN 1469-8360 online/03/010057-19  2003 HERDSA


DOI: 10.1080/0729436032000058623
58 C. Kreber

course perception influence student learning (e.g., Donald, 1997; Wilson & Lizzio,
1997). While these relationships are well established on conceptual as well as
empirical terms in the higher education teaching and learning literature, the present
study meant to make a contribution by investigating these relationships with a large
sample of Canadian undergraduate science students. Furthermore, since one of two
questionnaires administered to the sample was a rather newly established instrument
(Entwistle, Tait, & McCune, 2000), the study contributes by testing the reliability
of this instrument with a Canadian sample. Though the second instrument used has
been widely tested (Wilson & Lizzio, 1997; Ramsden, 1991, 1999), the present
study explored whether the reported factor structure would be confirmed with a
Canadian sample.

Deep-level Learning
When Ramsden (1992) reviewed the findings from several research studies asking
university instructors what they considered to be important goals of their teaching,
he discovered that virtually all studies showed that instructors hoped their students
would develop critical thinking skills and apply those to the subjects they are
studying.
It is clear from several studies that the ideas expressed by teachers in higher
education … can be summarized as understanding the key concepts; an
ability to go beyond the orthodox and the expected so that hitherto unmet
problems can be tackled with spirit; a facility with typical methods of
approaching a problem in the discipline and … an awareness of what
learning and understanding in the discipline consist of. In other words,
lecturers describe content-related versions, with a substantive and a pro-
cedural or syntactic element, of the general principles of “critical thinking”
and understanding. (p. 21)
What seems to be underlying these instructors’ comments is a concern with
deep-level learning on the part of students.

Why Instructors Should Be Concerned with Deep-level Learning


Deep-level learning, typically understood as learning characterised by a motivation
to seek meaning, understand underlying principles and identify relationships be-
tween ideas or concepts, has been shown to be an important prerequisite for
self-directed learning (e.g., Candy, 1991). In a globalised world characterised by
rapid technological, economic, cultural and political changes, the need for students
to acquire skills associated with lifelong learning has been repeatedly highlighted
(e.g., Bandura, 1997; Boud, 1997; Candy, 1991; Knapper & Cropley, 2000;
Kreber, 1998, 2002; Kreber, Cranton, & Allen, 2000; Little, 1983; Lengrand,
1986). Candy (1991) showed how the concept of self-directed learning is linked to
both deep-level learning and lifelong learning. In order to be able to learn through-
out the lifespan, people need to become self-directed learners who are capable of
Students’ Course Perception and their Approaches to Studying 59

managing their own learning as well as thinking critically. Such self-directed learning
requires a deep-level approach.

Factors That Support or Hinder the Encouragement of Deep-level Learning


One way in which people may acquire the skills necessary for self-directed and, by
extension, lifelong learning is through post-secondary education. Consequently,
institutions of higher education have a responsibility to create learning environments
that promote such deep-level learning. The extent to which this actually happens
depends not least on the individual instructors teaching certain courses, but it is also
linked to the structure of the programme and the culture of the department.
Teachers in higher education operate in a challenging environment characterised by
rising pressures to remain competitive in the research arena in addition to teaching
a steadily increasing and diverse student population in times of decreasing resources
(e.g., Evans, Forney, & Guido-DiBrito, 1998; Martin & Ramsden, 2000). An
environment that values research over teaching, however, is not very likely to
provide a great many incentives for academic staff to approach their teaching with
the same level of engagement, rigour and commitment as other aspects of their
scholarship. It has been argued that these conditions could be changed by promoting
a work environment within the institution or department that encourages good
teaching, and by extension, promotes student learning (Knight & Trowler, 2000).
A different approach to the enhancement of student learning is taken in the
United States. The American Carnegie Foundation for the Advancement of Teach-
ing sponsors a nationwide programme, called the Pew Scholar Fellowship Program
(Cambridge, 2000), aimed at enhancing student learning through the study of
teaching and learning. Specifically, academic teaching staff actively explore teaching
and learning issues within their own discipline as they turn to pedagogical literature
to familiarise themselves with the existing knowledge on the issue, use this as a
baseline for their own research and subject their work to peer-review. Of particular
interest are the criteria by which the work is reviewed. A few years ago, the
Foundation released a report suggesting six criteria by which to evaluate academic
work in any of Boyer’s (1990) four scholarships—the scholarship of discovery, the
scholarship of teaching, the scholarship of integration and the scholarship of appli-
cation (Glassick, Huber, & Maeroff, 1997). Irrespective of the domain of scholar-
ship, the scholar is expected to have (1) clear goals, (2) require adequate
preparation, (3) make use of appropriate methods, (4) produce significant results,
(5) demonstrate effective presentation and (6) involve reflective critique (Glassick,
Huber, & Maeroff, 1997, p. 25). Though these criteria have not remained unques-
tioned (e.g., Kreber & Cranton, 2000; Kreber, 2002), they are useful for the
purpose of the present study as they highlight the relationship between educational
goals, methods and results. The study reported in this article explored these
relationships in an indirect way. Working from the premise, supported by Rams-
den’s (1992) findings, that instructors approach their teaching with the intent to
promote critical thinking and deep-level learning (their educational goal), it was
investigated whether students’ perception of the learning environment experienced
60 C. Kreber

in a particular course (this perception, it is argued, is defined not the least by the
pedagogical methods used by the teacher) might explain the approaches to studying
(the result) they adopt for this same course. Specifically, the objective of this study
was to explore the relationship between students’ perception of the learning environ-
ment created in a semester-long course and their approaches to studying in this
course.

Data Collection
The sample for this study consisted of a total of 1080 undergraduate science
students of whom 591 were female. Seventy-five per cent of the sample were
between 18 and 22 years of age, sixteen per cent were between 23 and 27, five per
cent between 28 and 35, and the rest (2.8 per cent) 36 or older. The study was part
of a larger investigation into the relationships between science instructors’ knowl-
edge as well as conceptions of teaching and their students’ learning. Data were
collected at several Canadian universities between the fall of 2000 and winter of
2002 from a total of forty-three different classes during the last week of the term.
Though the Canadian higher education system is characterised by relatively little
diversity with respect to the university sector (as compared to, for example, the
United States), three kinds of institutions are typically distinguished among
Canada’s seventy universities: (1) so called “primarily undergraduate institutions”
(universities with fewer graduate programmes and usually no doctoral programmes),
(2) “comprehensive universities” (many master’s and doctoral programmes but no
medical school) and (3) so called “medical/doctoral universities”, research-intensive
institutions with many graduate programmes and large doctoral/medical faculties
(McLeans Magazine, 2000). While all universities offer undergraduate programmes,
the major difference lies in the number and types of graduate programmes they
provide. Students in this study came from all three types of universities.
According to Biglan’s (1973) classic categorisation system of disciplines, the seven
disciplines included in this study could be grouped as follows: earth and atmospheric
sciences (hard, pure, non-life systems), biology (hard, pure, life systems), chemistry
(hard, pure, non-life systems), computer science (hard, pure or applied, non-life
systems), mathematics (hard, pure, non-life systems), physics (hard, pure, non-life
systems), and psychology (either hard or soft, either pure or applied, life systems).
Note that in this study psychology departments, too, were associated with Faculties
of Science.
Data were gathered anonymously via two questionnaires that were completed
outside of class time and collected three days later in class. For the purpose of
correlating the data, the questionnaires the students received had been coded. The
first questionnaire was a section of the ASSIST (Approaches and Study Skills
Inventory for Students) (Tait & Entwistle, 1996; Entwistle, Tait, & McCune, 2000).
The ASSIST inventory consists of seven sections whereby the fourth addresses
approaches to studying. This section of the ASSIST is a more recent version of the
Approaches to Studying Inventory (ASI) orginally developed by Entwistle and
Ramsden (1983), which has been used in a large number of studies. Consistent with
Students’ Course Perception and their Approaches to Studying 61

the underlying construct, this fifty-two-item instrument consists of three main scales
measuring a deep approach, a surface approach, and a strategic approach to
learning, respectively. Each main scale is divided into four to five corresponding
sub-scales. The first main scale, measuring a deep approach, consists of the four
sub-scales Seeking Meaning, Relating Ideas, Use of Evidence, and Interest in Ideas
whereby the latter indicates the associated motive for learning. The second main
scale, measuring a strategic approach, is made up of the five sub-scales Organised
Studying, Time Management, Monitoring Effectiveness, and Alertness to Assess-
ment Demands, with Achievement Orientation as the associated motive. Finally, the
third main scale, measuring a surface approach, comprises the four sub-scales Lack
of Understanding (or Unrelated Memorisation), Lack of Purpose, Syllabus Bound-
ness and Fear of Failure, the latter being the associated motive for learning.
Sub-scale level factor analysis, as recommended by Tait, Entwistle, and McCune
(1997) confirmed the three-factor solution (deep, strategic, and surface) and the
association of the learning motives with their corresponding scales (see also
Entwistle, Tait & McCune, 2000). When all fifty-two items were subjected to
principal component factor analysis, the thirteen-factor solution reported by Tait,
Entwistle, and McCune (1997) was not confirmed, however. Consequently, all
further analyses were carried out on the main scale level. Not surprisingly, Entwistle,
Tait and McCune (2000) report a moderate positive correlation between the deep
and strategic factors and a negative correlation between the deep and surface factors,
as well as strategic and surface factors, and these finding were confirmed in the
present study. Table 1 shows the factor loadings and reliability measures for each
sub-scale and the correlations between the three ASSIST factors.
The second instrument was Ramsden’s Course Experience Questionnaire (CEQ).
Development on this questionnaire began in the 1980s at the University of Lan-
caster and has been refined through many subsequent studies (e.g., Ramsden,
1991). The theory underlying the CEQ is that students’ approaches to learning and
the quality of their learning outcomes are determined by their perception of
curriculum, instruction and assessment. In the Australian context, the CEQ is
employed as a measure of perceived teaching quality in degree programmes in
national annual surveys of all graduates in the higher education system and it is
increasingly being used as a measure of the quality of teaching in universities in the
UK (Wilson & Lizzio, 1997). The instrument has been tested on well over 65,000
students, and measures of validity and reliability have been reported as satisfactory
(e.g., Wilson & Lizzio, 1997; Ramsden, 1999). The CEQ has two versions—a longer
one containing thirty-six items and a more widely used and shorter one with
twenty-five items. The version of the CEQ used in this study had twenty-five items,
including the item “Overall, I was satisfied with the quality of the course” (item 25).
While the shorter version of the CEQ does not include the Emphasis on Indepen-
dence scale, which consists of six items of the longer version, it had a new scale
added to address the need to develop lifelong learning skills in students. Such skills
include “problem-solving, analytic skills, teamwork, confidence in tackling unfam-
iliar situations, ability to plan work and written communication skills” (Wilson &
Lizzio, 1997, p. 36). This scale was labeled “Generic Skills”. Principal components
62 C. Kreber

TABLE 1. Factor Loadings and Cronbach  Coefficients for each ASSIST Sub-Scale
(n ⫽ 1080)

Factor I II III (␣)

Approaches to Studying
Deep Approach
Seeking meaning 0.73 (0.62)
Relating idea 0.80 (0.59)
Use of evidence 0.79 (0.51)
Interest in ideas 0.70 (0.73)
Surface Approach
Lack of understanding 0.81 (0.60)
Lack of purpose 0.65 (0.72)
Syllabus boundness 0.61 (0.59)
Fear of failure 0.75 (0.75)
Strategic Approach
Organized studying 0.83 (0.59)
Time management 0.87 (0.80)
Monitoring effectiveness 0.65 (0.62)
Achievement orientation 0.81 (0.67)
Alertness to assessment demands 0.48 (0.60)

Correlations between factors


I II III

Factor I (Deep) 1.00


Factor II (Surface) ⫺ 0.43 1.00
Factor III (Strategic) 0.47 ⫺ 0.28 1.00

factor analysis yielded the following five factors for the shorter version of the CEQ:
Good Teaching, Clear Goals and Standards, Appropriate Workload, Appropriate
Assessment and Generic Skills (e.g., Wilson & Lizzio, 1997; Ramsden, 1999). Both
Wilson and Lizzio (1997) as well as Ramsden (cited in Richardson, 1994a) found
evidence that the CEQ comprises a higher-order, two-factor structure. The first
higher-order factor includes the scales Good Teaching, Clear Goals and Standards,
Generic Skills and Appropriate Assessment. Appropriate Workload makes up the
second higher-order factor (Wilson & Lizzio, 1997).
Note that the Course Experience Questionnaire was initially designed to assess
students’ experiences with a programme (“course”). The way the term “course” is
used in this article corresponds to the North American interpretation of a course as
a semester-long seminar or lecture usually comprising thirty-six hours of class time
and taught by one instructor. This is very different from an understanding of a
course as a programme of study comprised of several semester-long seminars and
lectures taught by several instructors leading to a degree. Though this is somewhat
different from what Ramsden had in mind when designing the instrument, the
questionnaire was still considered appropriate for the purpose of this study. Some
Students’ Course Perception and their Approaches to Studying 63

items had to be slightly reworded to make them more suitable for a semester-long
course.
The decision was made to collect students’ responses to eleven further items to
address the notion of “fostering independence within the discipline”, a dimension
shown to be very relevant in assisting student learning (Donald, 1997). Ten of these
eleven items were adapted directly from Donald’s (1997) list of recommendations of
how to promote effective learning environments. They included:

• Students had a great deal of choice over how they were going to learn
• The instructor showed us how the content of this course fits into the larger
discipline
• Students were given lots of choice in the work they had to do
• We discussed with our instructor how we were going to learn
• The instructor was approachable and willing to help with problems
• The instructor made me aware that learning required thinking on my part
• In this course I was shown how to think within the discipline
• The instructor encouraged student participation
• The instructor helped us to set goals for this course
• Success in this course was a matter of how well I studied

We added one generic item.

• Class time was used effectively

Principal component factor analyses were conducted separately for the twenty-
four items of the CEQ (not including the overall satisfaction item) and the eleven
new items. Taking into account the recommendation made by Wilson and Lizzio
(1997) to use oblique rather than varimax rotation in factor analyzing the CEQ, the
rationale being the reported significant positive moderate intercorrelations between
the CEQ scales (Ramsden, 1991; Ainley & Long, 1994), both were carried out.
Principal components factor analysis using first oblique rotation and then, for
comparison purposes, varimax rotation yielded a six-factor structure in each case. It
therefore was decided to base further analyses on varimax rotation. Table 2 shows
the six-factor solution for the CEQ and the variance accounted for by each factor.
With the exception of Ramsden’s Good Teaching scale, the analysis confirmed the
factor solution reported by Ramsden (1999) and Wilson and Lizzio (1997). Items
that loaded on the Good Teaching scale in previously reported factor analyses of the
CEQ, however, loaded clearly on two separate factors in the present study. This
remained the case also when, for comparison purposes, the factor analysis was
redone without item 16. This item (“Feedback on my work was usually given only
in the form of marks or grades”), which was not included in Ramsden’s (1999) and
Wilson and Lizzio’s (1997) analyses, loaded highly on factor three (Good Teaching
A) in our study and had no double-loadings. Since all other items confirmed the
factor structure reported elsewhere (e.g., Ramsden, 1999; Wilson & Lizzio. 1997),
the two factors were labeled “Good Teaching A” (which basically refers to
“Feedback on, and concern for, student learning”) and “Good Teaching B” (which
64
C. Kreber

TABLE 2. Factor Analysis—Original 24 items of Course Experience Questionnaire (N ⫽ 1035)

Factor 1 Factor 2 Factor 3 Factor 4 Factor 5 Factor 6


Clear Goals Facts-
Generic and Good Heavy Good oriented
Skills Standards Teaching a Workload Teaching b Assessment

5. This course sharpened my analytic skills 0.743 0.070 ⫺ 0.004 ⫺ 0.025 0.208 ⫺ 0.140
10. As a result of this course, I feel confident about 0.742 0.115 0.139 ⫺ 0.137 0.101 ⫺ 0.101
tackling unfamiliar problems
2. This course developed my problem solving skills 0.712 0.147 0.064 0.001 0.187 ⫺ 0.207
22. The course helped me to develop the ability to plan 0.697 0.083 0.098 0.117 0.173 0.043
my own work
11. The course improved my skills in written communication 0.521 ⫺ 0.055 0.405 ⫺ 0.061 0.054 0.086
9. This course helped me develop my ability to work as a 0.488 0.221 0.324 0.064 ⫺ 0.286 ⫺ 0.097
team member
1. It was always easy to know the standard of work 0.151 0.790 0.066 ⫺ 0.137 0.109 0.018
expected in this course
24. The instructor made it clear right from the start what 0.143 0.733 0.118 ⫺ 0.004 0.229 ⫺ 0.005
s/he expected from students
6. I usually had a clear idea of where I was going and 0.186 0.726 0.022 ⫺ 0.197 0.222 0.014
what was expected of me in this course
13. It was often hard to discover what was expected of me ⫺ 0.059 0.689 0.058 ⫺ 0.235 0.021 ⫺ 0.178
in this course
17. The instructor normally gave me feedback on how I was doing 0.246 0.129 0.750 ⫺ 0.089 0.145 ⫺ 0.037
7. The instructor put a lot of time into commenting on my work 0.185 0.193 0.738 ⫺ 0.013 0.216 0.000
16. Feedback on my work was usually given only in the ⫺ 0.027 ⫺ 0.081 0.678 ⫺ 0.098 ⫺ 0.001 ⫺ 0.166
form of marks or grades
15. The instructor made real effort to understand difficulty 0.157 0.190 0.512 ⫺ 0.100 0.421 ⫺ 0.098
I might be having with my work
4. The workload in this course was too heavy 0.042 ⫺ 0.173 ⫺ 0.032 0.774 ⫺ 0.135 0.087
23. The sheer volume of work to get through in this ⫺ 0.147 ⫺ 0.073 ⫺ 0.027 0.755 ⫺ 0.112 0.181
course meant it couldn’t all be thoroughly comprehended
14. I was generally given enough time to learn the things ⫺ 0.087 ⫺ 0.182 ⫺ 0.146 0.648 ⫺ 0.148 ⫺ 0.107
I had to learn
21. There was a lot of pressure on me to do well in this course 0.135 ⫺ 0.066 ⫺ 0.063 0.590 0.146 0.134
20. The instructor worked hard on making the subject interesting 0.195 0.130 0.109 ⫺ 0.034 0.763 ⫺ 0.146
18. The instructor was extremely good in explaining things 0.151 0.243 0.149 ⫺ 0.175 0.728 0.000
3. The instructor of this course motivated me to do my best work 0.322 0.326 0.285 ⫺ 0.013 0.614 ⫺ 0.066
8. To do well in this course all you really needed was a ⫺ 0.247 0.024 ⫺ 0.116 0.163 0.034 0.720
good memory
12. The instructor seemed more interested in testing what I ⫺ 0.071 ⫺ 0.170 ⫺ 0.148 0.146 ⫺ 0.208 0.697
had memorized than what I had understood
19. The instructor asked me questions just about facts ⫺ 0.012 0.003 0.007 0.007 ⫺ 0.025 0.675
% of Variance 12.9% 11.0% 9.7% 9.2% 9.0% 7.2%
Students’ Course Perception and their Approaches to Studying 65
66 C. Kreber

basically refers to “Classroom teaching”). Cronbach alpha reliability coefficients for


the six scales ranged from 0.78 for factor one to 0.59 for factor six.
Principal components factor analysis of the added eleven items yielded two
factors. The first factor included items 31, 30, 33, 36, 32, 34, 27 and 35. This factor
comprises a wide range of items and was tentatively labeled “Independent Thinking
and Learning within the Discipline”. Items loading on the second factor included
28, 29, and 26. This factor was labelled “Choice in Learning”. Cronbach alpha
reliability coefficients for these two scales were 0.77 for factor one and 0.71 for
factor two. Entwistle and Ramsden (1983, p. 185) investigated relationships be-
tween students’ approaches and the context of learning in academic departments
using previous versions of the instruments used here (the Course Perception Ques-
tionnaire and the Approaches to Studying Inventory) through factor analysis. In order
to explore whether our results resembled their earlier findings, we conducted a
principal components factor analysis of the three approaches, six CEQ scales and
two new scales. The analysis yielded three factors, whereby “Choice in Learning”,
“Good Teaching, B”, “Good Teaching, A”, “Independent Thinking and Learning
within the Discipline”, “Generic Skills” and “Clear Goals and Standards”, loaded
on the first factor. “Appropriate Workload”, “Surface Approach”, and “Appropriate
Assessment” loaded on the second factor. Strategic and Deep approach made up
factor three. Two observations can be made regarding these findings. First, the
results suggest a higher-order two-factor structure for the CEQ with Workload and
Assessment making up one of two higher-order factors. Though Wilson and Lizzio
(1997) also report a two-factor structure for the CEQ (see also Ramsden, 1991), in
their study “Appropriate Workload” was the only scale loading on the second factor.
Second, based on the Course Perception Questionnaire, Ramsden and Entwistle
(1983) reported the “Freedom in Learning” scale, which was part of this earlier
instrument, to load on the same factor as “Good Teaching”. This “Freedom in
Learning” scale shares similarities with our “Choice in Learning Scale” which is
further confirmed by the fact that our “Choice in Learning” scale loaded on the
same factor as the two “Good Teaching” scales (A and B).
Step-wise multiple regression analyses were then carried out to investigate
whether approaches to studying could be predicted from students’ course experi-
ence. More specifically, three analyses were conducted whereby each time one of the
three ASSIST main scales served as the criterion variable and the six CEQ factors
and two additional factors served as predictor variables. The overall satisfaction
CEQ item (item 25), gender, age as well as years of academic experience were added
as four further predictor variables. To ensure greater reliability of the results, the
analyses were carried out separately for each split half of the total sample (odd–even
split) to explore whether the overall variance accounted for by the predictor variables
would be considerably different for each sub-sample. The overall variance ac-
counted for in the prediction of deep approach was 20.3 per cent of the entire
sample and 19.3 and 21.5 per cent for each sub-sample, respectively. For strategic
approach the overall variance accounted for by the entire and two sub-samples
ranged from 16.3 to 16.7 per cent and for surface approach from 35.3 to 37.7 per
cent. For all three approaches, the main predictor variables remained the same
TABLE 3. Pearson Correlations between Course Experience and Approaches to Learning (N ⫽ 1032)
Clear Good Good Facts- Independence Choice
Deep Strategic Surface Generic Goals and Teaching Heavy Teaching oriented within in
Approach Approach Approach Skills Standards A Workload B Assessment Discipline Learning Gender Age

Strategic Approach 0.472** —


Surface Apathetic ⫺ 0.430** ⫺ 0.285** —
Approach
Generic Skills 0.378** 0.340** ⫺ 0.317** —
Clear Goals and 0.160** 0.211** ⫺ 0.333** 0.294** —
Standards
Good Teaching a 0.209** 0.177** ⫺ 0.226** 0.450** 0.314** —
Heavy Workload ⫺ 0.167** ⫺ 0.118** 0.485** ⫺ 0.091* ⫺ 0.366** ⫺ 0.240** —
Good Teaching b 0.299** 0.235** ⫺ 0.290** 0.455** 0.484** 0.476** ⫺ 0.248** —
Facts-oriented ⫺ 0.161** ⫺ 0.060 0.375** ⫺ 0.274** ⫺ 0.171** ⫺ 0.264** 0.289** ⫺ 0.236** —
Assessment
Independence 0.329** 0.273** ⫺ 0.250** 0.529** 0.457** 0.512** ⫺ 0.192** 0.706** ⫺ 0.274** —
within Discipline
Choice in 0.184** 0.141** ⫺ 0.159** 0.459** 0.276** 0.407** ⫺ 0.182** 0.362** ⫺ 0.176** 0.432** —
Learning
Gender ⫺ 0.061 0.130** 0.019 0.016 0.062 0.009 ⫺ 0.024 0.032 ⫺ 0.084* 0.031 0.013 —
Age 0.174** 0.112** ⫺ 0.116** 0.050* 0.075 0.090* ⫺ 0.037 0.079 ⫺ 0.074 0.057 0.012 ⫺ 0.069 —
Years post-sec. 0.088* 0.053 ⫺ 0.077 ⫺ 0.002 ⫺ 0.071 0.092* ⫺ 0.030 0.034 ⫺ 0.071 ⫺ 0.030 ⫺ 0.091* ⫺ 0.055 0.435**
education

**Correlation is significant at the 0.001 level (2-tailed).


*Correlation is significant at the 0.01 level (2-tailed).
Students’ Course Perception and their Approaches to Studying 67
68 C. Kreber

when analyses were done for each split half of the sample, though the order by which
subsequent predictor variables entered the regression equation changed slightly in
some cases. Perhaps most importantly, in the prediction of strategic approach,
gender and age were the second and third best predictors (though together they
accounted for only a small percentage of the variance) for the total sample, but these
results were not confirmed with half of the sample. Similarly, in the prediction of
deep approach, gender was the last variable to enter the regression analysis for the
entire sample (accounting for only a very small amount of variance) and these results
were not confirmed when the sample was split. Overall, however, these results
enhance our confidence in the findings based on the entire sample reported here,
particularly with regards to the main predictors.
To facilitate the reader’s understanding of the regression results reported, the
scales “appropriate workload” and “appropriate assessment” from here on will be
called “Heavy Workload” and “Facts-Oriented Assessment” as this is what the two
scales actually measure.

Results
Table 3 shows the correlations between the variables considered in this study.
Tables 4 to 6 report the results of the three regression analyses.
The main predictor for deep approach was “Generic Skills”, which accounted for
14.3 per cent of the total variance (Table 4).
Age, “Independent Thinking and Learning within the Discipline”, and “Heavy
Workload” (Beta is negative), together, explained an additional 5.6 per cent of the
variance. Gender entered the regression equation last and accounted only for a very
small percentage of the variance (0.4 per cent). As noted earlier, gender was shown
to not be a reliable predictor.
In the prediction of strategic approach, “Generic Skills” again explained most of
the variance (11.6 per cent) (Table 5).
Though, the next variable entering the regression equation was Gender account-
ing for an additional 1.6 per cent of the variance, the same result was not found for
each half of the sample. Though “Clear Goals and Standards” entered the re-
gression equation at the third step, this variable was removed in step eight, and
hence, did not really account for any additional variance. Age, though accounting
for an additional 1.0 per cent, was shown to not be a reliable predictor.

TABLE 4. Results of stepwise regression predicting deep approach (n ⫽ 1032)

Multiple F
Step Variable R square change Beta p

1 Generic Skills 0.143 171.28 0.28 ⬍ 0.0001


2 Age 0.167 29.73 0.14 ⬍ 0.0001
3 Independence 0.188 27.14 0.15 ⬍ 0.0001
4 Heavy Workload 0.199 14.24 ⫺ 0.11 ⬍ 0.0001
5 Gender 0.203 5.07 ⫺ 0.06 ⬍ 0.05
Students’ Course Perception and their Approaches to Studying 69

TABLE 5. Results of stepwise regression predicting strategic approach (n ⫽ 1032)

Multiple F
Step Variable R square change Beta p

1 Generic Skills 0.116 135.05 0.28 ⬍ 0.0001


2 Gender 0.132 18.49 0.14 ⬍ 0.0001
3 Clear Goals and
Standards 0.143 14.14
4 Age 0.153 11.44 0.10 ⬍ 0.0001
5 Independence 0.158 5.80 0.12 ⬍ 0.001
6 Facts-oriented
Assessment 0.163 6.26 0.10 ⬍ 0.01
7 Heavy Workload 0.167 5.35 ⫺ 0.09 ⬍ 0.01
8 Clear Goals and
Standards (removed) 0.165 2.67

TABLE 6. Results of stepwise regression predicting surface approach (n ⫽ 1032)

Multiple F
Step Variable R square change Beta p

1 Heavy Workload 0.235 316.42 0.36 ⬍ 0.0001


2 Generic Skills 0.310 112.18 ⫺ 0.21 ⬍ 0.0001
3 Facts-oriented
Assessment 0.342 50.47 0.19 ⬍ 0.0001
4 Clear Goals and
Standards 0.351 13.80 ⫺ 0.09 ⬍ 0.01
5 Choice 0.356 8.48 0.09 ⬍ 0.01
6 Age 0.361 7.65 ⫺ 0.07 ⬍ 0.01
7 Overall 0.364 4.60 ⫺ 0.07 ⬍ 0.05
8 Gender 0.367 4.11 0.05 ⬍ 0.05

“Independent Thinking and Learning within the Discipline”, “Facts-Oriented


Assessment” and “Heavy Workload” accounted for the remaining 1.4 per cent.
Clearly, the only strong predictor for strategic approach was “Generic Skills”.
Lastly, in the prediction of surface approach, “Heavy Workload” entered
the regression equation first accounting for 23.5 per cent of the overall variance
(Table 6).
“Generic Skills” (Beta is negative) contributed an additional 7.5 per cent and
“Facts-Oriented Assessment” accounted for another 3.2 per cent. Together, the
remaining variables, including age, accounted only for an additional 2.5 per cent of
the total variance.

Discussion
Gender was shown to not be a reliable predictor for any of the three approaches.
This is not unexpected since Richardson (1993), in a study exploring whether male
70 C. Kreber

and female students should be regarded as distinct populations with regard to their
orientations and approaches to studying in higher education, found no clear evi-
dence of differences between the two groups based on ASI scores. He suggested that
“this outcome is consistent with the broad patterns of findings obtained using other,
similar instruments in order to examine the possibility of gender differences in
student learning (see Richardson & King, 1991)” (Richardson, 1993, p. 3 plus four
electronic pages).
Age was shown to be a reliable predictor for deep approach. It is a significant,
though perhaps again not surprising, finding that the deep approach is associated
with older learners, whereas the surface approach is associated with younger learners
(though in the prediction of surface approach, age accounted for only a very small
percentage of the variance). Upon his extensive review of research literature to
explore whether there is any evidence that more mature students adopt different
approaches to studying from younger students, Richardson (1994b) concluded:
On the basis of the rather limited research evidence to date, one can put
forward the tentative hypothesis that mature students are more likely than
younger students to adopt a deep approach or meaning orientation towards
their academic studies and, conversely, that they are less likely than
younger students to adopt a surface approach or a reproducing orientation.
Such a hypothesis is prompted by findings obtained using three different
research instruments: the SPQ (see Watkins & Hattie, 1981; Biggs, 1985,
1987, pp. 55–58); the ILP (see Watkins & Hattie, 1981); and the ASI (see
Watkins, 1982, 1983; Harper & Kember, 1986; and cf. Watkins & Hattie,
1985; Clennell, 1987, 1990) (Richardson, 1994b, p. 316).
The results from the present study, based on a more recent version of the ASI (a
section of the ASSIST), confirm this hypothesis. This finding is also highly consist-
ent with what we know from the adult education literature, namely that mature
learners are far more likely to look for relevance and meaning in their learning
activities (e.g., Brundage & Mackeracher, 1980; Knowles, 1980; Knox, 1986;
Merriam & Caffarella, 1991).
Of the CEQ Scales, only “Generic Skills” and “Heavy Workload” appeared to be
strong predictors, though Facts-Oriented Assessment and Clear Goals and Stan-
dards accounted for a small percentage of the variance. Of particular interest is that
“Generic Skills” was a significant predictor for all three approaches. While “Generic
Skills” was the main predictor for both deep and strategic approach, accounting for
14.6 and 11.6 per cent respectively, it explained an additional 7.5 per cent of the
variance for surface approach (the main predictor, we saw, was Heavy Workload).
Since beta is negative, one can infer that if “Generic Skills” are not encouraged, the
student is more likely to adopt a surface approach. These results are consistent with
the conclusions offered by Ramsden (1992). However, it would also be conceivable
that depending on what approaches to studying a student has adopted within a
particular course, he or she will perceive the extent to which he or she has acquired
generic skills as a result of the course quite differently. In other words, students with
a deep approach may perceive, to a much greater extent than their peers with a
Students’ Course Perception and their Approaches to Studying 71

surface approach, that the course sharpened their analytical skills, that they are more
confident to tackle unfamiliar problems as a result of the course and so forth.
“Heavy Workload” appeared to be a strong predictor for surface approach and
accounted for some of the variance of deep and strategic approach. Since in the
prediction of deep approach beta is negative, this means that the less heavy the
workload in a given course, the more likely a student is to adopt a deep approach.
Conversely, the heavier the workload, the more likely the student is to adopt a
surface approach. It is worth noting that in the prediction of surface approach,
“Heavy Workload” was the main predictor variable explaining 23.5 per cent of the
overall variance. “Facts-Oriented Assessment” contributed also 3.2 per cent of the
variance. Previous research showed strong positive relationships between surface
approach and heavy workload and inappropriate assessment (e.g., Wilson & Lizzio,
1997).
Several studies identified a relationship between perceived workload and ap-
proaches to studying (e.g., Ramsden & Entwistle, 1981; Entwistle & Ramsden,
1983, Dahlgren, 1984). These studies investigated the effect of workload on ap-
proaches to studying. Kember and Ng (1996), however, suggested that how an
individual student perceives the workload is a function of his or her characteristics,
approaches to studying, as well as contextual variables related to the course taken,
such as motivational ability of the teacher, volume of content covered and so forth.
Kember and Leung (1998) argued that the widely reported effect of workload on
approaches could be in the opposite direction in a sense that people’s approach to
studying may influence their perception of the contextual factors of the environ-
ment, such as workload. Findings from their own study, conducted with a sample
of students enrolled in an undergraduate Engineering course at a university in Hong
Kong using Biggs’ (1987) Study Process Questionnaire, suggested the possibility of
a reciprocal relationship between approaches to studying and perceived workload.
They concluded that “students who utilise a reproducing approach are more likely
to perceive the workload as high. Also, when workloads are perceived as high, the
students may be more inclined to resort to a reproducing approach. The two
directional influences may even reinforce each other” (Kember & Leung, 1998,
p. 293 plus six electronic pages). In sum, though “Generic Skills” and “Heavy
Workload” were shown to be strong predictors of approaches to studying, the
previous discussion raised the possibility that the effect could be in the opposite
direction or reciprocal.
Of the two other scales, only “Independent Thinking and Learning within the
Discipline” was shown to be a predictor for deep and strategic approach accounting
for a small percentage of the overall variance (2.1 per cent for deep approach).
Though the variance accounted for is small, one may conclude that if students
perceive their independent thinking within the discipline is encouraged, this is
positively linked to them adopting a deep and strategic approach. If these conditions
are not in place, that is to say if students do not feel they are encouraged to think
independently, this, to some extent, explains why they might opt for a surface
approach to their studying (note that there was a significant negative moderate
correlation between surface approach and “Independent Thinking and Learning
72 C. Kreber

within the Discipline”, and a significant positive moderate correlation between each
deep and strategic approach and “Independent Thinking and Learning within the
Discipline”, see Table 3).

Conclusions
What has been learned from this study and what are some implications? This study
with a large sample of Canadian undergraduate students confirmed the factor
structure underlying the ASSIST questionnaire on the main scale level. The factor
structure Ramsden reported on the CEQ was largely confirmed, though two small
changes were noted. First, principal components factor analysis of the CEQ yielded
six rather than five factors. However, with the exception of the Good Teaching scale,
which split into two factors, items loaded as reported by other researchers (e.g.,
Ramsden 1999; Wilson & Lizzio, 1997). Second, item 16 loaded highly on one of
the two Good Teaching factors.
Of the eleven new items, five formed a factor called “Independent Thinking and
Learning within the Discipline” and three a factor called “Choice in Learning”.
Only “Independent Thinking and Learning within the Discipline” was shown to
moderately correlate with all three approaches.
Previous research using the CEQ and ASI was based on samples of Australian and
British undergraduates. The results of this study with a large sample of Canadian
students clearly confirm previous findings showing a link between students’ percep-
tions of the learning environment in a given course and their approaches to studying.
The strongest predictor overall in this study was “Heavy Workload” in the
prediction of a surface approach. Although it has been argued for many years that
a heavy workload discourages students from adopting a deep approach to learning,
many instructors who were interviewed as part of the larger study that the present
investigation was part of, continued to comment that not having enough time to get
through the course content was a major concern for them. Staff development
initiatives that inform instructors about findings from studies such as this one (and
many others that have been cited in this article) might be a first step in helping them
change first their perspectives and then their approaches to teaching. However,
perspectives or conceptions of teaching are deeply held and firmly rooted beliefs
about practice, and these, as several scholars were able to show (e.g., Argyris &
Schön, 1974; Brookfield, 1987; Mezirow, 1991), are not easily challenged and
changed. Trigwell and Prosser (1996) commented in this context: “If teachers’
conceptions of teaching and/or learning are related to their approaches to teaching,
the task of improving teaching may be significantly more difficult than anticipated”
(p. 276). It would seem essential that instructors are provided with opportunities
that allow them to become aware of their conceptions of teaching. Once they are
aware of these, and how their conceptions of teaching are linked to approaches to
teaching and students’ approaches to studying (Trigwell, Prosser, & Waterhouse,
1999), they will be in a better position to understand the dynamics of the teaching
and learning interactions they are involved in. They might be more inclined to
change their teaching approach if they have been provided with research-based
Students’ Course Perception and their Approaches to Studying 73

information that challenges their assumptions or conceptions of teaching.


Staff development initiatives in higher education that emphasise the development
of desirable teaching styles but neglect to link these to any theoretical framework
and research-based evidence are likely not to have much credibility in the eyes of
most academics, who by virtue of their chosen profession, value scholarship and
research.
While a deep approach to studying is a goal in and of itself, particularly at a
time when the ability to learn throughout the lifespan has been recognised as
an important prerequisite for a successful career and fulfilled life, teachers, who
care about their profession, are also interested in whether their students actually do
well in their courses as a result of having taken a deep approach. Clearly, the
relationships between approaches to studying and academic achievement in the
course are as important as the relationships between course perception and
approaches to studying. Wilson and Lizzio (1997) report significant positive
correlations between academic achievement and several CEQ scales, whereby the
strongest correlations were found for both the Good Teaching and Clear Goals and
Standards scales. Further studies with different samples investigating the overall
relationships between course perception, approaches to studying and student
achievement are encouraged.
It was noted that this study largely confirmed previous findings. While such
studies have their place, future research should perhaps turn to less well-known
territory. For example, deep-level learning was discussed as a prerequisite of lifelong
learning. However, preparing lifelong learners through higher education means more
than helping them learn how to learn. This latter goal, Candy (1991), in his book
Self-Direction for Lifelong Learning discussed as self-management in learning. Lifelong
learning has been associated not only with self-management but also with much
broader educational goals such as developing in learners a sense of personal
autonomy and social responsibility (see also Candy, 1991). The extent to which
perceptions of the learning environment and approaches to studying are related to
outcome measures that go beyond knowledge of the discipline, to include aspects of
personal, moral and intellectual development, remains an important question that
would clearly benefit from further research.

Acknowledgements
The author would like to acknowledge the support of the Social Sciences and
Humanities Research Council of Canada. The author wishes to thank Chris Prokop
for her advice and assistance with the analysis of the data. Finally, the author wishes
to extend her thanks to the graduate research assistants helping with data collection:
Heather Castleden, Nina Erfani, Joan Lim, and Tarah Wright.

Address for correspondence: Carolin Kreber, Adult and Higher Education, Department
of Educational Policy Studies, University of Alberta, Edmonton, AB, Canada T6G
2G5. E-mail: carolin.kreber@ualberta.ca
74 C. Kreber

References
Ainley, J., & Long, M. (1994). The Course experience survey: 1992 graduates. Canberra: Australian
Government Publishing Service.
Argyris, C., & Schön, D. (1974). Theory in practice: Increasing professional effectiveness. San
Francisco: Jossey-Bass.
Bandura, A. (Ed.) (1997). Self-efficacy in changing schools. Cambridge: Cambridge University
Press.
Biggs, J. (1985). The role of metalearning in study processes. British Journal of Educational
Psychology, 55, 381–394.
Biggs, J.B. (1987). Student approaches to learning and studying. Hawthorne, Victoria: Australian
Council for Educational Research.
Biglan, A. (1973). Characteristics of subject matter in different academic fields. Journal of Applied
Psychology, 57(3), 195–203.
Boud, D. (1997). Providing for lifelong learning through work-based study. Challenges for policy and
practice. Papers presented at the International Conference on Lifelong Learning, Guildford,
Surrey, UK.
Boyer, E. (1990). Scholarship Reconsidered. Washington, DC: The Carnegie Foundation.
Brookfield, S. (1987). Developing critical thinkers. San Francisco: Jossey-Bass.
Brundage, D., & MacKeracher, D. (1980). Adult learning principles and their application to program
planning. Toronto: Ontario Institute for Studies in Education.
Cambridge, B.L. (2000). The scholarship of teaching and learning: A national initiative. In M.
Kaplan & D. Lieberman (Eds.), To Improve the Academy, (pp. 18, 55–68). Bolton, MA:
Anker.
Candy, P. (1991). Self-direction for lifelong learning. San Francisco: Jossey-Bass.
Clennell, S. (Ed.) (1987). Older students in Adult Education. Milton Keynes: Open University.
Clennell, S. (Ed.) (1990). Older students in Europe: A survey of older students in four European
countries. Milton Keynes: Open University.
Dahlgren, L.O. (1984). Outcomes of learning. In F. Martin, D. Hounsell, & N. Entwistle (Eds.),
The experience of learning (pp. 19–35). Edinburgh: Scottish Academic Press.
Donald, J.G. (1997). Improving the environment for learning. San Francisco: Jossey-Bass.
Entwistle, N. (1998). Conceptions of learning, understanding and teaching in higher education.
SCRE Fellowship, November 5th.
Entwistle, N., & Ramsden, P. (1983). Understanding Student Learning. London & Canberra:
Croom Helm.
Entwistle, N., Tait, H., & McCune, V. (2000). Patterns of response to approaches to studying
inventory across contrasting groups and contexts. European Journal of the Psychology of
Education, 15, 33–48.
Evans, N.J., Forney, D.S., & Guido-DiBrito, F. (1998). Student development in college. Theory,
research, and practice. San Francisco: Jossey-Bass.
Glassick, C.E., Huber, M.T., & Maeroff, G.I. (1997). Scholarship assessed. Evaluation of the
professoriate. San Francisco: Jossey-Bass.
Harper, G., & Kember, D. (1986). Interpretation of factor analyses from the approaches to
studying inventory. British Journal of Educational Psychology, 59, 66–74.
Kember, D., & Leung, D. (1998). Influences upon students’ perceptions of workload. Educational
Psychology, 18(3), 293–308.
Kember, D., & Ng, S. (1996). An examination of the interrelationships between workload, study
time, learning approaches and academic outcomes. Studies in Higher Education, 21(3),
347–359.
Knapper, C., & Cropley, A. (2000). Lifelong learning in higher education. London: Kogan Page.
Knight, P.T., & Trowler, P.R. (2000). Department-level cultures and the improvement of
learning and teaching. Studies in Higher Education, 25, 69–83.
Knowles, M.S. (1980). The modern practice of adult education: From pedagogy to andragogy. New
York: Cambridge Books.
Students’ Course Perception and their Approaches to Studying 75

Knox, A. (1986). Helping adults learn. San Francisco: Jossey-Bass.


Kreber, C. (1998). The relationship between self-directed learning, critical thinking, and psycho-
logical type and some implications for teaching in higher education. Studies in Higher
Education, 23(1), 71–86.
Kreber, C. (2002). Embracing a philosophy of lifelong learning in higher education. Starting with
faculty’s beliefs about their role as educators. To Improve the Academy, 21, 288–301.
Kreber, C., & Cranton, P.A. (2000). Exploring the scholarship of teaching. Journal of Higher
Education, 71(4), 476–495.
Kreber, C., Cranton, P.A., Allen, K. (2000). If lifelong learning is important: The relationships
between students’ self-directed learning readiness, their psychological type, learning style,
and creative thinking and logical reasoning abilities. In H. Long (Ed.), Theory and practice
in self-directed learning (pp. 97–115). Schaumburg: Motorola Press.
Lengrand, P. (1986). Areas of learning basic to lifelong education. Oxford: Pergamon.
Little, T.C. (1983). The institutional context for experiential learning. In T.C. Little (Ed.),
Making sponsored experiential learning standard practice. San Francisco: Jossey-Bass.
Martin, E., & Ramsden, P. (2000). Introduction. Higher Education Research and Development,
19(2), 133–135.
McLeans Magazine (2000). http://www.macleans.ca/contents/universities.asp
Merriam, S.B., & Caffarella, R.S. (1991). Learning in adulthood. San Franciso: Jossey-Bass.
Mezirow, J. (1991). Transformative dimensions of adult learning. San Francisco: Jossey-Bass.
Ramsden, P. (1979). Student learning and perceptions of the academic environment. Higher
Education, 8, 411–428.
Ramsden, P. (1991). A performance indicator of teaching quality in higher education. Studies in
Higher Education, 16, 129–150.
Ramsden, P. (1992). Learning to teach in higher education. London: Routledge.
Ramsden, P. (1999). The 1999 CEQ Interim Report.
Ramsden, P., & Entwistle, N. (1981). Effects of academic departments on students’ approaches
to studying, British Journal of Educational Psychology, 51, 368–383.
Richardson, J.T. (1993). Gender differences in responses to the approaches to studying inventory.
Studies in Higher education, 18(1), 3–14.
Richardson, J.T.E. (1994a). A British evaluation of the course experience questionnaire. Studies
in Higher Education, 19, 59–68.
Richardson, J.T.E. (1994b). Mature students in higher education: I. A literature survey on
approaches to studying. Studies in Higher Education, 9(3), 309–326.
Richardson, J.T., & King, E. (1991). Gender differences in the experience of higher education:
Quantitative and qualitative approaches. Educational Psychology, 11, 363–382.
Tait, H., & Entwistle, N. (1996). Identifying students at risk through ineffective study strategies.
Higher Education, 31, 99–118.
Tait, H., Entwistle, N., & McCune, V. (1997). ASSIST: A reconceptualization of the Approaches
to Studying Inventory. In C. Rust (Ed.) Improving student learning: Improving students as
learners. Oxford: The Oxford Centre for Staff and Learning Development.
Trigwell, K., & Prosser, M. (1996). Changing approaches to teaching: A relational perspective.
Studies in Higher Education, 21(3), 275–284.
Trigwell, K., Prosser, M., & Waterhouse, F. (1999). Relations between teachers’ approaches to
teaching and student learning, Higher Education, 37, 57–70.
Watkins, D. (1982). Identifying the study process dimensions of Australian university students.
Australian Journal of Education, 26, 76–85.
Watkins, D. (1983). Assessing tertiary study processes. Human Learning, 2, 29–37.
Watkins, D., & Hattie, J. (1981). The learning processes of Australian university students:
Investigations of contextual and personological factors. British Journal of Educational Psy-
chology, 51, 384–393.
Watkins, D., & Hattie, J. (1985). A longitudinal study of the approaches to learning of Australian
tertiary students. Human Learning, 4, 127–141.
Wilson, K.L., & Lizzio, A. (1997). The development, validation and application of the course
experience questionnaire. Studies in Higher Education, 22(1), 33–53.

You might also like