Professional Documents
Culture Documents
The Relationship Between Students Course Perception and Their Approaches To Studying in Undergraduate Science Courses A Canadian Experience
The Relationship Between Students Course Perception and Their Approaches To Studying in Undergraduate Science Courses A Canadian Experience
The Relationship Between Students Course Perception and Their Approaches To Studying in Undergraduate Science Courses A Canadian Experience
Carolin Kreber
To cite this article: Carolin Kreber (2003) The Relationship between Students' Course
Perception and their Approaches to Studying in Undergraduate Science Courses: A
Canadian experience, Higher Education Research & Development, 22:1, 57-75, DOI:
10.1080/0729436032000058623
course perception influence student learning (e.g., Donald, 1997; Wilson & Lizzio,
1997). While these relationships are well established on conceptual as well as
empirical terms in the higher education teaching and learning literature, the present
study meant to make a contribution by investigating these relationships with a large
sample of Canadian undergraduate science students. Furthermore, since one of two
questionnaires administered to the sample was a rather newly established instrument
(Entwistle, Tait, & McCune, 2000), the study contributes by testing the reliability
of this instrument with a Canadian sample. Though the second instrument used has
been widely tested (Wilson & Lizzio, 1997; Ramsden, 1991, 1999), the present
study explored whether the reported factor structure would be confirmed with a
Canadian sample.
Deep-level Learning
When Ramsden (1992) reviewed the findings from several research studies asking
university instructors what they considered to be important goals of their teaching,
he discovered that virtually all studies showed that instructors hoped their students
would develop critical thinking skills and apply those to the subjects they are
studying.
It is clear from several studies that the ideas expressed by teachers in higher
education … can be summarized as understanding the key concepts; an
ability to go beyond the orthodox and the expected so that hitherto unmet
problems can be tackled with spirit; a facility with typical methods of
approaching a problem in the discipline and … an awareness of what
learning and understanding in the discipline consist of. In other words,
lecturers describe content-related versions, with a substantive and a pro-
cedural or syntactic element, of the general principles of “critical thinking”
and understanding. (p. 21)
What seems to be underlying these instructors’ comments is a concern with
deep-level learning on the part of students.
managing their own learning as well as thinking critically. Such self-directed learning
requires a deep-level approach.
in a particular course (this perception, it is argued, is defined not the least by the
pedagogical methods used by the teacher) might explain the approaches to studying
(the result) they adopt for this same course. Specifically, the objective of this study
was to explore the relationship between students’ perception of the learning environ-
ment created in a semester-long course and their approaches to studying in this
course.
Data Collection
The sample for this study consisted of a total of 1080 undergraduate science
students of whom 591 were female. Seventy-five per cent of the sample were
between 18 and 22 years of age, sixteen per cent were between 23 and 27, five per
cent between 28 and 35, and the rest (2.8 per cent) 36 or older. The study was part
of a larger investigation into the relationships between science instructors’ knowl-
edge as well as conceptions of teaching and their students’ learning. Data were
collected at several Canadian universities between the fall of 2000 and winter of
2002 from a total of forty-three different classes during the last week of the term.
Though the Canadian higher education system is characterised by relatively little
diversity with respect to the university sector (as compared to, for example, the
United States), three kinds of institutions are typically distinguished among
Canada’s seventy universities: (1) so called “primarily undergraduate institutions”
(universities with fewer graduate programmes and usually no doctoral programmes),
(2) “comprehensive universities” (many master’s and doctoral programmes but no
medical school) and (3) so called “medical/doctoral universities”, research-intensive
institutions with many graduate programmes and large doctoral/medical faculties
(McLeans Magazine, 2000). While all universities offer undergraduate programmes,
the major difference lies in the number and types of graduate programmes they
provide. Students in this study came from all three types of universities.
According to Biglan’s (1973) classic categorisation system of disciplines, the seven
disciplines included in this study could be grouped as follows: earth and atmospheric
sciences (hard, pure, non-life systems), biology (hard, pure, life systems), chemistry
(hard, pure, non-life systems), computer science (hard, pure or applied, non-life
systems), mathematics (hard, pure, non-life systems), physics (hard, pure, non-life
systems), and psychology (either hard or soft, either pure or applied, life systems).
Note that in this study psychology departments, too, were associated with Faculties
of Science.
Data were gathered anonymously via two questionnaires that were completed
outside of class time and collected three days later in class. For the purpose of
correlating the data, the questionnaires the students received had been coded. The
first questionnaire was a section of the ASSIST (Approaches and Study Skills
Inventory for Students) (Tait & Entwistle, 1996; Entwistle, Tait, & McCune, 2000).
The ASSIST inventory consists of seven sections whereby the fourth addresses
approaches to studying. This section of the ASSIST is a more recent version of the
Approaches to Studying Inventory (ASI) orginally developed by Entwistle and
Ramsden (1983), which has been used in a large number of studies. Consistent with
Students’ Course Perception and their Approaches to Studying 61
the underlying construct, this fifty-two-item instrument consists of three main scales
measuring a deep approach, a surface approach, and a strategic approach to
learning, respectively. Each main scale is divided into four to five corresponding
sub-scales. The first main scale, measuring a deep approach, consists of the four
sub-scales Seeking Meaning, Relating Ideas, Use of Evidence, and Interest in Ideas
whereby the latter indicates the associated motive for learning. The second main
scale, measuring a strategic approach, is made up of the five sub-scales Organised
Studying, Time Management, Monitoring Effectiveness, and Alertness to Assess-
ment Demands, with Achievement Orientation as the associated motive. Finally, the
third main scale, measuring a surface approach, comprises the four sub-scales Lack
of Understanding (or Unrelated Memorisation), Lack of Purpose, Syllabus Bound-
ness and Fear of Failure, the latter being the associated motive for learning.
Sub-scale level factor analysis, as recommended by Tait, Entwistle, and McCune
(1997) confirmed the three-factor solution (deep, strategic, and surface) and the
association of the learning motives with their corresponding scales (see also
Entwistle, Tait & McCune, 2000). When all fifty-two items were subjected to
principal component factor analysis, the thirteen-factor solution reported by Tait,
Entwistle, and McCune (1997) was not confirmed, however. Consequently, all
further analyses were carried out on the main scale level. Not surprisingly, Entwistle,
Tait and McCune (2000) report a moderate positive correlation between the deep
and strategic factors and a negative correlation between the deep and surface factors,
as well as strategic and surface factors, and these finding were confirmed in the
present study. Table 1 shows the factor loadings and reliability measures for each
sub-scale and the correlations between the three ASSIST factors.
The second instrument was Ramsden’s Course Experience Questionnaire (CEQ).
Development on this questionnaire began in the 1980s at the University of Lan-
caster and has been refined through many subsequent studies (e.g., Ramsden,
1991). The theory underlying the CEQ is that students’ approaches to learning and
the quality of their learning outcomes are determined by their perception of
curriculum, instruction and assessment. In the Australian context, the CEQ is
employed as a measure of perceived teaching quality in degree programmes in
national annual surveys of all graduates in the higher education system and it is
increasingly being used as a measure of the quality of teaching in universities in the
UK (Wilson & Lizzio, 1997). The instrument has been tested on well over 65,000
students, and measures of validity and reliability have been reported as satisfactory
(e.g., Wilson & Lizzio, 1997; Ramsden, 1999). The CEQ has two versions—a longer
one containing thirty-six items and a more widely used and shorter one with
twenty-five items. The version of the CEQ used in this study had twenty-five items,
including the item “Overall, I was satisfied with the quality of the course” (item 25).
While the shorter version of the CEQ does not include the Emphasis on Indepen-
dence scale, which consists of six items of the longer version, it had a new scale
added to address the need to develop lifelong learning skills in students. Such skills
include “problem-solving, analytic skills, teamwork, confidence in tackling unfam-
iliar situations, ability to plan work and written communication skills” (Wilson &
Lizzio, 1997, p. 36). This scale was labeled “Generic Skills”. Principal components
62 C. Kreber
TABLE 1. Factor Loadings and Cronbach Coefficients for each ASSIST Sub-Scale
(n ⫽ 1080)
Approaches to Studying
Deep Approach
Seeking meaning 0.73 (0.62)
Relating idea 0.80 (0.59)
Use of evidence 0.79 (0.51)
Interest in ideas 0.70 (0.73)
Surface Approach
Lack of understanding 0.81 (0.60)
Lack of purpose 0.65 (0.72)
Syllabus boundness 0.61 (0.59)
Fear of failure 0.75 (0.75)
Strategic Approach
Organized studying 0.83 (0.59)
Time management 0.87 (0.80)
Monitoring effectiveness 0.65 (0.62)
Achievement orientation 0.81 (0.67)
Alertness to assessment demands 0.48 (0.60)
factor analysis yielded the following five factors for the shorter version of the CEQ:
Good Teaching, Clear Goals and Standards, Appropriate Workload, Appropriate
Assessment and Generic Skills (e.g., Wilson & Lizzio, 1997; Ramsden, 1999). Both
Wilson and Lizzio (1997) as well as Ramsden (cited in Richardson, 1994a) found
evidence that the CEQ comprises a higher-order, two-factor structure. The first
higher-order factor includes the scales Good Teaching, Clear Goals and Standards,
Generic Skills and Appropriate Assessment. Appropriate Workload makes up the
second higher-order factor (Wilson & Lizzio, 1997).
Note that the Course Experience Questionnaire was initially designed to assess
students’ experiences with a programme (“course”). The way the term “course” is
used in this article corresponds to the North American interpretation of a course as
a semester-long seminar or lecture usually comprising thirty-six hours of class time
and taught by one instructor. This is very different from an understanding of a
course as a programme of study comprised of several semester-long seminars and
lectures taught by several instructors leading to a degree. Though this is somewhat
different from what Ramsden had in mind when designing the instrument, the
questionnaire was still considered appropriate for the purpose of this study. Some
Students’ Course Perception and their Approaches to Studying 63
items had to be slightly reworded to make them more suitable for a semester-long
course.
The decision was made to collect students’ responses to eleven further items to
address the notion of “fostering independence within the discipline”, a dimension
shown to be very relevant in assisting student learning (Donald, 1997). Ten of these
eleven items were adapted directly from Donald’s (1997) list of recommendations of
how to promote effective learning environments. They included:
• Students had a great deal of choice over how they were going to learn
• The instructor showed us how the content of this course fits into the larger
discipline
• Students were given lots of choice in the work they had to do
• We discussed with our instructor how we were going to learn
• The instructor was approachable and willing to help with problems
• The instructor made me aware that learning required thinking on my part
• In this course I was shown how to think within the discipline
• The instructor encouraged student participation
• The instructor helped us to set goals for this course
• Success in this course was a matter of how well I studied
Principal component factor analyses were conducted separately for the twenty-
four items of the CEQ (not including the overall satisfaction item) and the eleven
new items. Taking into account the recommendation made by Wilson and Lizzio
(1997) to use oblique rather than varimax rotation in factor analyzing the CEQ, the
rationale being the reported significant positive moderate intercorrelations between
the CEQ scales (Ramsden, 1991; Ainley & Long, 1994), both were carried out.
Principal components factor analysis using first oblique rotation and then, for
comparison purposes, varimax rotation yielded a six-factor structure in each case. It
therefore was decided to base further analyses on varimax rotation. Table 2 shows
the six-factor solution for the CEQ and the variance accounted for by each factor.
With the exception of Ramsden’s Good Teaching scale, the analysis confirmed the
factor solution reported by Ramsden (1999) and Wilson and Lizzio (1997). Items
that loaded on the Good Teaching scale in previously reported factor analyses of the
CEQ, however, loaded clearly on two separate factors in the present study. This
remained the case also when, for comparison purposes, the factor analysis was
redone without item 16. This item (“Feedback on my work was usually given only
in the form of marks or grades”), which was not included in Ramsden’s (1999) and
Wilson and Lizzio’s (1997) analyses, loaded highly on factor three (Good Teaching
A) in our study and had no double-loadings. Since all other items confirmed the
factor structure reported elsewhere (e.g., Ramsden, 1999; Wilson & Lizzio. 1997),
the two factors were labeled “Good Teaching A” (which basically refers to
“Feedback on, and concern for, student learning”) and “Good Teaching B” (which
64
C. Kreber
5. This course sharpened my analytic skills 0.743 0.070 ⫺ 0.004 ⫺ 0.025 0.208 ⫺ 0.140
10. As a result of this course, I feel confident about 0.742 0.115 0.139 ⫺ 0.137 0.101 ⫺ 0.101
tackling unfamiliar problems
2. This course developed my problem solving skills 0.712 0.147 0.064 0.001 0.187 ⫺ 0.207
22. The course helped me to develop the ability to plan 0.697 0.083 0.098 0.117 0.173 0.043
my own work
11. The course improved my skills in written communication 0.521 ⫺ 0.055 0.405 ⫺ 0.061 0.054 0.086
9. This course helped me develop my ability to work as a 0.488 0.221 0.324 0.064 ⫺ 0.286 ⫺ 0.097
team member
1. It was always easy to know the standard of work 0.151 0.790 0.066 ⫺ 0.137 0.109 0.018
expected in this course
24. The instructor made it clear right from the start what 0.143 0.733 0.118 ⫺ 0.004 0.229 ⫺ 0.005
s/he expected from students
6. I usually had a clear idea of where I was going and 0.186 0.726 0.022 ⫺ 0.197 0.222 0.014
what was expected of me in this course
13. It was often hard to discover what was expected of me ⫺ 0.059 0.689 0.058 ⫺ 0.235 0.021 ⫺ 0.178
in this course
17. The instructor normally gave me feedback on how I was doing 0.246 0.129 0.750 ⫺ 0.089 0.145 ⫺ 0.037
7. The instructor put a lot of time into commenting on my work 0.185 0.193 0.738 ⫺ 0.013 0.216 0.000
16. Feedback on my work was usually given only in the ⫺ 0.027 ⫺ 0.081 0.678 ⫺ 0.098 ⫺ 0.001 ⫺ 0.166
form of marks or grades
15. The instructor made real effort to understand difficulty 0.157 0.190 0.512 ⫺ 0.100 0.421 ⫺ 0.098
I might be having with my work
4. The workload in this course was too heavy 0.042 ⫺ 0.173 ⫺ 0.032 0.774 ⫺ 0.135 0.087
23. The sheer volume of work to get through in this ⫺ 0.147 ⫺ 0.073 ⫺ 0.027 0.755 ⫺ 0.112 0.181
course meant it couldn’t all be thoroughly comprehended
14. I was generally given enough time to learn the things ⫺ 0.087 ⫺ 0.182 ⫺ 0.146 0.648 ⫺ 0.148 ⫺ 0.107
I had to learn
21. There was a lot of pressure on me to do well in this course 0.135 ⫺ 0.066 ⫺ 0.063 0.590 0.146 0.134
20. The instructor worked hard on making the subject interesting 0.195 0.130 0.109 ⫺ 0.034 0.763 ⫺ 0.146
18. The instructor was extremely good in explaining things 0.151 0.243 0.149 ⫺ 0.175 0.728 0.000
3. The instructor of this course motivated me to do my best work 0.322 0.326 0.285 ⫺ 0.013 0.614 ⫺ 0.066
8. To do well in this course all you really needed was a ⫺ 0.247 0.024 ⫺ 0.116 0.163 0.034 0.720
good memory
12. The instructor seemed more interested in testing what I ⫺ 0.071 ⫺ 0.170 ⫺ 0.148 0.146 ⫺ 0.208 0.697
had memorized than what I had understood
19. The instructor asked me questions just about facts ⫺ 0.012 0.003 0.007 0.007 ⫺ 0.025 0.675
% of Variance 12.9% 11.0% 9.7% 9.2% 9.0% 7.2%
Students’ Course Perception and their Approaches to Studying 65
66 C. Kreber
when analyses were done for each split half of the sample, though the order by which
subsequent predictor variables entered the regression equation changed slightly in
some cases. Perhaps most importantly, in the prediction of strategic approach,
gender and age were the second and third best predictors (though together they
accounted for only a small percentage of the variance) for the total sample, but these
results were not confirmed with half of the sample. Similarly, in the prediction of
deep approach, gender was the last variable to enter the regression analysis for the
entire sample (accounting for only a very small amount of variance) and these results
were not confirmed when the sample was split. Overall, however, these results
enhance our confidence in the findings based on the entire sample reported here,
particularly with regards to the main predictors.
To facilitate the reader’s understanding of the regression results reported, the
scales “appropriate workload” and “appropriate assessment” from here on will be
called “Heavy Workload” and “Facts-Oriented Assessment” as this is what the two
scales actually measure.
Results
Table 3 shows the correlations between the variables considered in this study.
Tables 4 to 6 report the results of the three regression analyses.
The main predictor for deep approach was “Generic Skills”, which accounted for
14.3 per cent of the total variance (Table 4).
Age, “Independent Thinking and Learning within the Discipline”, and “Heavy
Workload” (Beta is negative), together, explained an additional 5.6 per cent of the
variance. Gender entered the regression equation last and accounted only for a very
small percentage of the variance (0.4 per cent). As noted earlier, gender was shown
to not be a reliable predictor.
In the prediction of strategic approach, “Generic Skills” again explained most of
the variance (11.6 per cent) (Table 5).
Though, the next variable entering the regression equation was Gender account-
ing for an additional 1.6 per cent of the variance, the same result was not found for
each half of the sample. Though “Clear Goals and Standards” entered the re-
gression equation at the third step, this variable was removed in step eight, and
hence, did not really account for any additional variance. Age, though accounting
for an additional 1.0 per cent, was shown to not be a reliable predictor.
Multiple F
Step Variable R square change Beta p
Multiple F
Step Variable R square change Beta p
Multiple F
Step Variable R square change Beta p
Discussion
Gender was shown to not be a reliable predictor for any of the three approaches.
This is not unexpected since Richardson (1993), in a study exploring whether male
70 C. Kreber
and female students should be regarded as distinct populations with regard to their
orientations and approaches to studying in higher education, found no clear evi-
dence of differences between the two groups based on ASI scores. He suggested that
“this outcome is consistent with the broad patterns of findings obtained using other,
similar instruments in order to examine the possibility of gender differences in
student learning (see Richardson & King, 1991)” (Richardson, 1993, p. 3 plus four
electronic pages).
Age was shown to be a reliable predictor for deep approach. It is a significant,
though perhaps again not surprising, finding that the deep approach is associated
with older learners, whereas the surface approach is associated with younger learners
(though in the prediction of surface approach, age accounted for only a very small
percentage of the variance). Upon his extensive review of research literature to
explore whether there is any evidence that more mature students adopt different
approaches to studying from younger students, Richardson (1994b) concluded:
On the basis of the rather limited research evidence to date, one can put
forward the tentative hypothesis that mature students are more likely than
younger students to adopt a deep approach or meaning orientation towards
their academic studies and, conversely, that they are less likely than
younger students to adopt a surface approach or a reproducing orientation.
Such a hypothesis is prompted by findings obtained using three different
research instruments: the SPQ (see Watkins & Hattie, 1981; Biggs, 1985,
1987, pp. 55–58); the ILP (see Watkins & Hattie, 1981); and the ASI (see
Watkins, 1982, 1983; Harper & Kember, 1986; and cf. Watkins & Hattie,
1985; Clennell, 1987, 1990) (Richardson, 1994b, p. 316).
The results from the present study, based on a more recent version of the ASI (a
section of the ASSIST), confirm this hypothesis. This finding is also highly consist-
ent with what we know from the adult education literature, namely that mature
learners are far more likely to look for relevance and meaning in their learning
activities (e.g., Brundage & Mackeracher, 1980; Knowles, 1980; Knox, 1986;
Merriam & Caffarella, 1991).
Of the CEQ Scales, only “Generic Skills” and “Heavy Workload” appeared to be
strong predictors, though Facts-Oriented Assessment and Clear Goals and Stan-
dards accounted for a small percentage of the variance. Of particular interest is that
“Generic Skills” was a significant predictor for all three approaches. While “Generic
Skills” was the main predictor for both deep and strategic approach, accounting for
14.6 and 11.6 per cent respectively, it explained an additional 7.5 per cent of the
variance for surface approach (the main predictor, we saw, was Heavy Workload).
Since beta is negative, one can infer that if “Generic Skills” are not encouraged, the
student is more likely to adopt a surface approach. These results are consistent with
the conclusions offered by Ramsden (1992). However, it would also be conceivable
that depending on what approaches to studying a student has adopted within a
particular course, he or she will perceive the extent to which he or she has acquired
generic skills as a result of the course quite differently. In other words, students with
a deep approach may perceive, to a much greater extent than their peers with a
Students’ Course Perception and their Approaches to Studying 71
surface approach, that the course sharpened their analytical skills, that they are more
confident to tackle unfamiliar problems as a result of the course and so forth.
“Heavy Workload” appeared to be a strong predictor for surface approach and
accounted for some of the variance of deep and strategic approach. Since in the
prediction of deep approach beta is negative, this means that the less heavy the
workload in a given course, the more likely a student is to adopt a deep approach.
Conversely, the heavier the workload, the more likely the student is to adopt a
surface approach. It is worth noting that in the prediction of surface approach,
“Heavy Workload” was the main predictor variable explaining 23.5 per cent of the
overall variance. “Facts-Oriented Assessment” contributed also 3.2 per cent of the
variance. Previous research showed strong positive relationships between surface
approach and heavy workload and inappropriate assessment (e.g., Wilson & Lizzio,
1997).
Several studies identified a relationship between perceived workload and ap-
proaches to studying (e.g., Ramsden & Entwistle, 1981; Entwistle & Ramsden,
1983, Dahlgren, 1984). These studies investigated the effect of workload on ap-
proaches to studying. Kember and Ng (1996), however, suggested that how an
individual student perceives the workload is a function of his or her characteristics,
approaches to studying, as well as contextual variables related to the course taken,
such as motivational ability of the teacher, volume of content covered and so forth.
Kember and Leung (1998) argued that the widely reported effect of workload on
approaches could be in the opposite direction in a sense that people’s approach to
studying may influence their perception of the contextual factors of the environ-
ment, such as workload. Findings from their own study, conducted with a sample
of students enrolled in an undergraduate Engineering course at a university in Hong
Kong using Biggs’ (1987) Study Process Questionnaire, suggested the possibility of
a reciprocal relationship between approaches to studying and perceived workload.
They concluded that “students who utilise a reproducing approach are more likely
to perceive the workload as high. Also, when workloads are perceived as high, the
students may be more inclined to resort to a reproducing approach. The two
directional influences may even reinforce each other” (Kember & Leung, 1998,
p. 293 plus six electronic pages). In sum, though “Generic Skills” and “Heavy
Workload” were shown to be strong predictors of approaches to studying, the
previous discussion raised the possibility that the effect could be in the opposite
direction or reciprocal.
Of the two other scales, only “Independent Thinking and Learning within the
Discipline” was shown to be a predictor for deep and strategic approach accounting
for a small percentage of the overall variance (2.1 per cent for deep approach).
Though the variance accounted for is small, one may conclude that if students
perceive their independent thinking within the discipline is encouraged, this is
positively linked to them adopting a deep and strategic approach. If these conditions
are not in place, that is to say if students do not feel they are encouraged to think
independently, this, to some extent, explains why they might opt for a surface
approach to their studying (note that there was a significant negative moderate
correlation between surface approach and “Independent Thinking and Learning
72 C. Kreber
within the Discipline”, and a significant positive moderate correlation between each
deep and strategic approach and “Independent Thinking and Learning within the
Discipline”, see Table 3).
Conclusions
What has been learned from this study and what are some implications? This study
with a large sample of Canadian undergraduate students confirmed the factor
structure underlying the ASSIST questionnaire on the main scale level. The factor
structure Ramsden reported on the CEQ was largely confirmed, though two small
changes were noted. First, principal components factor analysis of the CEQ yielded
six rather than five factors. However, with the exception of the Good Teaching scale,
which split into two factors, items loaded as reported by other researchers (e.g.,
Ramsden 1999; Wilson & Lizzio, 1997). Second, item 16 loaded highly on one of
the two Good Teaching factors.
Of the eleven new items, five formed a factor called “Independent Thinking and
Learning within the Discipline” and three a factor called “Choice in Learning”.
Only “Independent Thinking and Learning within the Discipline” was shown to
moderately correlate with all three approaches.
Previous research using the CEQ and ASI was based on samples of Australian and
British undergraduates. The results of this study with a large sample of Canadian
students clearly confirm previous findings showing a link between students’ percep-
tions of the learning environment in a given course and their approaches to studying.
The strongest predictor overall in this study was “Heavy Workload” in the
prediction of a surface approach. Although it has been argued for many years that
a heavy workload discourages students from adopting a deep approach to learning,
many instructors who were interviewed as part of the larger study that the present
investigation was part of, continued to comment that not having enough time to get
through the course content was a major concern for them. Staff development
initiatives that inform instructors about findings from studies such as this one (and
many others that have been cited in this article) might be a first step in helping them
change first their perspectives and then their approaches to teaching. However,
perspectives or conceptions of teaching are deeply held and firmly rooted beliefs
about practice, and these, as several scholars were able to show (e.g., Argyris &
Schön, 1974; Brookfield, 1987; Mezirow, 1991), are not easily challenged and
changed. Trigwell and Prosser (1996) commented in this context: “If teachers’
conceptions of teaching and/or learning are related to their approaches to teaching,
the task of improving teaching may be significantly more difficult than anticipated”
(p. 276). It would seem essential that instructors are provided with opportunities
that allow them to become aware of their conceptions of teaching. Once they are
aware of these, and how their conceptions of teaching are linked to approaches to
teaching and students’ approaches to studying (Trigwell, Prosser, & Waterhouse,
1999), they will be in a better position to understand the dynamics of the teaching
and learning interactions they are involved in. They might be more inclined to
change their teaching approach if they have been provided with research-based
Students’ Course Perception and their Approaches to Studying 73
Acknowledgements
The author would like to acknowledge the support of the Social Sciences and
Humanities Research Council of Canada. The author wishes to thank Chris Prokop
for her advice and assistance with the analysis of the data. Finally, the author wishes
to extend her thanks to the graduate research assistants helping with data collection:
Heather Castleden, Nina Erfani, Joan Lim, and Tarah Wright.
Address for correspondence: Carolin Kreber, Adult and Higher Education, Department
of Educational Policy Studies, University of Alberta, Edmonton, AB, Canada T6G
2G5. E-mail: carolin.kreber@ualberta.ca
74 C. Kreber
References
Ainley, J., & Long, M. (1994). The Course experience survey: 1992 graduates. Canberra: Australian
Government Publishing Service.
Argyris, C., & Schön, D. (1974). Theory in practice: Increasing professional effectiveness. San
Francisco: Jossey-Bass.
Bandura, A. (Ed.) (1997). Self-efficacy in changing schools. Cambridge: Cambridge University
Press.
Biggs, J. (1985). The role of metalearning in study processes. British Journal of Educational
Psychology, 55, 381–394.
Biggs, J.B. (1987). Student approaches to learning and studying. Hawthorne, Victoria: Australian
Council for Educational Research.
Biglan, A. (1973). Characteristics of subject matter in different academic fields. Journal of Applied
Psychology, 57(3), 195–203.
Boud, D. (1997). Providing for lifelong learning through work-based study. Challenges for policy and
practice. Papers presented at the International Conference on Lifelong Learning, Guildford,
Surrey, UK.
Boyer, E. (1990). Scholarship Reconsidered. Washington, DC: The Carnegie Foundation.
Brookfield, S. (1987). Developing critical thinkers. San Francisco: Jossey-Bass.
Brundage, D., & MacKeracher, D. (1980). Adult learning principles and their application to program
planning. Toronto: Ontario Institute for Studies in Education.
Cambridge, B.L. (2000). The scholarship of teaching and learning: A national initiative. In M.
Kaplan & D. Lieberman (Eds.), To Improve the Academy, (pp. 18, 55–68). Bolton, MA:
Anker.
Candy, P. (1991). Self-direction for lifelong learning. San Francisco: Jossey-Bass.
Clennell, S. (Ed.) (1987). Older students in Adult Education. Milton Keynes: Open University.
Clennell, S. (Ed.) (1990). Older students in Europe: A survey of older students in four European
countries. Milton Keynes: Open University.
Dahlgren, L.O. (1984). Outcomes of learning. In F. Martin, D. Hounsell, & N. Entwistle (Eds.),
The experience of learning (pp. 19–35). Edinburgh: Scottish Academic Press.
Donald, J.G. (1997). Improving the environment for learning. San Francisco: Jossey-Bass.
Entwistle, N. (1998). Conceptions of learning, understanding and teaching in higher education.
SCRE Fellowship, November 5th.
Entwistle, N., & Ramsden, P. (1983). Understanding Student Learning. London & Canberra:
Croom Helm.
Entwistle, N., Tait, H., & McCune, V. (2000). Patterns of response to approaches to studying
inventory across contrasting groups and contexts. European Journal of the Psychology of
Education, 15, 33–48.
Evans, N.J., Forney, D.S., & Guido-DiBrito, F. (1998). Student development in college. Theory,
research, and practice. San Francisco: Jossey-Bass.
Glassick, C.E., Huber, M.T., & Maeroff, G.I. (1997). Scholarship assessed. Evaluation of the
professoriate. San Francisco: Jossey-Bass.
Harper, G., & Kember, D. (1986). Interpretation of factor analyses from the approaches to
studying inventory. British Journal of Educational Psychology, 59, 66–74.
Kember, D., & Leung, D. (1998). Influences upon students’ perceptions of workload. Educational
Psychology, 18(3), 293–308.
Kember, D., & Ng, S. (1996). An examination of the interrelationships between workload, study
time, learning approaches and academic outcomes. Studies in Higher Education, 21(3),
347–359.
Knapper, C., & Cropley, A. (2000). Lifelong learning in higher education. London: Kogan Page.
Knight, P.T., & Trowler, P.R. (2000). Department-level cultures and the improvement of
learning and teaching. Studies in Higher Education, 25, 69–83.
Knowles, M.S. (1980). The modern practice of adult education: From pedagogy to andragogy. New
York: Cambridge Books.
Students’ Course Perception and their Approaches to Studying 75