Professional Documents
Culture Documents
Sum 8
Sum 8
Sum 8
To cite this article: Grace Wiebe & Kaori Kabata (2010): Students' and instructors' attitudes toward
the use of CALL in foreign language teaching and learning, Computer Assisted Language Learning,
23:3, 221-234
This article may be used for research, teaching, and private study purposes. Any
substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing,
systematic supply, or distribution in any form to anyone is expressly forbidden.
The publisher does not give any warranty express or implied or make any representation
that the contents will be complete or accurate or up to date. The accuracy of any
instructions, formulae, and drug doses should be independently verified with primary
sources. The publisher shall not be liable for any loss, actions, claims, proceedings,
demand, or costs or damages whatsoever or howsoever caused arising directly or
indirectly in connection with or arising out of the use of this material.
Computer Assisted Language Learning
Vol. 23, No. 3, July 2010, 221–234
Introduction
It has been often argued that a successful implementation of the use of computers for
instructional purposes is based on three factors: students, teachers, and infra-
structure/developer (Jamieson, Chapelle, & Preiss, 2005). However, a quick review of
recent evaluation studies in the area of computer assisted language learning (CALL)1
reveals that very little work has been done to link these three elements in this field of
research. While a number of studies have looked at students’ perceptions of
CALL (e.g. Ayres, 2002; Christie, 2001; Heller, 2005; Holmes, 1998; Hwu, 2003;
Stepp-Greany, 2002), and a few have looked at teachers’ perceptions (e.g. Kadel,
2005), very few have compared them (Jamieson et al., 2005; Wiebe & Kabata, 2006;
Wiebe, Kabata, & Okamoto, 2008), and even fewer, if any, have examined whether
teachers have a good understanding of how students perceive CALL. The present
study, therefore, aims to fill the gap in the literature by directly comparing the
2002; Christie, 2001; Heller, 2005; Holmes, 1998; Ma & Kelly, 2006; Stepp-Greany,
2002) mostly indicate positive outcomes, as illustrated by a student’s comment
reported by Holmes (1998):
I think that the technology is changing the education system and the main purpose of
education is that people become fit for the modern technology society. (p. 1)
Kang-Mi and Shen (2006) found that a CALL-integrated instruction does not
necessarily lead to better performance when compared with traditional instruction; it
leads to improvement of students’ perception of their learning environments. Similar
positive effects of CALL materials were found by Christie (2001), who reported that
the impact of CALL materials was found on the learners’ perception of their
language learning (Italian), as well as of computers in general. Christie (2001) also
argued that the CALL materials have an impact on the instructors’ attitudes toward
Downloaded by [University of Arizona] at 10:46 17 December 2012
the classroom since ‘‘more class time can be devoted to cognitively more demanding
and engaging tasks’’ (p. 511). As demonstrated by Conole (2008), students today
work in ‘‘a complex and multifaceted’’ environment where technology is ‘‘at the
heart of all aspects of their lives,’’ and they expect an appropriate, rather than
extensive, integration of technologies in their instruction (2008, p. 136).
Despite the importance of instructors’ roles in a successful CALL implementa-
tion (e.g. Kadel, 2005; Stepp-Greany, 2002), very few studies have examined how
instructors perceive CALL effectiveness, nor how aware instructors are of students’
perceptions of and attitudes toward CALL.
In responding to the need for a better understanding in the steps and decision-
making processes necessary for CALL implementation, Coryell and Chlup (2007)
conducted a survey study and held focus group discussions to examine what the
challenges and successful experiences were for instructors and their program during
CALL implementations. The data overwhelmingly identified a need for a dedicated
computer laboratory, updated hardware, projector systems, and Internet connectiv-
ity. It was also suggested that all faculty and staff needed to believe in e-learning in
order for its use to be successful. At the same time, some programs were found to still
delineate job functions separately between language instruction and computer
instruction.
Jamieson et al. (2005) examined how three groups of stakeholders of CALL
materials – students, a teacher, and developers – perceive CALL, and whether they
agree on their evaluation. The results indicated overall positive perception in all three
groups, and a strong agreement in the judgment of the CALL appropriateness in six
criteria: language learning potential, meaning focus, learner fit, authenticity, positive
impact, and practicality. Jamieson et al. (2005) argued that the level of agreement as
found in their study should ‘‘give some confidence about the use of the summary of
their collective decisions to arrive at an overall evaluation of the CALL material’’ (p.
123). Wiebe and Law (2005), on the other hand, were interested in how well
instructors understand the patterns of students’ use of computer laboratories. They
found some discrepancy between the students’ and instructors’ responses in terms of
the frequency of use of computer laboratories, the instructors’ emphasis on the
importance of laboratory activities, and students’ awareness of instructors’ goals for
the use of computer laboratories in their courses. These results were in part expected
as instructors are often under the impression that students spend more time on their
assignments than they actually do, and students are not necessarily aware of
instructors’ goals in their classes. However, the importance of this should not be
224 G. Wiebe and K. Kabata
Building on Wiebe and Law (2005) and Jamieson et al. (2005), and on the
premise that students need to be aware of course objectives as suggested by Field
(2002), this study is an in-depth investigation and comparison of students’ and
Downloaded by [University of Arizona] at 10:46 17 December 2012
Methodology
Over the course of two semesters, both quantitative and qualitative data were
collected and analyzed. Quantitative data were based on both students’ and teachers’
responses in surveys, and students’ actual usage pattern of the CALL materials
through the WebCT tracking system. These data were collected at the University of
Alberta in the winter and spring semesters. Participants were students enrolled in
JAPAN 102 and 202 (Basic Japanese II and Intermediate Japanese II), in the winter
semester, and JAPAN 101 and 201 (Basic Japanese I and Intermediate Japanese I),
in the spring semester. These courses utilized CALL materials as part of the
curricula: WebCT, a course management system, was used to provide students with
short instructor-designed quizzes and class notes, and Wimba, an audio tool, was
used to facilitate students’ pronunciation and reading practices. One hundred and
fifty-six students and five instructors participated in the first data collection or first
phase, and 27 students and two instructors in the second phase (see Table 1). The
smaller number of participants in the second phase was due to the much smaller
number of students enrolled in the third or spring semester. Some of the JAPAN 201
(Intermediate Japanese I) students in the second phase were in JAPAN 102 (Basic
Japanese II) in Phase I, but it was decided that this did not pose any problems to
our study since there were no ‘‘learning’’ effects from being exposed to the Phase I
study.
During class time, instructors and students were asked to fill in surveys designed
to measure instructors’ and students’ perceptions about the use of IT in their courses.
The survey questionnaires were concerned with the effectiveness of CALL materials
Note: Survey results for Phase II (spring) are not used in the results due to small sample size.
Computer Assisted Language Learning 225
including the online course management system (WebCT) and the online audio tool
(Wimba) used in a number of lessons and assignments, the frequency of use, time
spent on them, the effect on workload, the goal for using IT, and so on. Instructors
were asked about their perception of students’ usage patterns, as well as their own
opinions about CALL and the students were asked similar questions about the
instructors’ perceptions and reasons for CALL use, as well as their own perceptions
and usage patterns. The surveys were composed of multiple questions and a set of
statements asking students and instructors to rate their perceptions, using a scale of 1
(strongly disagree) to 5 (strongly agree). Sample statements include ‘‘For being
successful in this course, it is important to practice and study using WebCT quizzes’’
and ‘‘I would be able to practice the course material more if I were given more
materials on the WebCT quizzes.’’ The instructors’ and students’ responses were
then compared, and their perceptions and attitudes toward IT were examined.
Downloaded by [University of Arizona] at 10:46 17 December 2012
The second phase of the study was conducted with 27 students and two
instructors in the spring semester to collect additional data regarding students’ actual
usage pattern of CALL materials that were collected through the WebCT tracking
system, with the permission of both instructors and students regarding 27 students’
frequency of access and time spent on WebCT. The same survey questionnaires as
were given in Phase I were also presented to the student group in the second phase,
but the results were not used in the study because of the small sample size. In order
to obtain a record of in-class instructions about the WebCT materials, the two
instructors were also asked to keep a journal to record when and what they
mentioned about WebCT and CALL materials in the class. Instructors’ journals and
students’ usage data were then compared with each other to determine how students
responded to their instructors’ encouragement. In this study and previous ones, we
have repeatedly observed a discrepancy between students’ and instructors’
perceptions of students’ use of IT materials, and of instructors’ encouragement of
the use of IT materials, both in surveys and in focus groups. For this reason, by
looking at both the students’ log-in frequencies and the average time per week spent
on WebCT per class, and comparing these with the instructors’ journals for each
class, we were interested in what effect the instructors’ behavior had on students’
pattern of WebCT use.
Qualitative data were also collected during Phase I of the study from instructors’
and students’ responses to open-ended survey questions as well as through a focus
group interview. Two students in an introductory level Japanese language course
(JAPAN 101) and two at the intermediate level (JAPAN 201) volunteered to
participate in the focus group discussion, which lasted for an hour. A set of open-
ended questions was prepared to solicit students’ candid opinions on overall
experiences of the course. Questions largely focused on four areas: students’
perceptions of CALL, the students’ awareness of the instructors’ encouragement of
using WebCT, the influence of this research on the students’ use of CALL, and
suggestions for the use of CALL.
instructors and students have generally positive perceptions on the usefulness of IT.3
However, while all of the instructors found the use of IT was useful (67%) to very
useful (33%), just over half (52%) of the students rated it useful (17%) to very useful
(35%). The fact that fewer students than instructors found IT to be useful
corroborates our earlier findings (Wiebe & Kabata, 2006; Wiebe & Law, 2005).
Instructors and students were also asked which multimedia materials were most
effective for use in their course. In Figure 2, it can be seen that students chose their
textbooks to be the most effective for materials in their course. The four other
choices were given similar ratings by the students, with WebCT quizzes the highest at
Downloaded by [University of Arizona] at 10:46 17 December 2012
3.7, Wimba (3.5) and audio4 (3.5) the next highest, and lecture notes with the lowest
ratings at 2.9. The instructors on the other hand rated their lecture notes to be more
effective (5.0) and rated Wimba as least useful (4.0), yet all their ratings were high
(average rating, 4.5), indicating they felt that each of the course materials was
relatively effective. The discrepancies between instructors’ and students’ ratings were
greatest for lecture notes and least for Wimba. The discrepancy of the ratings for the
lecture notes could be a reflection of how much the lecture notes duplicated material
already in the textbook and the style of the class presentations. The closeness of the
ratings for Wimba, on the other hand, may be due to the importance placed on
Wimba for the completion of assignments.
In a previous study by Wiebe and Kabata (2006), instructors and students were
asked similar questions about the effectiveness of their course materials without the
inclusion of the textbook as a choice. Interestingly, in both the present study and
Downloaded by [University of Arizona] at 10:46 17 December 2012
the one done in 2006, the instructors rated most components higher than the
students, except the lecture notes, to which the instructors gave a lower usefulness
rating than the multimedia components (audio, Wimba, or WebCT) of the course.
The introduction of a fifth choice, the textbook in this study may have changed
their ratings of the other components. However, a more likely reason for the
higher ratings of the multimedia components by the instructors in the 2006 study
may be that these technologies were new and more exciting. Now that the
instructors and students have become more familiar with technologies like Wimba
and WebCT, their ratings may reflect their familiarity with IT, showing the
possibility of a familiarity effect (see Clark, 1983; Kabata & Wiebe, under revision;
Wiebe et al., 2008).
As some of our instructors had felt students might not be always appreciative of
the value of using CALL materials in their classes, nor of the value of doing so for
their learning, we decided to ask students to rank the reasons they thought their
instructor chose to use IT for their courses. The choices they were given to rank were:
(a) to decrease the instructor’s workload, (b) for instructional effectiveness, (c) for
classroom preparation, (d) to be consistent, (e) to give students skills, and (f) to teach
complex materials. Instructors were given a similar question asking what their goal
for using IT was, with the same list of choices, to see whether there were similarities
or differences between their own and their students’ ratings. As can be seen from
Figure 3, students and instructors did not necessarily agree. For instance, 36% of the
students felt that their instructors chose to use IT to decrease their course
preparation time (c), but no instructors felt that it decreased their preparation time.
Conversely, both instructors and students gave (b) instructional effectiveness as their
highest ratings. All instructors (100%) indicated that their use of IT increased their
instructional effectiveness (b), while somewhat fewer of the students (66%) felt it
increased instructional effectiveness. Both instructors (67%) and students (45%)
assigned the second highest rating to gives students skills (e). All other choices had
lower ratings from both instructors and students.
The large differences observed between students and instructors for some of the
ratings were likely due to the students not being aware of why they were using IT and
what outcome students should expect from using it. If instructors and course
coordinators are interested in changing students’ attitudes and perceptions of the
usefulness of IT, then they might consider spending some class time on why the
technology will benefit the students. However, this is no small task as can be seen in
Figure 4.
228 G. Wiebe and K. Kabata
Discrepancies were also noticed when instructors were asked whether they had
explained to their students why they were using technology and students were asked
whether their instructors had done so. Figure 4 shows a comparison of the answers
of students whose instructors reported that they had explained why they used
technology (Instructors explained why) with those students whose instructors did not
(Instructors did not explain why). The answers from students were similar whether or
not their instructors had given them explanations for why they used technology in
their teaching. Interestingly, only 21% of students in the Instructors explained why
group thought their instructors had told them why they were using technology in
their class, whereas 32% of the students in the Instructor did not explain why group,
Downloaded by [University of Arizona] at 10:46 17 December 2012
Figure 4. Comparison of student responses in classes where instructors reported they did or
did not explain why they were using IT in their classes.
Computer Assisted Language Learning 229
thought their instructors had done so. This gives us some insight into what students
are taking away from their classes depending on whether or not students are always
listening to their instructors.
Owing to this discrepancy, we asked the students in the focus group interview,
‘‘Has your instructor explained why you are using WebCT?’’ Students in the
introductory class indicated that their instructors explained their goal for using
WebCT, but the intermediate-level students maintained that their instructors
focused more on the mechanical aspects for using WebCT rather than the
pedagogical purposes for using WebCT. While the introductory-level students
answered that their instructors had explained to them that WebCT is for practice
and participation and for developing more skill with the language, the
intermediate-level students indicated that their instructors had mainly talked about
their due dates for WebCT assignments, explained how to log on, and asked
Downloaded by [University of Arizona] at 10:46 17 December 2012
students their reasons for missing the assignments. It appears, then, that at least
some of the students were aware that their instructors had explained why they
were using IT and in the focus group were able to articulate their instructors’
reasons.
Note: *p 5 0.05.
230 G. Wiebe and K. Kabata
Downloaded by [University of Arizona] at 10:46 17 December 2012
any mention or encouragement by the instructor (2nd, 3rd, and 5th W; 3rd S or
Saturday).
In the intermediate level class, we found similar results. The instructor
encouraged the students to use the computer four days every week, but the students
are not necessarily responding to the instructor’s encouragement with increased
number of log-ins (see Figure 6). Moreover, although the frequencies of log-ins are
higher over the whole period for the intermediate class when compared to the
introductory class, the instructors’ reminder to use WebCT in this intermediate class
did not affect the students’ average time spent on WebCT. In both classes, there is
some evidence of an increase in average time spent on WebCT a day or so prior to an
assessment or exam. There is also a particularly interesting spike in average time
spent on WebCT in the introductory class after the first exam. Any number of
reasons for this spike come to mind, such as increased enthusiasm, lower-than-
expected marks in the exam, and so on. However, these would all be speculation on
the authors’ part.
We wondered whether the students who are very responsive to the instructor’s
encouragement may have somewhat different usage patterns for using WebCT
compared to those who were less responsive. Therefore, we selected from each class
the students who logged into WebCT most frequently (five introductory-level
students and three intermediate-level students). When we compared the average
time that the eight ‘‘very responsive’’ students spent on WebCT per week with the
amount of time that the other ‘‘less responsive’’ students spent on WebCT, we
found that the ‘‘very responsive’’ students did not always stay logged-in longer on
WebCT than the other students. In other words, although the ‘‘very responsive’’
students appeared to be more attuned to the instructors’ encouragement and did
log-in more often, it did not increase their log-in times. This indicates that
instructors’ encouragement may increase log-in frequency, but the students may
not do more work as indicated by the duration of their log-ins as compared to
other students in their classes.
Downloaded by [University of Arizona] at 10:46 17 December 2012
Conclusions
Our results indicate that students’ and instructors’ perceptions of IT use and
behaviours do not always match. This result is not surprising to anyone who has
spent any time in a classroom or with students. What these results do suggest is that
instructors do not always have a good understanding of their students’ use of IT nor
do students necessarily understand their instructors’ goal for using technology-
enhanced materials in their classes. It would appear that instructors’ encouragement
affects the number of times students log into or visit the course website on WebCT,
but this does not mean that students spend more time logged on. It was not possible
in this study, however, to determine what kind of online learning activities students
were engaged in while logged on.
We also observed that in many instances students were not receptive to
Downloaded by [University of Arizona] at 10:46 17 December 2012
encouragement for using CALL materials. This was evident when students were
asked whether or not their instructors had explained why they were using
technology. It did not seem to matter whether instructors had or had not done so;
the students’ answers were the same, with students whose instructors has talked
about why they were using technology saying that their instructors did not mention
their reasons and vice versa. Further, repetitively mentioning online course materials
while focussing on the mechanical aspects, as the intermediate-level instructor did,
rather than on the pedagogical purposes for using these materials, does not increase
students’ overall time on task. In fact, an instructor’s daily mentioning that students
should be using WebCT without clear goals for using the online materials, had little
or no effect on the students’ online behavior. This constant repetition may have had
the effect of white-noise; it receded into the background of students’ attention. The
question remains, then, whether or not an instructor can map his or her intentions
and purposes for using IT onto the learner or not, because they may not share
common goals for using IT.
In this article we maintain that the instructor would benefit from knowing that
students’ perceptions of their use of CALL materials and IT for their classes would
differ from their own and that the implication for the instructor might simply be to
inform students of what they are doing and why they are doing it. Further, we found
evidence for the importance of judiciously placed reminders and encouragement
from the instructor for increased student performance and participation. We also
demonstrate how students’ and teachers’ perceptions match and if they do not,
where they do not match and why. Future studies should explore how such
discrepancies reflect usage patterns of students and whether they are dependant on
instructors’ and students’ attitudes and behaviors.
Acknowledgments
The authors would like to acknowledge Drs Stanley Varnhagen and Jason S. Daniels, from
Learning Solutions in the Faculty of Extension, for their assistance in the collection of data;
the instructors and students from the Department of East Asian Studies for their participation
in the surveys and focus group; the instructors for keeping a daily journal; and those students
who allowed us to access their WebCT usage patterns. This research was funded in part by a
Teaching and Learning Enhancement Fund Grant from the University of Alberta.
Notes
1. The term CALL is used to refer to computer-based tools or materials for language
learning and is used in the study to refer to its use in blended or hybrid learning situations.
Computer Assisted Language Learning 233
2. For the purposes of this article, we have used the terms IT and CALL interchangeably,
because IT was the more common and familiar term for the students being surveyed.
3. Data discussed and displayed in graphs were collected in the first phase of the study.
Similar results were observed in the second phase of the study.
4. In Figure 2, the item audio refers to exercises in which students listened to recorded items
and answered questions, either written or orally, and Wimba is used for extemporaneous
speech practice such as simulated telephone conversations and for students to record their
reading/pronunciations for assessment/practice.
Notes on contributors
Grace Wiebe (MA, germanic linguistics; PhD, psycholinguistics, University of Alberta) is the
director of the Arts Resource Centre (2000), which supports all aspects of the incorporation of
technology in teaching and research in the humanities, social sciences and fine arts, and
adjunct professor in the Department of Linguistics (1998), Faculty of Arts, University of
Downloaded by [University of Arizona] at 10:46 17 December 2012
Alberta, Canada. Her research interests include psycholinguistics, syllable structure, language
acquisition, and CALL.
Kaori Kabata (PhD, psycholinguistics, University of Alberta) is the director of the Prince
Takamado Japan Centre and an associate professor in the Department of East Asian Studies,
University of Alberta, Canada. She teaches Japanese language and Japanese linguistics. Her
research interests are in lexical semantics, first and second language acquisition, and the
development and evaluation of CALL materials.
References
Angelo, T.A., & Cross, K.P. (1993). Classroom assessment techniques: A handbook for college
teachers. San Francisco: Jossey-Bass.
Ayres, R. (2002). Learner attitudes towards the use of CALL. Computer Assisted Language
Learning, 15(3), 241–249.
Christie, K.N. (2001). Web-based multimedia in business Italian: A longitudinal evaluation of
learner experiences and attitudes. Italica, 78(4), 499–525.
Clark, R. (1983). Reconsidering research on learning from media. Review of Educational
Research, 53, 445–459.
Conole, G. (2008). Listening to the learner voice: The ever changing landscape of technology
use for language students. ReCALL, 20(2), 124–140.
Coryell, J.D., & Chlup, D.T. (2007). Implementing e-learning components with adult English
language learners: Vital factors and lessons learned. Computer Assisted Langugage
Learning, 20(3), 263–278.
Field, M.H. (2002). Toward a CALL pedagogy: Student uses and understanding. In P. Lewis
(Ed.), The changing face of CALL. A Japanese perspective (pp. 3–17). Lisse: Swets and
Zeitlinger.
Heller, I. (2005). Learner experiences and CALL-tool usability 7 Evaluating the Chemnitz
Internet Grammar. Computer Assisted Language Learning, 18(1–2), 119–142.
Holmes, B. (1998). Initial perceptions of CALL by Japanese university students. Computer
Assisted Language Learning, 11(4), 397–409.
Hwu, F. (2003). Learners’ behaviors in computer-based input activities elicited through
tracking technologies. Computer Assisted Language Learning, 16(1), 5–29.
Jamieson, J., Chapelle, C., & Preiss, S. (2005). CALL evaluations by developers, a teacher, and
students. CALICO Journal, 22(1), 93–138.
Kabata, K., & Wiebe, G. (under revision). Normalization or fossilization of CALL: The state
of CALL after 30 years.
Kadel, R. (2005). How teacher attitudes affect technology integration. Learning and Leading
with Technology, 32(2), 26–31.
Kang-Mi, L., & Shen, H.Z. (2006). Integration of computers into an EFL reading classroom.
ReCALL, 18(2), 212–229.
Ma, Q., & Kelly, P. (2006). Computer assisted vocabulary learning: Design and evaluation.
Computer Assisted Language Learning, 19(1), 15–45.
234 G. Wiebe and K. Kabata