Download as pdf or txt
Download as pdf or txt
You are on page 1of 18

Computer Assisted Language Learning

ISSN: 0958-8221 (Print) 1744-3210 (Online) Journal homepage: https://www.tandfonline.com/loi/ncal20

Using google translate in EFL drafts: a preliminary


investigation

Shu-Chiao Tsai

To cite this article: Shu-Chiao Tsai (2019): Using google translate in EFL drafts: a preliminary
investigation, Computer Assisted Language Learning, DOI: 10.1080/09588221.2018.1527361

To link to this article: https://doi.org/10.1080/09588221.2018.1527361

Published online: 14 Feb 2019.

Submit your article to this journal

View Crossmark data

Full Terms & Conditions of access and use can be found at


https://www.tandfonline.com/action/journalInformation?journalCode=ncal20
COMPUTER ASSISTED LANGUAGE LEARNING
https://doi.org/10.1080/09588221.2018.1527361

Using google translate in EFL drafts: a preliminary


investigation
Shu-Chiao Tsai
Department of Applied Foreign Languages, National Kaohsiung University of Science and
Technology, Kaohsiung, Taiwan, R.O.C

ABSTRACT KEYWORDS
This study investigates the impact on extemporaneous English as a Foreign
English-language first drafts by using Google Translate (GT) Language (EFL); Google
in three different tasks assigned to Chinese sophomore, Translate; writing; machine
translation; computerized
junior, and senior students of English as a Foreign writing assessment
Language (EFL) majoring in English. Students wrote first in
Chinese (Step 1), then drafted corresponding texts in
English (Step 2), and translated the Chinese into English
using the 2016 GT version (Step 3), and finally compared
their self-written (SW) English texts drafted in Step 2 and
their GT English texts translated from the Chinese texts in
Step 3. Both English drafts were analyzed using two types
of online computational assessments to compare and
evaluate grammatical components of writing quality and
lexical features. Results indicate that the GT English texts
presented a number of components of significantly higher
writing quality than those of students’ SW texts, by having
more words, fewer mistakes in spelling and grammar, and
fewer errors per words. In addition, there were more
advanced-level words in the students’ GT texts than in their
SW ones. A follow-up questionnaire survey indicated that
EFL students found satisfaction with using Google Translate
in their English writing, especially in finding vocabulary
items and enhancing the completion of English writing.

Introduction
In an artificial intelligence era, it is important for EFL learners to use
advanced tools of technology such as software for learning English. This
study investigates possible impact on EFL drafting from Google Translate
enhanced by the Google Neural Machine Translation (GNMT) system
(Le & Schuster, 2016). The purpose of the study is twofold: (i) investigat-
ing the potential usefulness of the updated Google Translate-GNMT for
producing English translation from Chinese texts written by EFL

CONTACT Shu-Chiao Tsai achiao@nkust.edu.tw Department of Applied Foreign Languages, National


Kaohsiung University of Science and Technology, No. 415, Chien-Kung Rd, Kaohsiung, 80778, Taiwan, R.O.C
ß 2019 Informa UK Limited, trading as Taylor & Francis Group
2 S.-C. TSAI

students in various contexts, in comparison with their unaided English


writing performance for the same contexts and (ii) discussing Chinese
EFL students’ current perceptions of uses for Google Translate.
Fluency in communication skills such as writing is needed in order to
properly convey ideas and concepts in different contexts. Effective writ-
ing skills are required in communications in both academic and business
contexts. However, writing in English is a complex and difficult skill for
students of English as a Foreign Language (EFL) because it is a nonlin-
ear, exploratory, and generative process (Sokolik, 2003) and requires
them to use a foreign language as they plan, draft, organize, and revise
throughout the writing process. Additional challenges face EFL writers:
the possible influence of their first language, linguistic problems with
content, vocabulary, organization, conventions and purpose (Okasha &
Hamdi, 2014), their prior experience and knowledge, cultural influences,
and their degree of audience awareness (Bakry & Alsamadani, 2015;
Santoso, 2010).
For example, EFL students are often influenced by their mother lan-
guages and generally cannot write in the same way as do native writers
during the writing process. They find writing the most difficult skill in
learning English (Okasha & Hamdi, 2014; Salma, 2015). In fact, L1 (first
language) and L2 (second language) are naturally linked to each other in
the mind of EFL writers (Cook, 2010; Druce, 2012; Weijen, Bergh,
Rijlaarsdam, & Sanders, 2009) and this mental activity should be har-
nessed rather than rejected (Leonardi, 2010). As EFL writers attempt to
use a broader vocabulary and larger set of phrases, consistent with L1
expression, the translation approach from L1 writing would be expected
to produce L2 texts with cohesion, syntactic complexity and breadth of
expression (Cohen & Brooks-Carson, 2001; Uzawa, 1996).
In addition, it is hard for EFL teachers to help students produce qual-
ity writing (Abdel-Hack, 2002; Alqurashi, 2015). In addition to lacking
writing skills and knowledge about writing processes, insufficient practice
and inadequate feedback within classes are other reasons for EFL
students’ negative attitudes towards writing (Anwar, 2000; Calhoun &
Hale, 2003). Using a dictionary is another important strategy in EFL
writing when EFL writers have uncertainty about the choice of a word
or cannot find the appropriate words (Levy & Steel, 2015; McAlpine &
Myles, 2003). Using a bilingual dictionary is reported as having a positive
role in the writing performance (Roohani & Khosravi, 2011). However, it
is still difficult for EFL writers to find the appropriate word from the
dictionary entries to use within the context of the sentences they are
writing, which would produce unnoticed errors or failures in the process
of writing (Ard, 1982; Chun, 2004; Jin & Deifell, 2013; Nesi & Meara,
COMPUTER ASSISTED LANGUAGE LEARNING 3

1994). A corpus can also be used for a range of learning activities con-
nected with writing (Aston, 2001; Corder, 1981; Hunston, 2002; Sinclair,
2004). A corpus is defined as a large, principled collection of naturally
occurring written or spoken texts that are electronically stored on a com-
puter (O’Keeffe, McCarthy, & Carter, 2007; Reppen, 2009).
With the advanced and innovative development of information and
communication technologies (ICTs), various Internet-based tools such as
Facebook, wikis, and blogs provide a social and interactive platform on
which students have more opportunities to practice their writing, create
discussion, share their thoughts, and receive instant feedback—formal or
informal—to their writing. Accordingly, these social network platforms
are often incorporated into EFL writing classes (Ahmed, 2016;
Alsamadani, 2017; Carri o-Pastor & Romero-Forteza, 2014; Klimova,

2011; Ozdemir & Aydin, 2015; Vurdien, 2013; Yusof, Manan, & Alias,
2012). Google, used as an online corpus-derived engine, has also been
widely used for simple and quick information searches.
Machine translation (MT) works through referencing a source text to
a corpus with a large amount of the source language paired with its
translation in the target language. The possibility of achieving MT of
high quality has been questioned or criticized. Therefore, a common
practice in MT is to use it to produce raw or rough drafts so that tar-
geted human paraphrasing or post-editing is needed to improve transla-
tion quality (Garcia, 2011; Hu, Bederson, & Rensik, 2010; Koponen,
2016; Resnik, et al., 2010; van Rensburg, Snyman, & Lotz, 2012).
However, given the advance in artificial intelligence, new interest has
been generated in using MT in various contexts, including language
learning, health education, and business (Aiken & Ghosh, 2009; Chen,
Acosta, & Barry, 2016; Dhakar, Sinha, & Pandey, 2013; Jimenez-Crespo,
2017). Based on a massive corpus of texts, Google Translate is freely
available and is improving its translation quality with grammatical accur-
acy. Its level of accuracy was approaching the minimum needed for uni-
versity admission at many institutions (Groves & Mundt, 2015). Students
can use this MT to circumvent traditional processes of learning language,
which may result in a major transformation and influence on the EFL
teaching and learning at universities. For example, it helped EFL begin-
ners communicate more and better in their L2 writing (Garcia &
Pena, 2011).
A new version of Google Translate with the GNMT system was
launched in November 2016 (Schuster, Johnson, & Thorat, 2016), and is
widely used for Chinese to English MTs, accounting for around 18 mil-
lion translations per day (Statt, 2016). It works from a translated version
of a set of isolated simple sentences and has fewer translation errors by
4 S.-C. TSAI

an average of 60% compared to Google’s previous phrase-based transla-


tion system (Wu et al., 2016). This updated version of Google Translate
with the GNMT system applies a method of example-based MT (EBMT)
to improve the quality of translation by learning from millions of exam-
ples and can undertake interlingual MT by encoding the semantics of
the sentence, rather than by ‘memorizing’ phrase-to-phrase translations
(Schuster et al., 2016). Such an advanced method leads to better, more
natural translations in translating whole sentences.
Although Google Translate with the GNMT system still makes errors
that a human translator would never make and its translation accuracy
remains a concern (Groves & Mundt, 2015; Le & Schuster, 2016), there
is no doubt that progress has gradually been made with the improvement
of artificial intelligence and machine learning technology. Given its wide
availability, it is important for institutions of higher education to develop
an understanding of and code of practice for using this technology
(Mundt & Groves, 2016). Meanwhile, instructors and students need to
investigate what such increasingly powerful software can and cannot do
and think about how to make further use of it and other digital online
tools while learning English.

Methods
This study was conducted with Chinese EFL students at three different
levels who were majoring in English at a national science and technology
university in Taiwan. Student’ English proficiency was determined by an
online TOEIC-like test with a total score of 990. The TOEIC means of
the sophomore, junior, and senior students were respectively 679.8,
713.6, and 733.4, a level between B1 (equivalent to TOEIC 550) and B2
(equivalent to TOEIC 785) of CEFR (Common European Framework of
Reference for Languages).
Google Translate-GNMT, a convenient and useful online tool for trans-
lation, was used in this study to translate the students’ Chinese texts to
English ones. The study procedure included five steps, as shown in Figure
1. In Step 1, the students were assigned a writing task based on the con-
tent emphasis in the courses they took, and asked to write a draft in
Chinese (L1). For example, after watching a 5-minute passage from a
movie, 25 senior EFL students were assigned a reflective writing task
regarding the timing and fairness of applying justice; 49 junior EFL stu-
dents were asked to describe a line graph combined with a corresponding
pie graph when furnished with a marketing context; and 50 EFL sopho-
mores had to write an essay targeting the importance of participating in
international trade fairs for local trade companies in Taiwan. Senior and
COMPUTER ASSISTED LANGUAGE LEARNING 5

Figure 1. Study procedure.

sophomore students were given only a title to complete their writing. In


contrast, junior students were given clear and specific information on how
to describe graphs. Compared with the junior students, it is more difficult
for senior and sophomore students to complete their writing process.
Based on the theme of each individual task, students at each level were
required to extemporaneously write a text in Chinese for 30 minutes on
their own and then upload their texts to the server of the teacher. Since
the students could think more deeply and better express their ideas,
6 S.-C. TSAI

thoughts and opinions in their native language (Cohen & Brooks-


Carson, 2001), this was expected to allow them more time to prepare
their writing in English in the following step. After writing the draft in
Chinese in Step 1, the students were required to compose a correspond-
ing text in English based on the same theme assigned in Step 1, in
30 minutes, in order that they could first express their personal thoughts
on the target issue in Chinese and then organize a coherent set of ideas
and present them in a comprehensive order in English.
Instead of being asked to translate their first Chinese drafts, students
were required to download their Chinese text completed in Step 1, and
translate it into English by using Google Translate in Step 3. That is
because the present study focused on comparing the use of Google
Translate-GNMT in conducting English translation tasks from Chinese
texts written by the EFL students in various contexts, in comparison
with their unaided self-written (SW) English writing performance for the
same contexts. By comparing their SW versions completed in Step 2
with the GT versions translated from their Chinese texts written in Step
1, they could identify problems encountered in expressing themselves in
their direct English writing and potentially could benefit in content
improvement through vocabulary expansion and enhanced writing com-
pletion from implementing Google Translate-GNMT.
Splitting the production of the English texts between Steps 2 and 3
aimed to prevent students from referring to any text other than their
own text in Chinese before drafting their SW texts in Step 2. Then, after
uploading a file with their GT and SW English texts, students were asked
to complete a questionnaire survey with an open-ended question as the
fourth step. Finally, students’ uploaded GT and SW English texts were
analyzed and measured by two types of online computational
assessments.
The study was conducted in the multimedia laboratory. Students were
asked to accomplish each assigned writing task on their individual com-
puters. While writing, students were advised not to surf on the Internet
but instead to write on their own. The teacher served as a facilitator of
the students’ writing process, encouraging them to become independent
and responsible writers. Assessment of the two English versions was
done by using two different computer-based freeware programs.
Computational writing assessment can offer an accessible and theoret-
ically sound approach for quantitative analysis, and reduces human falli-
bility and the subjective nature of intuitive judgments (Crossley &
McNamara, 2009; Tsai, 2013, 2017). All the students’ English texts were
measured by two types of online assessment freeware, 1Checker (www.
COMPUTER ASSISTED LANGUAGE LEARNING 7

1checker.com) and VocabProfiler (VP, http://www.lextutor.ca/vp/eng) to


evaluate students’ writing quality and lexical features (Tsai, 2017).
1Checker is a web-based automated writing evaluation tool that can
immediately provide the count of mistakes in spelling and grammar.
Another parameter analyzed in this study was the probability of writing
errors, meaning the total count of students’ written errors in spelling
and grammar divided by the total word count of the text.
Lexicon was also considered. Based on the frequency of words appear-
ing in very large text corpora, the online VP allows the user to classify
the words of the analyzed text into different categories: K1 (the most fre-
quent 1000 words), K2 (the second most frequent thousand words),
AWL (academic word list), and off-list words. In general, words in the
K2 category are considered as being at a more advanced vocabulary level
than those in the K1 category. For this study, the AWL and off-list
words were classified as one group because they generally consisted of a
more professional vocabulary with which students were not familiar. In
addition, the numbers and frequencies of different words and the lexical
density of the text were also measured. With regard to writing, lexical
density refers to the ratio of lexical and functional words. Thus, in the
case of written texts, lexical density is a measure of the amount of infor-
mation the text tries to convey.
In addition, a questionnaire was administered to elicit students’ per-
ceptions of the use of Google Translate, using a 5-point Likert scale rang-
ing from 1 (‘strongly disagree’) to 5 (‘strongly agree’), or Very Satisfied
(5), Satisfied (4), Neutral (3), Not Satisfied (2), Disliked (1). The satisfac-
tion questionnaire (QF) was reviewed by two experienced teachers in the
Applied Foreign Languages department to ensure the content validity of
the survey in this study. All returned questionnaires were analyzed using
using SPSS (IBM Corp. Released 2011. IBM SPSS Statistics for Windows,
Version 20.0. Armonk, NY: IBM Corp.). The questionnaire included
eight items and investigated whether or not students perceived Google
Translate as a useful tool for improving EFL writing with respect to
vocabulary, sentence pattern, expression, speed of writing, and grammar.
It also tested their willingness to continue using Google Translate. In
addition, an open-ended question was given to allow students to freely
record their comments or opinions about Google Translate.

Results
Based on the research procedure, the EFL English major students at
three different academic levels attempted different types of writing tasks,
producing writing in three genres: reflection (or opinion), scientific
8 S.-C. TSAI

Table 1. Analysis of results by 1Checker.


Means of students’ writing parameters
Number of
Number of grammar Writing errors
Grade Number of words spelling mistakes mistakes per word
Senior (N ¼ 25) SW 107.5 1.40 3.20 0.052
GT 139.5 0.48 0.48 0.008
Junior (N ¼ 50) SW 145.2 3.29 5.16 0.066
GT 143.8 0.22 1.34 0.014
Sophomore SW 125.6 0.66 2.33 0.026
(N ¼ 49) GT 160.9 0.51 2.13 0.019

p<.05 and p<.01, significant difference in performance between the SW and GT groups.

Table 2. Analysis of results by VocabProfiler (VP).


Means of students’ writing parameters
Number of words
Grade (N¼student number) K1 K2 AWL and off-list Different words Lexical density (p value)
Senior (N ¼ 25) SW 89.8 4.2 12.5 68.8 0.464
GT 114.6 5.7 19.2 84.2 0.474 (0.27)
Junior (N ¼ 50) SW 117.3 7.4 20.5 68.0 0.551
GT 114.4 8.8 20.6 68.8 0.561 (0.17)
Sophomore (N ¼ 49) SW 105.8 7.2 12.6 73.7 0.539
GT 129.0 8.9 23.0 90.4 0.549 (0.11)

p<.05 and p<.01, significant difference in writing performance between the SW and GT groups.

description, and persuasion. The English writing performance of the


students’ GT version and their self-written (SW) were assessed by two
types of online computational assessment freeware of VP and 1Checker.
The results are shown in Tables 1 and 2.
As shown in Table 1, the parameters measured by 1Checker in the
students’ GT texts generally presented significantly better writing quality
than those of their SW versions, in the sense of having a greater number
of words, fewer mistakes in spelling and grammar, resulting in fewer
writing errors.
In addition, based on the analysis measured by VP shown in Table 2,
the words in the students’ GT texts were significantly more advanced
than in their SW ones, especially for senior and sophomore students
who were given more complicated tasks. These results suggest that the
nature and genre of tasks may influence the effects of Google Translate
in this study, corresponding to similar effects found in an earlier study
by Chang and Sun (2009). Students’ GT versions had a greater lexical
density than their SW versions in all three different writing tasks; their p
value analyzed by an independent sample t-test indicated that students’
GT version texts not only conveyed more information than their SW
version, but also had a better grammatical accuracy and more words at a
higher level, which could be used to promote EFL students’ writing con-
fidence as well as the quality of their writing (Yoon, 2008).
COMPUTER ASSISTED LANGUAGE LEARNING 9

Table 3. Results of the questionnaire survey.


 N (student number) ¼ 84 Mean STD
Q1. I’m satisfied with English writing translated by Google Translate. 3.52 0.814
Q2. Google Translate is helpful for content improvement in English writing. 3.23 0.923
Q3. Google Translate is helpful for vocabulary use in English writing. 3.80 0.741
Q4. Google Translate is helpful for the use of sentence pattern in English writing. 2.93 0.979
Q5. Google Translate is helpful for expression in English writing. 3.17 0.929
Q6. Google Translate enhances the completion of English writing. 3.76 0.801
Q7. Google Translate is accurate in the grammar of English writing. 2.36 0.790
Q8. I will continue using Google Translate. 3.64 0.849
Overall mean 3.30

The Cronbach’s a reliability of the questionnaire was 0.744, greater


than 0.7, a reliable level. The questionnaire results are shown in Table 3.
Four questions out of eight scored higher than 3.30, which was the over-
all mean of the questionnaire, and considered as the dividing line
between higher scores and lower scores.
Some issues are highlighted as follows:

1. Q3 (M ¼ 3.80, vocabulary use) scored the highest, which indicates


that most students thought that the greatest benefit from Google
Translate lay in vocabulary use.
2. Q6 (M ¼ 3.76, enhancing the completion of English writing) scored
the second highest, which suggests that students thought Google
Translate could help with completing the assigned writing task.
3. Q1 (M ¼ 3.52, satisfaction with English writing translated by Google
Translate) and Q8 (M ¼ 3.64, continuing use of Google Translate)
with scores higher than the overall mean of the questionnaire, sug-
gests that students were satisfied with their GT texts translated by
Google Translate and that they would like to keep using it.
4. Q7 (M ¼ 2.36, English grammar accuracy of Google Translate) with
the lowest score indicates that the students thought Google Translate
could not provide accurate grammar. A closer investigation of the
students’ GT versions indicated that some of the problems in gram-
mar and syntax came from incorrect use of punctuation marks in
their Chinese texts. For example, some commas were overused
instead of dots so that the break of the sentences or paragraphs could
not be appropriately presented using Google Translate which resulted
in grammatical mistakes identified by the computational assessment
of 1Checker.

As for the open-ended question, 80 students wrote comments or sug-


gestions. Fifty-one students (63.8%) mentioned that Google Translate was
helpful for their English writing, and 37 students (46.3%) thought Google
Translate could help improve their English vocabulary. Meanwhile, 22
10 S.-C. TSAI

Table 4. Sample statements from the SW version compared with the GT version.
Version Corresponding sentence
Senior student A: reflection writing
SW … uses power to protect people, it causes some hurt to them at the end
because of his stubborn.
GT … spent his lifetime to follow the law to save the living beings, but he
was too tough in the process of implementing the law, so that many
innocent people hurt.
Sophomore student B: essay writing
SW … to be more familiar due to this trade fair and to improve the revenue
so that can have a more stable foundation on the international stage.
GT … to become a major brand in the international market, get much atten-
tion in the world, and have a further opportunity to stand on the inter-
national stage.
NB: sample grammatical mistakes or inaccurate use of vocabulary in the SW version are underlined.

(27.5%) and 21 (26.3%) students thought there was still room for
improvement in grammar and syntax, and 7 students (8.8%) commented
that students should complete English writing by themselves, instead of
using translation software.

Discussion
Two examples of the corresponding paragraphs regarding two more dif-
ficult writing tasks for sophomore and senior students’ GT and SW ver-
sions are given in Table 4. The first example concerns the attitude of the
police while executing the law, and the second one relates to the advan-
tages for Taiwanese trade companies while participating in international
fairs. Based on Table 4, in addition to fewer grammar mistakes, increased
propositions or ideas and a more conventional or accurate use of
vocabulary and expression are delivered in the GT version.
In addition to vocabulary, several factors should be considered in pro-
ducing EFL writing of better quality, such as an emphasis on content,
organization, conventions and purpose. In the process of writing, stu-
dents have to brainstorm, plan, write, and revise based on the theme of
the assigned topic. It is more challenging for EFL students to complete a
prompted English draft with accurate use of vocabulary, grammar and
structure in a limited time. The 2016 version of Google Translate has the
additional advantage of creating translations of whole sentences, rather
than focusing on vocabulary only. According to the results obtained in
this study, in addition to better English writing performance in the GT
texts measured by VP and 1Checker, high scores for Q1 (M ¼ 3.52, satis-
faction with English writing translated by Google Translate) and Q6
(M ¼ 3.76, enhancing the completion of English writing) imply that
Google Translate can be used in a supportive approach in EFL writing.
In addition to the teacher who is usually the primary or even the sole
audience for student writing, using Google Translate can provide EFL
COMPUTER ASSISTED LANGUAGE LEARNING 11

students with a ‘second audience’. For example, when having difficulty in


expressing their thoughts and ideas in English, EFL students can use
Google Translate to immediately translate their thoughts and ideas in
English. The GT texts can offer students initial advice on word usage and
sentence structures for further reference and revision in their English writ-
ing. However, the incorporation of Google Translate is a learner-centered
approach which may not be appropriate for beginning or low-level stu-
dents (Aston, 2001; Granath, 2009; Hunston, 2002). This study was con-
ducted with EFL students majoring in English, whose mean on the online
TOEIC test was between 679.8 and 733.4, superior to B1 level of the
CEFR. Indeed, students’ language ability, both in Chinese and English,
would be another interesting and important issue to be further studied in
terms of deciding if, when, and how to incorporate Google Translate into
Chinese EFL writing. In addition, by further comparing the difference
between the GT and SW texts, EFL students can be prompted to learn the
use of new vocabulary, grammatical patterns and sentence structures based
on the translated example, and then try to discover differences between
the texts, understand their usage, and even propose possible solutions to
better express their thoughts in English.
According to the questionnaire results in this study, the two highest
scores for Q3 (M ¼ 3.80) and Q6 (M ¼ 3.76) revealed that the students
thought Google Translate could help them choose more appropriate
words to convey their intended meaning and enhance the completion of
their English writing. However, we cannot yet say that GT versions are
better than those produced with the help of dictionaries or corpora. It is
expected that the present discussion will encourage further research in
this area. For example, other supportive corpus tools or digital diction-
aries can also be incorporated and even compared with Google Translate
in order to get a wider and deeper understanding about what and how
such online referencing tools could improve EFL writing.
The results shown in Table 2 indicate that, compared with the SW ver-
sion, the GT version of the reflective and essay writing had more and sig-
nificantly better writing parameters in all the categories of the VP analysis
than the one of graph description. The genre of the assigned tasks clearly
played a role in the translated version of Google Translate based on
students’ Chinese texts. At the beginning of the writing process, specific
instructions shown in the graph were given to junior students so that they
clearly understood how and what to describe. However, for the reflective
writing and essay writing, students were given only a title, and therefore
needed more time to brainstorm and draft. Thus, it was more difficult for
senior and sophomore students to complete the assigned English writing
tasks of reflection and essay. While completing a more difficult writing
12 S.-C. TSAI

task, the Chinese EFL students in this study felt more confident to express
their ideas and thoughts in Chinese than in English, which allowed them
to write more words or ideas in Chinese. This difference could in turn
lead to more information to convey in the translated English texts than
that of their self-written English texts, as shown in Tables 1 and 2. These
results were also reinforced by students’ positive responses to the questions
1 and 6 on the questionnaire that the completion of the assigned writing
task was enhanced with the use of Google Translate (Q6, M ¼ 3.8) and
they were satisfied with English writing translated by Google Translate
(Q1, M ¼ 3.52), as shown in Table 3. However, according to the results of
the questionnaire, most of the students thought Google Translate could
not provide accurate grammar, although the computational assessment
suggested that there were fewer grammar mistakes in student GT versions
than in their SW writing as shown in Table 1. Thus, further research is
needed to investigate this gap.

Conclusions
Advanced technologies inevitably influence language learning. This art-
icle is a very preliminary study to investigate the translation proficiency
of Google Translate incorporating the GNMT system by comparing the
writing parameters of English GT texts translated from Chinese texts
written by EFL students in various contexts with those of their unaided
English texts for the same contexts. Students’ perceptions of using Google
Translate have been also elicited and discussed. The results indicate that
the translated English GT texts presented a number of writing parame-
ters with significantly higher writing proficiency than those of students’
self-written (SW) texts. In addition, the results of the questionnaire sur-
vey revealed that EFL students were satisfied with using Google Translate
in their English writing.
Since using Google Translate in producing a translated passage and
incorporating Google Translate in EFL writing are different, it is import-
ant to carefully investigate if and how students’ writing ability can be
enhanced by the incorporation of Google Translate in EFL writing. There
are still some interesting issues to be further studied, both qualitatively
and quantitatively, to better understand the possible influence on English
learning resulting from the L1 and English proficiency of the students,
the tool’s incorporation to other courses or genres, and student reception
across different contexts. Studies should be conducted for EFL students
at higher English levels or other dominant languages in addition to
Chinese. Case studies combining longitudinal investigations of GT-
assisted writing with individual interviews of students are needed to
COMPUTER ASSISTED LANGUAGE LEARNING 13

determine whether EFL students actually do improve their writing and


what they perceive about their skills. The students will use this tool: as
teachers, we need to see what it does with, for, and to them.
With the rapid development of artificial intelligence technologies,
many innovative software programs for English learning have been cre-
ated and can be easily accessed on computers and mobile phones, mak-
ing it easy for students to try them out. For example, in addition to
Google Translate, Google has recently launched new wireless headphones
called Pixel Buds to work with Google Assistant on mobile phones to
translate between 40 different languages, literally in real-time. Google
Assistant was also delivered in 2016; like Siri (Speech Interpretation and
Recognition Interface) it can answer questions and have two-way conver-
sations. Apple developed a voice assistant translation software few
months ago to add to its popular Siri for mobile phones that users know
immediately how to say popular English phrase in a variety of languages,
including Chinese, French, German, Italian and Spanish. Facing the con-
tinuation of advanced artificial intelligence technologies, it is essential
that EFL teachers will take these these new directions under serious con-
sideration and investigate how to make good use of such advanced soft-
ware within classes to provide students with more opportunities of being
exposed to natural language use to further help them improve their
English skills.

Disclosure statement
No potential conflict of interest was reported by the author.

Funding
This work was partially supported by the Ministry of Science and Technology (MOST),
Taiwan, ROC, under Grant MOST [106-2410-H-151-012].

Notes on contributors
Shu-Chiao Tsai received the PhD degree in material sciences from Paris-Sud (XI)
University in 1996. After having worked in an optoelectronics company as marketing and
technical administrator for several years, I am currently Dean of Humanities and Social
Sciences and a professor with the Department of Applied Foreign Languages at National
Kaohsiung University of Applied Sciences, Taiwan. For the past few years, I have focused
on the development of technical and commercial ESP (English for Specific Purposes)
courseware and its application in the classroom to help university students and adult
learners augment language skills and knowledge applicable to the job market in Taiwan.
14 S.-C. TSAI

ORCID
Shu-Chiao Tsai https://orcid.org/0000-0002-0541-5593

References
Abdel-Hack, I. M. (2002). The effectiveness of a task-based learning approach on EFL
students’ writing production. Occasional Papers in the Language Education Center for
Developing of English Language Teaching CDELT, 34, 193–231.
Ahmed, M. (2016). Using facebook to develop grammar discussion and writing skills in
english as a foreign language for university students. Sino-US English Teaching,
13(12), 932–952.
Aiken, M., & Ghosh, K. (2009). Automatic translation in multilingual business meetings.
Industrial Management & Data Systems, 109(7), 916–925.
Alqurashi, F. (2015). Perspectives of Saudi EFL learners towards teacher response in
writing courses. International Journal of English Linguistics, 5(5), 37–46.
Alsamadani, H. A. (2017). The effectiveness of using online blogging for students’ indi-
vidual and group writing. International Education Studies, 11(1), 44–51.
Anwar, I. Z. (2000). The effect of using peer review groups in teaching essay writing to
fourth year English majors, Faculty of Education, on their writing performance,
apprehension, revising and attitudes. Journal of Research in Education and Psychology,
14(1), 94–129.
Ard, J. (1982). The use of bilingual dictionaries by ESL students while writing.
International Journal of Applied Linguistics, 58, 1–27.
Aston, G. (2001). Learning with corpora: An overview. In G. Aston (Ed.), Learning with
corpora (pp. 6–45). Houston, TX: Athelstan.
Bakry, M. S., & Alsamadani, H. A. (2015). Improving the persuasive essay writing of stu-
dents of Arabic as a foreign language (AFL): Effects of self-regulated strategy develop-
ment. Procedia-Social and Behavioral Sciences, 182, 89–97.
Calhoun, S., & Hale, J. (2003). Improving students writing through different writing
styles. M. A. Action Research Project. Saint Xavier University and Skylight Professional
Development Field-Based Master’s Program. Retrieved from IRI\Skylight Field Based
Master’s Program.
Carri
o-Pastor, M. L., & Romero-Forteza, F. (2014). Second language writing: Use of the
world wide web to improve specific writing. Procedia-Social and Behavioral Sciences,
116, 235–239.
Chang, W. L., & Sun, Y. C. (2009). Scaffolding and web concordancers as support for
language learning. Computer Assisted Language Learning, 22(4), 283–302.
Chen, X., Acosta, S., & Barry, A. E. (2016). Evaluating the accuracy of Google translate
for diabetes education material. JMIR Diabetes, 1(1), 29–39.
Chun, Y. V. (2004). EFL learners’ use of print and online dictionaries in L1 and L2 writ-
ing processes. Multimedia-Assisted Language Learning, 7(1), 9–35.
Cohen, A. D., & Brooks-Carson, A. (2001). Research on direct versus translated writing:
Students’ strategies and their results. The Modern Language Journal, 85(2), 169–188.
Cook, G. (2010). Translation in language teaching: An argument for reassessment.
Oxford: University Press.
Corder, P. (1981). Error analysis and interlanguage. Oxford: Oxford University Press.
COMPUTER ASSISTED LANGUAGE LEARNING 15

Crossley, S. A., & McNamara, D. S. (2009). Computational assessment of lexical differ-


ences in L1 and L2 writing. Journal of Second Language Writing, 18, 19–135.
Dhakar, B. S., Sinha, S. K., & Pandey, K. K. (2013). A survey of translation quality of
English to Hindi online translation systems (Google and Bing). International Journal
of Scientific and Research Publications, 3(1), 1–4.
Druce, P. M. (2012). Attitude to the use of L1 and translation in second language teach-
ing and learning. Journal of Second Language Teaching and Research, 2(1), 60–86.
Garcia, I. (2011). Translating by post-editing: is it the way forward? Machine
Translation, 25(3), 217–237.
Garcia, I., & Pena, M. I. (2011). Machine translation-assisted language learning: Writing
for beginners. Computer Assisted Language Learning, 24(5), 471–487.
Granath, S. (2009). Who benefits from learning how to use corpora? In K. Aijmer (Ed.),
Corpora and language teaching (pp. 47–65). Netherlands: John Benjamins.
Groves, M. J., & Mundt, K. (2015). Friend or foe? Google Translate in language for aca-
demic purposes. English for Specific Purposes, 37, 112–121.
Hu, C., Bederson, B., & Rensik, P. (2010). Translation by iterative collaboration between
monolingual users. Proceedings of Graphics Interface, May 31–June 2, Ottawa,
Ontario, Canada. pp. 39–46.
Hunston, S. (2002). Corpora in applied linguistics. Cambridge: Cambridge University
Press.
Jimenez-Crespo, M. A. (2017). The role of translation technologies in Spanish language
learning. Journal of Spanish Language Teaching, 4(2), 181–193.
Jin, L., & Deifell, E. (2013). Foreign language learners’ use and perception of online dic-
tionaries: A survey study. Journal of Online Learning and Teaching, 9(4), 515–533.
Klimova, B. F. (2011). Making academic writing real with ICT. Procedia Computer
Science, 3, 133–137.
Koponen, M. (2016). Is machine translation post-editing worth the effort? A survey of
research into post-editing and effort. The Journal of Specialised Translation, 25,
131–148.
Le, Q. V., & Schuster, M. (2016, September 26). A neural network for machine transla-
tion, at production scale. Google Research Blog. Retrieved from https://research.goo-
gleblog.com/2016/09/a-neural-network-for-machine.html
Leonardi, V. (2010). The role of pedagogical translation in second language acquisition.
Bern: Peter Lang.
Levy, M., & Steel, C. (2015). Language learner perspectives on the functionality and use
of electronic language dictionaries. ReCALL, 27(02), 177–196.
McAlpine, J., & Myles, J. (2003). Capturing phraseology in an online dictionary for
advanced users of English as a second language: A response to user needs. System,
31(1), 71–84.
Mundt, K., & Groves, M. J. (2016). A double-edged sword: the merits and the policy
implications of Google Translate in higher education. European Journal of Higher
Education, 6(3), 1–15.
Nesi, H., & Meara, P. (1994). Patterns of misinterpretation in the productive use of EFL
dictionary definitions. System, 22(1), 1–15.
Okasha, M. A., & Hamdi, S. A. (2014). Using strategic writing techniques for promoting
EFL writing skills and attitudes. Journal of Language Teaching and Research, 5(3),
674–681.
O’Keeffe, A., McCarthy, M., & Carter, R. (2007). From corpus to classroom. Language
use and language teaching. Cambridge: Cambridge University Press.
16 S.-C. TSAI


Ozdemir, E., & Aydin, S. (2015). The effects of blogging on EFL writing achievement.
Procedia - Social and Behavioral Sciences, 199, 372–380.
Reppen, R. (2009). Using corpora in the language classroom. Cambridge: Cambridge
University Press.
Resnik, P., Buzek, O., Hu, C., Kronrod, Y., Quinn, A. & Bederson, (2010). Improving
translation via targeted paraphrasing. Proceedings of the 2010 Conference on Empirical
Methods in Natural Language Processing, MIT, Massachusetts, USA, pp. 127–137.
Roohani, A., & Khosravi, A. (2011). An investigation into bilingual dictionary use: Do
the frequency of use and type of dictionary make a difference in L2 writing perform-
ance? The Journal of Teaching Language Skills (JTLS), 3(2), 85–106.
Salma, U. (2015). Problems and practical needs of writing skill in EFL context: An ana-
lysis of Iranian students of Aligarh Muslim University. IOSR Journal Of Humanities
and Social Science (IOSR-JHSS), 20(11), 74–76.
Santoso, A. (2010). Scaffolding an EFL (English as a Foreign Language) effective writing
class in hybrid learning community. Unpublished Doctor’s Thesis. Queensland
University of Technology.
Schuster, M., Johnson, M., & Thorat, N. (2016, November 22). Zero-shot translation
with Google’s multilingual neural machine translation system. Google Research Blog.
Retrieved from https://research.googleblog.com/2016/11/zero-shot-translation-with-
googles.html
Sinclair, J. M. (2004). How to use corpora in language teaching. Amsterdam: Benjamins.
Sokolik, M. (2003). Writing. In D. Nunan (Eds.), Practical English language teaching
(PELT), (pp. 87–88). New York: McGraw Hill.
Statt, N. (2016, September 27). Google’s AI translation system is approaching human-
level accuracy. THE VERGE. Retrieved from https://www.theverge.com/2016/9/27/
13078138/google-translate-ai-machine-learning-gnmt.
Tsai, S. C. (2013). EFL business writing with task-based learning approach: A case study
of student strategies to overcome difficulties. Journal of Humanities and Social
Sciences (KUAS), 10(2), 217–238.
Tsai, S. C. (2017). Effectiveness of ESL students’ performance by computational assess-
ment and role of reading strategies in courseware-implemented business translation
tasks. Computer Assisted Language Learning, 30, 474–487. Retrieved from http://dx.
doi.org/10.1080/09588221.2017.1313744
Uzawa. K. (1996). Second language learners’ processes of L1 writing, L2 writing, and
translation from L1 into L2. Journal of Second Language Writing, 5(3), 271–294.
van Rensburg, A., Snyman, C., & Lotz, S. (2012). Applying Google Translate in a higher
education environment: Translation products assessed. Southern African Linguistics
and Applied Language Studies, 30(4), 511–524.
Vurdien, R. (2013). Enhancing writing skills through blogging in advanced English as a
Foreign Language class in Spain. Computer Assisted Language Learning, 26(2),
126–143.
Weijen, D., Bergh, H., Rijlaarsdam, G., & Sanders, T. (2009). L1 use during L2 writing:
An empirical study of a complex phenomenon. Journal of Second Language Writing,
18(4), 235–250.
Wu, Y., Schuster, M., Chen, Z., Le, Q. V., Norouzi, M., Macherey, W., … Dean, J.
(2016). Google’s neural machine translation system: Bridging the gap between human
and machine translation. arXiv preprint arXiv:1609.08144. Retrieved from https://
arxiv.org/pdf/1609.08144.pdf
COMPUTER ASSISTED LANGUAGE LEARNING 17

Yoon, H. (2008). More than a linguistic reference: The influence of corpus technology
on L2 academic writing. Language Learning & Technology, 12(2), 31–48.
Yusof, J., Manan, N. A. A., & Alias, A. A. (2012). Guided peer feedback on academic
writing tasks using Facebook notes: An exploratory study. Paper presented at The 3rd
International Conference on E-Learning ICEL 2011, Bandung, Indonesia. Procedia
Social and Behavioral Sciences, 67, 216–228.

You might also like