Download as pdf or txt
Download as pdf or txt
You are on page 1of 18

Journal of Accounting Education 58 (2022) 100769

Contents lists available at ScienceDirect

Journal of Accounting Education


journal homepage: www.elsevier.com/locate/jaccedu

Effect of a summer school on formative and summative


assessment in accounting education
Evelien Opdecam ⇑, Patricia Everaert
Ghent University, Faculty of Economics and Business Administration, Department of Accounting, Corporate Finance and Taxation, Sint-Pietersplein 7,
9000 Gent, Belgium

a r t i c l e i n f o a b s t r a c t

Article history: This paper investigates the effect of a voluntary accounting summer school on formative
Received 27 December 2019 and summative assessments in accounting education. A quasi-experiment with a question-
Received in revised form 25 January 2022 naire design was conducted in a first-year undergraduate accounting course. The sample
Accepted 25 January 2022
consisted of 497 first-time higher education students. The effectiveness of voluntary
Available online 17 February 2022
summer school for performance was examined by comparing the attendees’ and
non-attendees’ scores on two formative assessments and one summative assessment to
Keywords:
determine whether attendees scored higher. The results of multiple one-way analyses of
Summer school
Accounting
covariance, controlling for different variables, reveal that attendance at summer school
Formative assessment has a significantly positive effect on short-term performance (first formative assessment),
Summative assessment but it disappears over time. No significant differences are found for the second formative
Higher education assessment or the summative assessment. Stepwise linear regression analyses show that
the scores of the test the students completed at the end of summer school are significantly
positively related to the summative assessment score (final exam at the end of the seme-
ster). Implications and recommendations for further research and the organization of sum-
mer schools are covered.
Ó 2022 Elsevier Ltd. All rights reserved.

1. Introduction

As early as in 1922, Egbert James wrote a paper on university summer schools in the United States. Summer schools are a
good use of the long summer break during which knowledge can be derived (e.g., Alexander, Entwisle, & Olson, 2001; Cooper,
Nye, Charlton, Lindsay, & Greathouse, 1996; Schacter & Jo, 2005). Subsequently, many papers describing the concepts of
summer schools have been published (e.g., Denton, Solari, Ciancio, Hecht, & Swank, 2010; Xu & De Arment, 2017; Xu,
Padilla, & Silva, 2014), with summer schools gaining interest over time. Nonetheless, it must be remarked that summer
school per se does not exist. So-called summer school’s only indisputable but logical characteristic is that it takes place dur-
ing the summer holidays. Other reasons for the interest in summer school can be found in the low passing rates of first-year
higher education students (Declercq & Verboven, 2010). There is an abrupt transition between secondary school and higher
education in terms of atmosphere and expectations (Brooman & Darwent, 2014; Byrne & Flood, 2005). Summer school has
the potential to make this transition occur more smoothly.
The literature review on summer schools reveals interesting points. First, previous research on summer schools seems to
focus primarily on learning outcomes, especially the difference between studying course content in an intensive summer

⇑ Corresponding author.
E-mail addresses: Evelien.Opdecam@UGent.be (E. Opdecam), Patricia.Everaert@UGent.be (P. Everaert).

https://doi.org/10.1016/j.jaccedu.2022.100769
0748-5751/Ó 2022 Elsevier Ltd. All rights reserved.
E. Opdecam and P. Everaert Journal of Accounting Education 58 (2022) 100769

school format and studying it during the school year at the standard pace. In this paper, summer school has the potential to
bridge the gap between secondary and higher education. Consequently, in this study summer school is organized as a vol-
untary supplementary information module prior to the start of the first semester. During the semester, a regular weekly
accounting course is organized and the performance differences are addressed between the summer school attendees and
non-attendees. Second, this study differs from others because it takes into account four different points of assessment. Gen-
erally, the moment when performance is assessed can vary, such as in the longer term, at a certain moment (e.g., Inglis,
Broadbent, & Dall’alba, 1993; Schacter & Jo, 2005; Zvoch & Robertson, 2017), or at multiple moments. Accounting is a chal-
lenging course; without systematic practice of the material covered in class, it is often difficult for students to master and
retain the fundamental concepts and techniques (Rotenstein, Davis, & Tatum, 2009). A formative assessment provides the
students an opportunity to learn from their mistakes and to obtain feedback on how well they are doing in terms of the
intended learning outcomes. Thus, feedback is important for student learning (Watty et al., 2013). Our study’s assessments
at different times during the semester in combination with comparing attendees with non-attendees will enrich the litera-
ture on summer schools, because most prior studies include only summative assessments.
Consequently, the first objective of the current study is to investigate the effect of voluntary supplementary summer
school on performance, measured at different times. In particular, this paper examines whether attending a voluntary sum-
mer school for accounting has a beneficial effect on either formative or summative assessments in a subsequent accounting
course.
In this study, the summer school is organized within an open-gate university context. The study was conducted at a large
research-oriented university in the Flanders region of Belgium, during the first undergraduate year of the economics/busi-
ness economics/business engineering program. This setting is very different from those of the Anglo-Saxon institutions
where many of the studies discussed above were conducted. In those institutions, freshman students typically pass a rigor-
ous selection procedure based on both academic criteria (e.g., prior scores on national standardized exams) and non-
academic criteria (e.g., personal statements or selection interviews). Unlike these Anglo-Saxon and most other countries,
the transition to higher education in Belgium has no formal selection method or entrance criteria1 (Pinxten, van Soom,
Peeters, & Al, 2019). In Belgium, higher education is completely publicly financed, with low tuition fees (i.e., less than EUR
1,000 or USD 1,200 per annum at the time of this writing, in 2021). Access to higher education is open, and there have been
no ‘‘selection at the gate” admission procedures so far. A high school diploma is sufficient to enter university.
Consequently, there are large differences between incoming students in terms of prior knowledge, attitudes, and skills
(Pinxten et al., 2019). For example, a substantial number of students enroll without any knowledge of accounting. More than
half of incoming students’ secondary school curriculum did not include accounting, because their major (e.g., mathematics/-
sciences) did not address courses such as economics or business economics or the school/teacher did not offer these as an
extracurricular topic. In addition, some students enroll with a weak background in mathematics. This issue is particularly
challenging, since mathematics and statistics are core courses of the first-year program, alongside economics and account-
ing.2 Consequently, there is an ongoing debate on whether the open-gate system should be continued in the future. If admission
tests are to be required, then the question is what types of topics need to be included, given that students have different back-
grounds (and can be successful, even without prior knowledge of accounting). Therefore, the second objective of this study is to
scrutinize the predictive value of the formative assessment, taken at the end of the voluntary summer school, in explaining the
final exam score at the end of the semester (summative assessment). Certainly, the summer school has the potential to bridge
the gap between secondary and higher education; however, the predictive value of the summer school has still to be proven.
This paper makes four main contributions. First, despite the fact that any conceivable topic can be covered during sum-
mer school, summer schools in languages and mathematics are much more researched than summer schools in accounting
(e.g., Denton et al., 2010; Gorard, Siddiqui, & See, 2015; Inglis et al., 1993; Schacter & Jo, 2005; Xu et al., 2014; Zvoch &
Robertson, 2017). Consequently, research on accounting summer schools is scarce. This study fills this first gap by investi-
gating whether learning accounting principles during a summer school is helpful for freshman students, especially in con-
texts in which accounting is not part of the secondary education curriculum. Second, unlike many summer schools, which
are organized for underperforming students, this summer school is organized for all students, in a voluntary supplementary
format. Since all students are investigated, this study compares attendees with non-attendees in a between-subjects control
group design. Third, it involves a longitudinal study on the effect of an accounting summer school on performance, where
several assessment points during the semester are taken into account. Hence, the potential beneficial effect of attending a
summer school is linked to a range of the students’ accounting assessment scores, collected during their first semester at
university. Fourth, this study uses a data set of 497 students in a Central European university setting. Compared to other
research, this study investigates a large group of students, with a control group.
The remainder of this paper is organized as follows. Section 2 discusses prior research on the key concepts of this study
and develops the hypotheses. Section 3 describes the methodology, followed by a description of the sample and an analysis

1
Except for study programs in medicine, dentistry, and arts education (Pinxten et al., 2014).
2
The full first year of the economics/business economics/business engineering program consists of economics (eight European Credit Transfer and
Accumulation System, or ECTS, credits), accounting (eight ECTS credits), business administration (four ECTS credits), mathematics (eight ECTS credits), statistics
(eight ECTS credits), computer systems (five ECTS credits), production technology (five ECTS credits), human sciences (four ECTS credits), economic English
(three ECTS credits), economic French (three ECTS credits), and law (four ECTS credits).

2
E. Opdecam and P. Everaert Journal of Accounting Education 58 (2022) 100769

of the results in Section 4. Section 5 discusses the results and provides an overview of the limitations of the research and
recommendations.

2. Literature review

2.1. Summer school

The term summer school refers to a course that takes place during the summer holidays. Summer school can be organized
by educational institutions or other parties, and the courses can be free or not. Some summer schools require students to
apply and be accepted, whereas others accept all students who want to attend, and still others are specifically designed
for only certain students. A distinction can be made between summer school that is part of a curriculum and summer school
provided as an extra learning opportunity. Xu et al. (2014) discuss the former, where students receive credits in exchange for
attendance in summer school. Most of the time, students take the course in a shorter time frame than the regular semester.
In these cases, summer school illustrates a modular approach toward educational courses. Bekman, Aksu-Koç, & Erguvanlı-
Taylan (2011) and Zvoch & Robertson (2017) provide examples of the latter, i.e. summer school as an extra learning
opportunity.
Further, there is no clear, unambiguous definition of the concept of summer school. There are as many interpretations as
there are organized summer schools, differing in terms of form, content, duration, target group, purpose, and so forth. The
most remarkable characteristics are the target group, purpose, and approach.
First, the literature shows that the target group differs among summer schools. Summer schools are organized for all ages,
from pre-kindergarten, kindergarten, and primary school to secondary school and higher education (e.g., Ho & Karagiannidis,
2007; Xu & De Arment, 2017; Zvoch & Robertson, 2017). Summer schools are already offered in kindergarten because early
intervention in reading difficulties has proven to be effective (Mathes & Denton, 2002; Torgesen, 2000). When provided as an
extra learning opportunity, summer school can be either open to all students or restricted to underrepresented, underper-
forming, or economically disadvantaged students, as compensatory education (Bekman et al., 2011; Ellis & Sawyer, 2009;
Zvoch & Robertson, 2017; Zvoch & Stevens, 2013). The results on the extent of knowledge retention during the summer hol-
idays for high- and low-income students are mixed, with most of the research concluding a greater need for summer school
among economically disadvantaged students, who are the ones who experience the highest levels of knowledge retention
during the summer holidays (Alexander et al., 2001; Cooper et al., 1996; Schacter & Jo, 2005). Prior literature concludes that
all students, regardless of their economic situation, experience some degree of summer learning loss, but the effect is greater
for lower-income students (Cooper et al., 1996; Schacter & Jo, 2005). The results of prior research are reflected in the ample
prevalence of summer schools for economically disadvantaged students (Denton et al., 2010; Xu & De Arment, 2017).
Second, besides differing by target group, summer schools can differ in terms of their purpose. Enriching summer schools
give students a chance to get ahead in specific materials, whereas remedial summer schools target underperforming stu-
dents to help them keep up with their fellow students (e.g., Bekman et al., 2011; Zvoch & Robertson, 2017; Zvoch &
Stevens, 2013). Online summer schools at Maastricht University, for example, illustrate this type of summer school, where
the objective is to equalize students’ prior knowledge before the start of the school year. Therefore, students who performed
below standard on a pre-test are encouraged to attend summer school (Rienties, Rehm & Dijkstra, 2005). Another type of
summer school targets recent graduates to prepare them for higher education. This preparation can involve either the
knowledge and skill level or a more social level. One reason for the latter is the issue of summer melt, which is the phe-
nomenon of (registered) students not enrolling in school after the summer break because of lack of motivation or other
motives. Again, lower-income students have been shown to be more vulnerable (Roderick, Nagaoka, Coca, & Moeller,
2008). A variant of the preparatory type of summer school gives students the opportunity to learn about less popular sub-
jects. Students can thus sample subjects before registering, to avoid ending up in a major they dislike. Generally, summer
schools can be organized around any subject, often involving languages, but mathematics, science, engineering, and so forth,
are also popular.
Third, many different approaches to summer schools exist. This is not surprising, since summer schools have much more
freedom in organizing educational activities than during the normal school year. Summer schools can make substantial use
of guest lectures, tutorials, active learning, group work, and web-based lectures (e.g., Jones & Iredale, 2006; Rienties et al.,
2005). Skills can be practiced outside the school building to directly link theory and practice. The Durham Summer School
is an example of a summer school that focuses on teaching young people entrepreneurial skills (Jones & Iredale, 2006). A
school building is not even a requisite, since summer school can take place online, using the newest technology (Issroff &
Eisenstadt, 1997; Rienties et al., 2005). Moreover, the duration of summer school and the number of hours per day differ
greatly (e.g., Bekman et al., 2011; Denton et al., 2010; Egbert, 1922; Jones & Iredale, 2006; Zvoch & Robertson, 2017). The
Durham Summer School, for example, lasts one week, whereas an intervention program for disadvantaged children can last
10 weeks (Bekman et al., 2011; Jones & Iredale, 2006). In practice, however, any combination is possible. Because of their
flexibility in duration, summer schools can be organized with an accent on the content, thereby adapting an optimal dura-
tion, thus enhancing or reducing learning intensity. Some summer schools stress learning, while others provide a balance
between learning and cultural activities. The latter often involve summer school abroad.

3
E. Opdecam and P. Everaert Journal of Accounting Education 58 (2022) 100769

In our study’s setting, all the students were offered the opportunity to attend a summer school for accounting, offered to
enrolled freshmen, just before they enter the university. Making the summer school voluntary is consistent with the move-
ment toward greater autonomy in the workplace (Lewis & Hayward, 2003). Although students are only rarely given the
opportunity to choose for themselves during their training programs (Frymier, Shulman & Houser, 1996), they are con-
fronted with multiple choices when they enter the work field. Graduate students are increasingly faced with entering orga-
nizations that require empowerment, self-determination, and self-management (Lewis & Hayward, 2003). In the psychology
literature, self-determination theory (SDT) has been described as a promising theory to study people’s motivation (Ryan &
Deci, 2000). The principles of SDT have recently been applied in the field of education to study the motivation of learners
(Aelterman, Vansteenkiste, & Haerens, 2018; Liu, Fang, Wan, & Zhou, 2016). In particular, the SDT framework postulates that
intrinsic motivation will be enhanced through the satisfaction of the three basic psychological needs in particular autonomy
(i.e. acting volitionally), competence (i.e. experiences of mastery) and relatedness (i.e. sense of belongingness) (Ryan & Deci,
2000). SDT has determined that the concept of autonomy can be fulfilled by providing real choices to students. The need for
autonomy refers to the need to feel a sense of full volition and ‘‘choicefulness” regarding one’s activities and goals, a feeling
that can arise when students are given the option of whether or not to attend the summer school. Research based on SDT has
shown that autonomy-supportive contexts improve both intrinsic motivation and the well-being of students (Ryan & Deci,
2000). Offering students a choice is considered a practice aimed at supporting autonomy. Ryan and Deci (2000) also argue
that intrinsic motivation can only arise when students feel that all three basic psychological needs are met. Consequently, if
choice is provided; it has to be in a way that all needs are satisfied to a meaningful degree. We posit that this is the case in the
current study, as the students’ main objective of attending the summer school is to become more competent on the subject
of accounting. In addition, at the end of the summer school a test is included to support students’ sense of competence, as an
initial assessment of students’ knowledge (which is not graded) (Katz & Assor, 2007). In addition, feedback is provided to the
students, so that students (and the instructor) receive information about their learning process, their mistakes and their
learning efforts (Watty et al, 2013). In this competence-supporting context, students can fully commit themselves to the
summer school that they have opted for, without worrying about their achievement level and possibility of negative eval-
uations. In terms of relatedness, the summer school offers students plenty of opportunities to collaborate with each other,
to provide meaningful help and to receive help from peers and from the instructor when it is needed; and to discuss and
reflect on the first university experiences (Katz & Assor, 2007).
Studies thus far on the potential negative (and positive) effects of summer schools have not focused on feelings of auton-
omy; that is, they have involved merely situations in which the instructor decides the students should attend summer
school. In the current study, the students are given the opportunity to voluntarily attend a summer school in accounting dur-
ing their vacation.

2.2. Summer schools in accounting

Research about summer schools in the field of accounting is scarce. One paper on summer schools in the accounting field
is that of Inglis et al. (1993). These researchers compare the performance of management accounting students who enrolled
in a four-week intensive summer school with that of students who took the course during the first semester. Short-term
results on performance and passing rates were collected, as well as long-term results on other subjects. A detailed overview
of the results is provided in the next section, on assessment.

2.3. Formative assessment

Since the literature on summer schools does not cover the link between formative and summative assessments, we focus
on the distinction between short- and longer-term assessments. Formative and summative assessments are two different
types of assessment that can be combined to cover a student’s entire learning process. They are discussed separately only
because they have different functions, one supporting the learning process and the other reporting performance (Harlen,
2005).
Many definitions of formative assessment are available in the literature, with the most important definitions for this paper
discussed here. Scriven (1967) coined the term formative evaluation, and its definition is related to the evaluation of curricula
during their development and testing phase. Furthermore, Scriven defines assessment as a judgment made, for rationaliza-
tion, in relation to weighted targeted objectives, to obtain a (relative) score. Bloom (1968) involves the students’ learning
process in the definition. The author defines formative assessment as interfering in the learning process to allow improve-
ments in teaching and learning by making it clear to students what materials they have mastered and what materials they
still need to study to perform well in a summative assessment. Ramaprasad (1983) confirms and phrases the difference
between current and targeted performance as a gap. Moreover, the author states that feedback is used to close that gap,
meaning that a formative assessment improves future learning performance (Black & William, 1998). Harlen (2005) adds
that a formative assessment is an assessment of learning; it gives students feedback for future learning.
Consequently, in this paper, the following definition of formative assessment is used: a formative assessment is a short-
term (maximum three months) formative offline or online assessment that gives students information about their learning
process. It provides feedback on what the students have mastered and what they still need to learn to close the gap between
their current and targeted performance. There can be multiple formative assessments.
4
E. Opdecam and P. Everaert Journal of Accounting Education 58 (2022) 100769

Researchers have studied the effect of attending summer school on short-term learning performance, with mixed results.
Some report that summer school has no significant effect on short-term performance (e.g., McCombs et al., 2014). However,
most studies in disciplines other than accounting find that summer school yields positive results, with significant knowledge
gains for the attendees (Lütgendorf-Caucig et al., 2017) and increases in performance (e.g., Inglis et al., 1993, Xu et al., 2014;
Xu & De Arment, 2017; Zvoch & Robertsen, 2017; Zvoch & Stevens, 2013). Based on this prior literature, the following
hypothesis is drawn.
H1: Students who attended summer school will score better on formative assessment tests during the semester, compared to
students who did not attend.

2.4. Summative assessment

Like formative assessment, summative assessment is thoroughly discussed in prior literature. The most important defi-
nitions are presented here. Again, Scriven (1967) introduced the term summative evaluation as the final evaluation of an edu-
cational program. Bloom (1968) connects summative assessment to students’ learning process by defining it as the grading
of their final performance. However, Harlen (2005) discloses that the reason for the grading differs, for example, to verify
performance evolution, to inform parties, and so forth. There can only be one summative assessment. A formative assess-
ment is only possible if there is a summative assessment too. However, Harlen (2005) again combines formative and sum-
mative assessment by using formative assessment information for summative assessment. Our paper uses the following
definition: summative assessment is a long-term (more than three months) assessment by which students’ final perfor-
mance is graded. In most cases, this means that there can only be one summative assessment. However, it is possible to have
multiple summative assessments, with weighted scores to obtain one final score (Marriott, 2009).
Schacter and Jo (2005) find a long-term improvement in performance among summer school attendees. The authors verify
the impact of a summer school nine months afterward. Although satisfied with the results, they note that the difference in
performance between attendees and non-attendees decreases over time. Zvoch and Robertson (2017) also check the influ-
ence of a summer school on long-term performance. They assess performance in January and May of the following academic
year and confirm the positive results of Schacter and Jo (2005). Roberts et al. (2018) examine how summer school in science,
technology, engineering, and mathematics (STEM) could influence student performance by providing opportunities to access
and extend STEM content learning. In accounting education specifically, Inglis et al. (1993) assess performance not only at the
end of a course, but also in the longer term. They report that attendees’ performance in the other first-semester subjects was
not worse than that of non-attendees. This result is explained by the fact that attendees are already in study mode, know
what is expected of them, and have more time to study. Further, Inglis et al. (1993) show that the performance of summer
school attendees continues to be better two years after summer school, with the attendees still demonstrating a more thor-
ough comprehension of the key material.
Based on the literature, the following hypotheses are established.
H2: Students who attended summer school will score better on the summative test (at the end of the semester), compared to
students who did not attend summer school.
H3: The formative test score (collected at the end of summer school) is positively related to the summative assessment score (at
the end of the semester).

3. Methodology

3.1. Design

The objective of this study is to examine the effect of summer school on formative and summative assessments in
accounting education. A quasi-experiment (see Fig. 1) consisting of multiple post-tests with a control group is conducted dur-
ing the academic year 2018/2019 with first-year accounting undergraduates enrolled in the field of study of economics/busi-
ness economics/business engineering at the faculty of Economics and Business Administration. Only students enrolled for
the first time in higher education in 2018 and who took the final examination of the introductory accounting course
(Accounting A) are included in the study.
Each year in September, before the start of the academic year, a summer school is organized. The target group of this sum-
mer school consists of university students in the field of study of economics/business economics/business engineering. The
purpose is to help students who believe they have insufficient accounting knowledge to start the course Accounting A, as well
as students who want to get ahead. In addition, the students become acquainted with bigger groups and the university atmo-
sphere. During four days (five and a half hours per day), the volunteer students take an introductory accounting course (ap-
proach). Lectures on theory are held in the morning, followed by active learning by means of exercises in the afternoon. All
the topics covered during the week are covered again during Accounting A (one hour and 15 min of theory and one hour and
15 min of exercises per week, during 12 weeks). Since students can choose to either attend summer school or not, the study
distinguishes between two groups. To address the hypotheses, the following is investigated. First, by conducting multiple
one-way analyses of covariance (ANCOVAs), we compare the two groups (control group and experimental group) to examine

5
E. Opdecam and P. Everaert Journal of Accounting Education 58 (2022) 100769

Fig. 1. Research design.

whether the performance of summer school attendees is higher than that of non-attendees in formative and summative
assessments. Second, by conducting regressions, we measure the impact of different formative assessment scores on the
summative assessment (final exam).

3.2. Formative and summative assessments

Learning performance is measured at four different times: once at the end of summer school, twice with a formative
assessment during the first semester of the academic year (2018), and once in January 2019 (i.e., the exam). The first test
(summer school test) was administered on the last day of summer school. Consequently, only the scores of attendees who
were present that day were collected. In the summer school test, one (closed book) integration exercise was asked during
class where the students needed to translate different events (e.g., the start of a company, obtaining a loan, making an invest-
ment, purchasing goods, selling goods, depreciation, distributing shareholder dividends) into financial statements. The
objective is to calculate the profit/loss and to present a balance sheet, after the period-end transactions. Afterward, a feed-
back session is provided for the test attendees.
The formative tests (Formative Test 1 and formative Test 2) are given to the students in the first and last weeks, respec-
tively, of November. The tests are online (hence open book) and consist of multiple-choice questions, journal entries, and
filling out (parts) of financial statements. Attendance is voluntary, but rewarded with half a point per entirely completed test,
regardless of the correctness of the answers. Thus, students can earn one point out of a possible 20 on their exam upfront by
participating in a proper way in the two tests. Students are given four days to complete each test at home, and, after each
test, feedback is given during a plenary session, with the possibility of asking questions afterward.
The summative assessment, that is, the exam, consists of two parts. The first comprises 20 multiple-choice questions con-
sisting of three types: (1) calculations, (2) definitions and concepts, and (3) statements. The second part consists of four
extensive exercises, two similar to the exercises covered in class, with 10 journal entries. Two of the exercises involve more
elaborate integration exercises of greater complexity, focused mainly on filling out and understanding the balance sheet and
income statement. We use the total score on the exam as the performance measure. A short overview of the different assess-
ments is given in Table 1.

Table 1
Overview of the different assessments.

Summer school test Formative test 1 Formative test 2 Summative test (Accounting exam)
Chapters Basics from 3 to 7 and 10 to 12 3 to 6 7 to 10 1 to 13
Type Closed book Open book Open book Closed book
Location In class Take-home Take-home In class
Duration 45 min (fixed time) 180 min (estimated) 180 min (estimated) 225 min
(fixed time)
Format 1 integration exercise All sorts of questions All sorts of questions All sorts of questions
Week Before semester starts (week 0) During semester (week 5) During semester (week 9) After study period (week 15)

6
E. Opdecam and P. Everaert Journal of Accounting Education 58 (2022) 100769

3.3. Procedures

Potentially confounding variables need to be included in the analysis because of the non-randomization of the two com-
pared groups (summer school attendees versus non-attendees). Therefore, other variables were collected via a questionnaire
(see the Appendix) administered at the beginning of the second semester (February 26, 2019) as part of a large educational
survey. The students were given time to fill in the questionnaire during class, to maximize completions. The instructor was
present during the administration of the instrument; however, the instructor did not intervene in the data gathering process.
Students entered their student ID code, but did not write their names on the questionnaires. The study was approved by the
university. All the students were assured that neither the teacher nor university administration would have access to indi-
viduals’ data and that all personal information would be treated confidentially and used solely for research purposes.

3.4. Sample and flow chart

The sample consists of 497 students who were officially enrolled by the administration of the university. Different steps
were followed to collect the data. First, 218 of the students attended the summer school, while 279 did not, and 137 of the
attendees took the voluntary summer school test. Second, assessment data were collected during the semester: 484 students
participated in formative test 1 and 467 students participated in formative test 2. The exam scores for all 497 students were
obtained. Third, additional data were collected from the survey. Missing, incorrect, and multiple answers for question were
treated as missing values. Students who answered the control questions with missing, incorrect, or multiple answers were
removed from the sample, as well as students who could not be linked to a score. These criteria resulted in a subsample of
373 students who completed the questionnaire. A summary of the different data flows is given in the flow chart of Fig. 2.

3.5. Measurement methods

An overview of the variables and their measurement methods is given in Table 2.

Fig. 2. Flow chart.

Table 2
Overview of the variables and their measurement methods.

Variable Operationalization
Independent variable
Summer school attendance Obtained by the administration (1 = attendee, 0 = non-attendee)
Dependent variables
Summer school test (formative assessment) Test on the last day of summer school (score out of 20)
Formative test 1 (formative assessment) Online test 1 (score out of 20)
Formative test 2 (formative assessment) Online test 2 (score out of 20)
Accounting exam (summative assessment) Exam score in January (score out of 20)

Control variables
Gender Obtained by the administration (1 = male, 0 = female)
Prior experience accounting Measured with a questionnaire (1 = yes, 0 = no)
Grades secondary school Measured with a questionnaire (score out of 100)
GPA1W (grade point average, or GPA, for semester 1, without the accounting score) Obtained by the administration (score out of 20)

7
E. Opdecam and P. Everaert Journal of Accounting Education 58 (2022) 100769

3.5.1. Dependent variables


The learning outcomes of all the students in the sample are measured by the number of correct answers on the two for-
mative assessments during the semester (formative test 1 and formative test 2) and the grades obtained on the summative
assessment, that is, the accounting exam at the end of the semester. An extra formative assessment test was administered
during the summer school (four days), and their score on this summer school test was gathered at the last day of the summer
school.

3.5.2. Control variables


Because of the non-randomization of the two groups, the following control variables are included.
Gender: Prior research about the effect of gender on academic performance is mixed. A number of studies indicate no sig-
nificant relation between gender and performance (e.g., Byrne & Flood, 2008; Gist, Goedde, & Ward, 1996; Guney, 2009;
Naser & Peel, 1998; Paver & Gammie, 2005). Other scholars in accounting education find that gender has a significant effect
on academic performance (e.g., Fogarty & Goldwater, 2010; Garkaz, Banimahd, & Esmaeili, 2011; Koh & Koh, 1999). These
mixed results can be explained by the fact that gender in itself might not be beneficial in determining academic performance.
The differing impacts of gender on academic performance can be explained by the different learning approaches of the stu-
dents (Booth, Luckett, & Mladenovic, 1999; Byrne, Flood, & Willis, 2002; Everaert, Opdecam, & Maussen, 2017; Jones &
Hassall, 1997), study effort (Arquero, Byrne, Flood, & Gonzalez, 2009; Everaert et al., 2017), and the format of the assessment
(Arthur & Everaert, 2012). Therefore, we include gender in the analyses to control for its potential effect.
Prior experience accounting: Previous studies suggest that prior experience with accounting positively impacts first-year
performance (Gist et al., 1996). However, Koh and Koh (1999) find no significant relation in the first year. Furthermore, in the
second and third years, the relation even becomes negative, meaning that students with prior experience do not perform as
well as students without it, because the students exert less study effort and prior knowledge tends to be insufficient when
the accounting curriculum in higher education differs greatly from that in secondary school.
% secondary school: A positive relation between ability and formative and summative assessments seems obvious. Ability
is measured by the score obtained in the last year of secondary school (Duff, 2004). This is a self-reported measure that is
measured by the questionnaire.
GPA1W: Following Phillips (2015), we use the GPA as a second proxy for ability. Two modifications are made: (1) we use
the mean score of the student’s first semester courses only, and (2) we exclude the grade of the first-semester accounting
course. This results in the measure for the GPA of semester 1 without accounting (GPA1W). It is the mean grade for all
the courses (minus Accounting A), for a total maximum score of 20.

4. Results

4.1. Descriptive statistics

Table 3 provides an overview of the descriptive statistics: 66% (326 of 497) of the students identified themselves as male,
and 34% (171 of 497) as female. The average age is 18.06 years (SD = 0.54). This is not surprising, since most students are 18
when they start university. The self-reported percentage of the last year of secondary school is 72.16%, on average
(SD = 5.96).

Table 3
Descriptive statistics and comparison between attendees and non-attendees.

Panel A: Categorical variables


Variables N N G1* N G2* Frequency G1* Frequency G2* v2 p-Value
Gender 497 218 279 0.037 0.850
Male 66% (326/497) 66% 65%
Female 34% (171/497) 34% 35%
Prior experience accounting 373 171 202 30.503 0.000
Yes 41% (151/373) 157 170 26% 54%
No 59% (279/373) 218 279 74% 46%
Panel B: Continuous variables
Variables N Min Max Mean (SD) N G1* N G2* Mean G1* Mean G2* t-Value p-Value
Age 372 17 21 18.06 (0.54) 170 202 18.01 18.11 2.384 0.018
% secondary school 327 52 90 72.16 (5.96) 157 170 71.82 72.45 0.972 0.332
GPA1W 497 2.17 17.17 9.64 (3.12) 218 279 10.03 9.3 2.021 0.044
Summer school test** 137 2 20 8.85 (3.83)
Formative test 1 484 3.36 20 15.18 (2.67) 215 267 15.52 14.91 2.521 0.012
Formative test 2 467 1.75 20 12.80 (2.95) 207 260 12.93 12.70 0.849 0.397
Accounting exam 497 3 19 11.74 (3.54) 218 279 12.09 11.46 2.002 0.046

* G1 = summer school attendees; G2 = non-attendees.


** Summer school test not available for non-attendees.

8
E. Opdecam and P. Everaert Journal of Accounting Education 58 (2022) 100769

The average scores on the summer school test and the first and second formative assessment tests are 8.85 out of 20
(SD = 3.83), 15.18 out of 20 (SD = 2.67), and 12.8 out of 20 (SD = 2.95). The average accounting exam score is 11.74 out
of 20 (SD = 3.54).

4.2. Attendees compared with non-attendees

Table 3 also compares summer school attendees and non-attendees. No significant difference in gender is found between the
two groups (v2 = 0.037, p = 0.850). In contrast, the results show a significant difference between attendance in summer
school and prior accounting experience (v2 = 30.503, p = 0.000). A total of 74% of the attendees have no prior experience,
a much higher percentage than for the non-attendees (46%). It seems that students with less prior experience try to increase
their lack of knowledge by attending summer school. Concerning the continuous variables, we find that the attendees are
generally a little younger than the non-attendees (with means of 18.01 years for the attendees and 18.11 for the non-
attendees). The attendees and non-attendees report equal scores for secondary education; however, the value of GPA1W
is significantly different between the two groups. The mean of GPA1W is higher for the attendees (mean = 10.03) than
for the non-attendees (mean = 9.3). In addition, there are significant differences regarding Test 1 (with means of 15.52
for the attendees and 14.91 for the non-attendees). The same favorable direction is found for the accounting exam, where
the mean of the attendees (12.09) is higher than the mean of the non-attendees (11.46).

4.3. Correlations

Table 4 provides an overview of the Pearson correlations between the variables. Summer school attendance is positively
correlated with formative test 1 (r = 0.114, p = 0.012) and the accounting exam (r = 0.089, p = 0.048). When looking at the
relation between the formative and summative tests, we find a positive correlation between the summer school test and the
accounting exam (r = 0.265, p = 0.002). Students who already performed well during summer school also performed well a
few months later, on the exam. Further, formative test 1 is positively linked to formative test 2 (r = 0.392, p < 0.001) and the
accounting exam (r = 0.293, p = 0.000). Similarly, formative test 2 is positively correlated to the accounting exam (r = 0.189,
p = 0.000). These results illustrate that regular study of the material is beneficial.

4.4. Hypothesis testing

In H1, summer school attendance is expected to positively influence formative assessment scores. In H2, attendance in
summer school is expected to positively influence summative assessment scores. These hypotheses are investigated by com-
paring the mean scores of the two groups, that is, summer school attendees and non-attendees, at three different points in
time (formative test 1, formative test 2, and the accounting exam). Multiple one-way ANCOVAs are thus conducted. Further-
more, for H3, expecting a positive relation between the three formative and summative tests, a detailed analysis concerning
the influence of the scores for the summer school tests and the two formative tests on the accounting exam is conducted by
means of stepwise linear regression analyses.

4.4.1. Attendees versus non-attendees (H1 and H2)


The analyses are shown in Table 5. For each performance test, three one-way ANCOVAs are conducted: in analysis A, gen-
der is a control variable; in analysis B, gender and ability measure percentage secondary school and GPA1W, respectively, as
control variables; and, in analysis C, gender, the percentage of secondary school, GPA1W, and prior accounting experience
are the control variables.
For formative test 1, as shown in Table 5, Panel A, in all three ANCOVAs, a significant effect of attending a summer school is
found (A, F = 6.292, p = 0.012; B, F = 4.795, p = 0.029; C, F = 5.191, p = 0.023). The mean score of the formative test 1 attendees
is significantly higher than that of non-attendees, even when controlling for all the covariates (gender and ability). In

Table 4
Correlations.

1 2 3 4 5 6 7 8
1 Summer school attendance 1
2 Gender 0.009 1
3 % secondary school 0.054 0.233** 1
a
4 Summer school test 0.080 0.107 1
5 Formative test 1 0.114* 0.034 0.209** 0.10 1
6 Formative test 2 0.039 0.009 0.148** 0.040 0.392** 1
7 Accounting exam 0.089* 0.050 0.461** 0.265** 0.293** 0.189** 1
8 GPA1W 0.111* 0.010 0.470** 0.206* 0.283** 0.174** 0.826** 1

* Correlation significant at the 0.05 level (two tailed).


** Correlation significant at the 0.01 level (two tailed).
a
The correlation cannot be computed, because one of the variables is constant (i.e., only attendee data available)

9
E. Opdecam and P. Everaert Journal of Accounting Education 58 (2022) 100769

Table 5
Hypothesis testing.

Panel A: Dependent variable = formative test 1 (out of 20)


Independent variables N G1* N G2* Attendees Non-attendees F-Value p-Value
EMMean** EMMean**
A Summer school attendance 215 269 15.52 14.91 6.292 0.012
Gender 0.519 0.472
B Summer school attendance 155 167 15.80 15.20 4.795 0.029
Gender 0.051 0.821
% secondary school 2.986 0.085
GPA1W 14.866 0.000
C Summer school attendance 155 167 15.81 15.18 5.191 0.023
Gender 0.102 0.749
% secondary school 2.842 0.093
GPA1W 15.245 0.000
Prior experience accounting 0.411 0.522
Panel B: Dependent variable = formative test 2 (out of 20)
A Summer school attendance 207 260 12.93 12.70 0.729 0.394
Gender 0.048 0.827
B Summer school attendance 150 161 13.02 13.18 0.262 0.609
Gender 0.664 0.416
% secondary school 2.591 0.108
GPA1W 2.816 0.094
C Summer school attendance 150 161 13.05 13.16 0.098 0.755
Gender 0.838 0.361
% secondary school 2.449 0.119
GPA1W 3.182 0.075
Prior experience accounting 0.577 0.448
Panel C: Dependent variable = accounting exam (out of 20)
A Summer school attendance 218 279 12.09 11.46 3.974 0.047
Gender 1.305 0.254
B Summer school attendance 157 170 12.36 12.13 0.949 0.331
Gender 0.656 0.419
% secondary school 5.172 0.024
GPA1W 498.569 0.000
C Summer school attendance 157 170 12.17 12.33 0.046 0.830
Gender 0.142 0.706
% secondary school 4.603 0.033
GPA1W 518.898 0.000
Prior experience accounting 8.900 0.003

* G1, summer school attendees; G2, non-attendees.


** EMMean = estimated marginal mean.

ANCOVA C, where gender, ability and prior experience in accounting is included in the analysis, the mean score for formative
test 1 is 15.81 out of 20 for the attendees, compared to 15.18 out of 20 for the non-attendees. Hence, the data support
hypothesis 1 for formative test 1.
For formative test 2, as shown in Table 5, Panel B, no significant difference is found between the scores of the attendees
and non-attendees on formative test 2, even when only gender is controlled for (A, F = 0.729, p = 0.394; B, F = 0.262, p = 0.609;
C, F = 0.098, p = 0.755). It is also clear that the scores are much lower than for formative test 1, with means around 12–13 out
of 20. Hence, hypothesis 1 cannot be supported for formative test 2.
For the final exam, the results are shown in Table 5, Panel C. A significant difference between attendees’ and non-
attendees’ scores is found when controlling for gender (A, F = 3.974, p = 0.047). The attendees’ mean score is above 60%, with
12.09 out of 20, while that of the non-attendees is under 60%, with only 11.46 out of 20. This means that attendees of the
summer school score higher on the final exam than the non-attendees, finding support for hypothesis 2. However, when the
percentage secondary school and GPA1W are included as control variables, the significant effect of attendance disappears (B,
F = 0.949, p = 0.331). When prior accounting experience is also included, again, we no longer find a significant difference in
the score on the final exam between the attendees and the non-attendees of the summer school (C, F = 0.046, p = 0.830).
To further refine the significant difference in scores on formative test 1 between the attendees and non-attendees (hy-
pothesis 1) , a distinction between high-, moderate-, and low-ability students is made based on their results for GPA1W.
High-ability students are defined as scoring a minimum of 11 out of 20 for GPA1W , (i.e. at the 66.67th percentile); low-
ability students are defined as scoring below eight out of 20 for GPA1W, (i.e., beneath the 33.33th percentile); and the
average-ability group are defined as scoring from eight to<11 out of 20 for GPA1W. The chi-squared test reveals significant
differences between these three ability groups and between summer school attendees and non-attendees (v2 = 9.595,

10
E. Opdecam and P. Everaert Journal of Accounting Education 58 (2022) 100769

p = 0.008). As described earlier, the attendees scored higher on GPA1W than the non-attendees, which means that atten-
dance at the summer school is not independent from ability.
Of all the summer school attendees, 33.4% are high-ability students, 33.2% moderate-ability students, and 33.4% low-
ability students. The previous tests are therefore redone for formative test 1 for the three ability groups separately, to exam-
ine whether the effect of summer school on performance differs depending on students’ level of ability. An overview of the
results is given in Table 6.
Instead of three significant results, as for the whole sample, only one significant result is found when we run the ANCO-
VAs in each ability group separately. Remarkably, only significant differences for formative test 1 are found in the low-ability
group. The scores of summer school attendees are significantly higher than those of non-attendees (ANCOVA A, F = 7.043,
p = 0.009; ANCOVA B, F = 9.952, p = 0.002; ANCOVA C, F = 6.807, p = 0.011). For the last-mentioned test, the mean attendee
score amounts to 15.58 out of 20, and the mean non-attendee score is 13.75 out of 20.
To conclude, attendance in summer school positively influences a student’s formative assessment if this assessment takes
place close to summer school (finding support for H1). The formative test in week 5 after summer school finds that attendance
in summer school has a significant beneficial effect. On the contrary, the formative test in week 9 no longer reveals any sig-
nificant beneficial effect of summer school attendance. In addition, looking at the results of the summative assessment of the
accounting exam at the end of the semester (week 15), we do not find summer school attendance to have a positive influ-
ence. Therefore, H2 is not supported by the data. These results show that the effect of attending a summer school does not hold
over the longer term; however, positive effects are found in the first part of the semester, where the scores on the formative
assessment test of attendees are higher than those of non-attendees.

4.4.2. Summer school test, formative tests, and the accounting exam (H3)
To test H3, we use stepwise regression. The results are shown in Table 7, second column.. Stepwise regression is con-
ducted for students who took the summer school test (n = 137) using the following regression model:
AccountingExam ¼ b0 þ b1 gender þ b2 summerschooltest þ b3 formativ etest1 þ b4 formativ etest2
In the third column, stepwise regression is run for the students who did not take the summer school, using the following
model.

Table 6
Hypothesis testing by ability group.

Dependent variable = formative test 1 (out of 20)


Independent variables N G1* N G2* Attendees Non-attendees F-Value p-Value
EMMean** EMMean**
High ability (GPA1W  11/20)
A Summer school attendance 74 89 15.72 16.02 0.728 0.395
Gender 0.363 0.548
B Summer school attendance 58 65 15.84 16.23 1.028 0.313
Gender 0.332 0.565
% secondary school 4.719 0.032
C Attendance 58 65 16.00 16.08 0.041 0.840
Gender 0.673 0.414
% secondary school 6.095 0.015
Prior experience accounting 6.043 0.015
Moderate ability (8/20  GPA1W < 11/20)
A Summer school attendance 84 77 15.56 15.17 0.994 0.320
Gender 0.556 0.552
B Summer school attendance 63 49 15.9 15.38 1.421 0.236
Gender 0.080 0.778
% secondary school 7.989 0.006
C Summer school attendance 63 49 15.90 15.38 1.356 0.247
Gender 0.074 0.786
% secondary school 7.799 0.006
Prior experience accounting 0.001 0.982
Low ability (GPA1W < 8/20)
A Summer school attendance 55 103 15.13 13.76 7.043 0.009
Gender 0.009 0.923
B Summer school attendance 33 53 15.79 13.62 9.952 0.002
Gender 0.318 0.575
% secondary school 0.218 0.642
C Summer school attendance 33 53 15.58 13.75 6.807 0.011
Gender 0.637 0.427
% secondary school 0.145 0.704
Prior experience accounting 2.991 0.088

* G1 = summer school attendees; G2 = non-attendees.


** EMMean = estimated marginal mean.

11
E. Opdecam and P. Everaert Journal of Accounting Education 58 (2022) 100769

AccountingExam ¼ b0 þ b1 gender þ b2 formativ etest1 þ b3 formativ etest2


Table 7 shows the results of a stepwise linear regression with the accounting exam score as the dependent variable. We
computed the variance inflation factors (VIF’s) of all independent variables of all models for attendees (column two) and the
non-attendees (column three) . The VIF’s ranged from 1.003 and 1.259, which indicates that multicollinearity is not a con-
cern. VIF values over 10 are normally considered to indicate a high degree of collinearity between variables (Hair, Anderson,
Tatham, & Black, 1998).
All regression model show that the summer school test score is significant and positively related to the final exam score.
In particular, regression C shows that only the summer school test score significantly influences the accounting exam score
(p = 0.004). When the summer school test scores increase by one point out of 20, exam scores increase by 0.438 points out of
20. In this case, 7.8% of the variance in exam scores is explained by gender, the summer school test, and formative tests 1 and
2. The highest adjusted R2 value is found in regression B, where formative test 2 is not included. Here, 8.8% of the variance in
the exam score is explained by gender, the summer school test, and formative test 1. Again, the score on the summer school
test is significantly positive related to the exam score (p = 0.003). In regression A, where only gender and the summer school
test are included, the adjusted R2 value amounts to 7.6%, again with a significant summer school test score (p = 0.003). In
sum, all the regressions in support H3. The test score at the end of summer school is positively related to the accounting exam
score. The results also show that the summer school test is a better predictor of accounting exam scores than formative test 1
or 2.
Table 7 also shows the results of the same analysis for non-attendees in the third column. Regression C, which has the
most complete set of variables, reveals that both formative test 1 (p = 0.000) and formative test 2 (p = 0.012) have a signif-
icant effect on accounting exam scores. The adjusted R2 value is 0.142, meaning that 14.2% of the variance in exam scores is
explained by gender and formative tests 1 and 2. When formative test 2 is not included, the adjusted R2 value drops to 11.8%.
Hence, for non-attendees, the scores on both formative tests 1 and 2 are significantly related to the exam scores, whereas, for
summer school attendees, the summer school test is a better predictor of the accounting exam score than either formative
test 1 or 2.

5. Discussion, limitations, and conclusion

5.1. Discussion

The first goal of this study is to examine the effect of summer school on formative and summative assessments in account-
ing education. A quasi-experiment was conducted with students in a first-year undergraduate accounting course during the
academic year 2018/2019. While controlling for other variables, we compare the scores on three tests (two formative online
tests and one summative exam) of the summer school attendees and non-attendees, to assess the impact of this intensive
learning program on performance. To refine the results, we distinguish between three ability groups: high, moderate, and
low. The second goal of this study is to investigate if the summer school test, taken at the end of summer school, that is,
before the semester starts, is related to exam performance at the end of the semester.
The summer school was organized as a voluntary supplementary week of study immediately before the start of the aca-
demic year. Only accounting topics were addressed and students took a formative test at the end of the summer school, dur-
ing class time. During the semester, the same content was covered again in a regular weekly course (but covering more
topics, with a more in-depth focus). In weeks 5 and 9, an open-book formative assessment test was administered (delivering
one of 20 marks), while the final exam was organized as a closed-book exam in week 15. Since attendance in the summer

Table 7
Results of stepwise linear regression with the accounting exam score as the dependent variable.

Dependent variable = accounting exam


Summer school attendees Non-attendees
Variables N B p-Value Adjusted R2 N B p-Value Adjusted R2
A 137 0.076
Gender 1.009 0.091
Summer school test 0.445 0.003
B 135 0.088 266 0.118
Gender 0.910 0.122 0.013 0.824
Summer school test 0.435 0.003
Formative test 1 0.049 0.054 0.353 0.000
C 127 0.078 249 0.142
Gender 0.833 0.187 00.015 0.795
Summer school test 0.438 0.004
Formative test 1 0.056 0.072 0.299 0.000
Formative test 2 0.010 0.674 0.160 0.012

Note: The VIF’s are all below 2.

12
E. Opdecam and P. Everaert Journal of Accounting Education 58 (2022) 100769

school was voluntary and all students took the same courses during the semester, this study can compare the results of
attendees with those of non-attendees. Below, the results are discussed stepwise.
First, almost half of the students took the voluntary summer school in accounting. The group of attendees differs signifi-
cantly from the group of non-attendees in various aspects. Summer school attendees have less experience in accounting than
the non-attendees, whereas no difference is found in terms of their grades in secondary education. However, looking at the
attendees’ performance measures at university, we find they have higher ability (measured as the GPA for all the other
courses in the first semester) and higher scores on both formative test 1 and the final exam in accounting, which is explained
in detail in the next paragraph. If we offer summer school access to all students, we cannot conclude that we have attracted
especially underprivileged students, as mentioned in previous literature (Alexander et al., 2001; Cooper et al., 1996; Schacter
& Jo, 2005). However, we can conclude that the summer school might be a good way to equalize students’ prior knowledge
before the start of the academic year (Rienties et al., 2005), as most attendees have less experience in accounting. Further-
more, the attendees of summer school in our study did not differ in initial ability (measured as percentage attained at high
school). There is an abrupt transition between secondary school and higher education in terms of atmosphere and expecta-
tions (Declercq & Verboven, 2010). Summer school has the potential to enhance the smoothness of the transition process
(Brooman & Darwent, 2014; Byrne & Flood, 2005), which is confirmed by our results in terms of higher scores for the tests
of accounting and a higher GPA score in the first semester for attendees of the summer school.
Second, based on previous studies, summer school was expected to have a beneficial impact on students’ learning. This
first hypothesis is supported for formative test 1, where we find that attendance has a positive influence on formative test 1
scores five weeks after the end of summer school. This beneficial effect on the first formative assessment test is surprising,
since the attendees had significantly less prior knowledge on accounting from their high school curricula. Hence, the desire
to attend the summer school and to prepare for the upcoming course is justified. Additionally, offering one week of holiday
seems to be very effective. This finding confirms the research of Gorard et al. (2015) and Inglis et al. (1993), who also find
summer school to have a beneficial effect. This effect can be explained by the fact that the attendees are better prepared for
the university atmosphere and already in study mode when the academic year starts, as explained by Inglis et al. (1993). The
authors find that students who take a summer school in management accounting (and have one course less during the seme-
ster) also do better on the other courses during the semester. Inglis et al. (1993) also show that attendees have more time
during the semester, since they have to study for one less course compared to non-attendees. In the current study, this is not
an explanation, since there is no difference in the number of courses between attendees and non-attendees. Furthermore,
given an initial knowledge gap between attendees and non-attendees, summer school succeeds in increasing students’
knowledge in accounting more than it reduces the knowledge gap. These positive outcomes may be the result of an intensive
teaching approach (Crispin et al. , 2016). In order to best organize a summer school, these authors advise organizing an inten-
sive teaching approach in a period when the balance between study and family obligations on the one hand and the balance
between standard units (which are organized during the academic year) and intensive unit on the other hand are good
(Crispin et al, 2016). This is the case in our study, as the summer school is the only course organized during the vacation,
so no pressure on family commitments. In addition, the summer school is the only course at that time, consequently there
is no interference with other courses either. Crispin et al. (2016) also argue that a course that is delivered in an intensive
manner (like the summer school) should implement team-based activities and exercises; allowing students the opportunity
to discuss concepts and to learn from each other’s experiences. This is indeed the case in our format.
For us, the explanation for the beneficial effect of the summer school on performance (while controlling for GPA) could
also be linked to the fact that the summer school was offered as a voluntary course, which means that students had the
autonomy to decide whether or not to enroll for the summer school. Providing students choice gives students control over
the situation and their own behavior (Ryan and Deci, 2000). This sense of control is a central feature of the Self-Determination
Theory (SDT).Based on SDT, choice is considered as a practice that promotes the learner’s need of autonomy (i.e. acting voli-
tionally), one of the three basic psychological needs alongside the need for competence (i.e. experiences of mastery) and
relatedness (i.e. sense of belongingness) (Ryan & Deci, 2000). According to SDT theory, students will get a sense of control
over the (first year) situation when they are provided with choices (here attending summer school or not). In addition, pre-
vious research in education settings found that when instructors apply an autonomy-supportive teaching style (by imple-
menting choice), they are more attuned to students’ perspectives which makes them in turn more open and responsive
to the relatedness and competence needs (Ryan & Deci, 2020). Subsequently, the fulfilment of these three needs enhances
intrinsic motivation, an important determinant of student achievement (Hattie, 2009). It would be very interesting to inves-
tigate whether the same results are obtained if summer school attendees and non-attendees are randomly assigned to treat-
ment in summer school.

13
E. Opdecam and P. Everaert Journal of Accounting Education 58 (2022) 100769

Third, the results of this study show that the beneficial effect of the summer school diminishes over time. A significant dif-
ference is no longer found for formative assessment test 2 or for the summative test (accounting exam) at the end of the seme-
ster, when we take the differences in GPA between the attendees and the non-attendees in consideration.. Hence, H3, where
attendance in summer school is expected to positively influence summative assessment scores, is not supported by our data.
This result shows that summer school attendance has an effect, but only on the short-term formative assessment, and not on
the summative assessment at the end of the semester. This result is contrary to the research of Inglis et al. (1993), who found
an impact even after two years, which is much longer than the 15 weeks in our study. One explanation could be that their
study involves an entire course, whereas, in our study, only the basic concepts of half of the main chapters are covered during
the summer school. As the semester progressed, many more new topics are introduced, which were not covered during sum-
mer school. In addition, Inglis et al. disclose that part of the significance could result from the different teaching approaches
between summer school and the regular semester (with seminars compared to tutorials and lectures). Other studies, such as
that of Schacter and Jo (2005) concerning reading abilities, find an effect in the short and longer term, but also that the influ-
ence diminishes over time. This is in line with our finding that the initial significant results in the very short term (five weeks
after summer school) are no longer found in the longer term (nine weeks or 15 weeks after summer school).
Fourth, the beneficial short-term impact of the summer school differs depending on the ability level of the student. When
we split the sample into high-, moderate-, and low-ability students (based on their GPA1 for all first-semester courses except
the accounting course), we find attendance in the summer school to have significantly positive effects on formative test 1
only among low-ability students. Attendees score higher than non-attendees. Similar to Rienties et al. 2005), this result
demonstrates that it is important to encourage students, especially low-ability students, to attend the summer school. How-
ever, we believe that other students should also be encouraged to attend, since attendance in summer school does not yield
any negative effects and prepares students for the academic setting. Previous literature confirms that a summer school can
be beneficial for all students (Cooper et al. 1996; Schacter and Jo, 2005), since it provides opportunities to meet new friends
in an informal environment, to obtain feedback on expectations of this new life and to become familiar with the university
environment.
Five, the three formative assessment tests (during the semester) are good predictors of the summative assessment test (at
the end of the semester). In particular, we find a positive correlation between the summer school test and accounting exam
scores. It seems that students who perform well during summer school (after four days at university) also perform well in the
final exam (in week 15). Additionally, we find that the scores of the formative tests in weeks 5 and 9 are positively correlated
with the final exam score. Hence, students score in a consistent way on the accounting assessments throughout the
semester.
Lastly, in comparing the different formative assessment tests to explain the summative test at the end of the semester, the
following conclusions can be drawn. Linear regression analyses for attendees who took the summer school test reveal that
the scores on formative tests 1 and 2 provide no additional explanation for the accounting exam score. Only the summer
school test score turns out to be a significant predictor of the accounting exam score, offering support for H3. The fact that
formative tests 1 and 2 are open-book exams, whereas the summer school test is closed book, can be mooted as an expla-
nation. For the non-attendees, where no summer school test scores are available, the results show that formative tests 1 and
2 are significant predictors of the accounting exam score. Consequently, the higher students score on the tests (either the
summer school test or formative tests 1 and 2, depending on the group), the higher they score on the accounting exam
and the greater their chances of passing the course. Already at the end of summer school, it is clear to the lecturers which
students will require extra help to succeed. For those students who did not take the summer school test, this information is
obtained after formative tests 1 and 2. Consequently, the fact that the summer school test and tests 1 and 2 provide infor-
mation about the scores students are likely to obtain on the accounting exam is advantageous: it gives lecturers and students
a chance to start remediation as soon as possible, to increase the number of students who will pass the course. This could be
even more interesting in light of an open-gate system. In a higher education system where the transition to higher education
has no formal selection method or entrance criteria, there are large differences between incoming students in terms of
knowledge, attitudes, and skills. The summer school test and formative tests could take on a signaling function for the
instructors and students. Remediation and reorientation can thus start, even before the academic year begins.

5.2. Conclusion and recommendations

The goal of this study is to examine the effect of summer school on formative and summative assessments in accounting
education. We conclude that attendance in a summer school for accounting has a beneficial effect on students’ performance.
However, this effect is only observed for the first test period. We advise other instructors to implement a summer school, for
three reasons. First, our study shows that attendance in summer school has a positive effect on short-term performance.
Attending students appear to gain a head start at the beginning of the academic year. Second, a summer school can facilitate
the transition between secondary and tertiary education. Summer school offers a safe learning environment in which stu-
dents come to know the atmosphere of institutions, which can facilitate the transition from smaller (secondary education)

14
E. Opdecam and P. Everaert Journal of Accounting Education 58 (2022) 100769

groups to bigger groups (tertiary education). Third, summer school attendance can reduce the number of early school leavers
and increase passing rates among first-year undergraduates, since summer school can be seen as a test to assess students’
abilities for the discipline in question before the start of the academic year. Our study shows that a test during summer
school is a very good predictor of accounting performance. Of the 218 students who attended the summer school 137 stu-
dents took the test. Given the predictive power of the summer school test, we advise all students attending the summer
school to take the test and advise educators to think about including a test as part of the summer school. This gives lecturers
and students the chance to start remediation as soon as possible and to increase the number of students who will pass the
course. This issue is of particular interest in an open-gate system. If a university has no selection procedure or intake exams,
the summer school test can offer an opportunity for immediate remediation or even reorientation.
In addition, we find that summer school attendees outperform non-attendees. Given that the summer school has a pos-
itive effect in our setting, one could ask whether summer school should not be provided for all students. We believe that
providing the opportunity to voluntarily attend a summer school is a good way to organize a summer school. We are not
sure that the result would be the same if all students were forced to attend the summer school, however. Some students will
always learn, no matter what they are offered, whereas others will need the extra time during summer school to adapt and
learn. Our study shows that especially lower-ability students are helped by this extra effort in summer school. We also
believe in this market-based offer. By providing students a choice, a market-based solution is created where only students
who believe in the added value of summer school are attracted to this additional summer school. This results in only limited
investment by the instructors, which must also be seen in light of the effect on student learning outcomes and the transition
between secondary and tertiary education.
Finally, since the performance measures for all three formative assessments are correlated with the summative assess-
ment, the current study offers signaling functions about student progress at each of these points in the semester. This means
that valuable feedback can be given to students, from the summer school week onward.

5.3. Limitations and recommendations for future research

First, we conducted a quasi-experiment where students could choose to either attend the summer school or not. For ethical
reasons and to offer all students the opportunity to attend the summer school, no randomization is applied. Consequently,
there could be selection bias in the current study. Therefore, it is recommended that further research consider other control
variables that could overshadow or accelerate the effect of summer school on formative and summative assessments along-
side those represented in this study. Adding more control variables might increase also the explanatory power of the models.
Note that the variance explained (adjusted R-squared) in the performance of the accounting exam is quite low, about 10%,
indicating that other factors related to the students and the educational context, not considered in this study, may affect
performance.
Second, prior experience in accounting and grades of secondary education are self-reported in the questionnaire. Given this
limitation, the GPA of the first-semester courses (without the accounting course) is included in the analyses.
Third, data are gathered only for one first-year undergraduate accounting course at one university in one year. Conse-
quently, prudence is advised in generalizing the results. Additionally, the analyses conducted to refine the results with
regard to the positive influence of attendance in summer school on formative test 1 show a positive effect for the lower-
ability group. Further research could investigate which variables and benchmarks are best suited to define this group. A
pre-test for every student enrolled in the field of study could potentially give the best information. Marketing research
can identify the most effective methods to reach students and convince them to attend. Certainly lower-ability students need
to be targeted, since they experience higher short-term improvements in performance, but all students benefit in some ways
from attendance.
Fourth, this study investigates the beneficial effect of the summer school in both the short and long term, without empir-
ically investigating the underlying reasons. For instance, can the short-term effect be explained by the four days of extra
teaching during the summer school or by an earlier start at studying the material or both? This last reason is in line with
the literature regarding procrastination. Academic procrastination means that students delay studying or completing aca-
demic assignments (Rotenstein et al., 2009). This is especially interesting for accounting students, since the importance of
regular homework and exercises in accounting classes cannot be overstated (Rotenstein et al., 2009). Without systematic
practice of the material covered in class, it is often difficult for students to master and retain the fundamental accounting
concepts and techniques.
Fifth, future research could involve intervention studies, dedicated to students with low scores in summer school tests.
These test results could flag potential problems and allow for the detection of students who need extra support to under-
stand the material.

Acknowledgements

We offer sincere appreciation to Pauline Heugebaert for her suggestions in the development of the paper. We also
acknowledge the helpful comments we received from the participants at the BAFA Conference in Ghent (2019).
15
E. Opdecam and P. Everaert Journal of Accounting Education 58 (2022) 100769

Appendix. . Questionnaire.

16
E. Opdecam and P. Everaert Journal of Accounting Education 58 (2022) 100769

References

Aelterman, N., Vansteenkiste, M., & Haerens, L. (2018). Correlates of students’ internalization and defiance of classroom rules: A self-determination theory
perspective. British Journal of Educational Psychology..
Alexander, K. L., Entwisle, D. R., & Olson, L. S. (2001). Schools, achievement, and inequality: A seasonal perspective. Educational Evaluation and Policy Analysis,
23(2), 171–191.
Arquero, J. L., Byrne, M., Flood, B., & Gonzalez, J. M. (2009). Motives, expectations, preparedness and academic performance: A study of students of
accounting at a Spanish university. Accounting Magazine, 12(2), 279–299. https://doi.org/10.1016/S1138-4891(09)70009-3.
Arthur, N., & Everaert, P. (2012). Gender and performance in accounting examinations: Exploring the impact of examination format. Accounting Education,
21(5), 471–487. https://doi.org/10.1080/09639284.2011.650447.
Bekman, S., Aksu-Koç, A., & Erguvanlı-Taylan, E. (2011). Effectiveness of an intervention program for six-year-olds: A summer-school model. European Early
Childhood Education Research Journal, 19(4), 409–431. https://doi.org/10.1080/1350293X.2011.623508.
Black, P., & Wiliam, D. (1998). Assessment and classroom learning. Assessment in Education, 5(1), 7–74.
Bloom, B. S. (1968). Learning for mastery. Evaluation Comment, 1(2), 1–12.
Booth, P., Luckett, P., & Mladenovic, R. (1999). The quality of learning in accounting education: The impact of approaches to learning on academic
performance. Accounting Education, 8(4), 277–300. https://doi.org/10.1080/096392899330801.
Brooman, S., & Darwent, S. (2014). Measuring the beginning: A quantitative study of the transition of higher education. Studies in Higher Education, 39(9),
1523–1541. https://doi.org/10.1080/03075079.2013.801428.
Byrne, M., & Flood, B. (2005). A study of accounting students’ motives, expectations and preparedness for higher education. Journal of Further and Higher
Education, 29(2), 111–124. https://doi.org/10.1080/03098770500103176.
Byrne, M., & Flood, B. (2008). Examining the relationships among background variables and academic performance of first year accounting students at an
Irish university. Journal of Accounting Education, 26(4), 202–212. https://doi.org/10.1016/j.jaccedu.2009.02.001.
Byrne, M., Flood, B., & Willis, P. (2002). The relationship between learning approaches and learning outcomes: A study of Irish accounting students.
Accounting Education, 11(1), 27–42. https://doi.org/10.1080/09639280210153254.
Crispin, S., Hancock, P., Male, S., Baillie, C., MacNish, C., Leggoe, J., et al (2016). Threshold capability development in intensive mode business units. Education
and Training, 58, Issue 5.
Cooper, H., Nye, B., Charlton, K., Lindsay, J. J., & Greathouse, S. (1996). The effects of summer vacation on achievement test scores: A narrative and meta-
analytic review. Review of Educational Research, 66(3), 227–268. https://doi.org/10.3102/00346543066003227.
Declercq, K., & Verboven, F. (2010). Chances of success at Flemish universities: time to adjust policy send? Leuven: Catholic University of Leuven.
Denton, C., Solari, E., Ciancio, D. J., Hecht, S., & Swank, P. (2010). A pilot study of a kindergarten summer school reading program in high-poverty urban
schools. The Elementary School Journal, 110(4), 423–439. https://doi.org/10.1086/651190.
Duff, A. (2004). Understanding academic performance and progression of first-year accounting and business economics undergraduates: The role of
approaches to learning and prior academic achievement. Accounting Education, 13(4), 409–430. https://doi.org/10.1080/0963928042000306800.
Egbert, J. (1922). University summer schools. New York City: Department of the Interior, Bureau of Education.
Ellis, B., & Sawyer, J. (2009). Regional summer schools: Widening learning opportunities through intensive courses (2009). Education in Rural Australia, 19(1),
35–52.
Everaert, P., Opdecam, E., & Maussen, S. (2017). The relationship between motivation, learning approaches, academic performance and time spent.
Accounting Education, 26(1), 78–107. https://doi.org/10.1080/09639284.2016.1274911.
Fogarty, T., & Goldwater, M. (2010). Beyond just desserts: The gendered nature of the connection between effort and achievement for accounting students.
Journal of Accounting Education, 28(1), 1–12. https://doi.org/10.1016/j.jaccedu.2010.09.001.
Frymier, A., Shulman, G., & Houser, M. (1996). The development of a learner empowerment measure. Communication Education, 45(3), 181–199. https://doi.
org/10.1080/03634529609379048.
Garkaz, M., Banimahd, B., & Esmaeili, H. (2011). Factors affecting accounting students’ performance: The case of students at the Islamic Azad University.
Procedia - Social and Behavioral Sciences, 29, 122–128. https://doi.org/10.1016/j.sbspro.2011.11.216.
Gist, W. E., Goedde, H., & Ward, B. H. (1996). The influence of mathematical skills and other factors on minority student performance in principles of
accounting. Issues in Accounting Education, 11(1), 49–60.
Gorard, S., Siddiqui, N., & See, B. H. (2015). How effective is a summer school for catch-up attainment in English and maths? International Journal of
Educational Research, 73, 1–11. https://doi.org/10.1016/j.ijer.2015.07.003.
Guney, Y. (2009). Exogenous and endogenous factors influencing students’ performance in undergraduate accounting modules. Accounting Education, 18(1),
51–73. https://doi.org/10.1080/09639280701740142.
Hair, J. F., Jr, Anderson, R. E., Tatham, R. L., & Black, W. C. (1998). Multivariate data analysis (3rd edition). Upper Saddle River, NJ: Prentice- Hall International.
Harlen, W. (2005). Teachers’ summative practices and assessment for learning –tensions and synergies. Curriculum Journal, 16(2), 207–223. https://doi.org/
10.1080/09585170500136093.
Ho, H., & Karagiannidis, V. (2007). Summer school teaching and learning: Some thoughts from undergraduate business students. College Quarterly, 10(2).
Inglis, R., Broadbent, A., & Dall’alba, G. (1993). Comparative evaluation of a teaching innovation in accounting education: Intensive learning in a seminar
format. Accounting Education, 2(3), 181–199. https://doi.org/10.1080/09639289300000027.
Issroff, K., & Eisenstadt, M. (1997). Evaluating a virtual summer school. Journal of Computer Assisted Learning, 13(4), 245–252. https://doi.org/10.1046/j.1365-
2729.1997.00027.x.
Jones, C., & Hassall, T. (1997). Approaches to learning of first year accounting students: Some empirical evidence. In G. Gibbs & C. Rust (Eds.), Improving
student learning through course design (pp. 431–438). Oxford: The Oxford Centre for Staff and Learning Development.
Jones, B., & Iredale, N. (2006). Developing an entrepreneurial life skills summer school. Innovations in Education and Teaching International, 43(3), 233–244.
https://doi.org/10.1080/14703290600618522.
Katz, I., & Assor, A. (2007). When Choice Motivates and When It Does Not. Educational Psychology Review, 19(4), 429–442 http://www.jstor.org/stable/
23363842.
Koh, M. Y., & Koh, H. C. (1999). The determinants of performance in an accountancy degree programme. Accounting Education, 8(1), 13–29. https://doi.org/
10.1080/096392899331017.
Lewis, L. K., & Hayward, P. A. (2003). Choice-based learning: Student reactions in an undergraduate organizational communication course. Communication
Education, 52(2), 148.
Liu, Q.-X., Fang, X.-Y., Wan, J.-J., & Zhou, Z.-K. (2016). Need satisfaction and adolescent pathological internet use: Comparison of satisfaction perceived online
and offline. Computers in Human Behavior, 55, 695–700.
Lütgendorf-Caucig, C., Kaiser, P., Machacek, A., Waldstein, C., Pötter, R., & Loeffler-Stastka, H. (2017). Vienna Summer School on Oncology: How to teach
clinical decision making in a multidisciplinary environment. BMC Medical Education, 17(100). https://doi.org/10.1186/s12909-017-0922-3.
Marriott, P. (2009). Students’ evaluation of the use of online summative assessment on an undergraduate financial accounting module. British Journal of
Educational Technology, 40(2), 237–254. https://doi.org/10.1111/j.1467-8535.2008.00924.x.
Mathes, P. G., & Denton, C. A. (2002). The prevention and identification of reading disability. Seminars in Pediatric Neurology, 9(3), 185–191. https://doi.org/
10.1053/spen.2002.35498.
McCombs, J. S., Pane, J. F., Augustine, C. H., Schwartz, H. L., Martorell, P., & Zakaras, L. (2014). Ready for fall? Near-term effects of voluntary summer learning
programs on lowincome students’ learning opportunities and outcomes. Santa Monica, CA: RAND Corporation.

17
E. Opdecam and P. Everaert Journal of Accounting Education 58 (2022) 100769

Naser, K., & Peel, M. J. (1998). An exploratory study of the impact of intervening variables on student performance in a principles of accounting course.
Accounting Education, 7(3), 209–223. https://doi.org/10.1080/096392898331153.
Paver, B., & Gammie, E. (2005). Constructed gender, approach to learning and academic performance. Accounting Education, 14(4), 427–444. https://doi.org/
10.1080/06939280500347142.
Phillips, J. F. (2015). Accounting majors finish first–results of a five-year study of performance in introductory accounting. The Accounting Educators’ Journal,
25, 25–38.
Pinxten, M., Marsh, H. W., De Fraine, B., van den Noortgate, W., & Van Damme, J. (2014). Enjoying mathematics or feeling competent in mathematics?
Reciprocal effects on mathematics achievement and perceived math effort expenditure. British Journal of Educational Psychology, 84, 152–174.
Pinxten, M., van Soom, C., Peeters, C., & Al, E. (2019). At-risk at the gate: Prediction of study success of first-year science and engineering students in an
open-admission university in Flanders—any incremental validity of study strategies? European Journal of Pyschology of Education, 34, 45–66.
Ramaprasad, A. (1983). On the definition of feedback. Behavioural Science, 28, 4–13.
Rienties, B., Rehm, M., & Dijkstra, J. (2005). Remedial online teaching in theory and practice online summer course: Balance between summer and course.
Industry and Higher Education, 20(5), 327–336. https://doi.org/10.5367/000000006778702300.
Roberts, T., Jackson, C., Mohr-Schroeder, M. J., et al (2018). Students’ perceptions of STEM learning after participating in a summer informal learning
experience. International Journal of STEM Education, 5, issue 35. https://doi.org/10.1186/s40594-018-0133-4.
Roderick, M., Nagaoka, J., Coca, V., & Moeller, E. (2008). From high school to the future: Potholes on the road to college. Research report: Retrieved from.
University of Chicago Consortium on Chicago School Research website: Https://consortium.uchicago.edu/sites/default/files/publications/
CCSR_Potholes_Report.pdf.
Rotenstein, A., Davis, H., & Tatum, L. (2009). Early birds versus just-in-timers: The effect of pracrastination on academic performance of accounting students.
Journal of Accounting Education, 27(4), 223–232.
Ryan, R. M., & Deci, E. L. (2000). Intrinsic and extrinsic motivations: Classic definitions and new directions. Contemporary Educational Psychology, 25(1),
54–67. https://doi.org/10.1006/ceps.1999.1020.
Schacter, J., & Jo, B. (2005). Learning when school is not in session: A reading summer day-camp intervention to improve the achievement of exiting first-
grade students who are economically disadvantaged. Journal of Research in Reading, 28(2), 158–169. https://doi.org/10.1111/j.1467-9817.2005.00260.x.
Scriven, M. (1967). The Methodology of Evaluation. In R. Tyler, R. Gagné & M. Scriven (Eds.), Perspectives of Curriculum Evaluation: AERA Monograph Series on
Curriculum Evaluation 1 (pp. 39–83). Chicago: Rand McNally.
Torgesen, J. (2000). Individual differences in response to early interventions in reading: The lingering problem of treatment resisters. Learning Disabilities
Research & Practice, 15(1), 55–64. https://doi.org/10.1207/SLDRP1501_6.
Watty, K., de Lange, P., Carr, R., O’Connell, B., Howieson, B., & Jacobsen, B. (2013). Accounting students’ feedback on feedback in Australian universities:
They’re less than impressed. Accounting Education, 22(5), 467–488. https://doi.org/10.1080/09639284.2013.823746.
Xu, Y., & De Arment, S. (2017). The effects of summer school on early literacy skills of children from low-income families. Early Child Development and Care,
187(1), 89–98. https://doi.org/10.1080/03004430.2016.1151419.
Xu, X., Padilla, A. M., & Silva, D. (2014). The time factor in Mandarin language learning: The four-week intensive versus the regular high school semester. The
Language Learning Journal, 42(1), 55–66. https://doi.org/10.1080/09571736.2012.677054.
Zvoch, K., & Robertson, M. C. (2017). Multivariate summer school effects. Studies in Educational Evaluation, 55, 145–152. https://doi.org/10.1016/j.
stueduc.2017.10.003.
Zvoch, K., & Stevens, J. J. (2013). Summer school effects in a randomized field trial. Early Childhood Research Quarterly, 28(1), 24–32.

18

You might also like