A Repository of Modern Software Testing Laboratory

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 9

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/229036930

A Repository of Modern Software Testing Laboratory Courseware–An Experience


Report

Article · January 2009

CITATIONS READS

0 239

1 author:

Vahid Garousi
Queen's University Belfast
172 PUBLICATIONS   1,986 CITATIONS   

SEE PROFILE

Some of the authors of this publication are also working on these related projects:

IndAcSE: Meta-analysis of Industry-Academia Collaborations in Software Engineering View project

Development of scientific software View project

All content following this page was uploaded by Vahid Garousi on 09 February 2016.

The user has requested enhancement of the downloaded file.


A Repository of Modern Software Testing Laboratory Courseware –
An Experience Report

Vahid Garousi
Software Quality Engineering Research Group (SoftQual)
Department of Electrical and Computer Engineering, Schulich School of Engineering
University of Calgary, Alberta, Canada
vgarousi@ucalgary.ca

Abstract
In order to effectively teach software testing students how to test real-world software, the
software tools, exercises, and lab projects chosen by testing educators should be practical
and realistic. However, there are not many publicly-available realistic testing courseware for
software testing educators to adapt and customize. Even for the existing testing lab exercises
developed and/or used by the educators, there are various drawbacks, e.g.: (1) They are not
usually kept up-to-date with the most recent testing tools and technologies, e.g., JUnit, (2)
They are not built based on realistic/real-world Systems Under Test (SUTs), but rather use
“toy” examples (SUTs). The above needs were the main motives for the author and his team
at the University of Calgary to modernize the lab exercises of an undergraduate software
testing course. This paper presents the designed lab courseware, and the experiences learned
from using the courseware in the University of Calgary. It is hoped (and expected) that other
software testing educators start to use this laboratory courseware and find it useful for their
instruction and training needs.

1. Introduction
In order to effectively teach software engineering students how to solve real-world
problems, the software tools, exercises, projects and assignments chosen by testing
educators should be practical and realistic.
In the context of software testing education, the above need implies the use of
realistic and relevant Systems Under Test (SUT), and making use of realistic
commercial testing tools. Otherwise, the skills that students learn in those courses will
not enable them to be ready to test large-scale industrial software systems after
graduation.
In this paper, the choices of testing tools and SUTs used in seven selected software
testing courses (referred to as just “courses” hereafter) delivered in a few North
American universities are first discussed (Section 2). Observations are made w.r.t.
teaching and students learning based on the choices of size and complexity of SUTs and
also testing tools used by students. After an extensive research, it was observed that
there are not many publicly-available testing courseware for software testing educators to
adapt and customize. Even for the existing testing lab exercises developed and/or used by the
educators, there are various drawbacks, e.g.:
ƒ They are not usually kept up-to-date with the most recent testing tools and technologies,
e.g., JUnit
ƒ They are not built based on realistic/real-world Systems Under Test (SUTs), but rather
use “toy” examples (SUTs)

1
The above needs were the main motives for the author and his team at the University of
Calgary to modernize and re-design the labs of an undergraduate software testing course. The
designed lab courseware is presented in Section 3. Experiences learned from using the
lab courseware in the University of Calgary are presented in Section 4. Finally, Section
5 concludes this paper.

2. Testing Lab/Activity Alternatives currently in use and Observations


The choices of testing tools and SUTs in seven randomly-selected courses [1-7] are
shown in Table 1. Note that a (computer-aided) testing tool in this study is defined a
tool which assists testers with different test activities, such as test case determination,
test data generation, test execution, test (coverage) measurement and monitoring, and
test evaluation [8].
According to Table 1, it is interesting to observe that different courses differ in terms
of both the breadth and depth of the practical tools they use and also by the scale of the
SUTs used by students. Courses at [1, 3, 5] involve students solely in practical unit
testing using JUnit and CPPUnit, but not other testing levels, e.g., GUI testing.
Furthermore, a quick look in Table 1 reveals that JUnit is the most popular testing tool
used in this list of courses.
According to the course syllabus of the Florida Institute of Technology’s course [2],
and to the best of the author’s knowledge, no specific testing tool was used in that
course. Students apparently used the black-box techniques manually (without the help
of any testing tool).
Table 1-The choices of testing tools and Systems Under Test (SUTs) in seven randomly-
selected testing courses in North America.
Testing course at Testing tool(s) SUT(s)

Carleton University JUnit A simple hypothetical landing gear safety
[1] system, and an Automatic Teller Machine
Florida Institute of None (students apparently used the black- Presentation tool, part of the Open Office
Technology [2] box techniques manually). suite
University of JUnit, CPPUnit A VoIP application developed by students
Waterloo [3] in a previous course
Purdue University JUnit A simple Triangle class
[4] xSuds (coverage analysis tool) A 73-LOC C program
Telcordia AR Greenhouse Efficient Test A hypothetical library checkout system
Case Generation Service (AETG)
JMeter www.google.com
ProteumIM-2.0 (mutation tool) A 49-LOC C program
University of JUnit A small-scale office suite called TerpOffice
Maryland [5] (developed at UMD)
Queen's University Tests were automated using Unix A small library book loan system
[6] command-line scripts
University of Various open-source or commercial tools Various open-source or commercial
Calgary (by the (one tool per student), e.g.: systems (one SUT per student), e.g.:
author) [7] ƒ BugZilla ƒ A realistic ATM software
ƒ JUnit ƒ JFreeChart
ƒ CodeCover ƒ Plazma ERP + CRM tool
ƒ IBM Rational Functional Tester
ƒ MuClipse

According to Table 1, some of the courses use small-scale (“toy”) or hypothetical


SUTs in their exercises. For example, in the Purdue University’s course [4], students
learn code coverage through a lab exercise by using a less-known tool called xSuds to

2
test a 73-LOC C program (SUT). It seems to the author that this scale of SUTs will not
prepare students well enough for real-world large-scale industrial settings.
In the Queen's university’s testing course [6], students automated their test cases
using Unix command-line scripts. However, it seems that it is much better to use
industry-standard testing frameworks (e.g., JUnit) instead of ad-hoc Unix scripts.
Also some of the courses (such as Purdue’s [4]) often use simple research-oriented
testing tools, e.g., ProteumIM-2.0 (a mutation tool). Perhaps, it is a better idea to adopt
and use more recent and powerful tools instead, e.g., MuClipse a mutation testing plug-
in for the Eclipse IDE.
In terms of breadth of tools used, the courses at Purdue University [4] and the
University of Calgary (taught by the author of this article) [7] seem to be more
comprehensive than the other five courses. Purdue’s course involves the students in
working with five different testing tools, each for a specific type of testing, e.g., JMeter
for performance testing, and xSuds for coverage analysis.

3. The Lab Courseware


As shown in the last row of Table 1, our goal was to design practical lab exercises,
use real testing tools and realistic SUTs. Designed for a 13-week term, we found that
the following five labs provide students with a good practical experience in testing.
Also we found out by experience that this courseware is just the right workload for
students, i.e., it is not overloading students, while teaching them the most practical
testing skills.
ƒ Lab 1- Introduction to Testing and Defect Tracking
ƒ Lab 2- Requirements-Based Test Generation
ƒ Lab 3- Code Coverage, Adequacy Criteria and Test Case Correlation
ƒ Lab 4- Functional GUI Testing
ƒ Lab 5- Mutation Testing and Analysis
Extensive effort was spent to prepare clear, concise and “interesting” lab documents.
Involving a recent student, who had passed the course, in the preparation of lab
documents helped a lot in this aspect.
This entire repository is available online for test educators’ use for free under a Creative
Commons Canada License at www.softqual.ucalgary.ca/projects/testing_labs.

3.1. Lab 1- Introduction to Testing and Defect Tracking

The objectives of this lab are: (1) Getting hands-on experience in ad-hoc and manual
testing, (2) Using industrial defect tracking practices and tools (Bugzilla), (3)
Comparing ad-hoc, with manual and regression testing, (4) Experiencing that ad-hoc
testing is not always very effective, and (5) Following defects in their lifecycle. The
SUT was a realistic ATM simulation system [9]. The test support tool used is Bugzilla.
A set of realistic defects were manually injected in the SUT
To provide the pedagogy followed in designing the lab document, the tables of
contents for lab documents 1 and 2 are provided as an example in Table 2. The general
design template in lab documents is to start with an Introduction section. The main part
of lab documents is “Instructions” which first familiarizes students with what they are
going to practice and learn. Then comes the main activity part which should be carried
out by students. The grading criteria and many useful appendices conclude the lab
documents. The tone and style of the documents were designed carefully to keep
students motivated while enjoying the learning experience.

3
1 INTRODUCTION ......................................................................................... 2 1 INTRODUCTION ..................................................................2
1.1 Objectives................................................................................................... 2 1.1 Objectives ..........................................................................2
1.2 Group-work Specification............................................................................ 2 1.2 This lab is a Group work ....................................................2
1.3 Due Date and Late Marking Policy ............................................................. 2 1.3 Due Date and Late Marking Policy ....................................2
1.4 Overview............................................................................2
1.4 Pre-lab ........................................................................................................ 2
1.5 Testing Tools .....................................................................2
1.4.1 Definition of Ad-Hoc Testing ..............................................................................3
1.6 System Under Test ............................................................3
1.4.2 Defect (Bug) Tracking Tool ................................................................................3
1.6.1 Purpose of the System ..........................................................3
1.4.3 Example Repositories of Real Defects...............................................................3
1.6.2 Usage of the System .............................................................3
1.4.4 System Under Test ............................................................................................3
2 INSTRUCTIONS ...................................................................4
2 INSTRUCTIONS .......................................................................................... 4 2.1 Familiarization....................................................................4
2.1 Familiarization with the System Under Test................................................ 4 2.1.1 Create an Eclipse Project ......................................................4
2.1.1 Perform a deposit ..............................................................................................4 2.1.2 Create a Simple JUnit Test ....................................................6
2.2 Familiarization with Bugzilla........................................................................ 5 2.1.3 Navigate Javadoc API Specifications...................................10
2.2.1 Logging in..........................................................................................................5 2.2 Test Suite Generation......................................................10
2.2.2 Reporting a defect .............................................................................................5 3 SUMMARY .......................................................................11
2.2.3 Displaying all defects for a product ....................................................................7 4 DELIVERABLES AND GRADING ..........................................11
2.2.4 Editing a product................................................................................................8 4.1 JUnit Test Suite (40%).....................................................11
2.3 Ad-Hoc Functional Testing ......................................................................... 8 4.2 Lab Report (50%) ............................................................11
2.4 Manual Functional Testing.......................................................................... 9 4.3 Demonstration of Results (10%) ......................................12
2.5 Defect Correction Verification and Regression Testing ............................ 10 5 ACKNOWLEDGEMENTS .....................................................12
2.6 Summary .................................................................................................. 10 6 REFERENCES ..................................................................12
3 DELIVERABLES AND GRADING ................................................................. 10 7 APPENDIX A – ASSERTIONS AVAILABLE IN JUNIT .............13
3.1 Bugzilla Defect Reports (30%).................................................................. 10 8 APPENDIX B – JAVADOC EXAMPLE ..................................14
3.2 Lab Report (70%) ..................................................................................... 11
4 INTERESTING QUOTES AND WEB LINKS ................................................... 11
5 ACKNOWLEDGEMENTS ............................................................................ 12
6 REFERENCES ......................................................................................... 12
7 APPENDIX A – USER INTERFACE FOR ATM SIMULATION SYSTEM ............. 13
8 APPENDIX B – DEFECT LIFECYCLE FROM INITIAL REPORT TO CLOSED ...... 14
9 APPENDIX C – REQUIREMENTS FOR THE ATM SIMULATION SYSTEM ......... 15
9.1 High Level Requirements ......................................................................... 15
9.2 Use Case Diagram ................................................................................... 16
10 APPENDIX D – GUIDELINES FOR REVIEWING A DEFECT REPORT ............. 17
10.1 First impressions..................................................................................... 17
10.2 Replicate the Report ............................................................................... 17
10.3 Follow-Up Tests...................................................................................... 17
10.4 Tester’s Speculation or Evaluation ......................................................... 18
11 APPENDIX E - FUNCTIONAL TEST SUITE................................................. 19

Table 2-Tables of contents for lab documents 1 and 2


3.2. Lab 2- Requirements-Based Test Generation

The objectives of this lab are twofold: (1) deriving unit test from Javadoc API
specifications, and (2) using JUnit. The SUT chosen is JFreeChart, a free open-source
library for creating charts on Java platform.
The SUT (JFreeChart) is in fact a real large-scale software, consisting of 177,450
LOC in 583 classes. To keep students’ workload manageable, the lab document asks
students to use the Javadocs API provided to them and develop JUnit test cases for all 5
methods of a selected class and one method of another class.

3.3. Lab 3-Code Coverage, Adequacy Criteria and Test Case Correlation

The SUT is again JFreeChart and students are asked to enhance the code coverage of
the test suites they had developed in lab 2. The testing tools are: JUnit, CodeCover and
CoverLipse. The latter two are two different code coverage plug-ins for the Eclipse
IDE. The reason two code coverage tools is used is that only CoverLipse supports data-
flow coverage metrics. However, CodeCover is more user friendly and more reliable to
work with.
To keep students’ workload manageable, the lab document asks students to enhance
their lab 2 test suites to achieve the following measures for each of the classes under
test: 90% statement coverage, 70% branch coverage, 50% loop coverage, 60% condition
coverage, and 50% uses covered.

4
The interesting part of this lab is to help students practically understand and
appreciate how black-box (lab 2) and white-box (lab 3) testing differ from and
complement each other.

3.4. Lab 4- Functional GUI Testing

After experimenting with code-based techniques in the last two labs, this lab
introduces students to GUI testing, a challenging but needed task in the software
industry. The SUT is an open-source Enterprise Resource Planning (ERP) and Customer
Relations Management (CRM) software, named Plazma. The testing tool is the IBM
Rational Functional Tester (version 7.0.1). Note that although this tool is commercial,
however university academics can get it for free for use in their teaching and research
through the IBM Academic Initiative (www.ibm.com/university).
This lab asks students to record GUI test cases based on the requirements given in
the UML use cases of the SUT. They then should execute (play back) the test cases on
the same version of the SUT.

3.5. Lab 5- Mutation Testing and Analysis

Similar to labs 2 and 3, the SUT is still JFreeChart and students apply the mutation
testing technique on the test suites they have created in labs 2 and 3 to assess the fault
detection effectives of those test suites. The testing tool is MuClipse (a plug-in for
Eclipse IDE). The test adequacy criterion is to reach a specific mutation score.

4. Experience using the Lab Courseware


The lab courseware was used during the Winter 2009 term, in a fourth-year
undergraduate Software Reliability and Testing course (SENG 521) at the University of
Calgary. The course was taught by the author and had 27 students and 1 TA.
To show to the students the impertinence of practical exercise (especially in software
testing), the lab activity and reports were
assigned 30% of the course’s final mark.
A template for lab reports were
prepared for each lab and given to the
students. They were asked to fill it out to
prepare their lab reports. This template
helped students be consistent and made it
easier for the students to develop their lab
reports, and for the instructor and the TA
to mark the reports. An example is given
in Figure 1.
The students were generally extremely
happy with the lab exercises and found
them very useful and good for their
learning experience. Here are some of the
students’ feedbacks (grouped by lab
numbers)
ƒ Lab 1:
o “This was a very realistic lab. It
clearly involves works and
concepts useful outside of school. Figure 1-An example template for Lab 1 report

5
o This lab has shown the complications involved in conducting software testing, and
that it is no trivial task.
o This lab was quite the learning adventure.
o Not only was I educated on writing a proper bug report, but I got some good
experience in writing them.
o It also gave me the exposure to challenges with testing and learning the system
efficiently (i.e.,, unclear requirements), which prepared me to deal with the future
ones [systems].
o The lab instructions were well written and easy to follow.”
ƒ Lab 2:
o Overall, the whole experience working with unit testing and bug reporting was
great. We learned how to test a class and execute unit testing using JUnit testing
based on requirements. It allowed us to learn the Java language again and the use
of Eclipse. It also allowed us to deal with inexplicit requirements and helped
prepare ourselves for future unit testing challenges.
o We learned the importance of peer review in creating effective test suites and bug
report. It also allowed us to put the idea of equivalence class analysis into real
situation practice. We used it in organizing our test cases and insuring that we had
test adequacy with respect to requirements. We learned that boundary value
analysis is not always applicable when testing software since we found that the
equivalence class analysis gave us more benefit in what we were trying to
accomplish.
o This lab was extremely useful as far as thrusting us into the foray of integrated unit
testing, and the tools that we may be using in the future. It seems like this lab is
centered around a very pragmatic principle that can be used in industry, and is
therefore invaluable to us as students on the fringe of being deployed in the
software engineering and computer science industries.
o The lab itself was interesting, and a reasonable amount of work.
ƒ Lab 3:
o Overall, this lab was a great experience as we got to learn to use the CodeCover
and Coverlipse tools. Also, we were able to get some practice in determining the
defined, c-use and p-use variables in each method we have worked with as well as
drawing the data flow graph and def-use pairs.
o Also, our knowledge of testing has been greatly enhanced while performing this
lab. Getting to know testing in a more structured and systematic way is more
effective in numerous aspects. Performing code coverage definitely has a great
impact in enhancing or creating more effective test suites.
o This lab perfectly replicates an industry situation where inadequate time is
available for testing. Hence, careful consideration was devoted in prioritizing
which test cases to develop.
o Finally, it is important to note that in this lab, we learned many important
practices in testing, and using tools to enhance the testing effort. This lab should
by all means be conducted as we felt as a team, this lab was one of the most
important labs we have done to date.
o It is important to note that in this lab, we learned many important practices in
testing, and using tools to enhance the testing effort. This lab should by all means
be conducted as we felt as a team, this lab was one of the most important labs we
have done to date.
o The lab was very useful in allowing insight on how to test code when the code base
is known and gives us a contrast of views on how effective the black box testing
that was done in the previous lab was. The correlation between test cases was also
a useful tool in allowing identification of redundant tests that could be removed to
provide a much cleaner set of test cases as well.
o Both of us [group members] found that this lab helped us to get a real life
perspective of how testing strategies could be used to achieve coverage within a
piece of software. We both learnt a lot by using JUnit, Coverlipse and CodeCover

6
to do the tests and it will be a great skill to know for future testing needs. We both
found the CodeCover tool to be useful, although it was slow to execute.
o By writing tests for software that had defects in it, we got a realistic perspective on
how involved it is to write coverage tests on a piece of software, and the benefits of
unit testing in revealing these defects.
o We felt, that despite the pressures of other classes, there was enough time to
complete this lab, and it was easy enough to follow.
ƒ Lab 4:
o It has been an enjoyable experience to run the automated test cases created for this
lab. The data and verifications points have been another power feature that I think
is very useful.
o The lab is quite useful in teaching us how to use an automated test tool and gives
good insight on how it works. It was a quite interesting lab and has real
applicability to how these tools would be used to be able to save manpower on
trying to regression test an application.
o Considering the complexity of Rational Functional Tester, we were happy to be
able to learn a little about it outside of the industry, making this a very worthwhile
lab in spite of its difficulties.
ƒ Lab 5:
o The lab was useful as it built on previous work done in lab 2 and lab 3. I really
enjoyed that this lab built on previous work, I would almost be tempted to suggest
that other labs (such as lab 4) be changed to incorporate this building approach.
Mutation is a very good way to test the effectiveness of a given test suit. The lab is
useful, if everything runs smoothly. MuClipse is also a powerful tool that saves the
developer/tester time to generate the different mutations for a given program.
o The lab was very educational when looking at mutation testing, and the manual
part was very good practice. The JUnit part was marred by very picky input
requirements and unexplained errors, but it was very convenient compared to
manual testing when it began working. The workload was acceptable.
o Overall, the whole experience with mutation testing was great. We learned to test
the suites to determine their effectiveness as well as creating mutants using a
mutation testing tool called MuClipse. We also learned how to work with
MuClipse. It allowed us to learn how to manually calculate mutation scores. It also
allowed us to understand the importance of a mutation testing. It allowed us to
understand the need of mutation testing tool use in executing large number of
mutants and test cases. This lab is another step in allowing us to see how effective
we are as testers. Effective testers produce effective test suites.
o This lab can be used in industry. It was extremely useful as it allows us to have
some experience with real situation mutation testing tools and mutation testing
process. Although, I believe we need more time to actually work with this tool and
understand the unexpected such as errors, redundant killed mutants, etc.

5. Conclusions
A repository of modern software testing laboratory courseware and an experience
report using laboratory courseware were presented. The author hopes that other
software testing educators find this laboratory courseware useful for their instruction
and training needs.
This paper also briefly analyzed the practically exercise used in a list of seven
selected testing courses, and reported on lessons learned from using a new laboratory
courseware used at the University of Calgary in which industry-strength testing tools
and large-scale SUTs were used. It is hard to generalize the findings and observations,
but the author would recommended that testing educators should align the choices of
SUTs and tools with the ultimate goal of the particular course at hand, the type of

7
students they are working with, and also the time and resources available to the students
in the given course.
For example, since the city of Calgary has a very active testing industry, and it was
expected that author’s students were going to work in the industry soon after the course,
the author decided to use industry-strength testing tools and large-scale SUTs in his
course.
For example, if the course is to teach the engineers of a company who should
conduct performance testing after the training, the instructor should clearly use the
relevant commercial performance testing tools, and SUTs similar to those that the
engineers will work on afterwards.

Acknowledgements

This work was partially supported by the Discovery Grant no. 341511-07 from the
Natural Sciences and Engineering Research Council of Canada (NSERC), and also the
Alberta Ingenuity New Faculty Award no. 200600673.

References
[1] S. Bashardoust, "Assignments of the course Software Quality Assurance (COMP 4004),
Carleton University," http://www.scs.carleton.ca/~sbtajali/4004/assign-4004.html, Last
accessed: Nov. 2008.
[2] C. Kaner and J. Bach, "Project specification of the course: Black Box Software Testing (CSE
3411), Florida Institute of Technology," http://www.testingeducation.org/BBST, Last
accessed: Nov. 2008.
[3] K. Kontogiannis, "Course project specification of course: Software Testing and Quality
Assurance (E&CE 453), University of Waterloo," http://swen.uwaterloo.ca/~kostas/ECE453-
06, Last accessed: Nov. 2008.
[4] A. Mathur, "Lab exercises of the course (CS 490M) Software Testing, Purdue University,"
http://www.cs.purdue.edu/homes/apm/courses/CS490M-fall06/, Last accessed: August 2008.
[5] A. M. Memon, "Course project specification of course: Fundamentals of Software Testing
(CMSC737), University of Maryland,"
http://www.cs.umd.edu/~atif/Teaching/Spring2008/CMSC737.html, Last accessed: Nov. 2008.
[6] M. Zulkernine, "Project component of course Software Quality Assurance (CISC 327),
Queen's University," http://research.cs.queensu.ca/~cisc327/, Last accessed: Nov. 2008.
[7] V. Garousi, "Course outline of Advanced Software Testing (SENG 607.22), University of
Calgary,"
http://enel.ucalgary.ca/~vgarousi/downloads/teaching/SENG_607_22/08_W_SENG_607_22_
Advanced_Software_Testing_Course_Outline.pdf, Last accessed: Nov. 2008.
[8] J. Wegener, R. Pitschinetz, K. Grimm, and M. Grochtmann, "TESSY - Yet Another
Computer-Aided Software Testing Tool?," Proc. of the 2nd European Int. Conf. on Software
Testing Analysis & Review, 1994.
[9] R. C. Bjork, "Example ATM Simulation System," http://www.math-
cs.gordon.edu/local/courses/cs211/ATMExample, Last accessed: October 2009.

View publication stats

You might also like