COS6028-B CW2 2022-23 - Supplementary

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

FACULTY OF ENGINEERING AND INFORMATICS

DEPARTMENT OF COMPUTER SCIENCE

COS6028-B Software Systems Design and Testing


Coursework 2
Weighting: 60%
Date Issued: 6 July 2023
Date Due: Friday 28 July 2023, 15:00

Learning outcomes
LO1: Demonstrate a critical awareness and ability to use practical techniques for software
development with regard to design, implementation and testing.
LO2: Demonstrate an understanding of practical challenges associated with the
development of a significant software system.
LO3: Demonstrate an ability to use testing methods appropriate in finding errors in different
stages of the software development life cycle.

Submission guidelines
• Submit your coursework to Canvas (Turnitin submission)
• Upload
1) PDF file containing your report:
• Please follow the structure (sections) given
2) ZIP archive containing ALL your code and JUnit tests
3) ZIP archive containing ALL your files related to Mutation Testing (any mutants
generated, or any files with reports / statistics, screenshots)
4) Video recording of your demonstration (in .mp4, .mov or similar format)

Important note
Plagiarism is an offence against the University and it will not be tolerated. No plagiarised
work will receive a mark or credit. All cases in which plagiarism is suspected will be
investigated further according to the University’s policies on plagiarism. For the details see
http://www.bradford.ac.uk/library/help/plagiarism

This coursework consists of 3 parts. You are required to complete ALL the tasks
and answer ALL the questions, as well as preparing a video recording in which you
demonstrate your work.

Scenario (from CW1)


The University of Bradford has recently introduced MFA (multi factor authentication). MFA
is an extra security measure to ensure secure access to accounts during the sign-in
process. This helps to keep our data safe. MFA implements a two-step verification process
which prompts you to approve your sign-in using a different method or on a different
device. Most people use their phone to do this (via an app notification, an authenticator

Page 1 of 4
app, a code sent by a text message, or a call). This means we can be sure that the person
trying to access the University’s services is really you, and not somebody who has
managed to steal your account details - even if they have your password, the odds are that
they don't have your device, too.

PART A [40 points]

TASKS

1. Design [15 points]


Extend your design of the sign-in functionality using MFA delivered for CW1 and add a
cross-platform authentication dialogue for, e.g., Android, iOS, Windows, or web
applications. The dialogue will use different UI controls to render its window. Under various
platforms, these controls may look a little different, but they should still behave
consistently. It should not be necessary to rewrite the logic of the dialogue for each
platform.

Use suitable design patterns and follow relevant SOLID principles. Ensure that:
• The creator of the authentication dialogue and the UI controls of the platform are not
tightly coupled.
• It must be possible to use different types of the authentication dialogue for the various
platforms with minimal changes to the code that uses them.

Provide the following:


1a. Class diagram illustrating all the classes and their relationships, together with a
description of the design patterns used.
1b. Rationale for the solution. What are the benefits of the chosen design patterns? Which
SOLID principles are supported and how?

2. Implementation [15 points]


Develop and demonstrate a Java application that implements your design. Provide a
working prototype for a platform of your choice, integrating the developments of CW1.

Provide the following:


2a. Java package containing all the classes, including a Demo class that demonstrates
how to create and use an authentication dialogue.
2b. Documentation of the package, including all the classes and the public methods of
their interfaces. Screenshots of the execution of the Demo class with associated
descriptions.

3. Questions
3a. “Factory” is an ambiguous term that stands for a function, method or class that is
supposed to be producing something. Most commonly, factories produce objects, but they

Page 2 of 4
may also produce files, records in databases, etc. Discuss factory design patterns, their
pros and cons, and indicate in which scenarios they are useful. [5 points]

3b. Discuss the role of dependencies in a software architecture and explain why modules
should depend on abstract interfaces and not concrete implementations.
[5 points]

PART B [40 points]

4. Unit Testing [20 points]


For all the classes implemented in Part A, provide JUnit tests. These should be included in
the Java project package, in a separate Tests folder and uploaded to Canvas, together
with the source code. The tests should aim to achieve a very good code coverage
(statement and branch). Use a plugin to measure the code coverage and provide in the
report screenshots showing: (i) test report statistics (from JUnit runner) and (ii) the
percentage of code coverage achieved by the tests (measured by the plugin).

5. Mutation Testing
Use a mutation testing system, to generate mutants, apply mutation analysis to evaluate
the quality of existing tests and then to improve them by designing new test cases.

5a. Briefly present the tool that you will use for mutation testing. For all the classes
implemented in Part A generate mutants using all the mutation operators available in the
mutation tool selected and provide a summary: for each mutation operator show the
number of generated mutants. Run the JUnit tests developed at Question 4 against the
mutants and report the number of live mutants vs. killed mutants and the mutation score
(percentage of killed mutants), including screenshots for all the statistics.
[5points]

5b. Targeting the live mutants that are not equivalent, write additional JUnit tests to kill
them. Provide some examples of new test cases and the mutants that were detected by
them, explaining how mutation testing is helping to improve the quality of your tests. For
comparison, provide evidence of the efficiency of the improved test suites (screenshots
with the new statistics: number of live vs. killed mutants, and the mutation score).
[10 points]

5c. Which of the mutation operators used in your experiments seem to be most useful?
Which mutation operators are generating most equivalent mutants? Give some examples
of equivalent mutants and justify why they cannot be killed by any test. Are there any
lessons learnt from this experiment? [5 points]

PART C [20 points]

6. Demonstration [20 points]

Page 3 of 4
Prepare a 5-minute video recording in which you demonstrate your prototype
implementation and the tests that you carried out.

Page 4 of 4

You might also like