Download as pdf or txt
Download as pdf or txt
You are on page 1of 39

Automated Software Testing

Major Project (CC4270) Report

Submitted in the partial fulfillment of the requirement for the award of

Bachelor of Technology

in

Computer and Communication Engineering

By:

Dev Rohit Pandhare


(209303106)

Under the supervision of:

Dr. Sourabh Singh Verma

May 2024

Department of Computer and Communication Engineering


School of Computing and Intelligent Systems
Manipal University Jaipur
VPO. Dehmi Kalan, Jaipur, Rajasthan, India – 303007
Department of Computer and Communication Engineering
School of Computing and Intelligent Systems, Manipal University Jaipur,
Dehmi Kalan, Jaipur, Rajasthan, India- 303007

STUDENT DECLARATION

I hereby declare that this project Automated Software Testing is my own work and that,
to the best of my knowledge and belief, it contains no material previously published or written
by another person nor material which has been accepted for the award of any other degree or
diploma of the University or other Institute, except where due acknowledgements has been
made in the text.

Place: Jaipur Dev Rohit Pandhare


Date: 23 May 23, 2024 (209303106)
B.Tech. (CCE) 8th Semester

i
Department of Computer and Communication Engineering
School of Computing and Intelligent Systems, Manipal University Jaipur,
Dehmi Kalan, Jaipur, Rajasthan, India- 303007

Date: 23 May 2024

CERTIFICATE FROM SUPERVISOR

This is to certify that the work entitled “Automated Software Testing” submitted by
Dev Rohit Pandhare (209303106) to Manipal University Jaipur for the award of the degree
of Bachelor of Technology in Computer and Communication Engineering is a bonafide
record of the work carried out by him under my supervision and guidance from 01/02/2024 –
31/05/2024.

Dr. Sourabh Singh Verma


Internal Supervisor,
Department of Computer and Communication Engineering
Manipal University Jaipur

ii
Project Completion Certificate

iii
ACKNOWLEDGEMENT
I would like to express my sincere gratitude to the people who have supported and guided me
throughout my internship experience at Rubiscape Pvt. Ltd.

First and foremost, I am incredibly thankful to my esteemed mentor, Dr. Sourabh Singh
Verma, for providing me with the invaluable opportunity to intern at Rubiscape Pvt. Ltd.
Your guidance, encouragement, and support throughout the internship have been instrumental
in my learning and development. Your willingness to share your knowledge and expertise has
significantly enhanced my understanding of the Software Testing industry and its practices. I
am particularly grateful for your patience in explaining complex concepts and for always
being available to answer my questions. Your belief in my abilities and constant motivation
have been a driving force behind my success during this internship.

I would also like to extend my deepest appreciation to Dr. Sunil Kumar, Head of the
Department of Computer and Communication Engineering at Manipal University Jaipur.
Your support in allowing me to pursue this internship and for accommodating my academic
schedule has been crucial. Your encouragement to gain practical work experience has been
invaluable, and I truly appreciate your understanding during this time.

My sincere gratitude goes out to the entire team at Rubiscape Pvt. Ltd. for creating a
welcoming and supportive work environment. I am thankful for the opportunity to collaborate
with such talented individuals and learn from their expertise. Their willingness to share their
knowledge and answer my questions has been instrumental in my professional growth.

iv
ABSTRACT

This internship report documents my contributions as a Software Testing Intern within the
Quality Assurance department at Rubiscape Pvt. Ltd. My primary objective was to facilitate
the company's transition from manual to automated software testing by establishing a robust
automation testing framework.

The report delves into the details of the framework development process. I implemented the
Page Object Model (POM) for structuring the framework code. This approach promotes code
reusability, maintainability, and simplifies future test script development. The report
elaborates on the rationale behind this choice and its benefits for long-term framework
sustainability. Furthermore, the report details the utilization of Selenium WebDriver and
Pytest as the core tools for conducting automated tests. It explains the functionalities offered
by Selenium WebDriver, such as browser interaction and element manipulation, which are
crucial for simulating user actions during testing. Additionally, the report highlights the
advantages of using Pytest, a popular Python testing framework, for its simplicity, flexibility,
and ability to streamline test execution and reporting.

The report continues by outlining the development of multiple Python-based automated


testing scripts within the PyCharm IDE. These scripts demonstrate the framework's capability
to automate various testing scenarios, such as login functionality, form validation, and data
verification. Specific examples of the developed scripts can be referenced within the report to
showcase the framework's practical application.

The report summarizes the key learnings and achievements gained during the internship. It
emphasizes the enhanced understanding of software testing methodologies, including manual
and automated testing practices. Additionally, it highlights the importance of automation
testing frameworks for improving software quality, development efficiency, and reducing
overall testing time. Finally, the report acknowledges the significant contribution of this
framework in laying the groundwork for Rubiscape Pvt. Ltd.'s successful transition towards
automated software testing.

v
TABLE OF CONTENTS

Student declaration i
Certificate from Supervisor ii
Certificate from Company (For external students only) iii
Acknowledgement iv
Abstract v
List of figures vii

1. Introduction 1
2. Pre-requisites 3
3. Objective 5
4. Technologies Used 7
5. Methodology 8
6. Workflow Execution 17
7. Work Done 25
8. Future Scope 28
9. Conclusion 30
10. Bibliography 31

vi
LIST OF FIGURES

Figure 1: Flowchart for dataset creation 10

Figure 2: Flowchart for File dataset creation 11

Figure 1: Flowchart for dashboard creation 13

Figure 2: Flowchart for table and schema validation 15

Figure 3: Flowchart for explore dataset 16

Figure 4: Open browser and navigate to url 17

Figure 5: Navigated to landing page 17

Figure 8: Clicked on plus icon 18

Figure 9: Clicked on rubiconnect icon 18

Figure 10: Clicked on PostgreSQL 19

Figure 11: Select schema 19

Figure 12: Select tables 20

Figure 13: Navigated to landing page after dataset creation 20

Figure 14: After plus icon, click on rubisight card 21

Figure 15: Select dataset for dashboard 21

Figure 16: Dashboard Created 22

Figure 17: Chart plotting 22

Figure 18: Plotted Chart 23

Figure 19: Applying Global Filter 23

Figure 20: Filter Configuration 24

Figure 21: After applying filters 24

vii
INTRODUCTION

1.1 Internship Overview


This report details my internship experience as a Software Testing Intern at Rubiscape Pvt.
Ltd. The internship commenced on February 1st, 2024, and provided me with a valuable
opportunity to contribute to the company's Quality Assurance (QA) department. Throughout
this enriching experience, I actively participated in the development and implementation of an
automation testing framework, paving the way for Rubiscape's transition from manual to
automated software testing.

1.2 Company and Department Introduction


Rubiscape Pvt. Ltd. is a company specializing in data science and machine larning. During
my week-long induction program, I gained a comprehensive understanding of the company's
structure, operations, and core values. Following this period, I was officially assigned to the
QA department, where I began collaborating with experienced professionals in the field of
software testing.

1.3 The Rise of Automation Testing


The software development industry is constantly evolving, demanding efficient and reliable
testing practices. Manual testing, while a crucial approach, can become time-consuming and
prone to human error, especially with complex software applications. Recognizing this,
Rubiscape Pvt. Ltd. embarked on the important initiative of transitioning towards automated
software testing.

1.4 Internship Objective


My primary objective within the QA department was to support this transition by assisting in
the development of a robust automation testing framework. This framework would streamline
the testing process, minimize manual interventions, and ultimately enhance the overall quality
and efficiency of software development at Rubiscape Pvt. Ltd.

1
1.5 Tools and Technologies

1.5.1 Programming Language

Rubiscape Pvt. Ltd. utilizes Python as its primary programming language for both software
development and testing purposes. This decision was based on Python's extensive libraries,
readability, and ease of use, which proved beneficial for my internship experience.

1.5.2 Automation Testing Framework

A core component of my internship involved implementing a well-structured automation


testing framework. This framework is designed to facilitate the creation and execution of
automated testing scripts, enabling efficient testing of various software functionalities. The
specific details of the chosen framework and its benefits will be elaborated upon in the
following sections.

1.5.3 Selenium WebDriver and Pytest

Within the framework, I utilized Selenium WebDriver as a primary tool for simulating user
interactions with the web application under test. Selenium WebDriver allows me to control
web browsers, interact with web elements, and automate various actions necessary for testing
purposes. This enables the creation of robust test scripts that replicate real-world user
scenarios.

Furthermore, I leveraged Pytest, a popular Python testing framework, for its simplicity and
flexibility. Pytest streamlines the process of writing test cases, executing them, and generating
comprehensive reports that provide valuable insights into the testing process. These reports
facilitate the identification and analysis of any software defects encountered during testing.

1.5.4 Data-Driven Testing with Pandas

One of the key aspects of the automated testing framework involved data-driven testing. This
approach involves utilizing external data sources to provide test input for various scenarios. In
this instance, I employed the Pandas library, a powerful data manipulation tool in Python, to
effectively manage and extract test data stored within Excel spreadsheets. By utilizing Pandas,
I ensured that the developed testing scripts were adaptable to various test data sets, promoting
the framework's versatility and robustness.

2
PRE-REQUISITES

A successful internship in automation testing necessitates a strong foundation in the relevant


tools and techniques.

Prior to delving into the specifics of Rubiscape's automation testing framework development,
this section explores the three key prerequisites I actively mastered during the initial stages of
my internship: Page Object Model (POM), Selenium WebDriver, and Pytest.

2.1 Page Object Model (POM): Organizing the Testing Landscape


The Page Object Model (POM) serves as a design pattern that structures the interaction
between test scripts and the web application under test. Its core principle lies in creating a
separate class for each web page within the application. This class encapsulates all the
elements, functionalities, and actions specific to that particular page.

Benefits of POM:

Improved Maintainability: By isolating page elements and actions within dedicated classes,
the POM promotes code reusability and simplifies maintenance. Modifications to a web page
element only require changes within its corresponding class, minimizing the need to alter
numerous test scripts.

Enhanced Readability: POM fosters a clear separation of concerns, making the test code
more readable and understandable. Developers can easily identify the interactions with
specific pages, improving collaboration and code comprehension.

Reduced Test Script Redundancy: POM eliminates the need to repeatedly locate and
interact with web elements throughout the test scripts. This reduces code redundancy and
promotes efficient test script development.

2.2 Selenium WebDriver: Bridging the Gap Between Script and Browser
Selenium WebDriver acts as a powerful web automation framework that allows us to control
web browsers programmatically. It facilitates simulating user actions like clicking buttons,
entering text into fields, and navigating web pages.

Key Functionalities of Selenium WebDriver:

Browser Interaction: Selenium WebDriver enables us to launch various web browsers,


manipulate browser windows, and navigate through web pages.

Web Element Identification: It provides methods for locating specific web elements within a
page using various strategies such as ID, name, class name, or XPath.

3
User Action Simulation: Selenium WebDriver allows us to simulate user interactions with
web elements, including clicking buttons, entering text into input fields, and submitting
forms.

JavaScript Execution: We can leverage Selenium WebDriver to execute JavaScript code


within the web browser, allowing for more complex testing scenarios.

2.3 Pytest: Streamlining Test Execution and Reporting


Pytest is a popular Python testing framework that offers a user-friendly and powerful
approach to writing, executing, and reporting on automated tests.

Advantages of Pytest:

Simple Test Definition: Pytest utilizes a convention-based approach for defining test cases,
making it easy to write clear and concise test scripts.

Flexible Test Discovery: Pytest automatically discovers and executes all Python files with
names starting with "test_" or containing the "test" keyword, simplifying test organization and
execution.

Comprehensive Reporting: Pytest generates detailed reports that provide information on test
execution status, passed and failed tests, and error messages. This aids in identifying and
debugging issues encountered during testing.

Fixtures for Shared Setup and Teardown: Pytest supports the use of fixtures, which are
functions that provide reusable setup and teardown functionalities for test cases, promoting
code reusability and improved test maintainability.

By mastering these three crucial tools – Page Object Model, Selenium WebDriver, and Pytest
– I equipped myself with the necessary foundation for actively contributing to Rubiscape's
automation testing framework development. The following sections will detail the specific
implementation of these tools within the framework and its application for efficient software
testing at Rubiscape Pvt. Ltd.

4
OBJECTIVES

My internship at Rubiscape Pvt. Ltd. within the Quality Assurance (QA) department was
focused on achieving a central objective: to establish a robust automation testing framework
and initiate the automation process for the company's software testing endeavors. This
objective aimed to streamline testing procedures, minimize manual interventions, and
ultimately elevate the overall quality and efficiency of software development at Rubiscape.

1.1 Framework Development: Setting the Stage for Automation


The primary objective involved actively participating in the development of a well-structured
automation testing framework. This framework would serve as the foundation for creating and
executing automated testing scripts, enabling efficient testing of diverse software
functionalities. Integral to this objective was the selection of an appropriate framework, the
design of a sustainable architecture, and the implementation of essential tools and libraries.

1.2 Collaborative Script Development: Working Together for Effective


Testing
Another critical objective entailed fostering close collaboration with both software developers
and fellow QA testers. This collaboration was instrumental in developing comprehensive test
case scenarios suitable for automation. Through collaborative efforts, we identified key
functionalities, user interactions, and potential edge cases that necessitated thorough testing.
By leveraging the combined expertise of developers and QA testers, we ensured that the
developed automated testing scripts effectively covered a wide range of testing scenarios.

1.3 Initiating the Automation Journey: Taking the First Steps


A crucial objective involved embarking on the automation process itself. This entailed
utilizing the developed framework and the identified test case scenarios to create the initial set
of automated testing scripts. The objective focused on automating a selection of essential test
cases, demonstrating the framework's capability and paving the way for its future expansion.

1.4 Laying the Groundwork for Continuous Improvement


The overall internship objective aimed to establish a solid foundation for Rubiscape's
transition towards automated software testing. This included ensuring the framework's
flexibility and scalability, allowing for the incorporation of additional test cases and
functionalities as the software development process progresses. By laying this groundwork,
the objective aimed to empower the QA team to continuously enhance the automation
framework and its effectiveness in achieving comprehensive software quality assurance.

5
The internship objectives at Rubiscape Pvt. Ltd. revolved around setting up the infrastructure
for successful automated testing. Through the collaborative development of a robust
framework and the creation of initial testing scripts, this internship aimed to initiate a
significant shift in the company's testing practices, paving the way for improved quality and
efficiency in future software development endeavors.

6
TECHNOLOGIES USED

1.1 Development Machine: The Foundation for Automation


Hardware: The primary development platform for this internship project was a laptop
equipped with an Intel i5 processor (10th generation) and 8 GB of RAM. This configuration
provided sufficient processing power and memory resources for effectively running the
development environment, including the PyCharm IDE, web browsers for testing purposes,
and the execution of automated test scripts.

Software: The underlying operating system for the development machine was Windows.
While other operating systems are viable options for software development using Python,
Windows offered a familiar and comfortable environment for this project.

1.2 Integrated Development Environment (IDE): Streamlining


Development Workflow
PyCharm IDE: PyCharm, developed by JetBrains, served as the primary Integrated
Development Environment (IDE) for this internship. PyCharm offers comprehensive support
for Python development, including syntax highlighting, code completion, debugging tools,
and integration with various testing frameworks, including Pytest. These features significantly
enhanced the development process, promoting code quality, efficiency, and ease of
collaboration.

1.3 Automation Testing Framework Essentials


Selenium WebDriver: As previously mentioned, Selenium WebDriver played a pivotal role
in the automation testing framework. It functioned as the core library for interacting with web
browsers and simulating user actions during automated test execution. Selenium WebDriver's
capabilities for browser control, web element identification, and user action simulation were
instrumental in automating various software testing scenarios.

Pytest: Pytest, a popular Python testing framework, was employed to streamline the
automation testing process. Pytest's functionalities for test case discovery, execution, and
reporting proved invaluable. It facilitated the creation of clear and concise test scripts,
automated test execution, and the generation of comprehensive reports that provided insights
into the testing process and identified any errors encountered during test runs.

7
METHODOLOGY

This section details the methodology employed to develop an automation testing framework
for Rubiscape Pvt. Ltd. The framework utilizes Selenium WebDriver and Pytest to automate
various functionalities within the company's software, focusing on three key scenarios:

Creating an RDBMS Dataset via RubiConnect

Creating a File Dataset via RubiConnect

Creating a Dashboard via RubiSight

Flowcharts for each scenario are provided at the end of this section.

Tools and Libraries


The following tools and libraries were utilized for framework development and test script
creation:

Python 3.x: The programming language for developing the automation scripts.

Selenium WebDriver: A library for interacting with web browsers and simulating user
actions.

Pytest: A testing framework for writing, executing, and reporting test cases.

Pandas: A library for data manipulation and reading data from Excel sheets.

Openpyxl: A library for reading and writing data to Excel files.

Framework Design
The automation testing framework is designed with a modular approach to promote code
reusability and maintainability. Key components include:

Page Object Model (POM): Each web page within the Rubiscape application has a
dedicated Python class encapsulating all its elements and functionalities.

Base Test Class: A base class containing common functionalities like browser setup,
teardown, and login procedures.

Data Provider: A function that retrieves test data (username, password, dataset details) from
an Excel sheet using Pandas.

8
Test Script Development
1. Base Test Class:

The BaseTestClass serves as the foundation for all test scripts. It includes methods for:

Launching a web browser (e.g., Chrome) using Selenium WebDriver.

Maximizing the browser window.

Navigating to the Rubiscape login page URL.

Locating and interacting with login form elements (username, password) using POM.

Submitting the login form and verifying successful navigation to the landing page.

2. Data Provider:

The get_data_pandas function fetches test data from an Excel sheet using Pandas. This
function returns a dictionary containing user credentials, dataset details (type, name, host,
etc.), chart selection, configurations, and filters based on the current test case.

3. Specific Test Scripts:


Each scenario has a dedicated test script inheriting from the BaseTestClass. These scripts
utilize the data_provider function to access test data and implement the following
functionalities:

Scenario 1: Creating RDBMS Dataset (RubiConnect)

• After landing page navigation, locate and click the "plus" icon using POM.
• Click on the "RubiConnect" icon.
• Based on the dataset type retrieved from the Excel sheet (e.g., MySQL,
PostgreSQL), identify and click the corresponding RDBMS option.
• Locate input fields for name, host, username, password, and database and populate
them with values obtained from the Excel sheet.
• Click the "Test Connection" button.
• Verify the toaster message.
• If the message is "Database connection successful!", proceed to select schema and
tables for dataset creation.
• If the message is "Invalid credentials!", fail the test case.

9
Figure 6: Flowchart for dataset creation
10
Scenario 2: Creating File Dataset (RubiConnect)
• Follow steps 1 and 2 from Scenario 1.
• Based on the file type retrieved from the Excel sheet (text, CSV, JSON, Excel),
identify and click the corresponding file dataset option.
• Locate input fields specific to the chosen file type (e.g., file path for text, delimiter
for CSV) and populate them with values obtained from the Excel sheet.
• Click the "Create Dataset" button.
• Verify successful creation of the file dataset based on UI elements or feedback
messages.

Figure 7: Flowchart for File dataset creation

11
Scenario 3: Creating Dashboard (RubiSight)
• Follow steps 1 from Scenario 1.
• Click on the "plus" icon.
• Click on the "RubiSight" icon.
• Locate the "Create Dashboard" section and fill in details like dashboard name,
project name, and dataset (selected from the retrieved Excel data) using POM.
• Click the "Create Dashboard" button.
• Verify successful dashboard creation based on UI elements or feedback messages.
• Based on the chart selection retrieved from the Excel sheet, identify and click the
corresponding chart option (e.g., bar chart, line chart).
• Utilize the retrieved configurations from the Excel sheet to customize the chart
(e.g., axis labels, data series).
• Apply filters to the plotted chart based on the filters retrieved from the Excel

12
Figure 8: Flowchart for dashboard creation

13
Visual and Data Validations for Automation Scripts
These validations utilize a combination of visual comparisons and data verification to ensure
the successful execution of test cases and the creation of valid datasets.

Visual Validation with validate_visual Function

The scripts leverage a validate_visual function for visual validation. Here's an explanation of
its functionality:

• Baseline Capture: During the initial test run, the validate_visual function captures a
screenshot of the relevant webpage section after a successful action (e.g., dataset
creation). This screenshot serves as the baseline for future comparisons.
• Subsequent Test Runs: In subsequent test runs, the function captures a screenshot of
the same webpage section.
• Comparison: The function then compares the newly captured screenshot with the
baseline screenshot using image comparison algorithms. These algorithms identify
pixel-level differences between the images.
• Pass/Fail Criteria: If the identified differences fall within a predefined tolerance
threshold (e.g., minor layout adjustments), the test case passes. However, if
significant discrepancies exist, indicating a potential UI change or error, the test case
fails.

This approach ensures that the visual appearance of the application remains consistent after
performing actions like dataset creation. Deviations from the baseline could signify UI bugs
or unexpected behavior.

Data Verification for RDBMS Datasets


For RDBMS datasets created via RubiConnect, the scripts implement additional data
validations:

Schema and Table Verification:

• The script retrieves the schema and table names associated with the newly
created dataset from the application interface (e.g., by reading text elements or
extracting data from specific HTML attributes).
• These retrieved values are then compared to the corresponding schema and table
names stored in the Excel sheet used for test data.
• If the retrieved values match the Excel sheet data, the test case proceeds.
• If a mismatch occurs, the test case fails, indicating an issue with the dataset
creation process.

14
Explore Option and Data Visibility:

• After dataset creation, the script can optionally utilize the "Explore" feature
within RubiConnect.
• Clicking the "Explore" button opens a dedicated page for visualizing the dataset
contents.
• The script can capture screenshots of this "Explore" page, providing a visual
verification of data presence within the newly created dataset.
• The script can further analyze the HTML structure and content of the "Explore"
page to identify specific elements or attributes associated with data tables.
• Based on this analysis, the script can update the Excel sheet with a success or
failure message regarding the visibility of explored tables.
• In case of missing tables, the script can record the names of those tables within
the Excel sheet for further investigation.

This data verification ensures that the created RDBMS dataset contains the intended schema,
tables, and data, enhancing the reliability of the automation testing process.

Figure 9: Flowchart for table and schema validation

15
Figure 10: Flowchart for explore dataset

16
WORKFLOW EXECUTION

Figure 11: Open browser and navigate to url

Figure 12: Navigated to landing page

17
Figure 13: Clicked on plus icon

Figure 14: Clicked on rubiconnect icon

18
Figure 15: Clicked on PostgreSQL

Figure 16: Select schema

19
Figure 17: Select tables

Figure 18: Navigated to landing page after dataset creation

20
Figure 19:After plus icon, click on rubisight card

Figure 20: Select dataset for dashboard

21
Figure 21:Dashboard Created

Figure 22: Chart plotting

22
Figure 23: Plotted Chart

Figure 24:Applying Global Filter

23
Figure 25:Filter Configuration

Figure 26:After applying filters

24
WORK DONE

Daily Scrum Meetings: Collaborative Planning and Progress Tracking


A cornerstone of the agile development methodology at Rubiscape involved daily scrum
meetings. These brief, focused sessions served as a platform for:

• Understanding Project Goals: Each meeting began with a review of the overall project
objectives, ensuring team alignment and a clear understanding of priorities.
• Sharing Individual Tasks: Team members, including myself, shared their planned
tasks for the day, fostering transparency and collaboration.
• Identifying Roadblocks: The meetings provided an opportunity to discuss any
roadblocks encountered during the development or testing process. By collectively
brainstorming solutions, we strived to maintain efficient progress.
• Celebrating Achievements: Recognizing successful test script development or the
resolution of critical issues nurtured a positive and collaborative team environment.

Participation in these daily scrum meetings played a significant role in my integration into the
QA team, fostering clear communication, and keeping me informed about the project's
evolving landscape.

Knowledge Transfer Sessions: Learning from the Experts


Rubiscape prioritized knowledge sharing and continuous learning. This was evident through
the dedicated knowledge transfer sessions organized within the QA department. During these
sessions, I had the privilege of learning from experienced QA professionals:

• Understanding Rubiscape's Software: Senior team members provided in-depth insights


into the functionalities and intricacies of Rubiscape's software, equipping me with a
solid foundation for test script development.
• Automation Testing Best Practices: Experts shared valuable knowledge on best
practices for writing efficient and maintainable automation scripts. These sessions
covered topics like Page Object Model (POM) design, selecting appropriate testing
frameworks, and implementing robust test case validations.
• Exploring Testing Tools and Technologies: I gained exposure to various testing tools
and frameworks like Selenium WebDriver, Pytest, and tools for visual validation. This
knowledge broadened my understanding of the automation testing landscape.

These knowledge transfer sessions significantly accelerated my learning curve and


empowered me to contribute meaningfully to the development of the automation testing
framework.

25
Demos on New Developments: Staying Ahead of the Curve
The development team at Rubiscape embraced a culture of continuous innovation. To keep
the QA team abreast of the latest software developments, they regularly conducted
demonstrations of new features and functionalities. These demos were instrumental in:

• Understanding Testing Requirements: By witnessing the new features in action, I


could effectively identify the testing scope and develop targeted test cases for
comprehensive automation.
• Providing Early Feedback: The demos provided an opportunity to offer early feedback
on the functionality and user experience from a testing perspective. This early
collaboration helped ensure the delivery of high-quality software.
• Adapting Test Scripts: As new features were integrated into the software, the demos
aided in adapting existing test scripts or creating new ones to maintain comprehensive
test coverage.

Scripting for Efficiency: Building the Automation Framework


Equipped with the knowledge gained from the training sessions, I delved into the
development of automation scripts using Python, Selenium WebDriver, and Pytest. The focus
of my scripting efforts revolved around:

• Creating Reusable Components: Adhering to the Page Object Model (POM) design
principle, I developed reusable Python classes for interacting with various web page
elements within the Rubiscape application. This modular approach promoted code
maintainability and facilitated the efficient creation of complex test scripts.
• Automating Key Functionalities: Based on test cases identified in collaboration with
the QA team lead, I developed scripts to automate essential functionalities like user
login, dataset creation via RubiConnect (including RDBMS and file-based datasets),
and basic dashboard creation through RubiSight. These scripts streamlined the testing
process, reducing manual workloads and enhancing efficiency.
• Incorporating Data-Driven Testing: To ensure broader test coverage and flexibility, I
implemented data-driven testing practices. This involved utilizing Excel sheets to
store test data (user credentials, dataset details, etc.), enabling the scripts to execute
tests using various data sets without code modifications.

The process of script development proved to be both challenging and rewarding. It required a
blend of technical understanding, problem-solving skills, and a meticulous approach to ensure
accurate and reliable test execution.

26
Testing with Confidence: Putting Scripts to the Test
Following script development, I actively participated in executing the automated test scripts.
This involved:

• Running Tests Locally: Initially, I ran the scripts on my local development machine to
identify and rectify any syntax errors or logical issues.
• Executing Tests on Test Environment: Once the scripts functioned as intended locally,
I executed them on the dedicated test environment at Rubiscape. This provided a
realistic testing context and ensured the scripts functioned seamlessly with the actual
software deployment.
• Analyzing Test Results: Following test execution, I meticulously analyzed the
generated reports using Pytest. These reports provided detailed information on test
case execution status (passed/failed), captured screenshots upon failures, and error
messages. By analyzing these reports, I could identify areas requiring script
refinement or potential software bugs necessitating further investigation.

Through active test execution, I gained valuable insights into the testing process and the
importance of robust script development for achieving comprehensive test coverage.

Collaboration is Key: Working with the QA Team Lead


Effective communication and collaboration with the QA team lead were integral aspects of
my internship. This teamwork manifested in several ways:

• Seeking Guidance: The QA team lead provided invaluable guidance and mentorship
throughout the internship. I regularly consulted with them to clarify testing
requirements, discuss script functionalities, and troubleshoot any challenges
encountered during script development or test execution. Their expertise significantly
aided my learning and problem-solving abilities.
• Providing Feedback: I actively participated in discussions and brainstorming sessions
with the team lead. This allowed me to contribute my perspectives on testing
strategies, script improvements, and potential edge cases that needed consideration.
• Joint Responsibility: As I progressed, I collaborated with the QA team lead in
identifying new automation opportunities within the Rubiscape software. This joint
effort ensured that the automation testing framework continued to evolve and
encompass a wider range of functionalities.

27
FUTURE SCOPE

The automation testing framework developed during this internship lays a solid foundation for
Rubiscape Pvt. Ltd. to enhance its software testing practices. However, there's immense
potential to further expand its capabilities and functionalities. Here are some key areas for
future development:

1. Expanding Test Coverage:


Advanced Functionalities: Currently, the framework focuses on core functionalities like user
login, dataset creation, and basic dashboard creation. Future iterations can encompass more
advanced functionalities within RubiConnect and RubiSight, such as complex data
manipulation within datasets, advanced chart configurations in dashboards, and integration
with external data sources.

API Testing: In addition to UI automation, exploring API testing can further improve test
coverage. Automating API calls allows for testing the backend functionalities and validating
data integrity throughout the application.

Edge Case Testing: The framework can be strengthened by incorporating test cases that cover
edge scenarios and potential error conditions. This could involve testing invalid user
credentials, handling data manipulation errors, and simulating unexpected user interactions.

2. Framework Enhancements:
Self-Healing Capabilities: Implementing self-healing mechanisms can enhance the
framework's robustness. These mechanisms could involve identifying and recovering from
minor UI changes or temporary application glitches, ensuring uninterrupted test execution.

Parallelization: Exploring parallel test execution techniques can significantly reduce overall
testing time. This can involve running independent test cases on multiple machines
concurrently.

Reporting and Analysis: Expanding the reporting capabilities of the framework can provide
more granular insights into test results. This could include generating reports with detailed
test logs, performance metrics, and visualization of test case trends over time.

28
3. Integration and Deployment:
CI/CD Pipeline Integration: Integrating the automation testing framework with a continuous
integration and continuous delivery (CI/CD) pipeline can automate test execution as part of
the software development lifecycle. This ensures that new code changes are thoroughly tested
before deployment.

Cloud-Based Execution: Deploying the framework to a cloud environment can offer


scalability and accessibility benefits. This allows for on-demand execution of test scripts from
any location and facilitates collaboration across geographically distributed teams.

29
CONCLUSION

The internship at Rubiscape Pvt. Ltd. proved to be a transformative experience, immersing me


in the world of automation testing. It equipped me with valuable technical skills, fostered
problem-solving abilities, and instilled the importance of teamwork in a collaborative
environment.

The core contributions of this internship revolved around scripting for efficiency, conducting
tests with confidence, and collaborating effectively with the QA team lead. Script
development using Python, Selenium WebDriver, and Pytest laid the foundation for the
automation testing framework. This framework automates essential functionalities like login,
dataset creation, and basic dashboard creation, streamlining the testing process.

Beyond technical expertise, the internship emphasized the significance of continuous


learning. Daily scrum meetings ensured clear communication and project alignment, while
knowledge transfer sessions offered invaluable insights from experienced QA professionals.
Additionally, demos on new developments kept me abreast of the ever-evolving Rubiscape
software.

Looking forward, the automation testing framework possesses immense potential for growth.
Expanding test coverage to encompass advanced functionalities, API testing, and edge cases
will further strengthen its capabilities. Framework enhancements like self-healing
mechanisms, parallelization, and advanced reporting can bolster its robustness and efficiency.
Ultimately, integration with a CI/CD pipeline and cloud-based execution can revolutionize the
testing process at Rubiscape.

This internship journey has been a gateway to a world of possibilities within software quality
assurance. The knowledge and experience gained will prove invaluable in my future
endeavors, empowering me to contribute meaningfully to the development of high-quality
software solutions.

30
BIBLIOGRAPHY

[1] Page Object Model: A Design Pattern for Improving Web Application Test Automation
by Alex Garcia (https://saucelabs.com/resources/video/boni-garcia-page-object-model-open-
source-software-selenium-and-more)

[2] Selenium WebDriver Documentation (https://www.selenium.dev/documentation)

[3] Pytest Documentation (https://docs.pytest.org/en/latest/contents.html)

31

You might also like