Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 6

Bhumika Ancha

Email: bhumika.a2304@gmail.com
Ph: +1 682-215-3427
_____________________________________________________________________________________________________________________

SUMMARY:

 Over 6+ Years of diverse experience in Quality Assurance (Manual / Automated) testing with
expertise in requirements gathering, analysis, design, application testing & Quality Assurance of
Web based applications, Client Server applications.
 Expertise in Test Planning, Test Cases Design, Test Environment Setup, Test Data Setup, Defect
Management, Configuration Management.Execution of automated test scripts using Mercury
Tools (Test Director/Quality Center, LoadRunner, and QTP), JMeter based on business/functional
specifications.
 Execution of automated test scripts using Mercury Tools (Test Director/Quality Center,
LoadRunner, and QTP), JMeter based on business/functional specifications.
 Proficient in different phases of testing like Functional Testing, GUI Testing, Regression Testing,
Integration Testing, System Testing, Performance Testing, User Acceptance Testing (UAT).
 Performed Black Box, OBIEE testing, Performance Testing, Regression, Disaster Recovery and
Validation testing during the testing life cycle of the product release.
 Good experience with run time settings/recording options and general options in IBM Rational
Performance Tester.
 Proficient in plotting and implementing scenarios and loading IBM Rational Performance Tester
scripts into a controller.
 Participated in Integration, System, Smoke and User Acceptance Testing.
 Prepared Test Plan, Effort Estimation and Regression Checklist.
 Prepared User Manuals for Projects/ Modules level.
 Specialized in performance testing applications using load-testing tools such as Load Runner,
Performance Center.
 Expert in Automation testing Using QTP.
 Extensive experience in using LoadRunner in Web and Web services.
 Expertise in Test Management and Bug Tracking tools like Quality Center, Test Director.
 Strong knowledge of using Single and Multiple protocols in Loadrunner Vugen like Web Http,
WebSevices, Ajax Tru Client.
 Expert in Leading QA Projects and mentoring Team Members.
 Knowledge on Visio diagrams and Mainframe technology.
 Effort estimation for undertaking new project and prepare reports for the same.
 Ability to identify root causes and derive corrective actions to meet short and long term
business requirements using resourceful approaches.
 Exceptional ability to build productive relationships with business users, test teams,
development teams and clients across functional and technical disciplines and thus generating
accurate and detailed business requirements.
 Ability to successfully manage multiple deadlines and multiple projects effectively through a
combination of business and technical skills.
 Strong business analysis skills and thorough understanding of software development life cycle.
 Strong ability to understand and document critical data through effective data collection, data
analysis and data interpretation.
 Strong problem solving skills and very good time management skills.
 Ability to master new technologies quickly.

TECHNICAL EXPERTISE:

Operating Systems: AIX, HP-UX, Linux, Windows 2003/2005


Scripting Language: XSL, XSLT, HTML, JavaScript, JQuery, AJAX, C, C++
Database/Tools: Oracle, MS Access, DB2 and MS SQL server
Testing Tools: Load Runner v.12.0, Performance Center v.12, IBM Rational Performance Tester,
HP Site Scope, QTP, Perfmon, OBIEE, JUnit, Clear Case
Web/App Servers: Tomcat, IIS, WebSphere, MQ Series and WebLogic.
Web Technologies: Struts Framework, Java Servlets, Java Beans, J2EE, JMS, JDBC, RMI, EJB,
Swing, and Mainframe.
XML and Web services: XML, DOM, XPATH, XSD, WSDL and SOAP.
Defect tracking Tool: Quality Center, Performance Center, Bugzilla
Other Tools: Neo Load Tool, Toad, Visio, Wily Introscope

PROFESSIONAL EXPERIENCE:

Client: StateFarm, Irving, TX Feb 2016 –May 2016


Role: Sr. Performance Analyst

State Farm is a mutual company that makes its primary focus on its policyholders. It has more than
65,000 employees and more than 18,000 agent’s service 82 million policies and accounts throughout the
U.S.
Responsibilities:
 Generated, validated, and tested reports produced by the product quality testing division that to
be reviewed by the higher level management.
 Participated in all phases of planning, such as defining requirements, defining the types of tests
to be performed, and scenario creation
 Participated in meetings with company executives such as business analysts, developers,
managers, supervisors, and executive officers in order to understand the product and the testing
phases more thoroughly
 Worked on different protocols like Web (HTTP/HTML), Ajax tru client, Web services, Windows
socket
 Developed Vugen Scripts for load testing with 1000 vusers to find bottlenecks in the server and
deadlocks in the database
 Tested performance of Web Portal using HP Load Runner 12.01
 Generated scripts in Virtual User Generator, which included Parameterization of the required
values.
 Run the script via the Load Runner Controller workbench to perform stress, volume, and
component testing
 Configured and used SiteScope Performance Monitor to monitor and analyze the performance
of the server by generating various reports from CPU utilization, Memory Usage to load average
etc.
 Executed in Load, Stress and Endurance Testing to simulate a process, which allowed using more
1000 virtual users.
 Analyzed scalability, throughput and load testing metrics against test servers.
 Monitored the performance of portal based on various Load Runner operating system, network,
middleware, and firewall monitors
 Highly involved in verifying test results with various manual unit, integration, smoke, and sanity
testing documents
 Executed regression cycles of the test cases in order to ensure the product quality and
performance after each stage of the code changes
 Performed Bug Tracking and figured out defects
 Extensively operated HP Performance Center in order to meet tight project deadlines remotely
by coordinating among the colleagues
 Performed analysis and reported required data to my managers.

Environment: HP Load Runner 12.01, performance center 12.01, HP-Sitescope, IBM Rational
Performance Tester HP diagnostic,Sitescope, SQL, JIRA, Perfmon, SQL, C/C++, Java, PHP

Client: American Express, Phoenix, AZ Sep 2014 –Jan 2015


Role: Sr. Performance Test Engineer

American Express offers asset based loans for small growing businesses and opportunities for the small
business it is going to finance. The project was a web-based application involving the automation of loan
origination process beginning from origination/opening an application. The system performed all the
business functions of the loan process such as set-up Account information, New Loan set-up, the
Appraisal, credit, and income. The project involved modules for Pre Approval, Application, Underwriting,
Pricing, Processing, and Closing
Responsibilities:
 Gathered the requirements and compiled them into Test Plan
 Followed Agile Methodologies (scrum)
 Responsible for implementing Load Runner, Performance center, JMeter based infrastructure
including: Architecting the load testing infrastructure, hardware & software integration with
LoadRunner.
 Prepared Test Cases, Vugen scripts, Load Test, Test Data, execute test, validate results, Manage
defects and report results
 Experience in preparing complex scripts using IBM RPT using Web HTTP/HTML protocols
 Expert in creating Next Generation Usage Pattern Analysis from the Production Logs to generate
Performance Load.
 Installed SiteScope, and configured monitors for analysis.
 Used SiteScope to get metrics from servers .
 Extensive backend knowledge Oracle 10g,11g, SAP, SOA, JAVA, J2EE, .Net Application Servers.
 Gathering and finalizing of specs / defining business and functional requirements for BI
reporting by conducting workshops, completing BI Report Specifications; guiding disposition of
reports between ECC and BI. Setup QA Environment Installing Loadrunner, Silk Performer, QTP,
Batch Processes.
 Complex Usage Pattern Analysis
 Used Performance Center to define performance requirement like SLA in test.
 Interface with developers, project managers, and management in the development, execution
and reporting of test automation results
 Identify and eliminate performance bottlenecks during the development lifecycle
 Accurately produce regular project status reports to senior management to ensure on-time
project launch.
 Performed Black Box, White Box, Performance testing, Regression, and Validation testing during
the testing life cycle of the product release.
 Participated in Integration, System, Smoke and User Acceptance Testing.
 Wrote User Acceptance Test (UAT) Plans and User Acceptance Test Cases.
 Verify that new or upgraded applications meet specified performance requirements.
 Used to identify the queries which taking too long and optimize those queries to improve
performance for degrading the performance by looking at the resources such as Available Bytes
and Private Bytes.
 Inserted GUI, bitmap and database checkpoints to check the functionality and data integrity.
 Involved in updating the whole Oracle based application on UNIX platform.
 Independently develop LoadRunner test scripts according to test specifications/ requirements.

Environment: Windows2003/NT, UNIX, SAP web, Web HTML /HTTP, Load Runner, IBM Rational
Performance Tester, JUnit, Oracle10g and XML/ SOAP, BMC, Java, Performance Center, Java script.

Client: EOrdering / Verizon.Com, Irving, TX Mar 2013 - Sep 2014


Role: Performance Analyst

Eordering facilitate the Consumers and General Business customers to order the Verizon products.
Eordering Interface Provides Various Verizon products to the customers like Voice, Data, TV and
additional Equipment Products.
Responsibilities:
 Followed Agile Scrum methodology
 Prepared Test Plan based on the requirements.
 Developed Scenarios in Controller based on the User Load and Transaction Volume
 Developed and executed formal test plans to ensure the delivery of quality software
applications.
 Enhanced the test scripts using Web Http, ODBC, JAVA protocol in Load runner.
 Executed stress/load/rendezvous scenarios and regression testing for various operations and
performed detailed test analysis reports and perform Disaster Recovery.
 Added performance measurements for UNIX, Oracle, Web Logic server in Load Runner,
controller and monitored online transaction response times, Web hits, TCP IP Connections,
Throughput, CPU, Memory, Heap sizes, Various Http requests etc. Monitored Oracle database
V$ session and system table stats.
 Used Site Scope Performance monitors and LoadRunner graphs to analyze the results.
 Extensively worked on UNIX and executed various programs on C Shell.
 Web Interface Protocols are initially determined before generating the test scripts and later the
scripts are generated.
 Sniffer traces were analyzed for Network Bottlenecks.
 The Average CPU usage, Response time, TPS are analyzed for each scenario.
 Performed backend testing on Oracle, executed various DDL and DML statement.
 Developed various reports and metrics to measure and track testing effort.
 Created Test Matrix, Traceability Matrix and performed Gap Analysis.
 Participate in Weekly Meetings with the management team and Walkthroughs.
 Interacted with analyst, system staff and developers.
 Detected defects and classified them based on the severity in Quality Center.
 Provided Screenshots to identify & reproduce the bugs in QC.
 Interacted with the development team to fix the defects as per the defect report
 Used Load Runner to execute multi-user performance tests, used online monitors, real-time
output messages and of the features of the Load Runner Controller.
 Identified Real World Scenarios and Day in Life Performance Tests
 Complex Usage Pattern Analysis
 Developed Complex ‘C' Libraries and Utility Functions for Code Reusability and Modularity
 Independently develop LoadRunner test scripts according to test specifications/requirements.
 Developed baseline scenarios, scripts and measurements
 Worked with Software Development in creating, executing, and documenting automated test
scripts.

Environment: Documentum, Performance Center, Load Runner, Windows2000/NT, Citrix 9.2 OEM,
RUP, Test Director, Introscope, Oracle10g, DB2, Websphere, SOA, Struts, EJB, IIS and XML/ SOAP.
TSQL.

Client: Progressive Insurance, Cleveland, OH January 2011 - Jan 2013


Role: QA Test Engineer
Responsibilities:
 Involved in writing Test Plan for the testing effort of the module.
 Developed and executed Test Cases and Test Scripts using QTP based on the requirement
documents and managed it using Quality Center.
 Prepared data for data driven testing using Data Driver Wizard in Win Runner as required by the
Corporate Customers.
 Conducted functionality testing manually prior to automated testing
 Maintained Requirement Traceability Matrix (RTM) to make sure that test cases were written
for all the requirements.
 Performed Back End Testing using SQL.
 Involved in Configuration Testing.
 Actively participated in walkthroughs and team meetings.
 Conducted Security, Performance, and Regression testing during the various phases of the
development.
 Investigated software bugs and interacted with developers to resolve technical issues using
Quality Center.
 Identifying Test Cases to be run for Regression Testing and conducting Regression testing as and
when new builds were made.
 Administration of Quality Center for Bug Tracking and Reporting, Generating customized graphs
and reports.

Environment: QTP, Quality Center, Windows.


Client: Dell, Austin, TX Feb 2010 - Dec 2010
Role: QA Test Engineer

Responsibilities:
 Involved in writing Test Plan for the testing effort of the module.
 Formulated detailed Test Plan, Test Scripts and Test Cases after analyzing business rationale.
 Involved in documenting test cases for Functionality, Security, and Performance Testing
 Performed Backend Testing of the Oracle Database manually to ensure Order details and
requests were correctly inserted.
 Performed Functionality Testing Manually.
 Analyzed software test plans, co-ordinated automated and manual test cases.
 Performed Manual Functional and Regression Testing.
 Conducted backend-testing on the Oracle database using SQL queries to ensure integrity and
consistency of the data.
 Performed Configuration Testing on various hardware platforms.
 Experienced in Backend Testing using SQL Queries.
 Maintained Test Matrix and Requirements Traceability Matrix. Performed Gap Analysis on the
same
 Used Quality Center to analyze, track and report defects.
 Worked on uploading all the Test cases to the Quality Center for the current and prior releases.

Environment: Windows, Oracle, UNIX, IIS, Quality Center.

EDUCATION:
Bachelors of Science in Computer Science.

You might also like