Yagna - Performance Lead

You might also like

Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 4

Yagna Narayana Sastry

OBJECTIVE

To seek a position and utilize my skills /abilities in the Information Technology Industry that offers
professional growth while being resourceful, innovative, and flexible.

EXPERIENCE SUMMARY

Expert in performance testing using VSTS, LRE, and JMeter, BlazeMeter, AzureLoadTest tools and
good expertise on manual and automation testing tools like Selenium, LeanFT

 Worked on more than 15 protocols in performance such as HTTP/S, Mobile, Citrix, Ajax,
Webservices, Siebel, etc.
 Led performance testing team of the product engineering group having 3 different products and
developed utilities to improve the product performance
 Deep expertise on the application profiling tools like Dynatrace, JProfiler, AWS CloudWatch,
AzureMonitor, and database performance tools like query store, DB forge, Splunk, etc.
 Methodologies (Agile SDLC and SCRUM methodologies)
 Proficient in writing SQL queries and analyzing AWR reports
 Experience in Shell scripting and various monitoring and analyzing tools in Windows &Linux
platforms
 Strong expertise in the modernization works especially for the applications migrating from
mainframe to Java/.NET technologies
 Experience in Core Java and Advanced Java programming and developed different utilities using java
programming language
 Performing 508 compliant verification testing and reviewing technical requirements of software
cases with tools like JAWS
 Well versed on DevOps tools and technologies like GIT, Ansible, Jenkins, Continuous testing, Nagios,
Splunk, Containerization, etc.
 Experience on Enable Omni Channel Service
 Having good knowledge in Gaming, Ecommerce, Public sector, and health care domains
 Proficient in building CI/CD pipeline using Jenkins, Ansible, Docker and Kubernetes
 Well versed with software development lifecycle and testing life cycle
 Worked on multiple cloud migration projects leveraging AWS and Azure platforms
 Proficient in complete project life cycle using different methodologies like Waterfall model, Agile and
V-model
 Strong expertise in client-side analysis using the tools like WebPageTest, Chrome dev tools w.r.t
network, performance, Lighthouse, JavaScript minification, using the inbuilt browser APIs to get the
FCP, LCP, etc.
 Excellent experience in executing different types of Performance tests like Smoke Testing, Baseline,
Soak Testing, Stress Testing, Volume testing, Scalability testing and other support tests
 Have Sound Technical Skills, Good Communication & Inter-personal Skills and Zeal to learn & adapt
to new technologies
 Having good educational background and working experience.

TECHNICAL EXPERTISE

Performance Testing Tools : LRE, Performance Center, JMeter, MS VSTS, RPT, Azure LT
APM Tools : JProfiler, Dynatrace, SiteScope, Splunk
Automation Tools : Selenium Webdriver, UFT, LeanFT
Test Management Tools : JIRA, JAMA, ALM, Load runner.
DevOps Tools : Jenkins, Docker, Kubernetes, Maven, Ansible, Continuous Monitoring,
SonarQube, Jacoco, JFrog, Nexus, GIT, SVN, etc.
Programming Languages : Java, .NET, Python Basics, PL/1, C, C++
Databases : Oracle AWR, SQL Server Profiling, Influx DB
Scripting Languages : Shell, Groovy
Others : Eclipse MAT, JConsole, PerfMon, SQLServer Query Store, DB Forge,
Chrome Dev Tools, Gremlin Chaos Engineering, etc.
Operating Systems : Windows, Linux

CERTIFICATIONS

 AWS Certified Solutions Architect, AWS Cloud Practitioner (A2X)


 Microsoft Azure Fundamentals (AZ-900) certified
 Certified Scrum Master
 Certified Scrum Product Owner
 Gremlin Certified Chaos Engineer
 Certified DevOps Foundation from DevOps University
 Application Tester Certified from MIT (Massachusetts’s Institute of Technology)

AWARDS AND RECOGNITIONS

 Received the Accenture Excellence Award (ACE Award)


 Received the monetary award for being a good “Business Operator”
 Awarded for developing a Re-usable Asset for the management
 Awarded for creating “Better Client Value Creation”
 Received many “SPOT” and “APPLAUSE” Awards for outstanding work.

Project Description:
The State has various applications that are being developed, re-designed and\or maintained. PERLSS is an LTSS
application developed for a particular category people of the state.

Responsibilities:
 Plan and Design complex scrips having > 30k loc of 5 different applications
 Execute load and stress tests and work closely with the development team to resolve application
bottlenecks
 Analyze the performance test results and make recommendations to the development team
 Prepare and share the report with the relevant stakeholders
 Assist the performance test team for any blockers and define the best practices
 Validate the errors using Splunk and monitor the servers using Dynatrace
 Develop custom utilities to save time of functional testing team daily tasks
 Perform Splunk admin tasks like creating indexes, dashboard, etc. and utilize the same during perf
executions
 Monitor both on-prem and cloud components during the performance executions
 Executed Section 508 compliance checking of web sites.
 Skilled at documenting compliance against the Section 508 technical standards
 Ability to test software, Internet applications and hardware as needed for Section 508 compliance
 Wide knowledge of section 508 testing tools including Jaws
 Assist the training team in the data seeding work and deliver KT
 Fine tune the client-side performance issues by doing the JavaScript minification, front end issues,
using Lighthouse and other tools.
Environment: LRE, Splunk, Dynatrace, JIRA, StormRunner, AWS cloud, Linux, Java, etc.
Project Description:
Worked on the mining phase which is one of the crucial phases in application modernization. This internal
project is to validate the features pushed onto the customized eclipse IDE using the Lean FT automation tool.
Developed an in-house performance profiling tool that will capture the runtime method response times,
jvm_metrics, and os_metrics. Created a dashboard in Grafana that will pull jvm_metrics, os_metrics stored in
the influx DB. This tool will output the results into the external .csv file as well as display the metrics on the
workbench UI. Also, added new features to compare the runtime method performance of nightly build with
any other build and outputs the methods that are degraded by 1 second.

Responsibilities:
 Review the latest features pushed into the mining framework
 Prepare automation workflows using Lean FT tool
 Validate the script on a remote server using Jenkins
 Develop utilities to simplify the performance testing work and test performance profiling tool features
and log performance metrics of online and batch applications
 Analyze the results and identify the performance hotspots
 Provide tuning recommendations and rerun the tests after performance improvements
 Present performance test report to the stakeholders
 Ensure there are no performance bottlenecks for the different in-house products
 Maintain the daily Jenkins pipelines jobs for the daily transformation test project executions
 Assist the client teams related to performance issues and define the best practices

Environment: JMeter, JProfiler, Jenkins, GITLAB SVN, Crucible, Eclipse, LeanFT, Groovy, influx DB, Grafana,
Dynatrace, JIRA, AWS cloud, Linux, Java, C#, etc.

This is one of the application modernizations projects which transformed the code from Cobol to .NET.
Performance testing was performed on the migrated application starting from performance testing
requirement gathering phase and drove it to closure.

Responsibilities:
 Gather performance testing requirements from the client
 Understand key performance workflows and gather test data
 Prepare performance test plan and test strategy
 Prepare workload model based on the production usage statistics
 Execute dry runs and a series of load tests
 Monitor server metrics with SiteScope, Dynatrace, and DB performance using Query store and DB
forge tools
 Creating Load Runner scripts to test customer interface, WSS – Web Self Service application
 Evaluating and analyzing 508 testing results and overseeing implementation of test plans
 Analyzed the load test results and provided code improvement areas
 After fine tuning the code, repeat execution cycles until the application performance is improved
 Publishing the performance test reports with the relevant stakeholders

Environment: Performance Center, SVN, Crucible, Visual Studio, Load Runner, QueryStore, SiteScope
Dynatrace, JIRA, Windows, C#, etc.

Project Description:
As part of the project, Deloitte involved in providing testing solutions for TCOE. This project includes multiple
applications and releases throughout the health system for patient registration and maintenance, order
placement and resulting, etc.

Responsibilities:
 Analyze the data coming from multiple systems (Customer, Product’s, etc.)
 Preparing Performance Test Plan and Test Strategy
 Identifying the test data requirements for performance testing
 Track and follow-up the deliverables status on a regular basis
 Review performance test deliverables
 Developing performance test scripts using Citrix, Web and True-Client protocols
 Install and setup HP ALM & HP Performance Center tools
 Develop performance scenarios
 Utilizing best practices to implement omni channel service.
 Execute a series of load tests
 Performance test results analysis and bottleneck identification
 Publishing test summary report with a Go-no-Go decision/conclusion

Environment: HP ALM, Performance Center, SiteScope, MOFW, several medical products, etc.

You might also like