Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 6

Karuna Sree Reddy Matta

karunasreereddy3366@gmail.com
508-417-8488

PROFESSIONAL SUMMARY:

 Around 9 years of experience as an IT professional. I had a good experience with AWS Cloud services
design & implementation in Infrastructure-as-a-Service.
 Worked on data with Extraction, Transformation, and Loading using SAP BODS.
 Expertise in Information Technology and worked at different levels on infrastructure application
deployment, operations, and management projects.
 Running Terraform script for provisioned for AWS infra.
 Setup Jenkins & configure to run automation.
 Involved DevOps tools GIT, GIT hub, Jenkins, Terraform, and Ansible.
 Running Terraform script to deploy cloud target environment for AWS.
 Proven experience running and deploying configuration management tools viz Ansible for cloud
management tools installation like Zabbix, Graylag, TrendMicro, and Grafana.
 Expertise in designing and deploying scalable and redundant infrastructure on Amazon Web Services
cloud platform with services viz, Amazon EC2, S3, EBS, Cloud watch, Autoscaling, ELB, RDS, SNS,
Amazon VPC, Route53.
 Setup Jenkins & configure to run automation using Jenkins’s file using groovy and bash and shell
scripts.
 Monitored builds and deployments to ensure that failed builds or deployments are addressed in a
timely manner using Zabbix.
 Expertise in infra for SAP and Java applications.
 Experience in Jenkins to run the auto deployment.
 Experience in Managing a Large Linux infrastructure
 File system Administration and File systems using LVM on Linux
 Executing technical feasibility assessments, solution estimations, and proposal development for
moving identified workloads to AWS.
 Migrated applications and databases using the Cloud Endure tool.
 Integrate test scripts to Continuous integration using Jenkins.
 Prepare the Effort Estimation and time management.
 Actively participated in sprint meetings like Planning, Refinement, Retrospective Meetings.
 Experienced in build and releasing automation of Java-based applications.
 Managed systems routine backup, scheduling jobs like disabling and enabling Cronjobs, enabling
system logging, network logging of servers for maintenance, and performance tuning.

EDUCATION:

 Bachelor’s Degree in Computer Science engineering (2009) from Jawaharlal Nehru and Technological
University, India.
 SAP Business Objects Data services.
 Agile and ASAP Methodologies.
 In-house training in AWS Infra.

TECHNICAL SKILLS:

 Cloud Technologies: AWS cloud.


 Migration tool: Cloudendure
 Operating Systems: Windows and Linux.
 Other Tools: CI/CD Tools – GIT, Jenkins, Docker, Terraform.
 ETL (SAP BODS)
 VMware vSphere 6.0
 Service Now & JIRA/Confluence tools.
 Zabbix, Nagios Monitoring Tools
 SQL SERVER

BUSINESS EXPERIENCE:

Client: Unilever Englewood Cliffs, NJ Feb 2020 To Till Date.


Role: Cloud Engineer And SRE
Domain: Logistics and Supply Chain Management

Description:

Unilever is one of the world’s leading suppliers of beauty & personal care, home care, and foods &
refreshment products with sales in over 190 countries and reaching 2.5 billion consumers a day.

Roles Responsibilities:

 Deliver new application hosting solutions on AWS Cloud and provide support to the existing ones
 Hands-on experience working on various AWS services like EC2, VPC, Route53, S3, EBS, RDS,
DynamoDB, IAM, Cloudtrail, Cloudwatch, ELB, autoscaling, SNS & SES- etc.
 Experience in creating images
 Deploying AWS stack through Cloud Formation templates.
 Created load balancers (ELB), configured Autoscaling, and used Route53 with failover and latency
options for high availability and fault tolerance.
 Setup Jenkins & configure to run automation.
 Involved DevOps tools GIT, GIT hub, Jenkins, Terraform, Ansible.
 Ran Terraform script to deploy cloud target environment for AWS.
 Experience with monitoring, and alerting using Cloud Watch, Cloudconfig & Cloudtrail.
 Experience in Snapshots for backup.
 Inspect AWS environment for cost saving, improving system performance and Security gaps using
AWS Trusted Advisor, Guard Duty & Security Hub.
 Configured log management on Cloudwatch, VPC Flow log, S3 bucket.
 Provided detailed architecture documentation like HLD & LLD.
 Good experience on AWS Well-Architected framework.
 Architected multiple cloud-based backup and recovery solutions by utilizing AWS as a DR and/or
backup solution.
 Closely working with Customers in all the aspects of Agile methodology.
 Configuring GIT and Jenkins in a server to run on a terraform pipeline and make the project
secure.
 Building job through Jenkins, pre-build and post-build activities, maintenance.
 Creating playbooks and regular changes in the host’s entry in Ansible for configuration
management.
 Migrate existing virtual Windows & Linux server infrastructure to AWS using Cloudendure.
 Suggesting ways to optimize our AWS cloud landscape by re-architecting forklifted workloads to
make use of cloud-native and fully managed AWS solutions.

Client: SC Johnson, Racine July 2017 To Feb 2020.


Role: Cloud Engineer
Domain: Manufacturing

Description:

A leading manufacturer of household cleaning products and products for home storage, air care,
pest control, as well as professional products. Sc Johnson is committed to sustainability efforts that
deliver lasting change.

Roles Responsibilities:
 Launching EC2 instances with respect to specific applications.
 Configured the storage on S3 buckets.
 Securing Infra by using VPC and Security Groups.
 Configuring auto-scaling and Elastic Load Balancer.
 Monitoring EC2 instances using Cloud Watch (CPU, Memory, Disk, and network utilization).
 Configuring snapshots based on customer requirements.
 Configuring Images based on Linux requirements.
 Worked on Amazon AWS administration, EC2, S3, CloudWatch, Route53, and Security groups.
 Launched EC2 instances with respect to specific applications.
 Running and deploying configuration management tools viz Ansible for cloud management tools
installation like Zabbix, Graylag, TrendMicro, and Grafana.
 Design and deploying scalable and redundant infrastructure on Amazon Web Services cloud
platform with services viz, Amazon EC2, S3, EBS, Cloudwatch, Autoscaling, ELB, RDS, SNS,
Amazon VPC, Route53.
 Setup Jenkins & configure to run automation using Jenkins’s file using groovy and bash and shell
scripts.
 Monitored builds and deployments to ensure that failed builds or deployments are addressed in a
timely manner using Zabbix.
 File system Administration and File systems using LVM on Linux.
 Execute technical feasibility assessments, solution estimations, and proposal development for
moving identified workloads to AWS.
 Migrated applications and databases using Cloud Endure tool.
 Integrated test scripts to Continuous integration using Jenkins.
 Prepared the Effort Estimation and time management.
 Actively participated in sprint meetings like Planning, Refinement, and Retrospective meetings.
 Experienced in build and releasing automation of Java-based applications.
 Managed systems routine backup, scheduling jobs like disabling and enabling Cronjobs, enabling
system logging, network logging of servers for maintenance, and performance tuning.

Client: NBC, Englewood Cliffs, NJ May 2016 To July 2017.


Role: Consultant
Domain: Telecommunication
Description:

The National Broadcasting Company is an American English-language commercial broadcast television


and radio network. It is the flagship property of the NBC entertainment division of NBCUniversal, a
division of Comcast.
Roles Responsibilities:

 Creating ETL jobs, and modifying the ETL logics as per the new/change requirements.
 Implemented complex Business logic using QUERY, MERGE, SQL, PIVOT, CASE, VALIDATION,
LOOKUP_EXT, TABLE COMPARISION, HISTORY PRESERVATION transforms.
 Build the transforms using BODS functions like date functions, String Functions, and Miscellaneous
functions.
 Part of Core production support team and worked on daily monitoring of Daily/Monthly load issues.
 Involved in data analysis, data mapping, data extraction, data verification, and data validation.
 Optimized SAP BODS batch jobs using performance tuning techniques like parallel processing and
bulk loading.
 Extensively worked with the secured central repository, Job server, management console
 Responsible for validating and approving logical and physical data models to ensure they support
requirements gathered as well as providing guidance on data sourcing and ETL processes.
 Developed Framework for BODS Designer Objects using heterogeneous data Sources like SAP ECC,
SAP BW, ORACLE, Excel, XML, SQL Server, etc.,
 Involved in the data loads during the testing phase for different environments like SIT, QA, and Pre-
prod.

Client: AAA Copper Transportation June 2010 To Sept 2013.


Role: Associate Engineer (IGate)

Description:

AAA Cooper Transportation is an American, non-union less than truckload freight carrier based in
dothan, Alabama. The company also provides dedicated, port and international freight transportation.

Roles Responsibilities:
 Understand the project business requirements and create the technical requirements/specifications and
come up with E2E design documents.
 Prepare a production support plan and be responsible for a smooth handover to the support teams.
 Worked with different transformations like Query, Key Generation, Pivot, Map operation, and Table
Comparison in data integrator.
 Worked on performance improvement of the ETL process and played an effective role in reducing the
job execution times.
 Involved in consolidating the user stories from the product owner.
 Involved in several meetings like Daily scrum calls, Retrospective meetings, and sprint meetings.
 Worked on Change data capture on both source and target levels and implemented the SCD Slowly
Changing Dimension Type 1, Type 2, and Type 3.
 Design, develop and debug complex ETL jobs using Data Integrator scripts, Workflows, Data flow,
Datastores, Transforms, and Functions.
 Promoting ETL objects to different environments like Integration, QA, and Preprod using the Central
repository.
 Extensively worked on scheduling the jobs on the management console, monitoring, checking the
errors, and rectifying the same.
 Involved in defining the validation rules and bind those rules for specific columns based on the
requirements
 Involved in discussions with multiple teams like Functional, Legacy, and Application to create
mappings as per the requirements and worked with the QA team in resolving defects/bugs.

You might also like