Azure

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 8

PROFESSIONAL SUMMARY:

 Over 9 + years of IT experience as DevOps Engineer, AWS Architect & Developer, Azure Developer & Administrator, Linux
System Administrator and application development working on server - based operating system kernel configurations on
Red-hat Linux, CentOS, SUSE, and Ubuntu 12.x/13. x. Tuning kernel Parameters, Trouble Shooting System & Performance
Issues.
 Have experience with Serverless/PaaS technologies (API Gateway, Lambda, Dynamo, S3, etc
 In-depth knowledge in AWS cloud services like EC2, S3, RDS, VPC, Cloud Front, Route53, Cloud Watch, OpsWorks, IAM, SQS,
SNS and SES.
 Expertise in DevOps, Release Engineering, Configuration Management, Cloud Infrastructure Automation, it includes Amazon
Web Services (AWS), Ant, Maven, Jenkins, Chef, SVN, and GitHub.
 Worked on Microsoft Azure (Public) Cloud to provide IaaS support to client. Create Virtual Machines through PowerShell
Script and Azure Portal.
 Implemented DevOps tools suites like GIT, ANT, Maven, Jenkins, JFrog Artifactory, CircleCI, Docker, Docker Swarm,
Kubernetes, Nexus repository, Chef, Ansible, cloudwatch & Nagios in traditional environments & cloud environment
 Experience in Deploying applications (JAR, WAR, RAR, and EAR) and related troubleshooting in clustered environment using.
 Extensively worked on Jenkins, Docker for continuous integration and for End-to-End automation for all build and
deployments.
 Expert in Azure Kubernetes Service. Implemented the end-to-end Creation, Configuration and Deployments to AKS.
 Utilized the Azure APIM DevOps Resource Kit to extract ARM templates of APIs created in Azure APIM.
 Installed and configured foreman with puppet automation for auto-provisioning the Linux machines in Open Stack and
VMware environments.
 Used Azure, Python and ansible for various configuration management activities after the infrastructure as code activities
are completed.
 Data analysis, Data modeling and implementation of enterprise class systems spanning Big Data, Data Integration, Object
Oriented programming.
 Configured Selenium WebDriver, TestNG, Maven tool and created Selenium automation scripts in java using TestNG prior to
next quarter release.
 Utilized bash scripts to wrap java and python modules to schedule jobs in CA Autosys.
 Extensively worked on Jenkins, NoSQL, UDeploy, Open shift, Guidewire, Docker for continuous integration and for end-to-
end automation for all build and deployment.
 Experience in application development, debugging, implementation, supporting dev team, testing of Oracle based ERP using
SQL and Database Triggers.
 Implemented Continuous Delivery framework using Chef Jenkins, and Maven in Linux environment on AWS Public cloud.
 Extensively used Ruby scripting on Chef Automation for creating Cookbooks comprising all resources, Data bags, templates,
attributes.
 Experience on AWS cloud services like EC2, S3, RDS, ELB, EBS, VPC, Route53, Auto scaling groups, Cloudwatch, Cloudfront,
IAM for installing configuring and troubleshooting on various Amazon images for server migration from physical into cloud.
 Implemented Team Foundation Server (TFS), Subversion, Gitand GitHub as Version Control tools.Have a good knowledge on
BitBucket.
 Set up and maintained Logging and Monitoring subsystems using tools like; Elasticsearch, Fluent, Kibana, Prometheus,
Grafana and Alert manager.
 Automated the deployment processes of GraphQL APIs using CI/CD pipelines, ensuring efficient and consistent releases.
 Expertise in source code management using GIT, CVS and well versed in code.
 Highly Involved into Data Architecture and Application Design using Cloud and Big Data solutions on AWS, Microsoft Azure.
 Expertise in Automation tools such as Selenium WebDriver, Selenium IDE/RC, Selenium Grid, Java, Jenkins (continuous
integration, Regression tests), Maven (Regression tests), Eclipse, Cucumber, TestNG (Regression tests) and Junit.
 Experience in writing code in Perl to develop and deploy continuous test cases, in combination with CI tools like Jenkins.
 Experience at working on Software Development Life Cycles and Agile Programming & Agile Ops Methodologies.
 Hands-on experience with monitoring tools like Prometheus, Dynatrace. And worked with Apache Kafka and Zookeeper.
 Automated configuration management and deployments using Ansible playbooks and YAML. Experienced in developing
Ansible roles and Ansible Playbooks for the server configuration and Deployment activities.
 Experience in Configuration Management, Change/Release/Build Management, Support and Maintenance under Unix/Linux
Platforms (REDHAT and CENTOS).
 Extensive experience on modern front-end template frameworks and libraries for JavaScript including j Query, AngularJS
etc.
 Hands-on experience in using message brokers such as ActiveMQ and RabbitMQ.
 Expertise in Job Scheduling environments CA Workload Automation AE (AutoSys Edition), CA-7 Scheduler, UC11, JOBTRAC.
 Experienced in scaling Amazon RDS, MySQL, MongoDB, DynamoDB instances vertically and horizontally for high availability.

Technical Skills:
Operating Systems UNIX, Red hat Enterprise Linux, Ubuntu, Windows 98/NT/XP/Vista/7/8.
SCM Tools Subversion, GIT and Perforce.
Build Tools Apache Ant, Maven, Bazel, cMake, Gradle
Infrastructure As Code Terraform, Cloud formation
CI/CD Tools Jenkins, Apache Anthill Pro, Bamboo.
Repositories Nexus, Artifactory.
Configuration Management Tool Chef, Puppet, Ansible.
Web Service Tools JBOSS, Apache Tomcat, IntelliJ IDEA, Oracle
Web logic, IBM Web sphere, IIS Server, TibcoFTL
Languages/Utilities Shell Script, APACHE ANT Script, Batch Script, Ruby, python,Perl, C, Python, Core
Java.
Networking TCP/IP, NIS, NFS, DNS, DHCP, WAN, SMTP, LAN, FTP/TFTP.
Technologies AWS (EC2, S3, EBS, ELB, Elastic IP, RDS, SNS, SQS, IAM, VPC, Lambda,Cloud
Formation, Route53, CloudWatch, Microsoft Azure and Rack space OpenStack.
Databases SQL Server Oracle, DB2 and Teradata.
Cloud Platform AWS, Microsoft Azure, GCP, PCF.
Monitoring and profiling tools Nagios and Splunk.
Containerization Tools Docker, Docker swarm, Kubernetes
Cyber Security Vendor Equipment/ Palo Alto Firewalls Configuration and Management at scale, Cortex XDR,
Technology GlobalProtect, Wildfire, Prisma Access, Prisma Cloud.

Professional Experience:
Client: Epicor, Rochester Hills, MI Sept 2021 – Till now
Role: Site Reliability Engineer (SRE) / Azure DevOps Engineer.

Responsibilities:
 Design AWS Cloud Formation templates to create custom sized VPC, subnets, S3 to ensure successful deployment of Web
applications and database templates.
 Proficient in writing Cloud Formation Templates (CFT) in YAML and JSON format to build the AWS with the paradigm of
Infrastructure as a Code.
 Used Kubernetes for container management that runs Docker containerized applications in a cluster of EC2 instances in
Linux Environment.
 Integrated DevOps toolchains into Agile workflows, encompassing version control, build automation, testing, and
deployment.
 Used Azure resources like App service, IoT hub, EventHub, Appin sights, azure data factory, azure data bricks, key vault, Vnet
VM, etc.
 Built and deployed docker images on AWS,ECS and automated the ci/cd pipelines.
 Implemented configuration management tools (e.g., Ansible, Chef, Puppet) to ensure consistent and reproducible
configurations across Azure servers hosting the EDH.
 working knowledge and experience with Jira, visual studio and node js.
 Experienced in upgrading neo4j graph dB from 2.2.0 to 3.2.5 using chef cookbook.
 worked on Datadog monitoring tools automation using ansible.
 Used Splunk apm for log aggregation and analysis on different application servers and integrating the Splunk with single sign
on authentication and service now ticketing tool.
 Integrated VSCode with version control systems (e.g., Git) to manage source code repositories.
 Created and administrated multiple Kubernetes AKS clusters in Microsoft azure platform.
 Integrated AWS Step Functions into CI/CD pipelines to automate and manage the deployment process, ensuring smooth
transitions between different stages of the software development lifecycle.
 Designed AWS, EC2 security Groups for Databases to control the limited connectivity’s databases like Cassandra, Couchbase,
Mango, Oracle, Aurora Postgres and AWS Elastic research.
 Worked on Amazon EKS using infrastructure-as-code tools like Terraform or AWS CloudFormation.
 Implemented Mutual SSL in microservices architectures, enhancing security between inter-service communication.
 Used Amazon EKS to manage Kubernetes clusters to ensure scalable and reliable deployment of microservices.
 Integrated Databricks with Azure DevOps for seamless CI/CD pipeline orchestration and version control integration.
 Integrated Ansible with monitoring tools (e.g., Nagios, Prometheus) to track infrastructure changes and performance.
 Automated the provisioning and configuration of Kafka clusters using infrastructure-as-code tools like Ansible or Terraform.
 Automated the build and deployment processes of Angular applications using CI/CD pipelines.
 Worked on AWS infrastructure using services like EC2, VPC, S3, RDS, and more.
 Experience working with monitoring tools like solar winds, new relic.
 Experience on azure data bricks cloud to organize the data into notebooks and making it easy to visualize the data through
the use of dashboards.
 Experienced in object-oriented programming lang like java and c#.
 Used VSCode to built-in Git integration and supports popular source control features.
 designed, installed, configured and integrated sensu enterprise monitoring for Paas cloud environment.
 Worked on IaC tools and scripts using Golang(GO) to automate the provisioning and management of infrastructure
resources. Libraries like Terraform and Packer, which are widely used in the DevOps space, have Golang (GO) support.
 Configured CI pipelines in Visual Studio to automate builds and tests using tools like Azure DevOps or Jenkins.
 Launching and configuring of Amazon EC2 (AWS) Cloud Servers using AMI’s (Linux) and configuring the servers for specified
applications.
 Worked on serverless applications using AWS Lambda, allowing for event-driven, scalable, and cost-effective execution of
code.
 Created ec2 machine, installed and configured neo4j graph database and authenticated users in Kibana to load data from
csv files stored on s3 bucket.
 Experienced with deployment maintenance and troubleshooting applications on AKS.
 Automated the cloud deployments using Ansible, Python and AWS Cloud Formation Templates.
 Integrated reliability testing into the CI/CD pipeline to catch potential issues early in the development lifecycle.
 Implemented Azure Private Link to securely connect Azure services, like Azure Storage and Azure SQL Database, to virtual
networks.
 Implemented handling for OIDC logout. When a user logs out of your DevOps tool, ensure that they are also logged out
from the OIDC provider to terminate the session securely.
 Implemented Infrastructure as Code (IAC) practices using tools like Terraform, AWS CloudFormation, or Ansible for
automated infrastructure provisioning.
 Designed and deployed Extract, Transform, Load (ETL) pipelines in Azure Data Factory for efficient data integration across
various sources.
 Configured Splunk to build and to maintain log analysis for various systems and have developed Splunk queries.
 Strong experience with SQL and MySQL as well with NoSQL databases specifically MongoDB and PostgreSQL AND
Cassandra.
 Experienced with Jira, visual studio, NodeJS, git, GitHub, GitLab, maven, SonarQube, ansible, docker, Jenkins, Kubernetes,
JUnit and jfrog.
 Worked with terraform to provide the infrastructure in AWS components like EC2,IAM,VPC,ELB,Cognito,api gateway, ECS,
EKS.
 Experienced in cloud-based data ecosystems using azure HDInsight, Azure data factory, Azure data bricks, Azure synapse
analytics, Azure devops, Azure data lakes and azure storages.
 Implemented logging and monitoring solutions for AWS Step Functions, providing visibility into workflow execution,
identifying bottlenecks, and aiding in troubleshooting.
 Developed scripts, automation and application using PowerShell, c# and python.
 Utilized Harness for defining deployment pipelines as code, promoting version control and traceability of changes.
 implemented Apache Airflow for orchestrating complex workflows, enabling the automation of data processing, ETL tasks,
and job scheduling.
 Utilized VSCode's debugging capabilities to troubleshoot and debug code.
 Significant experienced in ITL/SAIX sigma standards and waterfall/agile methodologies.
 Expertise in release engineering, configuration management, cloud infrastructure and automation. It includes Amazon web
services, github,veracde,Sonarqube,grade,jenkins,ansible,chef,xcode,docker,tomcat and Linux.
 Implemented Lambda functions to automate routine tasks, such as data processing, file transformations, or infrastructure
provisioning.
 Used Infrastructure as Code principles to define and manage the infrastructure components required for MuleSoft
applications.
 utilized AWS ECR as docker image repository and used AWS ECS to deploy docker container to AWS faregate and ec2.
 Designed Iaas and paas solutions for new clients migrating from onsite infrastructure to cloud.
 Developed and implemented automated scaling strategies for Couchbase clusters to handle varying workloads.
 Integrated AWS Lambda with other AWS services to build end-to-end automated workflows within the AWS ecosystem.
 Utilized Lambda triggers, such as API Gateway, S3 events, or CloudWatch Events, to initiate serverless function executions in
response to specific events.
 Utilized Infrastructure as Code tools (e.g., Azure Resource Manager templates, Terraform) to define and manage the Azure
infrastructure needed for EDH.
 Worked on setup of the Kubernetes sandbox on azure Kubernetes services(AKS) for testing the different features.
 Used AWS to Identity and Access Management (IAM) roles and policies to enforce secure and least privilege access.
 Experienced in provisioning, configuring and troubleshooting of various AWS cloud services, EC2, S3, RDS, ELB, EBS, VPC,
Route53, Auto scaling groups, Cloud watch, Cloud Front, IAM.
 build and deployed scaled machine learning models with AWS sage maker.
 Collaborated with development teams to implement Azure DevOps security features for secure CI/CD pipelines.
 Integrated Cassandra with CI/CD pipelines for automated testing and deployment of database changes.
 Integrated Databricks with Azure DevOps for version control, continuous integration (CI), and continuous delivery (CD) of
notebooks and jobs.
 Used tools like Terraform, CloudFormation, or AWS CLI to automate EMR cluster creation, configuration, and termination
based on requirements.
 Created docker images for microservices to work in AWS ECS and configure application load balancer and auto scaling
groups for high availability of application in cloud.
 Implementing a Continuous Delivery framework using Jenkins, Puppet, Maven & Nexus in Linux environment.
 Implemented Infrastructure as Code (IaC) practices to manage Couchbase infrastructure, ensuring consistency and
repeatability.
 Experienced in designing AWS cloud models for infrastructure as a service IAAS, platform as a service PAAS, and software as
a service SAAS.
 Used Linux servers based on project requirements, utilizing distributions such as Ubuntu, CentOS, or Red Hat.
 Implementing a Continuous Delivery framework using Jenkins, Maven in multiple environments.
 Utilized (Go's) Golang ability to easily cross-compile binaries for different platforms, making it convenient for creating tools
that run consistently across various operating systems.
 Implemented SSO solutions using Okta to streamline user authentication across various DevOps tools and platforms.
 Used Lambda functions for custom automation tasks within CI/CD workflows, enhancing the overall efficiency of software
delivery pipeline.
 Integrated azure log analytics with azure wms for monitoring the logfiles, stores them and track metrics.
 Expertise in installing, configuring, trouble shooting and maintenance Dynatrace managed apm tool.
 Integrated Harness with infrastructure provisioning tools like Terraform or Ansible for automated environment setup and
teardown.
 Integrated PowerShell scripts into your CI/CD pipelines for automated deployments, tests, and infrastructure updates.
 Used Jenkins pipelines to automate the continuous integration of Angular code, ensuring early identification of integration
issues.
 Integrated Apache Airflow with CI/CD pipelines, facilitating seamless deployment and execution of workflows as part of the
continuous integration process.
 Used Anypoint Platform to provide comprehensive API management capabilities, allowing organizations to design, publish,
secure, and analyze APIs.
 experienced in working with docker, Kubernetes, ECS container services and successfully deployed the images in the cloud
environment for managing application.
 Integrated Azure DevOps Services for version control, continuous integration, and continuous deployment (CI/CD) pipelines.
 Utilized Okta for managing the complete identity lifecycle, including onboarding, offboarding, and role changes.
 Integrated Azure network entities with Azure Active Directory for secure identity-based access controls.
 Worked on implementing Azure Security Center recommendations for vulnerabilities and security best practices.
 Implemented and maintained CI/CD pipelines for containerized applications on Amazon EKS, utilizing tools such as Jenkins,
GitLab CI/CD, or AWS Code Pipeline.
Environment: AWS, EC2, S3, IAM, Cloud Formation, Cloud watch, SNS, Jenkins, GIT, Ansible, Kubernetes, Microservice, .net,
Nexus, Docker, Apache Webserver, KVM, Windows, Solaris, Tomcat, Red Hat, Linux, Apache, Restful, Java, Python, Shell, Agile,
SQL server.
Client: Ahead, Chicago, IL Jun 2019 – Aug 2021
Role: Sr. DevOps/AWS Engineer

Responsibilities:

 Experience in Creating and managing users and groups accounts, passwords, permissions, logging, disk space usage and
processes via Disk Quota, PAM Limits, ACLs, LDAP.
 Configuration and administration of DNS, Web, Mail, DHCP and FTP Servers. Managed users and groups on a large scale in
NIS and LDAP environments.
 Used Git platforms (e.g., GitLab, GitHub, Bitbucket), managing repositories, users, and access controls.
 Integrated Databricks with MLflow for end-to-end machine learning lifecycle management, including experimentation,
deployment, and monitoring.
 used Datadog and Nagios to maintain server uptime along with Api monitoring Datadog, Nagios.
 Integrated AWS sage Maker hyperparameter tuning into the CI/CD pipeline to automatically find optimal hyperparameter
values for your models.
 Installed, configured and integrated red hat cloud forms, ansible owner, PostgreSQL data base for automation within PAAS
cloud.
 Implemented cypher queries to manipulate data on neo4j database.
 Created indexes and install forwards for new log analytics suing Splunk.
 experienced with popular programming scripting and markup lang like python, shell, c, c# etc.
 Implemented CI/CD pipelines for Databricks notebooks, automating testing, deployment, and versioning to streamline the
development process.
 Implemented Jenkins pipelines into azure pipelines to drive all micro services builds out to the docker registry and then
deployed to Kubernetes, created pods and managed using AKS.
 Integrated Spark with CI/CD pipelines using tools like Jenkins, Airflow, or Livy for automated data processing on-demand.
 Integrated Angular applications with cloud services (e.g., AWS, Azure) for hosting, storage, and other cloud-related
functionalities.
 Implemented error handling and recovery mechanisms in MuleSoft applications.
 Designed and customized dashboards in Datadog to visualize key performance metrics and KPIs for easy monitoring.
 Integrated Data Lake with data catalog solutions to enhance metadata management, discoverability, and data lineage.
 Implemented auditing mechanisms for Azure Data Factory pipelines to track changes, monitor execution history, and
comply with data governance standards.
 Developed AWS CloudFormation templates for infrastructure as code (IaC), enabling repeatable and version-controlled
deployments.
 Implemented elastic stack composed of Elasticsearch, Kibana ,Logstash, apm-server on a Kubernetes environment using
helm to monitor and perform a log analysis on production environment.
 Designed and managed cluster/workloads running on self-managed Kubernetes, amazon EKS, amazon ECS and AWS
faregate.
 Used CI/CD pipeline configurations in popular formats such as YAML using VSCode.
 Expertise on deploying Couchbase, Tomcat and electricity search clusters using Docker.
 Integrated flows in Mule ESB are designed using visual tools, allowing developers to create and modify integrations through
a graphical interface.
 Integrated OIDC authentication into your DevOps tools or systems. Most modern DevOps tools, such as Jenkins, GitLab, or
Kubernetes, have OIDC integration capabilities.
 Utilized Step Functions' conditional branching capabilities to create dynamic workflows that adapt based on the outcomes
of specific tasks or conditions.
 Implemented security configurations for Couchbase, including authentication, authorization, and encryption of data in
transit and at rest.
 used Datadog monitoring metrics at build time using terraform.
 Implemented automated A/B testing for different model versions using AWS sage Maker, allowing for controlled
experimentation with new models.
 Modified web services using c# to interact with the other applications and exposed them using SOAP and HTTP.
 Worked on Anypoint Platform which provides governance and control features to ensure compliance with organizational
policies, security standards, and regulatory requirements.
 Developed scripts for build, deployment, maintenance and related tasks using Jenkins, Docker, Maven, Python and Bash.
 Utilized Okta SDKs to integrate Okta functionality directly into custom DevOps applications.
 Evaluated Chef Recipes with the concept of Test-Driven Development for Infrastructure as a Code.
 implemented AWS Step Functions to automate and coordinate workflows involving multiple AWS services, such as Lambda
functions, AWS Batch, or ECS (Elastic Container Service).
 Implemented network-related tasks and tools using Golong, such as developing networking utilities, monitoring network
traffic, or interacting with network devices.
 Implemented microservices application development and migration using aws/azure services such as azure devops
Kubernetes services (AKS), container registry, cosmos DB, and Grafana, azure pipelines, monitor, aws Kubernetes eks and
Kubernetes Api to run workloads on eks clusters.
 Created neo4j graphical databases nodes and relationship.
 Experienced with PAAS/IAAS development using AngularJS. docker, ansible.
 Leveraged tools like Docker or Kubernetes to containerize Snowflake tasks for portability and scalability.
 Created documentation for XML configurations, detailing the structure, elements, and purpose of each configuration file.
 Integrated Okta APIs with custom scripts and automation tools to extend identity and access management capabilities.
 Designed an ELK (Elastic Logstash Kibana) system to monitor and search enterprise alerts and configured ELK stack in
conjunction with AWS and using Logstash to output data to AWS S3
 Implemented rapid-provisioning and life-cycle management for Ubuntu Linux using Amazon EC2, CHEF, and custom
Ruby/Bash scripts.
 Worked on sandbox functionality, uploading patterns and quality policies in Veracode.
 Implemented automated testing for MuleSoft applications to ensure the reliability of integration flows.
 Used lambda functions in functional programming paradigms to facilitate operations such as mapping, filtering, and
reducing data sets.
 Integrated of AWS services such as Amazon RDS, S3, and IAM with EKS to enhance the functionality and capabilities of
Kubernetes workloads.
 Integrated AWS Step Functions with other AWS services such as S3, DynamoDB, or SNS to build end-to-end solutions for
various use cases.
 Authored Azure Resource Manager (ARM) templates for Infrastructure as Code (IaC) deployments, ensuring consistent and
repeatable infrastructure setups.
 Automated the provisioning of development, testing, and production environments for Web Apps using Infrastructure as
Code (IaC) tools.
 Created automation scripts (e.g., PowerShell, Azure CLI) to automate the deployment and configuration of EDH components
on Azure servers.
 Experienced on container-based environments using docker and amazon ECS.
 Result driven consultant with good experience in the area of UNIX/Linux System Administration.
 Utilize DevOps methodologies and best practices to create infrastructure automation and continuous delivery.
 integrated applications with advanced monitoring tools such as Prometheus, data dog, elk etc.
 used c# .net as language to develop code behind business logic.
 Implemented cluster services using docker and azure Kubernetes services( AKS) to manage local deployments in Kubernetes
by building a self-hosted Kubernetes cluster using Jenkins ci/cd pipelines.
 Experience on neo4j graphical databases to design the graphical structure applications.
 Implemented automated capacity planning to ensure Couchbase clusters can handle growing data volumes and user loads.
 Integrated Lambda functions into CI/CD pipelines to automate code deployment, testing, and other DevOps processes.
 Involved in designing and deploying a multitude application utilizing almost all the AWS stack (including EC2, Route 53, S3,
RDS, Dynamo DB, SNS, SQS, IAM) focusing on high-availability, fault tolerance, and auto scaling in AWS Cloud Formation.
 Working with AWS services such as EC2, VPC, RDS, CloudWatch, CloudFront, Route53 etc
 Integrated XML processing tasks into build tools (e.g., Jenkins, Azure DevOps) to automate build and deployment workflows.

Environment: Azure, Jenkins, Windows, Kubernetes, TFS, GIT, Gradle, MS Build, Docker, Docker swarm, Ansible, SonarQube,
Tomcat, Microservices, Java, C++, C#, Shell, Groovy, Agile, SQL Server.

Client: Finitas Pharma, Piscataway, NJ Jan 2017 – May 2019


Role: DevOps Engineer

Responsibilities:

 Worked on Keystone service on OpenStack to manage the users and groups by assigning the role and policies as per project.
 Monitored and fine-tuning system and network performance for server environments running Red hat Linux, Ubuntu,
Solaris.
 integrated cloud check, Datadog, Splunk dashboards with aws accounts.
 Worked on creating the Docker containers and Docker consoles for managing the application life cycle.
 Worked in Git implementation containing various Remote repositories for a single application
 Integrated Data Lake with analytics platforms (e.g., Databricks, Azure Synapse Analytics) to enable advanced data processing
and analysis
 Integrated MuleSoft applications with monitoring and logging solutions Which Capture and analyse logs, metrics and events
to monitor the health and performance of integration flows.
 Utilized Kanban practices to eliminate distractions and improve team focus on completing tasks efficiently.
 Utilized tools like RabbitMQ Management Plugin, Prometheus, and Grafana to monitor queue health, identify bottlenecks,
and troubleshoot message delivery problems.
 Integrated Couchbase-related tasks into the CI/CD pipeline, including database migrations and schema changes.
 Utilized mocking and stubbing techniques to isolate unit tests from external dependencies, promoting test independence.
 Seeking DevOps opportunities to extend my expertise in Continuous Integration & Deployment practices. Experience
in DevOps, Build, AWS services and Salesforce on Linux and Windows environments.
 Worked with developers to resolve the git conflicts while merging the feature branches.
 Setting up SQL Azure Firewall and Create, manage SQL Server AZURE Databases.
 Created a complete release process doc, which explains all the steps involved in the release process.
 Updated the Database Tables running the Database Scripts.
 Integrated Bitbucket with issue tracking tools (e.g., Jira) for seamless collaboration between development and project
management teams.
 Implemented Step Functions to automate resource cleanup tasks, ensuring efficient and cost-effective management of
resources when they are no longer needed.
 Implemented monitoring and logging solutions for EKS clusters, integrating with tools like Prometheus and Grafana to gain
insights into cluster health and performance
 Used Quantum Metric APIs for custom integrations and automation within the DevOps toolchain.
 Implemented CI pipelines using tools like Jenkins or Azure DevOps to automate the build and test processes for .NET Core
applications.
 Designed Directed Acyclic Graphs (DAGs) in Airflow, ensuring streamlined task dependencies and efficient execution of tasks
within the workflow.
 Integrated Splunk REST API for automation, enabling programmatic access and interaction with Splunk features.
 Integrated Web Apps with DevOps tools (e.g., Azure DevOps, Jenkins) for streamlined CI/CD pipelines and release
automation.
 Integrated Airflow with monitoring tools such as Prometheus and Grafana to track and visualize workflow performance and
identify potential bottlenecks.
 Implemented linked ARM template deployments for modular and reusable infrastructure components.
 Documented commonly used Kusto queries, best practices, and troubleshooting tips for knowledge sharing within the team.
 Integrated Okta with DevOps tools such as Jenkins, GitLab, or Bitbucket for centralized identity management.
 Used VMware configurations, procedures, and troubleshooting steps for knowledge sharing within the team.
 Integrated static code analysis tools (e.g., ESLint) into the CI/CD pipeline for enforcing coding standards in React codebases.
 Provisioned the servers (RHEL/Ubuntu) as per the request of the development and operations.
 Configured and maintained Jenkins to implement the CI process and integrated the tool with GIT and Maven to schedule the
builds.
 Used IaC tools like Terraform or ARM templates to provision and manage infrastructure resources required for .NET Core
applications.
 Experience in migration of On - Premises RHEL and Windows servers to cloud platforms including AWS EC2 and Azure.
 Setup/Managing Linux Servers on EC2, EBS, ELB, SSL, Security Groups, RDS, and IAM.
 Installation, Configuration and Maintenance of Samba, Apache Tomcat, Web Sphere and J boss servers in AIX and Linux
environment.
 Developed command-line interface (CLI) tools in Golang for automating common tasks, facilitating interactions with APIs,
and simplifying workflows in a DevOps environment.
 Developed this application using HTML5, CSS, material controls to render HTML and used type script to code the application
logic.
 Used AWS Identity and Access Management (IAM) roles and policies to grant the necessary permissions to Step Functions,
following the principle of least privilege.
 Developed infrastructure as code with Terraform to deploy VMware and AWS infrastructure.
 Designed and created multiple deployment strategies using CI/CD Pipelines using Jenkins. Installed multiple plugins to
Jenkins, Configured Proxy to get auto updates
 Utilized dependency management tools like NuGet to manage and version dependencies for .NET Core projects.
 Extensively worked with Scheduling, deploying, managing container replicas onto a node using Kubernetes and experienced
in creating Kubernetes clusters work with Helm charts running on the same cluster resources.
 Integrated ARM templates into CI pipelines (e.g., Azure DevOps, Jenkins) for automated testing and validation before
deployment.
 Implemented versioning strategies for Web APIs, ensuring backward compatibility and smooth transitions during updates.
 Integrated Web APIs with API gateways to manage and control access, monitor traffic, and enhance security.
 Utilized tools such as MySQL Enterprise Monitor or Prometheus to monitor and analyze MySQL database performance.
 Administered and Implemented CI tools Hudson/Jenkins for automated builds with the help of build tools like ANT, Maven,
Gradle.
 Bootstrapping automation scripting for virtual servers, using VMWare cluste.

Environment: VMware, Terraform, Splunk, GIT, AWS, CICD (Jenkins), Docker, Maven, Kubernetes, Python, Shell Scripting,
JSON, WebSphere, WebLogic, Tomcat, My SQL.

Client: TransUnion, Chicago, IL Feb2014– Dec 2016


Role: Linux Administrator

Responsibilities:

 Installed, Configured and Maintained Debian/RedHat Servers at multiple Data Centers.


 Configured RedHatKickstartfor installing multiple production servers.
 Installation of Red Hat Satellite 6 Server
 Installation, Configuration and administration of DNS, LDAP, NFS, NIS, NIS and Send mail on RedHat Linux/Debian Servers.
 Installed, Configured and Maintained Active Directory Windows Server 2008
 Migration from SUSE Linux to Ubuntu Linux
 Created Audit Report using aureport
 Configured, Maintained OVM
 Configured, Maintained, Installed OEL
 Developed automation scripting in Python (core) using Puppet to deploy and manage Java applications across Linux servers.
 Installation, Configuration and administration of VMWAREIBM Blade servers
 Experience working with production servers at multiple data centers.
 Experience using BMC bladelogic client automation Tool
 Installed Checkpoint Firewall to secure networks
 Experience in migration of consumer data from one production server to another production server over the network with
the help of Bash and Perl scripting.
 Used Puppet for Monitoring system and automation.
 Installed and configured monitoring tools such as munin and nagios for monitoring the network bandwidth and the hard
drives status.
 Developed and supported the Red Hat Enterprise Linux based infrastructure in the cloud environment.
 Experience in AWS.
 Integrated PowerShell scripts into your CI/CD pipelines for automated deployments, tests, and infrastructure updates.
 Configured Azure VMs for Windows Systems
 Managing and scaling complex SaaS for web services
 Experience with using Centrify
 Developed automation scripting in Python core using Puppet to deploy and manage Java applications across Linux servers.
 Diagnosing failed disk drives using shell scripting for automated tasks.
 Logged events from forced crash dumps and troubleshooted server issues.
 Troubleshooting production servers with IPMI tool to connect over SOL.
 Experience with system imaging tools ClonezillaandSystem Imager for data center migration.
 Configured yum repository server for installing packages from a centralized server.
 Installed Fuse to mount the keys on every DebianProduction Server for password-less authentication.

You might also like