Professional Documents
Culture Documents
Pavan Aluri - Profile
Pavan Aluri - Profile
Email: aluri888@gmail.com
Summary of Experience:
Working as a Test Lead in Infosys Limited, Bangalore from May’2019 to till date.
Worked as a Test Lead in Tech Mahindra Limited, Bangalore from May’2016 to till date.
Worked as a Sr.Software engineer in Mphasis Limited, Bangalore from April’2011 to
May’2016.
Worked as a Software engineer in Infotrack-Systems, Hyderabad from Feb’2009 to
March’2011
Qualification:
.
Achievements
Skills:
Skills Tools
BI Tools Informatica 9.6/8.6.Talend, Abinitio and Tableau
Databases Oracle, Teradata, Netezza, MSSQL Server, MySQL
Technical Profile:
Project#1:
Description:
The TRM M&T Program focuses on the Visa technology stack. We have identified 30+ programs
supported by 2-3 platforms/systems each. Extracting, Transforming and Loading (ETL) data is
considered within the scope of this project. Also, within the scope is the normalization of fields
from all the systems so reporting can be generated across multiple systems in a consistent
matter. Helper functions and data arrangements to support normalization and standardization
are considered with in scope.
.
Responsibilities:
Project#2:
Description:
OPTUS IS AN AUSTRALIAN LEADER IN INTEGRATED COMMUNICATIONS. Optus is a wholly owned
subsidiary of SingTel, Asia’s leading communications group. With over 130 years operating
experience, SingTel drives the Group’s efforts to lead in communication services throughout Asia
and Africa as well as to front the local digital and enterprise markets.
As the Australian arm of SingTel, Optus services over ten million customers each day. It provides
a broad range of communications services including mobile, telephony, business network
services, internet and satellite services and subscription television
Customer Master Data Management holding the entire customer contact information,
Preferences and Subscriptions from all the Optus source systems. Based on the Match and
Merge rules, it will merge the contact information and generate the Party ID and sending the
information to downstream systems.
Responsibilities:
.
Validating data in target tables according to business rules.
End to End Testing of Data Flow from Source to Data warehouses and then to Data mart.
Created SQL queries for validation of data present on the tables after load, based on the
business requirements.
Scheduling the Defect meeting and follow up with developers and Business team.
Verification of the fixed bugs and subsequently closing.
Daily sending the status Report to respective ITPM and team members.
Internal Review of Test Cases, Involved in Test case execution and bug reporting in the
Bug tracking tool.
Coordinating with development team in tracing errors and tracking it till closures with
respect to the test execution done.
Project#3:
Description:
External Commissions: AMDOCS ODS to provide ordering and billing details for commissioning
calculations (Apollo).AMDOCS MEC to provide rate plan and other configuration details. Internal
Commissions: Performance Centre and Firefly calculate internal commission’s payments and
send outputs to the SAP payroll system.
The data (structure, format) from AMDOCS (via ODS) is vastly different to the data (structure,
format) available via Legacy applications; existing systems receiving information from Legacy
systems need to be remediated to consume AMDOCS data. AMDOCS Data needs to be re-
formatted to be “readable “by Apollo and Performance Centre to ensure commissions can be
calculated.
.
Responsibilities:
Analyzed and understood user requirements through the Use Case and the knowledge
transfer session with the client.
Preparing the Test agreement/Test Plan Document.
Review the Test Object Matrix and Test case document.
Well efficient in preparing Process documents like Test Plans.
Preparation of Test Scenarios, Test Cases and exporting to ALM.
Internal Review of Test Cases, Involved in Test case execution and bug reporting in the
Bug tracking tool.
Preparing Test summary and Defect Aging Reports and sharing with client test manager
Coordinating with development team in tracing errors and tracking it till closures with
respect to the test execution done.
.
Guiding the team by providing solutions for various issues faced on time.
Interacting with customers for clarity on requirements.
Maintain Status Report and Issue log.
Following Defect Life cycle through Quality Center.
Follow up on Internal Test issues with Onshore.
Project#4:
Description:
Card Data Warehouse (CDW) is one of the well-established and reliable data
warehouse in Capital One. The main objective of CDW is to meet the US Business analytic needs
while adhering to data architecture guidelines and principles. CDW has a typical data warehouse
architecture where it has multiple source files as input feeds, a cleansing environment (staging)
and a distribution area (enriched files and load ready files) together called DDE (Data
Distribution Environment) for Capital One standard DQ check, Clean file creation, data
aggregation using defined transformation logics and then finally data loaded into the Card Data
Warehouse which resides on Teradata database.
Responsibilities:
Project #5
.
Environment: Ab initio, Unix, DB2,Oracle, Teradata, HP ALM.
Overview:
The Capital One Auto Finance analytical platform process is unable to meet all business needs of
the division and COF enterprise. The Auto Finance Data Warehouse Remodeling project will
resolve existing gaps in the Capital One Auto Finance Analytics process.
The Auto Finance Data Warehouse Data Excellence Program (DEP) consists of three phases. This
solution represents Phase-3 of the project to deliver a re-modeled Auto Finance Data
Warehouse.
Phase 1 process includes unload the data from files to SQL Data base. Currently working on 2nd
phase. Phase-2 consisted of fork lifting all data supporting current business intelligence needs,
from SQL Server analytical environment to enterprise Teradata environment and converting all
legacy SQL Server DTS packages into DDE for simplified and efficient data acquisition and
integration environment. This will enable data sharing across divisions and will improve overall
BI capabilities of COAF.The ADW environment will contain two major components, Base Layer
and Presentation Layer. The Base Layer will contain the staging and base tables and the
Presentation Layer will contain aggregate and summary tables, database views. The
presentation layer will be the access layer through which the users and BI tool will access the
data. SDR’s, Source to target mapping and Data Model will be the inputs for the testing activity.
Responsibilities:
Project #6:
Overview:
.
A Data Repository having generic data structures to hold HC wide data, to address all non-
transactional Business Reporting needs. Supports design, development and deployment of HC
Global and P&L specific Business reporting solutions (irrespective of numbers & type ERP/OLTP
systems).Supports downstream application design to reduce/eliminate the need of parallel
stream of redundant data transformation jobs. Reduces/eliminate the need of additional
redundant data extract jobs from ERP to support Business Reporting or Non-transactional
application load. Reduces/eliminate the need for redundant databases, required to support
semi-operational / non-transactional Business Applications
Responsibilities: