Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 9

Anil Kumar.

Location: Plano,TX
aniletl888@gmail.com 469-287-7814

Summary: Highly motivated, Solutions driven with 12+Years of experience in Data


warehousing with good skills in Analysis, Design, Production Support, Data Modeling,
Mapping, Extraction, Migration and Development of ETL components and integrating with
external systems using DataStage.
• Extensively worked with IBM InfoSphere Information Server DataStage, and Ascential
DataStage.
• Worked on DataStage tools- DataStage Designer, DataStage Director, and DataStage
Administrator.
• Extensive experience in developing strategies for Extraction, Transformation, Loading
(ETL) data from various sources into Data Warehouse/Data Marts using DataStage Parallel
Extender.
• Experienced in Production and Technical Support activities and managing production
rollouts.
• Experienced in writing Usecase documents, Module Specifications, Technical
Specifications, translating user requirements to functional specifications.
• Excellent knowledge on data dependencies using DataStage and preparing job
sequences for the existing jobs to facilitate scheduling of multiple jobs with appropriate trigger
activities.
• Good understanding of the principles of DW, and dimensional modeling, Star Schema
and snowflake schema.
• Implemented multi-node declaration using configuration files (APT_Config_file) for
performance enhancement.
• Experienced in exposing ETL jobs as Web Services (SOA Architecture).
• Extensive experience in Unit Testing, Functional Testing, System Testing, Integration
Testing, Regression Testing, User Acceptance Testing (UAT) and Performance Testing.
• Extensive experience in building data warehouse architecture and Data Cleansing.
• Efficient in integration of various data sources like Teradata, DB2, Oracle, flat files,
XML files, Webservices and MS Access into staging area.
• Developed SCDs to populate Type I, Type II, and Type III slowly changing dimension
tables from several operational source files.
• Knowledge on entire life cycle of Data warehouse design, development and
Implementation.
• Experienced with UNIX operating system.
• Experienced in writing complex queries for Reporting.
• Excellent communication skills, capable of learning new technologies quickly, good
team player and good team co-ordination skills.
• Efficient knowledge to support the various SDLC life cycles including technical design,
coding, testing, technical documentation preparation and having Production Support experience.
• Had Experience in Production environment.
• Worked extensively in migrations of DataStage 7.5.2 to DataStage 8.5, and DataStage
8.5 to DataStage 9.1.
• Extensively worked in migration of Oracle 10g to Oracle 11g upgrade
• Had participated in migrations of Teradata v13 to Teradata v14, and on Teradata v14 to
Teradata v15.1
• Extensively worked in the DataStage Legacy Stages to DataStage Connectors (Oracle,
MQ, Teradata, DB2, and ODBC).
• Had Migration experience in SSRS 2008 to SSRS 2012
• Worked on the Autosys migration from R10 to R11
• In Business Objects migration worked for versions XIR3.0 to 4.0
• Multilingual, highly organized, detail-oriented professional with strong technical skills.
• Received lot of rewards for best performance in the team successful launch of the
projects
• Received many awards for the best Team player.

SOFTWARE SKILLS:

ETL Tools : IBM InfoSphere Information Server DataStagev9.1, IBM


InfoSphere Information Server DataStagev8.1, and v8.5, DataStage
Enterprise Edition v 7.5, Ascential DataStage 7.1, QualityStage.
RDBMS : Oracle 11g/10g/9i/8i, Teradata v14.0/13.0,SQL Server 2005/2012,
DB2, and MS – Access.
Operating System : UNIX (AIX 5.2), Solaris 10 and Windows NT/2000/ XP/Vista
Languages : SQL & PL/SQL, HTML, XML, UNIX Shell scripts
Scheduling tools : AutosysR11, Control-M Scheduler, Orchestrator, Maestro 8.3.
Other Details : Citrix Server 6p/5p, Accurev6.0, PVCS, BeyondCompare3,
Notepad++, HOD.
Reporting Tools : Business ObjectsXI 3.1

PROFESSIONAL EXPERIENCE:

Client: FORD, NA
Duration: Aug'16 to Till Date
Role: Technical Lead (Development, and Prod Support.)

Responsibilities:
• Monitor and check the both applications Sales, and LMR processes (Daily, Weekly,
Monthly, and Yearly) that are running in the Production environments.
• Identify and resolve the SMW issues on quickly basis and migrate the code to
Production.
• Work with Business to support in resolving the data issues.
• Plan and work with Off-shore team in assigning and Development work.
• Run, and participate in the meetings with Business about the new implementations.
• Designing the new data models and architecture of data warehouses and data marts
• Design, development, modification and debugging of data models, and databases in the
projects.
• Analyze Design, Develop, and Implement ETL jobs by using different types of stages
like Flat file, Transformer, Look up, Funnel, Join, Copy, Aggregator, ODBC connector, and
Oracle connector stages and conduct Unit testing.
• Test the designed code in Upper lanes, verify the data and make code ready for
Deployment.
• Develop Usecase documents, Module Specifications, Technical Specifications,
translating user requirements to functional specifications and review with users and get signoff.
• Coordinate with Onsite, and Offshore ETL, and Business teams.
• Create, and test the SFTP connections with other servers and provide them the required
files.
• Manage the Off shore, and On-shore teams and assign the work.
• Work with Off-shore, and On-shore teams in creating the complex designs, and improve
the performance of jobs.
• Create the required Nexus requests to move the code from Lower lanes to Upper lanes.
• Co-Ordinate with COE teams in finding and resolving the issues.
• Create BMC requests to capture the Changes for the small code changes, and Major new
changes/implementations.
• Inspect, Validate, and Compare the data from the Source file systems, and Data loaded
into tables before generating the Reports.
• Monitoring of database performance, and recommend the tuning of Performance
activities and resolves complex issues.

Environment: IBM InfoSphere Information Server DataStagev9.1, Oracle11g, SQL Server
2012, DB2, UNIX, TFS, Microsoft Visio, BMC Remedy, AutosysR11, Notepad++, and
WinSCP.

Client: FORD, NA
Project: SNURK2.0 (Agile Framework)
Duration: July'14 to Aug’16
Role: Technical Lead/DW Architect

Description: SNURK an Agile model system, an easy to use web-based tool, which is designed
to provide the Plant Controller's Office with a set of common core reports that can be used by all
plants globally. It has an ability to provide users with the most up-to-date information. It Collects
information (Timekeeping (Hourly and Salary), and Industrial Materials) from a suite of
applications across globally (SA, NA, EU, APA regions) convert into required format and Loads
into SNURK Warehouse. This is created in order to host the data in one DWH thereby
eliminating the need to retrieve data from a variety of different systems. It is used to increase
positive cash flow and as a Tool for GAO Auditors to audit plant.

Responsibilities:

• Follow an agile framework process to progress on all plant projects on daily basis.
• Plan work in each iteration weekly and assign to the Developers, BAs, and QC testers.
• Run the Iteration plan meetings along with Plant controllers and Product line supervisors.
• Designing the new data models and architecture of data warehouses and data marts
• Design, development, modification and debugging of data models, and databases in the
project.
• Analyze Design, Develop, and Implement ETL jobs by using different types of stages
like Flat file, Transformer, Look up, Funnel, Join, Copy, Aggregator, Teradata connector, and
Oracle connector stages and conduct Unit testing.
• Worked on SCDs to populate Type I, Type II, and Type III slowly changing dimension
tables from several operational source files.
• Develop Usecase documents, Module Specifications, Technical Specifications,
translating user requirements to functional specifications and review with users and get signoff.
• Coordinate with Plant controllers to get resolve the Data issues.
• Coordinate with Onsite, and Offshore ETL team.
• Identify and resolve the SMW issues on quickly basis and migrate the code to Production
• Coordinate with QC testing team in preparing Test cases and performing the UAT for
various plants Project modules.
• Implementation and manage the data warehouse and related data marts
• Design and Develop complex ETL jobs using DataStage with different types of stages
like Sequential file, Transformer, Lookup, Aggregator, MQ series, Teradata connectors,
CDC,Join,Funnel,Merge, and Oracle connector stages.
• Inspect, Validate, and Compare the data from the Source file systems, and Data loaded
into tables before generating the Reports.
• Monitoring of database performance, and recommend the tuning of Performance
activities and resolves complex issues.
• Redesigning of the Database based on the updated Data models.
• Extensive experience in Unit Testing, Functional Testing, System Testing, Integration
Testing, Regression Testing, User Acceptance Testing (UAT) and Performance Testing.
• Capture SIT and UAT results, generate BO reports and review with AM, Business teams
and get the Signoffs.

Environment: IBM InfoSphere Information Server DataStagev9.1, IBM InfoSphere Information
ServerDataStagev8.5, Oracle11g, Teradatav14.0, UNIX, WindowsXP, MicrosoftVisio,
BeyondCompare3, Accurev6.0, SQLServer2012, AutosysR11, RallyOne, Notepad++, and HOD.

Client: FORD, NA
Project: FST-Techrefresh(DataStagev8.5,Teradatav14.0,Oracle11g,DataStage
Connectors ,DataStage9.1,and Teradata v15.10)
Duration: February'11 to July'14
Role: Technical Lead

Description: FST- Finance System Tools is a key service marketing tool of FCSD-NA and
Europe provides the Platform for multiple CRM initiatives. Extract the Customer; Vehicle and
Financial information Transform by different standard business rules and loads into the Database
warehouse.

Responsibilities:

• The ETL code had been migrated from lower version to upper version.
• Migrated the older version code to newer version environment and compiled the whole
project jobs.
• Migrated, and converted the Oracle, Teradata, MQ series stages into Connector stages by
using Migration tool.
• Involved in and making ready of the environment (DataStage, UNIX, Teradata, Oracle,
WAS, Autosys) for the Newer version.
• Designed and Developed some sample ETL jobs using DataStage by using different types
of stages like Sequential file, Transformer, Lookup, Funnel, Copy, Aggregator, MQ series,
Teradata, and Oracle enterprise stages and conduct testing.
• Recorded and tuned the performance by comparing both versions of the processes.
• Generated System testing, User Acceptance testing documents for all the Project modules
and reviewed with AD, AM, and Business teams.
• Implemented the new features of newer version in the existed jobs to simplify the
complexity, and to increase the performance.
• Coordinated with Onsite and Offshore ETL teams.
• Coordinated with QC testing team in preparing Test cases and performing the testing for
all the Project modules.
• Captured SIT and UAT results, and generated BO reports then reviewed with AD, AM,
Business teams and got their Signoffs.
• Listed out all the Issues, and resolutions while performing testing in Newer version and
share with all other teams.
• Involved in migrate the newer version code to Production environment and coordinated
with Production team in resolving the issues.

Environment: IBM InfoSphere Information Server DataStagev9.1, IBM InfoSphere
Information Server DataStage v8.1/v8.5, DataStage Enterprise Edition v7.5, UNIX, Oracle
10g/11g, Teradatav13/v14, PVCS, AutosysR11, and Business ObjectsXI 3.1
.
Client: FORD, NA
Project: F@ST/FST
Duration: January'09 to January'11
Role: Technical Analyst

Description: F@ST!- Finance at Speed of Thought is a key service marketing tool of FCSD-NA,
provides the Platform for multiple CRM initiatives. Enables analysis of sales potential and
performance. Extract the Customer; Vehicle and Financial information transform it by different
standard business rules and loads into the Database warehouse, the rejected data loads into the
rejected tables for analysis.

Responsibilities:

• Designed Project modules using DataStage parallel Jobs- Developed jobs using different
types of stages like Sequential file, Transformer, Lookup, Funnel, Copy, Aggregator, Stored
Procedure and Oracle enterprise stages.
• Experienced in developing parallel jobs using various Development/debug stages (Peek
stage, Head & Tail Stage, Row generator stage, Column generator stage, Sample Stage)
• Generated HLD, LLD, Source to Target mapping, test cases and miscellaneous
documents.
• Prepared test cases for Unit testing and coordinated the review of the same by Business
Analysts.
• Developed SCDs to populate Type I,Type II, and Type III slowly changing dimension
tables from several operational source files.
• Travelled to Germany and worked with Business users and Plant Supervisors.
• Created some routines (Before-After, Transform function) used across the project.
• Involved in writing assembly test cases and preparing Test data document to aid testing
team
• Involved in testing in various phases like Unit Testing, SIT and UAT.
• Involved in migrating the DataStage jobs using DataStage to System test and Production
environments.
• Coordinated with On-site, and Off-shore ETL teams.

Environment: DataStage Enterprise Edition v 7.5, Oracle 9i, Teradata v13.0,UNIX, Microsoft
Visio, Test Director 8.0, PVCS, and WindowsXP,

Client: FORD, N.A


Project: CKS-Rearchitecture
Duration: April'08 to December'08
Role: Specialty Developer
Description: CKS-A (Customer Knowledge System-Analytical-Extract/transform/Load) unload
the Customer, Vehicle and Financial information stored by CKS-I, transform it by different
business rules and the records which satisfy these rules will get loaded into the Teradata
warehouse and the rest get loaded into the rejection tables. Business User accesses the data
through Business Objects reports and Campaign management.

Responsibilities:

• Designed ETL processes using DataStage parallel Jobs- Developed jobs using different
types of stages like Sequential file, Transformer, Lookup, Sort, Remove Duplicates, Pivot, Join,
Change Capture, Funnel, Copy, Aggregator, Stored Procedure, Join, Data set and Oracle
enterprise stages.
• Worked on SCD (Type I, Type II, and Type III) by using several operational source files.
• Designed several jobs to extract, transform and load data from flat files into the target
using DataStage.
• Generated Source to Target mapping, Unit test cases and miscellaneous documents.
• Developed sequencer jobs using Job sequence for controlling the data flow within the
source system.
• Identified source systems, their connectivity, related tables and fields and ensure data
suitably for mapping.
• Handled Dependencies of both Business, and Technical requirements.
• Prepared test cases for Unit testing and coordinated in the review of the same by Business
Analysts.
• Prepared detailed design and program specifications documents based on the Functional
specification document.
• Involved in testing of various phases like Unit Testing, SIT and UAT.
• Involved in migrating the DataStage jobs using Test and Production Environments.
• Raised E-Trackers for GIRS issues follow-up with them to get it fixed.
• Participated in Peer Review.

Environment: DataStage Enterprise Edition v 7.5, Oracle 10g, Teradata v13, WindowsXP,
Solaris 10, SQL Server 2005, Team Connect site.

Client: Royal and Sun Alliance, UK


Project: Royal and Sun Alliance Consolidated Reporting
Duration: March'07 to March'08
Role: ETL Developer

Description: Royal and Sun Alliance is the second largest British general insurance company in
UK. It has so many insurance groups across the world. They have different type of insurance
products like Vehicle, Items, Life Insurance, Retirement services and investment products in
each of their groups. There is no guarantee that reports from these groups use the same
definitions for elements like line of business or premium written. This attempt is to achieve
consensus about these definitions so that management can compare the performance of all
insurance groups.

Responsibilities:

• Created many ETL jobs using DataStage parallel jobs-Developed jobs using different
types of stages like Datasets Transformer, Lookup, Join, Merge, Aggregator and DB2 Stage
using DataStage Enterprise Edition.
• Handled Dependencies(Business and Technical) – Involved in programming iLOG which
maintains the Dependencies among Dimensions and Facts.
• Reconciliation – Created reconciliation jobs to check the amounts.
• Prepared Test cases for Unit Testing.
• Prepared detailed design and program specification documents based on the High level
design document,Functional specification document and BRD.
• Involved in testing of various phases like Unit Testing,SIT, and UAT.
• Helped the Business Users in clearing their queries, and issues.
• Involved in Data Synchronization exercise.
• Involved in preparing the Data base scripts for newly added systems and countries.
• Involved in Migrating the DataStage jobs from old version to new version.
• Involved in exporting the DataStage jobs using DataStage Manager.

Environment: DataStage7.5 PX, Oracle 9i/8i, UNIX, Windows NT.

Client: Charter One Bank,Chicago


Project: Charter One Bank Management System
Duration: December'06 to February'07
Role: ETL Developer

Description: This Project involved in the development of Data Warehouse on four kinds of Data
Marts, such as Accounts, Loans, Credit cards and Insurance. Each Data Mart represents a
collection of data pertaining to a single business process. The company requires different level of
analysis regarding loan amount, type of loans, type of customers, type of payment schedules,
interest rate. The Data Warehouse captures data from their Transactional Data base maintained
under Client/Server Architecture for Data Extraction, Transformation and Loading.

Responsibilities:

• Used Data Stage Designer to develop processes for Extracting, Cleansing, Transforming,
Integrating and Loading into Oracle Data Warehouse
• Used OCI, Aggregator stage, Transformer, Hashed files and used functions like
ICONV/OCONV While loading data into the Targets.
• Validated and Scheduled the jobs using Data Stage Director
• Created Various Local Containers and Shared Containers.
• Worked with Link Partitioner and Link Collector to improve performance of jobs.
• Used Job Sequencer for job Automation.

Environment: DataStage 7.5, Windows 2000/XP, Oracle8i, Test Director 8.0

Client: Albert Son, USA


Project: Albert Son's Sales Mart
Duration: April'04 to November'06
Role: ETL Developer

Description: Albert son's Inc. offers super markets throughout many states in US and on line
grocery shopping. Albert son’s discount schemes offer two distinct offers –''Multi-Buys'' and
''Linked Items’’. These details are sent to the purchase order system at stores level. When the
items with in the offers are sold at stores, the POS sends item sales data and daily Item sales get
aggregated into weekly Item sales. The offer records are kept separately and the system currently
does not apportion the discount amounts to the Items sold within offers. Albert son's wants
system capabilities to capture accurate discounted item sales data to calculate the true
profitability, as the current data is being distorted and over standing the sales figures.

Responsibilities:

• Designed and Developed of ETL modules using DataStage Server Jobs- Developed jobs
using different types of stages like Sequential File, Transformer, Hashed File, Link collector and
DB2 stage.
• Designed several jobs to extract, transform and load data from flat files into data
warehouse using DataStage Designer.
• Developed sequence jobs using Job sequence for controlling the jobs.
• Prepared test cases for Unit testing.
• Prepared detailed design document based on the Functional specification document and
BRD.
• Supported the system in Production runs.
• Resolved so many run time errors during Daily, Weekly and Monthly Runs.
• Involved in testing of various phases like Unit Testing, SIT and UAT.
• Involved in Migrating the DataStage jobs from old version to new version.
• Involved in exporting the DataStage jobs using DataStage Manager.

Environment: DataStage (Server Edition), Oracle 8i and Windows 2000 server.

EDUCATION: Masters in Engineering from S.V.University, India.

You might also like