Visit More Resumes at WWW Pages/resumes/resum Es - HTML

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 7

www.downloadmela.

com

Senior ETL / Informatica Developer

Professional Summary
• Over
Over 7 Yea ears
rs of IT expeexperi rien
ence
ce with
with emph
emphas asis
is on busi
businesnesss requ
requir
irem
emenents
ts anal
analys
ysis
is,,
applicat
application
ion design,
design, data modeling
modeling,, developme
development, nt, implementa
implementations tions,, testing
testing and project
project
coordination of applications
• Exper
Experie
ienc
ncee in deve
develolopm
pmen entt and
and desi
design
gnin
ing
g of ETL ETL metho
methodo dolo
logy
gy for
for supp
suppor
orti
ting
ng data
data
transformations using Informatica Power Center 6.x/7.x /8.x
• Highly skilled in Planning, Designing, developing and deploying Data Warehouses / Data
Marts.
• Extensively used ETL methodology for supporting Data Extraction, transformations and
loadin
loadingg proce
processi
ssing,
ng, using
using Inform
Informati
atica
ca ( Powe
Powerr Mart Mart// PowePowerr CenCenter-
ter- Meta
Meta Data
Data
Repository, Designer, Server Manager).
• Extensively
Extensively used various
various active
active Transforma
ransformations
tions like
like Filter
Filter Transforma
ransformation,
tion, Router
Router
Transformation, Joiner Transformation and Aggregator Transformation.
• Work
orked with
with vari
variou
ouss Info
Inform
rmatatic
icaa clie
client
nt tool
tools s lik
like Source
Source Analyzer
Analyzer,, Warehou
Warehouse se
designe
designerr, Mappin
Mapping g design
designer er,, Mapple
Mapplett Designe
Designerr, Transfo ransforma
rmatio
tionn Develo
Developerper,
Informatica Repository Manager and Workflow Manager.
• Hands on experience in handling the Informatica Power Exchange to capture the near real
time change data capture(CDC) data.
• Experienced with different classes of ODS file systems and extracted the files using ETL
tool Informatica PowerCenter.
PowerCenter.
• Experience in relational database development using PL/SQL and SQL in various relational
databases like Oracle, SQL Server,DB2.
• Experienced on different HealthCare Insurance, Financial and Pharmaceutical sectors using
ETL methodology.
• Expert
Expertise
ise in creat
creating
ing mappin
mappings, gs, Mapple
Mapplets ts and reusabreusable le transf
transform
ormati
ations
ons using
Informatica.
• Experi
Experienc
enced
ed in analyz
analyzing
ing busine
business ss requi
requirem
remententss and transltranslati
ating
ng requir
requireme
ementsnts into
into
functional and technical design specifications.
• Designed and developed Data Marts by following Star Schema and Snowflake Schema
Methodology, using Data Modeling tool Erwin.
• Experienced in installing and maintaining PowerCenter/Power Exchange CDC on UNIX, UNIX, for
Oracle 10g RAC.
• Good understanding of relational database management systems and extensively worked
on Data
Data extrac
extractio
tion,
n, transfo
transformarmatio
tion,
n, loadin
loading
g with
with Ora
Oracle
cle,, DB2,
DB2, SQL Server
Server 2000
2000 using
using
Informatica.
• Extensively used various Functions like LTRIM, RTRIM, ISNULL, ISDATE, TO_DATE, Decode,
Substr, Instr and IIF function.
• Experi
Experienc
enced
ed in migrat
migrating
ing Inform
Informati
atica
ca mappin
mappings gs from
from OracOraclele to DB2,DB2, andand othe
otherr
relational databases.
databases.
• Hands on experience in handling large volumes of data in production environments.
• Worked with different data sources like Flat Files, XML files, Databases. Databases .
• Hands
Hands on exper
experien
ience
ce in writin
writing,
g, testi
testing
ng and implem
implement entati
ation
on of the triggers,
triggers, stored
stored
procedures, functions, packages at database level and form level using SQL, PL/SQL.

Visit More Resumes At


www.downloadmela.com/
www .downloadmela.com/pages/resumes/resum
pages/resumes/resumes.html
es.html
www.downloadmela.com
• Design Source to Target maps, Code Migration, Version control, scheduling tools, Auditing,
shared folders, data movement, naming in accordance with ETL Best Practices,
Standards and Procedures.
• Coordination with the off-shore development team to fix release issues.
• SCD Management including Type 1, 2, 3, Hybrid Type 3, De-normalization, Cleansing,
Conversion, Aggregation, Performance Optimization, Audit, etc.
• Coordination in preparing project plans with project manager and development team to
make sure project plans are correct.
• Excellent aptitude to understand and meet client needs.
• Excellent communication, presentation, project management skills and a very good team
player and self-starter with ability to work independently and as part of a team.

TECHNICAL SKILLS:
Oracle 10g/9i/8i, DB2, MS SQL SERVER, MS Access 9.0/7.0, Foxpro,
RDBMS
MS Access
Informatica (Power Center 8.5.1/8.1.1/7.1.4/6.2, Power Exchange
ETL Tools
8.1.1, Power Mart 7.1/6.2), SQL*Loader
BI Tools Business Objects, Web Intelligence, Cognos 7.0, Microstrategy
Operating
UNIX (Sun Solaris, HP Unix), Windows NT/98/95, MS DOS
Systems
Data Modeling Erwin 3.4/3.5.5/3.5
Languages SQL, PL/SQL,shell scripts, Perl, HTML, DHTML, C, C++, COBOL,VB5
Scheduler Autosys, Cronacle, Maestro, JCL Mainframes
Version Control Win CVS, VSS
 Toad 7.6/8.5, PL/SQL-Developer, Mercury QC 9.1, MS-Office, MS
Other Tools
Project Plan

Professional Experience
Standard & Poor, NY City, NY 
Dec’09 – Till Date
Sr.ETL/ Informatica Developer
S&P is a diversified financial service company that provides a broad range of banking, asset
management, wealth management, and corporate and investment banking products and
services. Corporate Repository or Core is the primary data source from which data is fetched
to load into various databases that support various products. The ratings repository
datawarehouse is the target from which various reports are generated. The Data Services
department deals with the extraction, transformation and loading of data from the Core
( Corporate Repository )to the RRDW ( Ratings Reporting Data Warehouse ) and maintains all
the data involved in the process.

Responsibilities:
• Developed complex Informatica mappings to load the data from various sources using
different transformations like source qualifier, connected and unconnected look up,
expression, aggregator, joiner, filter, normalizer, rank and router transformations.
• Worked Informatica power center tools like source analyzer, mapping designer, mapplet
and transformations.
• Developed Informatica mappings and also tuned for better performance.

Visit More Resumes At


www.downloadmela.com/pages/resumes/resumes.html
www.downloadmela.com
• Worked with MLOAD, FASTLOAD, TPUMP and BTEQ utilities of Teradata for faster
loading and to improve the performance.
• Extensively used Informatica to load data from flat files, Oracle, Teradata database.
•  Teradata views have been developed against the Departmental database and claims
engine database to get the required data.
• Extensively used various Functions like LTRIM, RTRIM, ISNULL, ISDATE, TO_DATE,
Decode, Substr, Instr and IIF function.
• Responsible for Performance Tuning at the Mapping Level and Session level.
• Worked with SQL Override in the Source Qualifier and Lookup transformation.
• Extensively worked with both Connected and Unconnected Lookup Transformations.
•  These views are built through Harvest Change Manager tool in desired schema in
 Teradata Warehouse and used as one of the sources for Informatica.
• Load balancing of ETL processes, database performance tuning and capacity monitoring.
• Involved in Unit testing and System testing of the individual.
• Analyzed existing system and developed business documentation on changes required.
• Used UNIX to create Parameter files and for real time applications.
• Developed shell scripts.
• Extensively involved in testing the system from beginning to end to ensure the quality if 
the adjustments made to oblige the source system up-gradation.
• Worked with many existing Informatica mappings to produce correct output.
• Prepared Detail design documentation thoroughly for production support department
to use as hand guide for future production runs before code gets migrated.
• Prepared Unit Test plan and efficient unit test documentation was created along with Unit
test cases for the developed code.
• Detail system defects are created to inform the project team about the status throughout
the process.
• Extensively worked with various lookup caches like Static Cache, Dynamic Cache and
Persistent Cache.
• Used Update Strategy DD_INSERT, DD_UPDATE to insert and update data for
implementing the Slowly Changing Dimension Logic.
• Developed Re-Usable Transformations and Re-Usable Mapplets.
• Developed Slowly Changing Dimensions Mapping for Type 1 SCD and Type 2 SCD.
Environment: Informatica Power Center 8.6.1/8.1.3, Teradata V2R12/V2R6, Oracle 10g,
 Teradata SQL Assistant and Administrator, IMS Data, XML, LINUX, UNIX Shell Scripting,
Harvest Change Manager, Rational Clear Quest,, Windows, Autosys

T-Mobile, Frisco, TX
Oct’08 – Nov’09
Sr.ETL/Informatica Developer
 T-Mobile is a wireless service provider that operates several GSM networks in the United
States. The financial data warehouse represents the balances and transaction data
information extracted and integrated from various source systems. The objective of Financial
Data Warehouse is to represent a financial reporting infrastructure that centralizes access to
information, data, applications, reporting and analytical systems for the required financial
content dealing with different products and services.

Responsibilities:
• Interacted with the Business Users to analyze the Business Requirements and transform
the business requirements into the technical requirements.

Visit More Resumes At


www.downloadmela.com/pages/resumes/resumes.html
www.downloadmela.com
• Extracted data from various heterogeneous sources like Oracle, Teradata and Flat Files.
• Worked on PowerCenter Tools like Designer, Workflow Manager, Workflow Monitor and
repository Manager.
• Worked on Designer Tools like Source Analyzer, Warehouse Designer, Transformation
Developer, Mapplet Designer and Mapping Designer.
• Extensively used various active Transformations like Filter Transformation, Router
Transformation, Joiner Transformation and Aggregator Transformation.
• Extensively worked with various passive Transformations like Expression
Transformation and Lookup Transformation.
• Responsible for Data Cleansing and Data Validation using Expression Transformation and
Lookup Transformation.
• Analyzed business process workflows and developed Etl procedures to move data from
various source systems to target systems.
• Developed database Schemas like Star schema and Snowflake schema used in relational,
dimensional and multidimensional data modeling using ERWIN.
• Worked with various Informatica client tools like Source Analyzer, Warehouse
designer, Mapping designer, Mapplet Designer, Transformation Developer,
Informatica Repository Manager and Workflow Manager.
• Involved in development of Informatica Mappings using transformations like Source
qualifier, Aggregator, Connected & unconnected Lookups, Filter, Update
Strategy, Rank, Stored Procedure, Expression and Sequence Generator and
Reusable transformations.
• Created PL/SQL procedures to transform data from staging to Data Warehouse Fact and
summary tables.
• Extensively used Stored Procedures, Functions and Packages using PL/SQL for creating
Stored Procedure Transformations.
• Developed, Modified and Tested UNIX Shell scripts and necessary Test Plans to ensure the
successful execution of the data loading process.
• Created various UNIX Shell scripts for automation of events, File Validation, File Archiving.
• Developed scripts using PMCMD command to run the Workflows, Sessions from UNIX
environment which are called in Autosys jobs.
• Developed batch file to automate the task of executing the different workflows and
sessions associated with the mappings on the development server.
• Involved in Testing, Debugging,Data Validation and Performance Tuning of ETL process,
help develop optimum solutions for data warehouse deliverables.
• Performance tuning of Informatica mappings for large data files by managing the block
sizes, data cache sizes, sequence buffer lengths and commit interval.
• Developed Business Objects Universe which acts a interface between database and end
user.
• Worked on Unit Testing, Integration testing and Regression testing to verify load order,
time window and lookup with full load.
Environment: Informatica Power Center 8.6.1,  Teradata V2R6, Business Objects XIR2, Erwin
7.2,DB2,Oracle 10g,Sql Server, Flat Files, Transact-SQL, PL/SQL, UNIX, TOAD, AutoSys,
Microsoft VSS.

PNA Insurance, Chicago, IL


May’07 – Aug’08
Sr.ETL/ Informatica Developer
PNA Insurance is a full-service of insurance, financial, and auto registration agency has been

Visit More Resumes At


www.downloadmela.com/pages/resumes/resumes.html
www.downloadmela.com
in business since 1988. This is a Claim Data warehouse (CDW) providing Policy Transactions
by credit cards, Premium and Claims Transactions data to the business users. Users want to
measure profit over time by coverage, coverage item type, and geographic, demographic and
sales distribution channels.The system is fully integrated, enabling the entry and sharing of 
policy and claims data among all components of the system and capturing data to complete
policy administration, claims administration, policy issuance, billing, reinsurance accounting,
and bureau reporting and management reports.

Responsibilities:
• Involved in gathering of  business scope and technical requirements and written
technical specifications.
• Involved in all phases of software development life cycle (SDLC) in RUP framework.
• Fine-tuned existing Informatica maps for performance optimization.
• Developed complex mappings in Informatica to load the data from various sources using
different transformations like Source Qualifier, Look up (connected and unconnected),
Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Rank and
Router transformations.
• Involved in massive data cleansing and data profiling of the production data load.
• Created synonyms for copies of time dimensions, used the sequence generator
transformation type to create sequences for generalized dimension keys, stored procedure
transformation type for encoding and decoding functions and Lookup transformation to
identify slowly changing dimensions.
• Created FTP scripts to transfer data from consumer systems to HCA systems, and
Conversion scripts to convert data into flat files to be used for Informatica sessions.
• Worked on Informatica tool –Source Analyzer, Data warehousing designer, Mapping
Designer & Mapplet and Transformations.
• Involved in capturing slowly changing dimensions of all varieties including type-I, type-
II, type-III.
• Used ready wizards in Informatica to capture and implement SCD Type-II (Changing
Data) and tuned it requirement.
• Involved in the development of Informatica mappings and also tuned for better
performance.
• Extensively used ETL to load data from flat files (excel/access), Teradata database.
• Load balancing of ETL processes, database performance tuning and Capacity monitoring.
• Involved in the Unit testing, Event & Thread testing and System testing of the individual.
• Analyzed existing system and developed business documentation on changes required.
• Develop and execute load scripts using Teradata client utilities MULTILOAD, FASTLOAD
and BTEQ.
• Written JCL scripts and created stored procedures in TERADATA SQL Assistant.
• Used UNIX to run FTP scripts, which transferred data from the UNIX server to the
Windows NT server.
• Made adjustments in Data Model and SQL scripts to create and alter tables.
• Scheduled and monitored automated weekly jobs under UNIX environment.
• Extensively involved in testing the system from beginning to end to ensure the quality of 
the adjustments made to oblige the source system up-gradation.
• Involved in tracking and reporting defects using Clear Quest and was also responsible for
communicating the status to the team in bug meetings.
• Worked with many existing Informatica Mappings to produce correct output.
• Made use of Server Manager to create and monitor sessions and batches.
Environment: Informatica PowerCenter 8.5.1/7.1.2, Custom programming using Perl, IMS
Visit More Resumes At
www.downloadmela.com/pages/resumes/resumes.html
www.downloadmela.com
Data (Xponent, Xponent Plantrak, PlanTrak, Rx Data), Teradata V2R5, Oracle 10g, MS SQL
Server 2005/2000, RTMS, HP UNIX Shell Scripting, PL/SQL, Syncsort for UNIX, Rational Clear
Quest, Windows 2000, HP-UX.

AIG, NY City, NY 


Oct 05 – Mar 07
Data Warehouse/ Informatica Developer
ERMT Data Repository (ERMT) that serves as the authoritative source of timely, accurate, and
consistent data on AIG’s holdings, transactions, reference data, Financial Guarantee/LOC and
other obligations that carry embedded market and credit risks. Create a reliable mechanism
that guarantees the delivery of this data between systems. Development of WEB Application
for MEDS and CONTROLLERS work station to create / search / modify financial guarantee /
LOC.

Responsibilities:
• Created new mappings and updating old mappings according to changes in Business logic.
• Designed and deployed overall ETL strategy including CDC, SCD’s, Partition Management,
Materialized Views and other complex mapping logics.
• Performed major role in understanding the business requirements and designing and
loading the data into data warehouse (ETL).
• Used ETL (Informatica) to load data from source to ODS. Built two administrative utilities
(Execution Broker and Workflow Monitor) in Informatica.
• Used Informatica client tools - Source Analyzer, Warehouse designer, Mapping Designer,
Mapplet Designer, and Transformation Developer for defining Source & Target definitions
and coded the process of data flow from source system to data warehouse.
• Logical and Physical design of databases using Erwin. Extensively used Erwin for data
modeling and dimensional data modeling.
• Implemented Slowly Changing Dimensions - Type I & II in different mappings as per the
requirements.
• Used Informatica Workflow Manager and Workflow Monitor to schedule and monitor
session status.
• Used Debugger to troubleshoot the mappings.
• Developed mappings in Informatica to load the data from various sources into the Data
Warehouse, using different transformations like Joiner, Aggregator, Update Strategy, Rank,
Router, Lookup, Sequence Generator, Filter, Sorter, and Source Qualifier.
• Implemented performance tuning logic on Targets, Sources, mappings, sessions to provide
maximum efficiency and performance.
• Defined Target Load Order Plan for loading data into Target Tables.
• Prepared Unit Test Cases.
• Generated various reports using Business Objects functionalities like Queries, Slice and
Dice, Drill down, Functions, Cross Tab, Master/Detail and Formulas as per client
requirements.
• Used shell scripts for post session operations.
• Worked on database connections, SQL joins, cardinalities, loops, aliases, views, aggregate
conditions, parsing of objects and hierarchies.
Environment: Informatica Power Center 8.1.1, Sybase 12.x/11.x, Oracle 10g, Facets, PL/SQL,
Business Objects 6.5, TOAD, Erwin 4.5, UNIX Shell Script, Windows 2003.

Kernex Microsystems And Software Ltd, INDIA


 Jan’03 – Jul’05

Visit More Resumes At


www.downloadmela.com/pages/resumes/resumes.html
www.downloadmela.com
ETL Developer, Report Generator.
Lucent have developed a central data warehouse for sales. The Data Warehouse development
significantly enhanced Lucent Flexibility, Proprietary Competitive Advantage, and Improved
Service Levels in serving the critical needs of the sales team. Project involved in developing
Enterprise Subject Data warehouse to implement a centralized database that collects and
organizes and stores data from different operational data sources to provide a single source of 
integrated and historical data.

Responsibilities:
• Identified business rules for data migration, parsing high-level design spec to simple ETL
coding and mapping standards.
• Designed new database tables to meet business information needs. Designed mapping
document, which is a guideline to ETL coding.
• Loading the Data from Flat Files and COBOL copybooks into the staging area using
Informatica Power Center 6.2
• Implemented SDLC concept in extraction, Transformation and Loading of data using
Informatica.
• Worked on Informatica Power Center 6.2 tool - Source analyzer, warehouse designer,
mapping designer, workflow manager, mapplets, and reusable transformations.
• Using Informatica Designer designed mappings that populated the data into the target star
schema on Oracle 8i instance.
• Extensively used XML for metadata exchange and data portability.
• Optimized query performance, session performance and reliability.
• Extensively used Router, Lookup, Aggregator, Expression and Update Strategy
transformations.
• Maintained Shared folders and shortcuts, global repositories and user security via
repository manger.
• Repository backup and maintenance.
• Used data base objects like materialized view, sequence generators, parallel partitioning
and stored procedures for accomplishing the complex logical situations.
• Handled the migration process from development, test and production environments.
• Used Perl database modules for error checking and logging of data loads.
• Used Perl scripts to send emails with attachments of erroneous records to support team.
• Set Standards for naming conventions and best practices for Informatica mapping
development.
•  Tuning the mappings for optimum performance, dependencies and batch design.
• Scheduled and ran extraction and load process and monitored sessions using Informatica
workflow manager.
• Created the batch design and scheduled the batches to run under the window time.
• Designed the slowly changing dimension strategy for the warehouse.
Environment: Informatica Power Center 6.2, Perl scripts, Sun Solaris 2.7, Oracle 8i, DB2,
Intelligence Server, Cognos 6.0, PL/SQL, SQL*Plus, SQL*Loader, UDB, Erwin, Windows NT 4.0

Visit More Resumes At


www.downloadmela.com/pages/resumes/resumes.html

You might also like