Vineetchaubey345

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 3

VINEET KUMAR CHAUBEY

Phone: +91- 7411820183


E-Mail: vineetchaturvedi345@gmail.com

OBJECTIVE

Achievement-driven professional targeting assignments in an organization where one can apply his experience and
efficiently contribute to the company’s growth.

PROFILE SYNOPSIS

● A competent professional with almost 6 years of experience in Gap Analysis, Understanding the user requirement,
Coordination, Data Analysis and Integration, ETL methodology for data transformation in an Offshore- Onshore
engagement at a product-based company.
● Excellent exposure towards industry standard ETL concepts, Data Warehousing concepts, Data Integration with Data
Quality maintenance and conceptual and hands on Business Intelligence tools to articulate Decision support systems.
● Very good understanding of Data Models and Data Warehousing concepts.
● Interpret end to end data flow across domains and exhibit excellent data purge and co-ordinate cleansing activities at
best quality of data for the team.
● Experience in SDLC process using project maintenance tools like: JIRA, Service now, Confluence, Top Team, Elvis,
Crucible etc.
● Good analytical, writing, verbal and problem-solving skills, organizational project planning and time management.
● Excellent communication & interpersonal skills with the ability to work in a multicultural environment.

CORE COMPETENCIES

● In depth knowledge on Informatica tools – Source Analyzer, Workflow manager, Mapping designer, Mapplet Designer,
Transformations Developer, Informatica Repository.
● Designed and developed complex mappings to move data from disparate source systems into a common target area
such as Data Marts and Data Warehouse using lookups, Source Qualifier, Router, Filter, Expression, Aggregator, Joiner,
Normalizer, Sequence Generator, Update Strategy from varied transformation logics in Informatica.
● Exhibit strong hands on experience in extraction of the data from various source systems ranging from Mainframes
like Flat Files, VSAM files, etc. to RDBMS like Oracle, SQL Server etc.
● Extensively used Slowly Changing Dimension (SCD) technique in various applications. Expertise in OLTP/OLAP
System Study, Analysis, E-R modelling, developing Dimensional Models using Star schema and Snowflake schema
techniques used in relational, dimensional and multidimensional modelling.
● Worked on optimizing the mappings by creating reusable transformations and Mapplets and performance tuning of
sources, targets, mappings, transformations and sessions.
● Worked on streamsets tool to ingest data in Hadoop.
● Experience in Big data Hadoop and Sqoop to load data in Hadoop environment and perform transformation using
Spark.
● Worked on Hadoop file level security using ACL.
● Good understanding in creating Hive tables, loading with data and writing Hive queries which will run internally in
MapReduce way.
● Good understanding on performance tuning in hive. Regularly tune performance of Hive queries to improve data
processing and retrieving
● Worked on Loading and transforming large sets of structured and semi structured data.
● Worked on trifecta to create flows for data wrangling and cleansing.
● Good understating on user and group level security on HDFS files and directories.
● Worked on visualization, data and analytical functions using Spotfire reporting tool.
● Good understanding on the Spotfire charts like Bar graph, Line chart, cross table, filter, pie chart, scatter plot etc.
● Implemented several challenging scenarios by using document property, data function using R and Iron python
scripting.

TOOLS

● ETL Tools: Informatica PowerCenter 8.x,9.x and 10.4


● Big data: Cloudera, Sqoop, Spark, Hive, Trifecta and Streamsets.
● Database: Oracle, SQL Server
● DB Tools: SQL*Plus, TOAD, SQL Developer
● Languages: SQL, PL/SQL, UNIX, Python
● Visualization Tools: Spotfire

PROFESSIONAL EXPERIENCE

Company Name: IQVIA


Duration: From July-2015 To Till Date

PROJECT DETAILS

Project Title: Marketed Product Maintenance


Period: Till Date
Team Size: 25
Designation: ETL Developer
Technical Stack: Oracle Database 11g, SQL Developer Data Modeler, Informatica.
Description: The MPM Oversight Dashboard is designed to layout the ground work to support a new offering from
Pharmacovigilance, Regulatory and Benefit Risk Management. The dashboard is intended to provide insights into key
performance metrics and portfolio management, revealing data typically not attainable in current pharmaceutical delivery
models. Data will be integrated from approximately 15 sources and visualized through a presentation layer.
The MPM Oversight Dashboard will be broadly divided into two areas – ‘Performance Oversight’ and ‘Portfolio
Management’. Performance Oversight is intended for Internal and prospective customer use to oversee how a company is
delivering service for customers according to predefined SLAs and KPIs. It will provide an easy view to see where
performance might be tending off course and allow for remediation. Portfolio Management is intended to provide
customers with insights as to how their product portfolio is performing globally and allow them to make key informed
decisions with regards to increasing focus on marketing that product in a certain country, or indeed the opposite and allow
them to make decisions about perhaps withdrawing a product from a market.

Responsibilities:

• Analysis of source systems, business requirements and identification of business rules.


• Responsible for developing, supporting and maintenance for the ETL (Extract, Transform and Load) processes
using Informatica Power Center.
• Worked on complex mappings, Mapplets and Workflows to meet the business needs and ensured they are
reusable transformations to avoid duplications.
• Extensively used ETL concepts to transfer and extract data from source files (over 15 sources) and load the data
into the target database.
• Documented Mapping and Transformation details, reviewed user requirements, validated implementation plan
and schedule.
• Designed and developed efficient Error Handling methods and implemented throughout the mappings.
Responsible for Data quality analysis to determine cleansing requirements.
• Worked with several facets of the Informatica Power Center tool - Source Analyzer, Data Warehousing Designer,
Mapping & Mapplet Designer and Transformation Designer. Development of Informatica mappings for better
performance.
• Configured the sessions using workflow manager to have multiple partitions on Source data and to improve
performance. Understand the business needs and implement the same into a functional database design
• Prepared Unit/ Systems Test Plan and the test cases for the developed mappings.
● Responsible for developing support and maintenance for the spotfire reports.
• Responsible for junior team members’ work assignment and tracking along with a knowledge base development
plan for freshers.

Project Title: Data exploration and wrangling


Period: Till Date
Team Size: 7
Designation: Hive, Spark and trifecta Developer.
Technical Stack: Cloudera Hadoop, Hive, Trifecta, Sqoop
Description: Provides our data users with end to end data acquisition, prep, transformation, visualization and
collaboration capabilities so that our business users are enabled to perform more up-front data discovery and test out
various business rules, logic and theories prior to IT involvement in creation of production ready features. This provides a
way for users to directly interact with data using self-serve tools to construct and execute experiments, explorations,
discovery and analytics.
Responsibilities:

● Responsible for creating flows in trifecta for data wrangling and cleansing.
● Analysis of source systems, business requirements and identification of business rules.
● Creating flow in trifecta for data wrangling.
● Responsible for creating folder structure in Hadoop and providing/changing access.
● Creating hive tables on top of output files created by trifecta flow for analysis.
● Responsible for implementing permission requirements for specific users or groups using extended ACL.
● Prepared Unit/ Systems Test Plan and the test cases for the developed mappings.
● Creating Streamsets pipeline and job to load data from salesforce to HDFS.
● Worked on hive database (Hive tables to access the data loaded in HDFS).

ACADEMIC CREDENTIALS

• Graduation: Bachelor of engineering and technology (Electronics and communication) from Aligarh College Of
Engineering And Technology (UPTU) – 2014 securing 73.24 percentage
• X Grade: Government Inter College Allahabad – 2007 securing 69.5 percentage
• XII Grade: Government Inter College Allahabad – 2009 securing 70.80 percentage

PERSONAL DETAILS

Date of Birth: 15th JUN 1991


Address: 46, 25TH MAIN ROAD, HSR LAYOUT SECTOR 1, Bangalore - 560102
Languages Known: English, Hindi
Marital Status: Married

Declaration: I hereby declare that the information furnished above is true to the best of my knowledge.

You might also like