Professional Documents
Culture Documents
Vineetchaubey345
Vineetchaubey345
Vineetchaubey345
OBJECTIVE
Achievement-driven professional targeting assignments in an organization where one can apply his experience and
efficiently contribute to the company’s growth.
PROFILE SYNOPSIS
● A competent professional with almost 6 years of experience in Gap Analysis, Understanding the user requirement,
Coordination, Data Analysis and Integration, ETL methodology for data transformation in an Offshore- Onshore
engagement at a product-based company.
● Excellent exposure towards industry standard ETL concepts, Data Warehousing concepts, Data Integration with Data
Quality maintenance and conceptual and hands on Business Intelligence tools to articulate Decision support systems.
● Very good understanding of Data Models and Data Warehousing concepts.
● Interpret end to end data flow across domains and exhibit excellent data purge and co-ordinate cleansing activities at
best quality of data for the team.
● Experience in SDLC process using project maintenance tools like: JIRA, Service now, Confluence, Top Team, Elvis,
Crucible etc.
● Good analytical, writing, verbal and problem-solving skills, organizational project planning and time management.
● Excellent communication & interpersonal skills with the ability to work in a multicultural environment.
CORE COMPETENCIES
● In depth knowledge on Informatica tools – Source Analyzer, Workflow manager, Mapping designer, Mapplet Designer,
Transformations Developer, Informatica Repository.
● Designed and developed complex mappings to move data from disparate source systems into a common target area
such as Data Marts and Data Warehouse using lookups, Source Qualifier, Router, Filter, Expression, Aggregator, Joiner,
Normalizer, Sequence Generator, Update Strategy from varied transformation logics in Informatica.
● Exhibit strong hands on experience in extraction of the data from various source systems ranging from Mainframes
like Flat Files, VSAM files, etc. to RDBMS like Oracle, SQL Server etc.
● Extensively used Slowly Changing Dimension (SCD) technique in various applications. Expertise in OLTP/OLAP
System Study, Analysis, E-R modelling, developing Dimensional Models using Star schema and Snowflake schema
techniques used in relational, dimensional and multidimensional modelling.
● Worked on optimizing the mappings by creating reusable transformations and Mapplets and performance tuning of
sources, targets, mappings, transformations and sessions.
● Worked on streamsets tool to ingest data in Hadoop.
● Experience in Big data Hadoop and Sqoop to load data in Hadoop environment and perform transformation using
Spark.
● Worked on Hadoop file level security using ACL.
● Good understanding in creating Hive tables, loading with data and writing Hive queries which will run internally in
MapReduce way.
● Good understanding on performance tuning in hive. Regularly tune performance of Hive queries to improve data
processing and retrieving
● Worked on Loading and transforming large sets of structured and semi structured data.
● Worked on trifecta to create flows for data wrangling and cleansing.
● Good understating on user and group level security on HDFS files and directories.
● Worked on visualization, data and analytical functions using Spotfire reporting tool.
● Good understanding on the Spotfire charts like Bar graph, Line chart, cross table, filter, pie chart, scatter plot etc.
● Implemented several challenging scenarios by using document property, data function using R and Iron python
scripting.
TOOLS
PROFESSIONAL EXPERIENCE
PROJECT DETAILS
Responsibilities:
● Responsible for creating flows in trifecta for data wrangling and cleansing.
● Analysis of source systems, business requirements and identification of business rules.
● Creating flow in trifecta for data wrangling.
● Responsible for creating folder structure in Hadoop and providing/changing access.
● Creating hive tables on top of output files created by trifecta flow for analysis.
● Responsible for implementing permission requirements for specific users or groups using extended ACL.
● Prepared Unit/ Systems Test Plan and the test cases for the developed mappings.
● Creating Streamsets pipeline and job to load data from salesforce to HDFS.
● Worked on hive database (Hive tables to access the data loaded in HDFS).
ACADEMIC CREDENTIALS
• Graduation: Bachelor of engineering and technology (Electronics and communication) from Aligarh College Of
Engineering And Technology (UPTU) – 2014 securing 73.24 percentage
• X Grade: Government Inter College Allahabad – 2007 securing 69.5 percentage
• XII Grade: Government Inter College Allahabad – 2009 securing 70.80 percentage
PERSONAL DETAILS
Declaration: I hereby declare that the information furnished above is true to the best of my knowledge.