Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 5

RAMANAIAH

Email:dataengineer9966@gmail.com Mobile:+91 9963813576

PROFFESSIONAL SUMMARY:
 Having about 4.1 years of experience in developing DW-BI Applications Using
Microsoft technologies SQL Server (SSIS (ETL) & Cloun based ETL Azure Data
Factory, Snowflake, SSRS and power BI
 Having about 1.5 Years of professional softwa re development experience
in power BI,
 Importing data from different from data sources such as SQL server Analysis,
Excel, Flat files, Oracle, Azure Data Analytics etc.in Power BI
 Hands on Experience in creating visualization data reports such as
Line chart, Pie Chart, Column Chart, Scattered Chart, Funnel Chart,
Matrix Chart, Gauge Chart, Slicers, KPI indicators etc using Power
BI.
 Worked on DAX queries, on different types on Aggregate, mathematical,
data time functions etc.
 Worked on measures groups, customized columns
 Modelling data from different types of data source such as SQL Server,
Excel, flat file, Oracle and Making relations on among tables in power
BI
 Connecting Data sources through Power BI Gateway on Cloud
promises, up to data for live connection.
 Modelling data on Star Schema and Snow Flake Schema from SSAS
Services and making relationship on fact tables and Dimension Tables in
power BI
 Worked on different types of filters at report level and page level
 Publishing reports on Power BI Service through Power BI
workspace.
 Creating cloud based Extract, Transform &Load(ETL) pipeline using Azure Data
Factory transformations such as Copy Transfer, Aggregate, Sort, Looku p, join ,
Look up, Conditional split, Derived column , etc.
 Data injection from cloud services Microsoft azure into snoflake
 Data loading into snow flake using copy command from Local system & azure stage
 Worked on staging tables
 Worked on different types of formats CSV, JOSON while data loading
 Working on Data migration using copy transfer from Flat files, CSV,Excel and
SQLServer to Azure data storage and Azure Lake etc.
 Experience in extracting data from heterogeneous sources like Flat files, CSV,Excel and
SQLServer, transforming and loading (ETL) process using SQL Server 2014 Integration
Services.
 Hands on experience in Extract, Transform &Load(ETL) development using SQL
Server Integration Services (SSIS) such as Control Flow (Execute Sql task, For
each loop Containers, File System, Data Flow task etc) and transformations such
as Aggregate, Sort, Looku p, Merge join , Multicast, Fuzzy Look up, Conditional
split, Derived column , Character Map etc..
 Experience in implementing Event Handlers, Logging, Checkpoints, Transactions and
package configurations.
 Implemented SCDs while history maintenance in tables.
 Developed reports using SQL Server Reporting Services (SSRS) and utilizing complex
SQL queries and stored procedures.
 Generated drill down,drill downn throgh,sub reports and cascading parameterized reports
from OLTP/OLAP by using SSRS.
 Hands on Experience in SQL Server Analysis Services (SSAS) such
as(Nam ed Queries, Nam ed Calculations, MDX, Queries, Partitions,
Perspective, Translations) etc.
 In volved in perform ance tun ing at SSAS Cube Level and stored procedurers.
 Generated reports by writing the MDX Queries from Cube
 Worked in production support activities for SSIS and SSRS.
 Hands on experience in Su b queries, Tables ,Joins, Set operations, views,
Indexes ,s tored procedures and T-Sql.
 Good knowledge in all stages of Software Development Life Cycle.

SKILL SUMMARY:

Operating System : Windows server 2014, Windows 10


Reporting tool : Power BI & (SQL Server Reporting services)
ETL Tool and : SSIS (SQL Server Integration services ) and
Cloud ETL Services : & Azure Data factory (Storage, azure Data Lake
Analysis Tool : SSAS (SQL Server Analysis Services)
Databases : SQL Server 2016 , Snowflake
Languages : SQL, T-SQL
EDUCATION
• Master of Computer Applications from JNTU.

PROFFESSIONAL SUMMARY:
 Worki ng as S e n i o r Software Developer in HCL Technologies Pvt ltd, from Nov-2018 to till
date.

PROJECT#2:
Project name : Nestle
Environment : SQL Server2014, Azure Data Factory, Power BI, SSAS.
Description: Nestle is a Swiss multinational food and drink processing conglomerate
corporation headquartered in Vevey, Vaud, Switzerland. It is the largest publicly held
food company in the world, It has different types data marts in datawarehouse

Responsibilities
 Importing data from different from data sources such as SQL
server ,Excel, Flat files, Oracle etc into Power BI and Modelling data
and Making relations on among tables in Power BI
 Modelling data on Star Schema and Snow Flake Schema from SSAS
Services and making relationship on fact tables and Dimension Tables in
power BI
 Worked on different types of filters at report level and page level
 Worked on Security roles on row level security in power BI
 Worked on DAX queries, on different types on Aggregate,
mathematical, data time functions etc.
 Worked on measures groups, customized columns
 creating visualization data reports such as Line chart, Pie Chart, Column Chart,
Scattered Chart, Funnel Chart, Pivot table ,Cross report, Drill Through Matrix Chart ,
Gauge Chart, KPI etc using Power BI as per specifications
 Publishing reports on Power BI Service through Power BI
workspace.
 Generating reports by writing the MDX Queries from Cube
 Working in production support activities for Power BI and SSAS
 Creating cloud based Extract, Transform &Load(ETL) pipeline using Azure Data
Factory transformations such as Copy Transfer, Aggregate, Sort, Looku p, join ,
Multicast, Look up, Conditional split, Derived column , etc..
 Working on Data migration using copy transfer from Flat files, CSV,Excel and
SQLServer to Azure data storage and Azure Lake etc.

PROJECT#1:
Project name : Ford
Environment : SQL Server 2014, SSIS,SSRS,SSAS
Ford Motor Company (commonly known as Ford) is an American multinational
automobile manufacturer headquartered in Dearborn, Michigan, United States. It has
different types data marts in Datawarehouse. Across the global and fetching data from
different types of data sources, using ETL tool,

Role and Responsibilities:

 Designing of ETL packages in SSIS as per ETL specifications


 Importing Source/Target tables from the respective databases by using Control Flow
Tasks in SQL Server Integration Services.
 Development of mappings according to the ETL specifications for the staging area data
load & warehouse data load using Integration Services.
 Created Transformations and Mappings using Data Flow Tasks.
 Used Control Flow Tasks like For Loop Container, For Each Loop Container, Sequential
Container, Execute Task, Email Task, and Data Flow Task.
 Extensively used Derive Column, Data Conversion, Conditional Split, Lookup and Sort to
load data into Data Warehouse.
 Worked on SSIS package, import/Export for transferring data between SQL Server
 Scheduled the ETL Package (Daily, Weekly and Monthly) Using SQL Server 2014
Management Studio.
 Worked on enhancement of existing reports ,in support activities, (Dev ,UAT , Testing
&Production)
 Developed reports using SQL Server Reporting Services (SSRS) and utilizing complex
SQL queries and stored procedures.
 Experienced in writing Parameterized Queries for generating Tabular reports and Sub
reports using Global variables, Expressions, and Functions, Sorting the data, Defining Data
sources and Subtotals for the reports using SSRS .
 Generated drill down,drill downn throgh,sub reports and cascading parameterized reports
from OLTP/OLAP by using SSRS.
 In volved in perform ance tun ing at SSAS Cube Level and stored procedures.
 Generated reports by writing the MDX Queries from Cube
 Created database Tables, Views and Stored Procedures.

You might also like