Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 7

NAGA VENKATA CHAITANYA MELIMI

Chaitanya.melimi@gmail.com
908-888-0008

SUMMARY:
 A Data driven professional with more than 18 years of experience in Business Intelligence, Data Science, Data
Analytics, Data visualization, Data warehousing, Advanced Analytics, Cloud computing, Statistical analysis, &
Predictive Analytics.
 As I progress my career, I look to continue delivering innovative data strategies to help organizations obtain
profitable insights in areas where they may be overlooked, using leading edge technology.
 Highly Motivated, results driven Microsoft Certified Solutions Expert (Data management and Analytics).
 Experience across entire Microsoft BI stack including SQL Server 2017/2016/2012, SSIS, SSAS, SSRS, DAX, MDS and
Power BI from prototyping to deployment.
 Strong development skills with Azure Data Lake, Azure Data Factory, Azure Databricks, Azure ML Service, Azure ML
Studio, Azure Devops, Azure Monitor, MASE, Azure SQL DW (Synapse), Azure Storage Explorer.
 Enhancing SQL logic through creating new data granularity and improved data standardization.
 Optimizing queries/run-time, streamlining automation, provide best practice in BI data integration.
 Experience implementing BI solutions using SQL/Azure Analysis Services (SSAS tabular), dashboards, scorecards
using Reporting Services (SSRS), Power BI, Tableau & Excel PowerPivot.
 Solutions - oriented ETL, Data modeler & Power BI report developer with proven expertise analyzing various large
datasets for the e-commerce, retail, healthcare, financial, and insurance industries.
 Proven qualifications in data modeling, analyzing requirements, planning, designing, and implementing innovative
database solutions that meet or exceed challenging business needs.
 Designed, developed, and implemented Power BI Dashboards, Scorecards & KPI Reports.
 Expertise in design, development, and maintenance of SSIS packages. Implementing various SSIS packages having
different tasks and transformations and scheduled SSIS packages.
 Strong knowledge of DWH and BI methodologies and concepts including Star Schemas, Snowflake Schema.

EDUCATION: Bachelor of Electronics and Communications Engineering from Andhra University, India 2004
CERTIFICATIONS: Microsoft certified Solutions Expert – Data management and Analytics (Charter) with the completion
of the following certifications:
 70-461 - Querying Microsoft SQL Server 2012/2014
 70-462 - Administering Microsoft SQL Server 2012/2014 Databases
 70-463 - Implementing a Data Warehouse with Microsoft SQL Server 2012/2014
 70-467 - Implementing Data Models and Reports with Microsoft SQL Server
 70-468 - Designing Business Intelligence Solutions with Microsoft SQL Server
TECHNICAL SKILLS:
Technologies: Business Intelligence, Statistical Analysis, Machine Learning, Predictive Analytics, Cloud
Computing & Data Warehousing.
Database & ETL: SQL Server 2008/2012/2016/2017, NoSQL, MongoDB, Cosmos DB, SSIS, SSRS, Azure Analysis
Services, SQL Server Analysis Services (SSAS), SQL Enterprises Manager, SQL Profiler, Master
Data Services (MDS), DTS, Business intelligence development studio (BIDS),
Oracle 11g/10g/9i/8i, PL/SQL, SQLPLUS, TOAD and SQLLOADER.
Azure: Azure Data factory, USQL, Azure Data lake Analytics, Azure Data lake store, Azure ML studio,
Azure Machine Learning Service, Azure Databricks, Azure Devops and Azure portal.
Big Data & ML: Apache Spark, Azure ML Studio
BI & Reporting Tools: Power BI, Power BI Desktop, Tableau, Tableau Desktop versions 7/8.2, Tableau Reader and
Tableau Server, Report builder 3.0, PowerPivot, PowerView, Performance point, Report Builder,
MS Excel, Oracle SQL developer and Oracle report designer.
Languages: SQL, USQL, Apache Spark, Pyspark 2.0, Scala, Microsoft R, R programming, Python, MDX,
PowerShell & DAX.
Performance Monitoring Tools: Azure SQL Analytics, , SQL Sentry One, SQL Server Profiler, System Monitor, Tivoli
Monitoring, BMC Patrol.
Web designing tools: HTML5, XHTML, JavaScript, .Net, and JSON.
Other tools: Microsoft Office 365, Serena Team track, PVCS version control, CA Autosys scheduler
Test and Management tools: WinRunner, QTP, Quality Center 9.2, Test Central, IBM rational portfolio manager.
Workflow Tools: MSOffice suite, MS Visio, Microsoft Project
Version Control: GitHub, Team Foundation Server including continuous integration, VSTS, SVN, PVCS Serena
Library Management, Change man, Turnover.

EXPERIENCE:
July 2020 – till date, Sr Business Intelligence ETL Architect, North American Dental Group
Current Project: North American Dental Group July 2020 – till date
Senior Business Intelligence ETL Architect
 Leading the designing, development, and modification of complex applications with ETL - SSIS, SQL, SSRS and
PowerBI.
 Analyzing user requirements and defining functional specifications using Waterfall and Agile methodologies.
 Analyzing, Designing, and publishing Power BI reports, dashboards and SSAS tabular models.
 Performing dimensional data modeling, star schema design and SQL programming.
 Architecting and building Datawarehouse’s using ETL tools – SQL, SSIS and SSAS.
 Experience across entire Microsoft BI suite of products including SQL Server 2016/2014/2012/2008R2, SSIS,
SSAS, SSRS, MDS, MDX, DAX, PowerPivot, and Power BI from prototyping to deployment.
 Created data integration and technical solutions for Azure Data Lake Analytics, Azure Data Lake Storage, Azure
Data Factory, Azure SQL databases and Azure SQL Data Warehouse for providing analytics and reports for
improving marketing strategies.
 Extract Transform and Load data from Sources Systems to Azure Data Storage services using a combination of
Azure Data Factory, T-SQL, Spark SQL and U-SQL Azure Data Lake Analytics. Data Ingestion to one or more Azure
Services - (Azure Data Lake, Azure Storage, Azure SQL, Azure DW) and processing the data in In Azure Databricks.
 Created Pipelines in ADF using Linked Services/Datasets/Pipeline/ to Extract, Transform, and load data from
different sources like Azure SQL, Blob storage, Azure SQL Data warehouse, write-back tool and backwards.
 Developed Databrick notebooks using Pyspark, Scala and Spark-SQL for data extraction, transformation, and
aggregation from multiple file formats for analyzing & transforming the data to uncover insights into the
customer usage patterns.
 Design data collection procedures from different data sources – Legacy PMS cloud systems, SQL Server, SSAS,
Azure Data Lake Store and Azure Datawarehouse.
 Design end to end ETL processes and mappings to different source systems.
 Architect and implement ETL and data movement solutions using Azure Data Factory.
 Analyzing test results and recommending modifications to the applications to meet project specifications.
 Develop and direct software system testing and validation procedures, programming, and documentation.
 Analyzing and identifying technical areas of improvement within existing applications.
 Responsible for testing and deployments in various environments.
 Developing and directing system testing, including analysis, design, development (coding), unit testing and
implementation.
 Researching and recommending new tools and technology frameworks that can drive innovation and
differentiation of software.
 Being a technical architect for direct communications to team members in the project development, testing and
implementation processes.
 Conferring with project team and customer about designing data models and to obtain information on project
limitations and capabilities, performance requirements and interfaces.
 Documenting modifications and enhancements made to the applications as required by the project.
Skills & environment: Power BI Desktop, Power BI service, MDX, DAX, Master data Services, Microsoft Azure, SQL
server Integration Services (SSIS), SQL Server Analysis Services (SSAS), SQL Server Reporting Services (SSRS), Azure
Data Lake store, Azure DWH, PowerPivot, Power view, Oracle PL/SQL and Microsoft SQL Server 2016/2014/2012,
MySQL, Azure Data factory, Azure Data lake Analytics, Azure Data Lake store, Azure Storage Explorer, Azure Devops.

Oct 14 – July 2020, Azure Lead Architect, Key Business Solutions Inc

I worked in the following projects:


Current Project: Giant Eagle Mar 2019 – July 2020
Location: Pittsburgh, Pennsylvania
BI Lead Architect
Application Area: Data Analytics and Visualization

Key Projects:

 Inventory and Merchandising Advanced Analytics Environment - (Microsoft Azure - Databricks)


 Real Time Sales Reporting - (PowerBI model / Reports)
 Curbside Ordering - (Tabular model / Reports)
 Inventory Movement & Known Loss Report - (Tabular model / Reports)
 Gift cards - (Databrick cluster as a source, PBI model/reports)
 BI Reports Migration - Excel based Inventory Reporting deliverables - (Tabular model / Reports)

Roles & Responsibilities:

 Good technical expertise in DAX and SSAS tabular data modeling and data partitioning.
 Created SSAS tabular models for Sales, Inventory and Ordering analytics.
 Developed Power BI reports using query logic based on business rules and integrated Custom Visuals based
on business requirements using Power BI desktop.
 Good understanding of Power BI Premium and Pro and Power BI gateways.
 Proven ability to work with business users to take requirements and translate them into technical designs to
facilitate BI analytics and reporting solution.
 Responsible for analyzing, initiate, design, execute and control the tabular data models using Azure analysis
services including the generation of reports using Power BI Desktop.
 Developed at least two - three visual options for each report for discussion with Giant Eagle stakeholders
(product owners and leadership) and customers.
 Published 15 dashboards, 25 reports within a short span and received customer appreciation and created
Power BI visualization of Dashboards & Scorecards (KPI) for Finance and Marketing Department
 Assisted Supply chain analysts with automating reporting functionality using Power BI tools Power BI
reporting, Dashboards & Scorecards (KPI) using Azure SQL, Azure Data Lake & Azure Data warehouse data
sources.
 In mid-2019, I provided the Inventory and Costing leadership with a proposal to create a fully dynamic data
modelling environment to further the availability of value-added analytics across the enterprise. At the end
advanced analytics environment using Microsoft Azure to configure, store and process large amounts of
transactional data by creating robust data pipelines using the Pyspark 2.0 processing engine and also using
Scala through a Databricks Notebook Environment.
 Architect & implement medium to large scale BI solutions on Azure using Azure Data Platform services
(Azure Data Lake, Data Factory, Data Lake Analytics, Stream Analytics, Azure SQL DW, Databricks, NoSQL DB)
 Experience in Developing Spark applications using Spark - SQL in Databricks for data extraction,
transformation, and aggregation from multiple file formats for analyzing & transforming the data to uncover
insights into the customer usage patterns.
 Created Regression and Classification ML model POC’s for Sales Forecasting. Trained, Scored and Evaluated
ML models for best fit model with good predictions using algorithms.
 Responsible for design and develop objects (Tables, views, stored procs, TVFs) in Azure Data Lake.
 Worked on complex U-SQL scripts in Azure Data Lake Analytics for the data transformation and loading table
to Data Lake store and report generation.
 Designed end to end ETL processes and mappings to data lakes and BLOB storages which were used
enterprise wide by multiple Business Intelligence Solutions software for all LOB’s.
 Created CI/CD pipelines in Azure Pipelines and good experience in Devops build and release pipelines.
 Performed data exploration and analysis of Sales & Marketing data using R.
 Creating ADL, Azure Storage, ADB and resource groups in Azure portal. Preparing and uploading the source
data via azure storage explorer.
 Increased the performance necessary for statistical reporting by 25% after performance monitoring, tuning,
and optimizing indexes.
 Consult with customers about software system/ Data model design and maintenance.
 Supervise the work of Business analytics programmers, technologists and technicians, and other engineering
and scientific personnel.
 Coordinate software system installation, Database maintenance and monitor equipment functioning to
ensure specifications are met.
Skills & environment: Azure Data factory, Azure Data lake Analytics, Azure Data Lake store, Azure Databricks,
Azure Devops, Azure ML studio, Azure Machine Learning Service, Azure CLI, USQL, JavaScript, PowerPivot Power
BI Desktop, Power BI service, Apache Spark, Python, R, MDX, DAX, Master data Services, Azure Analysis Services,
SQL Server Analysis Services (SSAS), SQL Server Reporting Services (SSRS), , Oracle 11g, NoSQL, and Microsoft SQL
Server 2016/2014/2012.

Client: Arconic (Alcoa), Pittsburgh, PA October 2014 to February 2019


ETL Project Lead
 Created Pipelines in ADF using Linked Services/Datasets/Pipeline/ to Extract, Transform, and load data from
different sources like Azure SQL, Blob storage, Azure SQL Data warehouse, write-back tool and backwards.
 Developed Spark applications using Pyspark and Spark-SQL for data extraction, transformation, and aggregation
from multiple file formats for analyzing & transforming the data to uncover insights into the customer usage
patterns.
 Responsible for estimating the cluster size, monitoring, and troubleshooting of the Spark data bricks cluster.
 Experience in Developing Spark applications using Spark - SQL in Databricks for data extraction, transformation,
and aggregation from multiple file formats for analyzing & transforming the data to uncover insights into the
customer usage patterns.
 Participate in Daily scrum calls, Sprint Planning Session, Sprint Retrospective Session, Sprint Review Session and
Product Backlog Grooming sessions from a data structure / database perspective and reporting.
 Updated agile tracking systems to provide transparency on product and sprint backlogs.
 Developed complex calculated measures using Data Analysis Expression (DAX) for PowerPivot and tabular data
models.
 Developed Power BI reports using query logic based on business rules, and visuals selected by Arconic
stakeholders.
 Developed at least two - three visual options for each report for discussion with Arconic stakeholders (product
owners and leadership) and customers.
 Integrated Custom Visuals based on business requirements using Power BI desktop.
 Published 10 dashboards and 20 reports in Power BI for Top 20 performance Quality metrics.
 Organize and facilitate project planning, daily stand-ups, reviews, retrospectives, sprint/release planning,
demos, and other Scrum-related meetings.
 Demonstrated expertise in utilizing ETL tools, including MS SQL Server Integration Services (SSIS), ETL package
design, Master Data Services (MDS) for data enrichment.
 Expertise in Master Data Services to create entities and support the data enrichment and data mapping
hierarchy.
 Used ETL (SSIS) to develop jobs for extracting, cleaning, transforming, and loading data into Analyze user needs
and software/database requirements to determine feasibility of design within time and cost constraints.
 Designed and developed ETL SSIS packages to facilitate incremental load process.
 Handled multiple SSIS ETL projects from designing the data mapping, applying transformations based upon
business logic and creating load strategies.
 Effectively handled the maintenance of SSIS packages, jobs and provided support for multiple applications.
 Proposed an automated SSIS load feature using SharePoint list in one application and it was used across 25
applications to handle the downtimes and weekend load failures due to intermittent connection failures.
 Prepared the complete data mapping for all the migrated jobs using SSIS.
 Mastered the ability to design and deploy rich Graphic visualizations with Drill Down and Drop-down menu
option and Parameters using Tableau.
 Worked closely with Business users. Interacted with ETL developers, Project Managers, and members of the QA
teams.
 Converted existing BO reports to tableau dashboards.
 Created different KPI using calculated key figures and parameters.
 Developed Tableau data visualization using Cross tabs, Heat maps, Box and Whisker charts, Scatter Plots,
Geographic Map, Pie Charts and Bar Charts and Density Chart.
 Developed donut charts and implemented complex features in charts like creating bar charts in tooltip.
 Worked extensively with Advance analysis Actions, Calculations, Parameters, Background images, Maps, Trend
Lines, Statistics, and table calculations.
 Provided Operational Support to modify existing Tabular SSAS models to satisfy new business requirements.
 Created Data quality PowerPivot models to maintain Data integrity and provide a data error control mechanism
to the business users and commodity managers.
 Worked with Microsoft team on product limitations and communicate limitations and tool capabilities to the
Arconic IT and business stakeholders.
 Provided continued maintenance and development of bug fixes for the existing and new Power BI Reports.
 Involved in the Migration of Databases from SQL Server 2008 to SQL Server 2012 and led the Data center move.
 Experience in the design and development of drill down & drill through reports, Summary and Detail reports and
Forecast reports using SQL Server Reporting Services (SSRS).
 Generated periodic reports based on the statistical analysis of the data from various time frame and division
using SQL Server Reporting Services (SSRS)
 Advanced knowledge of Excel for using Pivot tables, Power Pivot, Complex DAX formulas.
 Actively participated in interacting with users, DBAs, and technical manager to fully understand the
requirements of the system.
 Experience in the conversion of PowerPivot models to Tabular models. Actively handled the conversion right
from the design to implementation.
 Responsible for translating functional specification requirements to technical specifications and guiding
development team to understand the flow.
 Daily interaction with offshore and onshore application team members to go through the status of development
and maintenance tasks and address any risks/concerns.
 Participate in Agile software development projects and tasks include coding, testing, debugging, documentation,
peer-review, deploying, monitoring, and support.
 Create and maintain technical documentation using defined templates.
Environment: Azure Data factory, Azure Data Lake Analytics, Azure Data Lake store, Azure Databricks, Azure Devops,
Azure CLI, USQL, JavaScript, Power BI, Power BI Desktop, Tableau, Tableau Desktop, MS SQL Server
2012/2008 R2, SSIS, SSRS, Report builder 3.0, SSAS, DAX, MDS, PowerPivot, Oracle 10g, SQL
developer, Microsoft Visual Studio 2008R2, Microsoft Visual Studio 2010/2012, Business intelligence
development studio
June 2006 – Oct 2014, Project Lead-Business Intelligence, IBM India Pvt Limited
Client:
PRUDENTIAL Group Insurance, Roseland, NJ January 2008 to October 2014
Lead MSBI Developer
 Microsoft BI Lead for the Business Analytics team and responsible to handle design, development, implementation,
QA, UAT and PROD support.
 Gathered and documented business requirements for analytical reports. Participated in requirements meetings and
data mapping sessions to understand business needs. Identified and documented detailed business rules.
 Participated in biweekly sprint meetings to identify and log pending Business Request within the Team Foundation
Server.
 Performed database documentation including table relation diagrams and data dictionary updates.
 Performed database impact analysis.
 Experience converting legacy data storage mechanisms such as MS Access or mainframe flat storage to relational
databases.
 Scheduled reports for daily, weekly, monthly reports for executives, Business analyst and customer Representatives
for various categories and regions based on business needs using SQL Server Reporting Services (SSRS).
 Created Shared data source connections and created PowerPivot data models under PowerPivot data gallery under
the SharePoint BI site.
 Involved in doing a thorough analysis of the issues with the cube processing and query processing to identify
bottlenecks for the existing facts and dimensions.
 Created aggregates, partitions, attribute relationships and user defined hierarchies with the goal to reduce the cube
processing time from over 2 hours to under 60 minutes and query processing time to less than 500ms.
 Developed a Microsoft business intelligence solution for the users using a large amount of unstructured data from
multiple sources.
 Effectively handled the maintenance of SSIS packages, jobs and provided support for multiple applications.
 Developed PowerPivot model for the users and provided pivot-based reports.
 Experience in SharePoint integrated reporting environment and deployed different types of SSRS reports.
 Administered interface to organize reports and data sources, schedule report execution and delivery, and track
reporting history.
 Proficiency in creating different types of reports such as Crosstabs, Conditional, Pie Chart Reports, and Drill down,
Top N, Summary and Sub reports using SSRS and formatting them.
 Design, deployment, and maintenance of various SSRS Reports in SQL Server 2008.
 Handled a stand-alone dot net application that produces daily letters to Long Term Care clients and involved in fine
tuning the app.
 As a Lead, responsible to allocate work to the developers and do the resource planning. Proactively interacted with
the team to clarify business requirements and in providing functional knowledge as and when required.
 Collaborated with data architects for data model management and version control.
 Conducted data model reviews with project team members.
Environment: Power BI, MS SQL Server 2012/2008 R2, SSIS, SSRS, SSAS, SharePoint 2013/2016 server, PowerPivot, TFS,
MDS, DAX, Microsoft Visual Studio 2008R2, Microsoft Visual Studio 2010, BIDS, Oracle 10g/9i/8i, PL/SQL, SQLPLUS,
TOAD and SQLLOADER.

SUNTRUST Bank, Atlanta GA June 2006 to December 2007


Microsoft SQL Server Developer
Responsibilities
 Mainly involved in Performance Tuning of the Teller Application for various Modules.
 Actively Coded and Unit Testing using AS 400 tables and SQL server 2000.
 Worked on data migration from Db 400 to SQL server.
 Developed, maintained and Monitored SQL server agent jobs using SQL Enterprise manager.
 Developed all the required Stored Procedures, UDF’s, Triggers, Views using T-SQL as per business logic.
 Developed DTS packages for data flow from one source files like Excel, Tables, and Views to other databases or files
with proper data mapping and data cleansing.
 Prepare Small Project Report, RDD (Requirement Definition Document).
 Interacting with the client to fulfill the requirements. Prepared High/Low level design documents.
 Support various testing (Regression testing, Assembly testing, UAT, etc.,)
 Involved in QA Meetings for the code and construction tasks with the Client.
 Reviewed Unit & Integration Test Plan and tested the Integration of different modules.
Environment: Microsoft SQL server 2000, SQL Agent job scheduler, SQL Enterprise manager, DTS, AS 400, Turnover, MS
Excel, Visio, DOS batch scripting.
Jan 2005 – June 2006, Software Trainee and Programmer, Keane India Pvt Ltd

Client: Hanover Insurance, Keane India Pvt Limited (NTTData - now), India
Jan 2005 – June 2006
Microsoft SQL Server Developer
 Preparation of Functional and Technical Specification documents as deliverables to the client
 Designed, Developed, and distributed reports using DTS packages in MS SQL Server 2000 environment.
 Responsible for ongoing maintenance and change management to existing reports and optimize report
performance.
 Created stored procedures to calculate pension and annuity information, when called will give the details for an
employee on the website.
 The task involved writing a lot of SQL Code and distributing adhoc reports out of database.
 Tested for different test cases based on the requirements. Developed Test Cases and Test Scripts.
Environment: MS SQL server 2000, DTS packages, Windows XP, T-SQL, Windows 2003 professional server, Visual basic
and windows batch scripting.

You might also like