Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 3

Professional Summary

 7+ years of IT Experience as a Data Analyst especially in Banking Domain having experience with AML, Fraud
detection & Trade Surveillance tools such as Actimize and good understanding of related topics.
 Strong working experience in the Data Analysis, Design, Development, Implementation and Testing of Data
Warehousing, DataMart and ODS using Data Conversions, Data Extraction, Data Transformation and Data Loading
(ETL).
 Very good understanding of Data Warehousing concepts, Data Analysis, Data Warehouse Architecture and Designing
 Performed Data analysis, Statistical analysis, Listings, and Graphs by using SAS tools for Claims, Inquiry and Quality
related reports
 Experience in using SQL Queries and MS Excel (hlookup, vlookup, Macros, Pivot Tables, etc.) for Data Validation and
Data Integration
 Extensive experience in ETL testing with thorough understanding of data warehouse concepts
 Exposure in creating and analyzing Data Flow diagrams, and Entity Relationship diagrams.
 Good understanding of health care industry claims management process, Medicaid and Medicare Services and
insurance sector.
 Provided project planning, milestones, and deliverables for implementation of document management system.
 Extensive work experience in ER Modeling, Dimensional Modeling (Star Schema, Snowflake Schema) and Data
Warehousing, OLAP tools.
 Expertise in Dimensional data modeling, Star schema, Snowflake schema, Data Normalization, informatica Utilities
Source Analyzer, warehouse Designer, Mapping Designer, Mapplet Designer and Transformation Develop.
 Knowledge of administration and support of SharePoint and Microsoft Project servers in multiple environments.
 Extensive experience in enterprise Data flow analysis, data modeling and analytics (SSRS, SSAS, SSIS, ETL, MDM,
Entity Data Models and data warehousing) incorporating transaction-oriented business/service level data processes
and metadata collection techniques to enhance informatics strategies.

Professional Experience

Client: WELLS FARGO Bank, Charlotte, NC Aug 21– Present


Role: Sr. Data Analyst

Wells Fargo’s Consumer Lending Group (CLG) includes Home Lending, Consumer Credit Cards, Personal Loans and Lines,
Direct Auto, Dealer Services, Commercial Auto, Retail Services, and Education Financial Services. As a Business Analyst
working with Oracle EBS (Enterprise Resource Planning) in the financial accounting modules implementations. Record to
report speaks to the entire process from collecting data and consolidating it in a report format allowing financial
organizations to deliver on streamlined accounting integrity as a part of the close process. Also worked with reporting
teams at client to compile and publish quarterly reports generated using Tableau as B. I. tool to support Digital Display
Benchmarking Analytics project for client.

Responsibilities:
 Worked in Agile-Scrum software development methodology for aiding changing business requirements.
 Analyze the data requirements and understand the business needs for the Snowflake schema modeling. Work
closely with stakeholders to gather and document data requirements, including data sources, data entities,
relationships, and attributes.
 Involved in Requirement Analysis, requirement modeling using Unified Modeling Language (UML) and Object-
Oriented Analysis. Experience with Workflow, Data Analysis, Cost/Benefit, Cause-and-Effect Analysis and Reports
etc.
 Experienced in Data Mart design and creation of Cubes using Dimensional data modeling in Snowflake schema.
 Design the Snowflake schema, which is a multidimensional data model that consists of a central fact table
surrounded by multiple dimension tables.
 Develop RPG (Report Program Generator) scripts for extracting data from IBM AS400 / IBM I-Series databases.
 Utilize CL (Control Language) programming for orchestrating data extraction processes on the AS400/ IBM I-Series
system.
 Implement ODBC (Open Database Connectivity) connections to facilitate data transfer between AS400, IBM I-Series
and Google Cloud Platform.
 Utilize DB2 SQL (Structured Query Language) for querying and manipulating data directly within the AS400
environment.
 Responsible for data mapping form raw table to mainframe and mainframe to adjudicated table in IBM I-Series.
 Writing Standard SQL queries and optimizing SQL Queries to fetch data in Snowflake Schema Modeling.
 Involved in the Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema
Modeling, Fact and Dimension tables.
 Involved in data dictionary Data Quality Management, extraction, transformation and loading (ETL) of data from
various sources. Participated in ETL requirements process during data transition from source systems to target
systems in Snowflake Schema.
 Performed Data Migration from OLTP to Data warehouse environment from DB2 Database.
 Involved in designing and developing Data Models and Data Marts that support the Business Intelligence Data
Warehouse and Snowflake Schema.
 Data mapping, logical data modeling, created class diagrams and ER diagrams and used SQL queries to filter data
within Snowflake Schema.
 Contributing to software process-reengineering efforts aimed at evolving current software development practices to
adopt Lean/Agile and Scrum practices
 Developed dashboards in Tableau Desktop and published them on to Tableau Server which allowed end users to
understand the data by using quick filters for on demand information.
 Created views in Tableau Desktop that were published to internal team for review and further data analysis and
preparing Dashboards using calculations, parameters, calculated fields, groups, sets and hierarchies in Tableau.
 Used MS Excel for documenting various processes and procedures and prepare reports as dictated by the clients.

Client: Key Bank, Cleveland OH Feb 18 – July 21


Role: Data Analyst

Cleveland-based KeyCorp is one of the nation's largest bank-based financial services which provide investment
management, retail and commercial banking, consumer finance, and investment banking products and services. Their
mortgage system is a windows-based software package that is designed to be a complete mortgage lending solution.
The system allows for processing, closing and underwriting all loan types for wholesale and retail production. The system
also provides a centralized repository for underwriting and monitoring process.

Responsibilities:
 Involved with creating the ETL specifications documents and contributing to the Detailed design of the DW.
 Optimize RPG programs for efficient data extraction and processing on IBM AS400.
 Develop and maintain logical files and indexes to improve data retrieval performance on IBM AS400 and GCP
(Google Cloud Platform).
 Integrate IBM AS400 data sources with Google Cloud services using APIs and custom connectors.
 Implement job scheduling and batch processing using native IBM AS400 utilities like Job Scheduler.
 Perform data replication and synchronization between IBM AS400 and cloud-based storage or databases.
 Monitor AS400 system performance and resource utilization during data extraction operations. Track tasks progress
through daily/weekly status calls. Took part in bug triage meetings to make sure defect aging was controlled.
 Performed database validation by executing SQL queries and testing of data correctness in Snowflake Schema.
 Involved with the request creation for the development and test environment set up.
 Performed extensive task which include Data Quality Analysis, Data lineage
and Data Standardization data structures, data base design, data warehouses, business intelligence/analytic tools,
SQL, ETL tools, and data integration methods in in Snowflake Schema and DB2 databases.
 Extensively involved in performing Gap Analysis in the upgrade of the existing Claims Processing system and
highlighting related issues.
 Used Data warehousing for Data Profiling to examine the data available in an existing database.
 Perform data profiling and data quality analysis to ensure the accuracy, completeness, and consistency of the data
within the Snowflake schema. Identify and address any data quality issues or anomalies, and work with the relevant
teams to resolve them.
 Wrote SQL queries to validate source data versus data in the data warehouse including identification of duplicate
records.
 Tested the application by writing SQL Queries and creating pivot views as to perform back-end testing.

Client: Charles Schwab Inc, Phoenix, AZ Aug 15 – Dec 17


Role: Data Analyst

The project involved adding self-service features on the Schwab institutional platform for their clients. So, the firm can
detour the operational traffic from the Schwab service centers and call centers to the web portal to attain overall
operational efficiency. Also redesigned the existing brokerage account open and maintenance processes to minimize
time lag and human interaction.

Responsibilities:
 Managing the project using the Agile Method of project development and Implementing Scrum development
methodology.
 Documented and reported any variances, issues and conflicts after completion of every sprint execution.
 Set up and managed support functions covering planning, tracking, reporting, quality management and internal
communication
 Produced consolidated reporting to the Project Steering Board, including milestone summary, key issues, risks,
benefits, summary of costs incurred.
 Create the logical and physical data models for the Snowflake schema. Use data modeling tools to define the
structure of the schema, including tables, columns, primary and foreign key relationships, and indexing strategies
with Data Architect, data engineering team.
 Document the data model to provide a clear representation of the schema design in Snowflake schema.
 Collaborate with data engineers and ETL (Extract, Transform, Load) developers to define the data integration
requirements for populating the Snowflake schema.
 Identify the necessary data transformations, mappings, and business rules to transform and load data from various
sources into the Snowflake schema.
 Worked as a Data Analyst in order to gather requirements and reaching each and every deadline well ahead of the
time line.
 Understood business requirements from user interviews and then convert them into technical specifications.
 Involved in Data Validation, Data Modeling, Data Flows, Data Profiling, Data Mining, Data Quality, Data Integration,
Data Verification, and Data loading in Snowflake schema.
 Used MS Visio to represent system under development using case diagrams, activity diagrams and workflow
diagrams.
 Implemented Agile Scrum methodology for the project life cycle and scrum framework for team management.
 Involved in mentoring specific projects in application of the new SDLC based on the Agile Scrum and Rational Unified
Process, especially from the project management, requirements and architecture perspectives.
 Performed analysis on enterprise data/report integration & provided functional specification to development team
to build Enterprise Reporting Systems.
 Gathered and interpreted data and trends on health care industry, managed care and Medicaid health care policy
resulting in the identification of changes to state health care and Medicaid policy.
 Performed the web testing of the application for browser dependency.

EDUCATION:

You might also like