Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 9

Jayant

Sr. Business Data Analyst


jayantibhai976@gmail.com
347-229-9978
________________________________________________________________________________________
Professional Summary:
 9+ years of professional experience in the Information Technology (IT) Industry serving as a Data
Analyst in Banking, Finance, Loan and Investment Banking domain with Master Data
Management (MDM), ETL Development and Data Modeling experience and coordinating with
stakeholders at every level, writing Use Cases, Requirement gathering, and end-to-end software
life cycle development (SDLC).
 Experience in Data analysis, Business Process Analysis/Modeling, Business Requirements
Gathering, Database Design, Data warehousing, Data Mapping, Development of Web Based,
and Client/Server Applications.
 Expertise in Financial Services Industry in mortgage administration, loan monitoring and
diverse Capital Markets domains in areas such as Fixed Income, Financial reporting
using Business Objects and Spreadsheets.
 Experienced in financial instruments within investment banking and brokerage environments
(Derivatives, Fixed Income, Hedge Funds, Capital Markets, Energy trading, FX, Futures,
Options, Commodities, Debt, Equity and Structured Products, Mutual Funds, Swaps, Asset
Backed Securities) and wealth management.
 Expertise in diverse retail banking sectors including lending, deposits, payments, cards, trade
finance, treasury, regulatory reporting (BI), KYC, AML, payment system.
 In-depth knowledge of Anti Money Laundering AML and Bank Secrecy Act BSA. Worked on
Suspicious Activity Reporting SAR, OFAC reports, Currency Transaction Report CTR, Foreign
Bank Financial Accounts FBAR and CMIR reports.
 Experience with Data Stewards, Data Owners, Authoritative Originators, Data Consumers.
 Experienced in Designing Star Schema (identification of facts, measures and dimensions),
Snowflake Schema for Data Warehouse, ODS architecture by using tools like Data Model, E-R
Studio.
 Vast experience in generating regulatory documents including Technical Files, Declarations of
Conformity, Certificates to Foreign Government, and Certificates of Origin used for global product
registration.
 Experience working with data modeling tools like Erwin, Power Designer and ER Studio.
 Extensive experience in data warehousing tools and technology such as Informatica Power
Center, Power Exchange, Metadata Manager, MDM, OBIEE, QlikView, SSIS, SSRS, Micro
Strategy, etc.
 Expertise in implementation processes for deploying, upgrading, managing, archiving and
extracting data for reporting using SQL Server Reporting Services (SSRS).
 Have considerable expertise in Metadata Management, Data Analysis, Data Profiling & Quality,
Data Governance and Master Data management (MDM).
 Expertise in Data ingestion into big data platform from different systems both Structured and
Unstructured data.
 Expertise in designing the ETL around Aladdin, Bloomberg AIM, Eagle Pace and accounting
systems.
 Very proficient with Data Analysis, mapping source and target systems for data migration efforts
and resolving issues relating to data migration.
 Expert in Data Analytics, DW/ETL, BI report development, data mining, data cleansing, analysis
and database programming.
 Experienced in Dimensional and Relational Data Modeler using Star Schema/Snowflake
Modeling, Fact and Dimension Tables, and Physical and Logical Data Modeling.
 Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation,
Integration, Data Import, and Data Export through the use of MS SQL Server and
Informatica Data Quality tools.
 Experience in Reference Data Management (RDM) concepts, Methodologies and ability to apply
this knowledge in building enterprise reference data warehouse using Informatica RDM/MDM
accelerator and IDD tools.
 Expertise in creating Test Cases on the basis of product features, client requirements and technical
documents. Developed, executed and maintained Test Scripts.
 Have a good understanding on Wealth management in capital market.
 Excellent understanding of Database management system, Data warehousing concepts, Business
Intelligence technologies and Data mining.
 Extensive experience with AFS and ACBS (Advanced Commercial Banking System) data
processing, system and business analysis, data mapping, testing, training, conversion, system
upgrade, General Ledger Account balancing, and post-conversion support involving strong
analytical and problem-solving skills. Experienced in full project life cycle.
 Experience in Data Scrubbing/Cleansing, Data Quality, Data Governance, Data Mapping, Data
Modeling, Data Profiling, Data Validation in ETL.
 Proficient in multiple Business Intelligence tools such as Power BI & Tableau.
 Extensive hands-on experience creating prescriptive data lineage, active data lineage
using Informatica Metadata Manager combined with Informatica Developer.
 Experience in creating and documenting Metadata for OLTP and OLAP when designing a
system.
 Experience in data transformation, data mapping from source to target database schemas and data
cleansing procedures using Informatica Power Center and SSIS.
 Expertise in providing metadata management, end to end data lineage solution of organizations
using Informatica metadata manager (IMM) and Analyst tool (IA).
 Experience on Data Stage for extracting, transforming and loading databases from sources
including Oracle, DB2 and Flat files.
 Experience in creating, writing, and executing Test Cases, Test Scripts from Business User
Requirement documents and Functional Design Documents.
 Advanced Proficiency with Excel, Word, Blackline Financial Account Reconciliation application,
Oracle & PeopleSoft General Ledger applications.
 Experience in converting legacy reports to more contemporary programs such as Crystal Reports,
Business Objects, MS Excel, MS Access, and SQL, and condensing number of reports to simplify
processes.
 Demonstrated ability to work both in independent and team-oriented environments with well-
developed organizational
 Strong knowledge of the procedures of Banking Regulatory Compliances especially in AML, KYC
and Basel Accords.
 Strong experience in conducting User Acceptance Testing (UAT) and documentation of Test Cases
and in designing and developing Test Plans and Test Scripts.

TECHNICAL SKILLS:

Business Requirements, Business Process Analysis & Design, Financial


Business Skills Analysis, Risk Analysis, Requirement Gathering, Use Case Modeling,
JAD/JRP Sessions, GAP Analysis & Impact analysis, OSS, BSS
Databases SQL Server, Oracle, MYSQL, No-SQL, Teradata, DB2
Cloud Technologies AWS, Microsoft Azure, GCP
Analytic Tools SAS (MRIMA, MRMA, Macro), SAS, SPSS, MS Excel, MS Access
ETL and BI Tools SSRS, Report Builder, Power BI, Tableau, Informatica
Languages SQL, PL/SQL, Python, UML, HTML, XML, VB, JavaScript
SQL Server Reporting Services (SSRS), SSIS, Crystal Reports, OLAP,
OLAP Erwin

IT Processes Software Development Life Cycle (SDLC), Agile, Waterfall, Iterative

Professional Experience:
M&T Bank, Buffalo, NY April 2022 - present
Sr. Business Data Analyst
As a Senior Data Analyst, I documented the process to assess eligibility for financing and securitization of
Asset Based Lending loans. This involved sourcing data from multiple channels and establishing methods
for data acquisition and storage. The Structured Finance Group at M&T Bank utilizes the Stucky Application
software to track commercial collateral metrics, including Borrowing Base, Aging, Receivables, and
Inventory.

Responsibilities:
 Acquired experience in all aspects of retail business including sales tracking, financial reporting,
staffing, training & development, staff relations, marketing and inventory control.
 Created reports on Statutory Liquidity Ratio (SLR) or Liquidity Coverage Ratio (LCR), Cash
Reserve Ratio, Daily time and demand deposit statuses and Projected Provisions.
 Facilitated the scrum meetings and helped to act as a buffer between development team and non-
development distractions.
 Worked with the Validation Analyst to gather requirements, implement enhancements, releases and
identify risks for Global Specification System (GSS) and Enterprise Document Management (EDM).
 Worked on oracle SQL scripts and hive/impala SQL scripts to load data.
 Performed ingestion strategy and data acquisition with incremental data load in Hive.
 Set up and maintain Retail Banking sales incentive reporting system and developed retail banking
application to provide true end-to-end visibility.
 Created data warehouse objects - Fact tables, dimension tables, and association tables.
 Executed SQL queries to test back-end data validation of DB2 database tables based on business
requirement.
 Worked with SQL Joins and SQL queries extensively to evaluate data process and end user
satisfaction.
 Performed Gap analysis and implemented updates for the Data Warehousing Application.
 Interacting with other data scientists and architected custom solutions for data visualization using
tools like tableau, Packages in R and R-Shiny.
 Implemented new test procedures in data warehouse QA environment like validating data
availability, incremental data load, data accuracy, data loss and naming standards.
 Reviewed Commercial loan documentation to ensure enhancements are meeting the proper audit
standards.
 Used Rational Rose to designed and develop Use Cases Scenarios, Use Cases Models, Activity
Diagrams, Sequence Diagrams, State chart diagrams, OOAD using UML.
 Wrote Test Cases in MS Excel for user registration, access to training material, and activity log-in,
reviewed the test cases and finalized.
 Worked on DataStage Manager for importing metadata from repository, new job categories and
creating new data elements.
 Involved in creating ETL using Informatica by applying best practices and error handling process.
 Worked extensively with Microsoft Excel (Macros, VLOOKUPS, and Pivot Tables). Proficient
Microsoft Word, PowerPoint, Access.
 Facilitated Joint Requirement Planning (JRP) sessions with SME's in understanding the
Requirements pertaining to Loan Origination to Loan Processing.
 Created requirements analysis and design phase artifacts using Requisite Pro, activity diagrams
and Sequential diagrams.
 Involved in Target Data Mapping, Data Profiling, and Transformation rules for OLAP.
 Analyzed Data Set with SAS programming, R and Excel.
 Extensive experience in Text Analytics, generating data visualizations using R, Python and creating
dashboards using tools like Tableau.
 Worked on improving their Data Governance Policies, Data Governance Processes, Data
Management, Data Management Infrastructure worked on investing significant resources to guide
and fulfil the data needs.
 Involved in working with the System Engineering team on Data Governance to optimize their efforts.
 Used Power BI to design interactive reports and data visualization using Power BI Desktop.
 Developed Power BI reports and dashboards from multiple data sources using data blending.
 Ran data lineage Analysis to perform Impact analysis and RCA analysis.
 Implemented Front End to a Fixed Income Derivatives Trading System, including trade blotter and
interfaces to risk management, accounting, and middle and back office.
 Created use-case scenarios and storyboards in MS Word and MS PowerPoint for better
visualization of the application and managed them using Rational Requisite Pro.
 Worked with the backend data retrieval team, data stewards, data mart team to guide the proper
structuring of data for Power BI reporting.
 Wrote Test Plans in MS Word for Manual Testing, System Testing, Integration Testing,
Performance Testing, Regression Testing & reviewed their consistency with the business
requirements.
 Provided assistance to the Quality Analysts team with writing test scripts between hive and oracle
and also between excel, csv to hive tables using Spark.
 Highly experienced on migrating data from legacy systems (Mainframes - VSAM, QSAM files, IMS
DB/DC Hierarchical Database, COBOL) to Oracle / SQL Server staging areas. Developing
interfaces (ETL tools) between legacy systems to Oracle staging.
 Coded DB2 Cobol modules and worked in socket communication modules to connect to open
system
 Worked with the Enterprise Data ware house team and Business Intelligence Architecture team to
understand repository objects that support the business requirement and process
 Automated and scheduled recurring reporting processes using UNIX shell scripting and Teradata
utilities such as MLOAD, BTEQ and Fast Load
 Upgraded the present application by adding new functionalities and adding new reports and wrote
regression test cases, did smoke testing with users.
 Coordinated (with business users, data base administrator, mainframe team and testing team) in
mirror to production testing.
 Prepared Scripts in Python and Shell for Automation of administration tasks.
 Coordinated activities with project manager and various other teams using MS Project.
 Supported User Acceptance testing (UAT) and collaborated with the QA team to develop test plans,
test scenarios, test cases and test data to be used in testing, based on business requirements,
technical specifications and/or product knowledge.

JPMorgan Chase, New York, NY March 2020 - March


2022
Sr. Data Analyst
My assignment at JPMorgan Chase involved constructing an Enterprise Reporting System for asset
management and credit risk management. The Credit Risk Evaluator Project is a comprehensive,
customer-controlled platform designed for credit risk management, providing robust tools for measurement,
monitoring, and management of credit risk. The platform empowers institutions with analytical capabilities
and user control, allowing estimation and stress testing of credit risk measures and rating estimates
through scorecards/models.
Responsibilities:

 Contributed to the Data Management and Business Solutions team, engaging in tasks such as Data
Analysis, Requirement gathering, GAP analysis, Data Mapping, and Testing of ETL and BI
Reporting requirements.
 Generated source-to-target data mapping documents for reconciling data across different systems.
 Utilized Agile/Scrum methodology for gathering requirements, documenting user stories, and
discussing project specifications.
 Worked on business requirements for CIB and IHC Financial Reporting, Management Reporting,
and Regulatory Reporting work streams, aiming to enhance the quality and process of Board
reports, management reports, financial statements, and regulatory reports.
 Collaborated with users and IT to configure AXIOM for manual data input templates, plan Feed and
Line Automation.
 Exported large datasets to Excel, creating relational worksheets, pivot tables, and performing data
comparisons and searches.
 Developed Pricing Reports and created Pivot Tables for review and comparison.
 Managed multiple projects in the HR, Finance, and Revenue Accounting areas.
 Expert in managing, storing, and analyzing data using Microsoft Azure data services, including
Azure Synapse Analytics, Azure SQL Database, and Azure Data Lake Storage.
 Performed data engineering activities using Azure Databricks, including feature engineering, model
creation, and data preparation.
 Deployed a tool for analyzing products, leading to productivity savings, and developed a model to
forecast demand for different Exchanges.
 Created stored procedures, views, and custom SQL queries to import data from SQL Server to
Power BI.
 Utilized Power BI for creating dashboards and ad-hoc reports for senior management analysis.
 Provided ongoing maintenance and bug fixes for existing and new Power BI Reports.
 Performed CRUD operations in SQL Server, HTTP, and Restful APIs.
 Operated SWIFT messaging service for financial messages between member banks worldwide.
 Conducted Data Profiling and Data Quality assessment for Siperian Data Warehouse, ODS, and
source systems using TOAD and IBM Information Analyzer tools.
 Collaborated with Cash Management Division and Compliance Office for AML requirements and to
provide Positive Pay service for check fraud reduction.
 Engaged in data cleansing, layout creation, and specifications for end-user reports, delivering a
complete Metadata Management solution.
 Created use cases, data flow models, and performed cost/benefit analysis.
 Identified customer marketing opportunities using data mining models, advanced T-SQL stored
procedures, and Python in data collection programming.
 Worked on performance tuning of mappings, sessions, Target and Source.
 Applied advanced functions like VLOOKUP’s, Pivots, graphs, and analytical and statistical tool
packs in Excel.
 Demonstrated a thorough understanding of various AML modules, including Watch List Filtering,
Suspicious Activity Monitoring, CTR, CDD, and EDD.
 Involved in Data extractions and mapping using customized PL/SQL code on Unix/Linux platforms.
 Designed and developed strategies for implementing SOAP and REST web services.
 Extensive experience with ETL & Reporting tools such as SQL Server Integration Services (SSIS)
and Silver Reporting Services (SSRS), including UML structure and behavior diagrams and
reporting using advanced Word/Excel skills.

Wells Fargo Bank, Phoenix, AZ May 2018 - Feb 2020


Data Analyst
The focus of the project was a significant conversion/migration from their old mainframe to a tailored loan
origination system with analytic capabilities. Wells Fargo was in the process of constructing a centralized
data warehouse to integrate their diverse businesses. This project specifically addressed linking the
mortgage business to the centralized warehouse, aiming to reduce loan origination processing times and
enhance overall productivity.
Responsibilities:
 Translated business requirements into detailed technical specifications, new features, and
enhancements for existing technical business functionality.
 Prepared and analyzed AS-IS and TO-BE architectures, conducted Gap Analysis, created workflow
scenarios, designed new process flows, and documented Business Processes and various
Business Scenarios from conceptual to procedural levels.
 Analyzed business requirements using Unified Modeling Language (UML), developing high-level
and low-level Use Cases, Activity Diagrams, Sequence Diagrams, Class Diagrams, Data-flow
Diagrams, Business Workflow Diagrams, and Swim Lane Diagrams with Rational Rose.
 Designed and developed Data Models and Data Marts supporting Business Intelligence Data
Warehouses using an Agile SDLC methodology.
 Utilized Scrum Work Pro and Microsoft Office software within the corporation's developed Agile
SDLC methodology.
 Created and maintained sales reporting using MS Excel queries, SQL in Teradata, and MS Access,
producing performance reports and implementing changes for improved reporting.
 Implemented Actimize Anti-Money Laundering (AML) system to monitor suspicious transactions
and enhance regulatory compliance.
 Collaborated with region and country AML Compliance leads, supporting compliance-led projects,
defining phases, training, test scripts, data migration, and uplift strategy.
 Provided web analytics reports using Adobe Site Catalyst and Google Analytics on a weekly,
monthly, and ad-hoc basis.
 Developed Tableau data visualizations using various chart types for analysis.
 Demonstrated report development experience using SQL Server Reporting Services (SSRS),
Congo’s Impromptu, and Microsoft Excel.
 Wrote PL/SQL procedures for processing business logic in the database and tuned SQL queries for
better performance.
 Utilized complex Excel functions such as pivot tables and VLOOKUPs for managing large datasets.
 Developed SQL-based data warehouse environments and created custom database applications
for data archiving, analysis, and reporting.
 Conducted data mapping, logical data modeling, created class diagrams, and ER diagrams, using
SQL queries to filter data within the Oracle database.
 Worked on the systems implementation project management plan, covering milestones and steps
from vendor procurement to project implementation and maintenance.
 Created dashboard analyses using Tableau Desktop for car rental records, covering customers,
days of rental, service region, travel type, etc.
 Conducted research, analysis and provide qualitative/quantitative insights to lead and direct
projects and initiatives to meet the Firms AML/KYC Standards.
 Assisted in integration testing, regression testing, and analyzed data, creating reports using SQL
queries.
 Performed extensive requirement analysis, including data and gap analysis.
 Designed and implemented basic SQL queries for QA Testing and Report/Data Validation.
 Created UML structure and behavior diagrams, utilizing advanced Word/Excel skills, and writing
basic SQL scripts for reporting.
UBS, Bethesda, MD Nov 2016 - April
2018
Data Analyst
The project aimed to design and populate the Reporting Data Store (RDS) using SSIS packages, stored
procedures, and views from the Central Data Warehouse (CDW). It also included the creation of drill-down
and drill-through reports using SSRS to showcase reconciliation variances from source to target.
Responsibilities:
 Built models and solves complex business problems where analyses of situations and/or data
require in-depth evaluation of variable factors.
 Involved in interactions with the Subject Matter Expert, Project Manager, Developers, and the End-
Users.
 Understood the requirement and developed the ETL process.
 Performed unit test on ETL code and written test cases.
 Generated Data Models using Erwin and developed relational database system and involved in
Logical modeling using the Dimensional Modeling techniques such as Star Schema and Snow
Flake Schema.
 Participated in documenting the data governance framework including processes for governing the
identification, collection, and use of data to assure accuracy and validity.
 Metrics reporting, data mining and trends in helpdesk environment using Access.
 Involved in creating Informatica mapping to populate staging tables and data warehouse tables
from various sources like flat files DB2, Netezza and oracle sources.
 Involved in loading the data from the flat files submitted by vendors into the Oracle12c External
tables, extensively used ETL to load data from Oracle database, XML files, and Flat files data and
also used import data from IBM Mainframes.
 Used Python to preprocess data and attempt to find insights.
 Involved with ETL team to develop Informatica mappings for data extraction and loading the data
from source to MDM Hub Landing tables.
 Involved in MDM Process including data modeling, ETL process, and prepared data mapping
documents based on graph requirements.
 Used Pivot tables, VLOOKUPS and conditional formatting to verify data uploaded to proprietary
database and online reporting.
 Involved in implementing metadata standards, data governance and stewardship, master data
management, ETL, ODS, data warehouse, data marts, reporting, dashboard, analytics,
segmentation, and predictive modeling EDM (Enterprise Data Management) - promotes data
integration between systems via standardized data interfaces.
 Coordinated with Data Architects and Data Modelers to create new schemas and view in Netezza
for to improve reports execution time, worked on creating optimized Data-Mart reports.
 Independently worked on owning IT support tasks related to Tableau Reports on Server.
 Primary subject matter expert in implementing Oracle's self-service OBIEE reporting system. This
system allows for real-time data to be accessible at all HR locations throughout the company.
 Evaluate, identify, and solve process and system issues utilizing business analysis, design best
practices and recommended enterprise and Tableau solutions.
 Used advanced functions like VLOOKUPS, Pivots, graphs, and analytical and statistical tool packs
in Excel.
 Involved in Normalization and De-Normalization of existing tables for faster query retrieval.
 Tested Complex ETL Mappings and Sessions based on business user requirements and business
rules to load data from source flat files and RDBMS tables to target tables.
 Extensively used SQL Server2014 tools to develop Oracle stored packages, functions and
procedures for Oracle database back-end validations and Web application development.
 Performed analyses such as regression analysis, logistic regression, discriminant analysis, cluster
analysis using SAS programming.
Frost Bank, Cincinnati, OH May 2015 - Oct 2016
Data Analyst
My responsibilities included evaluating the effects on downstream data management environments
resulting from processing various banking products through different systems and processes.

Responsibilities:
 Documented all data mapping and transformation processes in the Functional Design documents
based on the business requirements.
 Involved in Data mining, transformation and loading from the source systems to the target system.
 Involved in defining the source to target data mappings, business rules, and business and data
definitions.
 Provided quality data review for completeness, inconsistencies, erroneous and missing data
according to data review plan.
 Prepared High Level Logical Data Models using Erwin, and later translated the model into physical
model using the Forward Engineering technique.
 Made data chart presentations and coded variables from original data, conducted statistical
analysis as and when required and provided summaries of analysis.
 Was responsible in using MDM to decide how to manage their life cycle, cardinality, complexity and
collect and analyze metadata about the master data.
 Loaded operational data from Oracle, flat files, XML files, Excel spreadsheets into Oracle
target data mart and used Informatica Power Exchange for mainframe sources, such
as COBOL files.
 Extensively used Informatica Power Center Designer to create a variety of mappings including
expression, sequence generator, source qualifier, router, joiner, update strategy, aggregator, stored
procedure, and lookup transformations with embedded business logic.
 Involved in extensive DATA validation using SQL queries and back-end testing.
 Contributed to the initial data mining work and development of tools and technology.
 Initiated detailed discussions/functional walkthroughs with stakeholders.
 Produced system documents, functional requirements, ETL mapping and modeling deliverables
Worked with root cause of data inaccuracies and improved the quality of data by using Data Flux.
 Wrote complex SQL, T/SQL Testing scripts for Backend Testing of the data warehouse application.
 Assisted developers and testing teams to ensure that the requirements are communicated
accurately and that the system being developed is verifiable against the requirements.
 Wrote SQL queries to validate source data versus data in the data warehouse including
identification of duplicate records.
 Tested Ad-hoc reports, drill down and drill through reports using SSRS.
 Tested different detail, summary reports and on demand reports.
 Involved in pre and post session migration planning for optimizing data load performance.
 Facilitated periodic data loading by using reusable transformations, mapplets and worklets.
 Used parameter files to define values for mapping parameters and variables to provide flexibility.
 Implemented the Slowly Changing Dimensions (SCD) of type 1 and type 2 (flag, date) for the
purpose of incremental loading of target database.
 Worked on performance tuning of the SQL queries, Informatica mappings, sessions, and workflows
to achieve faster data loads to the data mart.
 Extensively used SQL DDL/DML commands with TOAD environment to create target tables and
perform different testing for accuracy & quality.

You might also like