Professional Documents
Culture Documents
Jayant
Jayant
TECHNICAL SKILLS:
Professional Experience:
M&T Bank, Buffalo, NY April 2022 - present
Sr. Business Data Analyst
As a Senior Data Analyst, I documented the process to assess eligibility for financing and securitization of
Asset Based Lending loans. This involved sourcing data from multiple channels and establishing methods
for data acquisition and storage. The Structured Finance Group at M&T Bank utilizes the Stucky Application
software to track commercial collateral metrics, including Borrowing Base, Aging, Receivables, and
Inventory.
Responsibilities:
Acquired experience in all aspects of retail business including sales tracking, financial reporting,
staffing, training & development, staff relations, marketing and inventory control.
Created reports on Statutory Liquidity Ratio (SLR) or Liquidity Coverage Ratio (LCR), Cash
Reserve Ratio, Daily time and demand deposit statuses and Projected Provisions.
Facilitated the scrum meetings and helped to act as a buffer between development team and non-
development distractions.
Worked with the Validation Analyst to gather requirements, implement enhancements, releases and
identify risks for Global Specification System (GSS) and Enterprise Document Management (EDM).
Worked on oracle SQL scripts and hive/impala SQL scripts to load data.
Performed ingestion strategy and data acquisition with incremental data load in Hive.
Set up and maintain Retail Banking sales incentive reporting system and developed retail banking
application to provide true end-to-end visibility.
Created data warehouse objects - Fact tables, dimension tables, and association tables.
Executed SQL queries to test back-end data validation of DB2 database tables based on business
requirement.
Worked with SQL Joins and SQL queries extensively to evaluate data process and end user
satisfaction.
Performed Gap analysis and implemented updates for the Data Warehousing Application.
Interacting with other data scientists and architected custom solutions for data visualization using
tools like tableau, Packages in R and R-Shiny.
Implemented new test procedures in data warehouse QA environment like validating data
availability, incremental data load, data accuracy, data loss and naming standards.
Reviewed Commercial loan documentation to ensure enhancements are meeting the proper audit
standards.
Used Rational Rose to designed and develop Use Cases Scenarios, Use Cases Models, Activity
Diagrams, Sequence Diagrams, State chart diagrams, OOAD using UML.
Wrote Test Cases in MS Excel for user registration, access to training material, and activity log-in,
reviewed the test cases and finalized.
Worked on DataStage Manager for importing metadata from repository, new job categories and
creating new data elements.
Involved in creating ETL using Informatica by applying best practices and error handling process.
Worked extensively with Microsoft Excel (Macros, VLOOKUPS, and Pivot Tables). Proficient
Microsoft Word, PowerPoint, Access.
Facilitated Joint Requirement Planning (JRP) sessions with SME's in understanding the
Requirements pertaining to Loan Origination to Loan Processing.
Created requirements analysis and design phase artifacts using Requisite Pro, activity diagrams
and Sequential diagrams.
Involved in Target Data Mapping, Data Profiling, and Transformation rules for OLAP.
Analyzed Data Set with SAS programming, R and Excel.
Extensive experience in Text Analytics, generating data visualizations using R, Python and creating
dashboards using tools like Tableau.
Worked on improving their Data Governance Policies, Data Governance Processes, Data
Management, Data Management Infrastructure worked on investing significant resources to guide
and fulfil the data needs.
Involved in working with the System Engineering team on Data Governance to optimize their efforts.
Used Power BI to design interactive reports and data visualization using Power BI Desktop.
Developed Power BI reports and dashboards from multiple data sources using data blending.
Ran data lineage Analysis to perform Impact analysis and RCA analysis.
Implemented Front End to a Fixed Income Derivatives Trading System, including trade blotter and
interfaces to risk management, accounting, and middle and back office.
Created use-case scenarios and storyboards in MS Word and MS PowerPoint for better
visualization of the application and managed them using Rational Requisite Pro.
Worked with the backend data retrieval team, data stewards, data mart team to guide the proper
structuring of data for Power BI reporting.
Wrote Test Plans in MS Word for Manual Testing, System Testing, Integration Testing,
Performance Testing, Regression Testing & reviewed their consistency with the business
requirements.
Provided assistance to the Quality Analysts team with writing test scripts between hive and oracle
and also between excel, csv to hive tables using Spark.
Highly experienced on migrating data from legacy systems (Mainframes - VSAM, QSAM files, IMS
DB/DC Hierarchical Database, COBOL) to Oracle / SQL Server staging areas. Developing
interfaces (ETL tools) between legacy systems to Oracle staging.
Coded DB2 Cobol modules and worked in socket communication modules to connect to open
system
Worked with the Enterprise Data ware house team and Business Intelligence Architecture team to
understand repository objects that support the business requirement and process
Automated and scheduled recurring reporting processes using UNIX shell scripting and Teradata
utilities such as MLOAD, BTEQ and Fast Load
Upgraded the present application by adding new functionalities and adding new reports and wrote
regression test cases, did smoke testing with users.
Coordinated (with business users, data base administrator, mainframe team and testing team) in
mirror to production testing.
Prepared Scripts in Python and Shell for Automation of administration tasks.
Coordinated activities with project manager and various other teams using MS Project.
Supported User Acceptance testing (UAT) and collaborated with the QA team to develop test plans,
test scenarios, test cases and test data to be used in testing, based on business requirements,
technical specifications and/or product knowledge.
Contributed to the Data Management and Business Solutions team, engaging in tasks such as Data
Analysis, Requirement gathering, GAP analysis, Data Mapping, and Testing of ETL and BI
Reporting requirements.
Generated source-to-target data mapping documents for reconciling data across different systems.
Utilized Agile/Scrum methodology for gathering requirements, documenting user stories, and
discussing project specifications.
Worked on business requirements for CIB and IHC Financial Reporting, Management Reporting,
and Regulatory Reporting work streams, aiming to enhance the quality and process of Board
reports, management reports, financial statements, and regulatory reports.
Collaborated with users and IT to configure AXIOM for manual data input templates, plan Feed and
Line Automation.
Exported large datasets to Excel, creating relational worksheets, pivot tables, and performing data
comparisons and searches.
Developed Pricing Reports and created Pivot Tables for review and comparison.
Managed multiple projects in the HR, Finance, and Revenue Accounting areas.
Expert in managing, storing, and analyzing data using Microsoft Azure data services, including
Azure Synapse Analytics, Azure SQL Database, and Azure Data Lake Storage.
Performed data engineering activities using Azure Databricks, including feature engineering, model
creation, and data preparation.
Deployed a tool for analyzing products, leading to productivity savings, and developed a model to
forecast demand for different Exchanges.
Created stored procedures, views, and custom SQL queries to import data from SQL Server to
Power BI.
Utilized Power BI for creating dashboards and ad-hoc reports for senior management analysis.
Provided ongoing maintenance and bug fixes for existing and new Power BI Reports.
Performed CRUD operations in SQL Server, HTTP, and Restful APIs.
Operated SWIFT messaging service for financial messages between member banks worldwide.
Conducted Data Profiling and Data Quality assessment for Siperian Data Warehouse, ODS, and
source systems using TOAD and IBM Information Analyzer tools.
Collaborated with Cash Management Division and Compliance Office for AML requirements and to
provide Positive Pay service for check fraud reduction.
Engaged in data cleansing, layout creation, and specifications for end-user reports, delivering a
complete Metadata Management solution.
Created use cases, data flow models, and performed cost/benefit analysis.
Identified customer marketing opportunities using data mining models, advanced T-SQL stored
procedures, and Python in data collection programming.
Worked on performance tuning of mappings, sessions, Target and Source.
Applied advanced functions like VLOOKUP’s, Pivots, graphs, and analytical and statistical tool
packs in Excel.
Demonstrated a thorough understanding of various AML modules, including Watch List Filtering,
Suspicious Activity Monitoring, CTR, CDD, and EDD.
Involved in Data extractions and mapping using customized PL/SQL code on Unix/Linux platforms.
Designed and developed strategies for implementing SOAP and REST web services.
Extensive experience with ETL & Reporting tools such as SQL Server Integration Services (SSIS)
and Silver Reporting Services (SSRS), including UML structure and behavior diagrams and
reporting using advanced Word/Excel skills.
Responsibilities:
Documented all data mapping and transformation processes in the Functional Design documents
based on the business requirements.
Involved in Data mining, transformation and loading from the source systems to the target system.
Involved in defining the source to target data mappings, business rules, and business and data
definitions.
Provided quality data review for completeness, inconsistencies, erroneous and missing data
according to data review plan.
Prepared High Level Logical Data Models using Erwin, and later translated the model into physical
model using the Forward Engineering technique.
Made data chart presentations and coded variables from original data, conducted statistical
analysis as and when required and provided summaries of analysis.
Was responsible in using MDM to decide how to manage their life cycle, cardinality, complexity and
collect and analyze metadata about the master data.
Loaded operational data from Oracle, flat files, XML files, Excel spreadsheets into Oracle
target data mart and used Informatica Power Exchange for mainframe sources, such
as COBOL files.
Extensively used Informatica Power Center Designer to create a variety of mappings including
expression, sequence generator, source qualifier, router, joiner, update strategy, aggregator, stored
procedure, and lookup transformations with embedded business logic.
Involved in extensive DATA validation using SQL queries and back-end testing.
Contributed to the initial data mining work and development of tools and technology.
Initiated detailed discussions/functional walkthroughs with stakeholders.
Produced system documents, functional requirements, ETL mapping and modeling deliverables
Worked with root cause of data inaccuracies and improved the quality of data by using Data Flux.
Wrote complex SQL, T/SQL Testing scripts for Backend Testing of the data warehouse application.
Assisted developers and testing teams to ensure that the requirements are communicated
accurately and that the system being developed is verifiable against the requirements.
Wrote SQL queries to validate source data versus data in the data warehouse including
identification of duplicate records.
Tested Ad-hoc reports, drill down and drill through reports using SSRS.
Tested different detail, summary reports and on demand reports.
Involved in pre and post session migration planning for optimizing data load performance.
Facilitated periodic data loading by using reusable transformations, mapplets and worklets.
Used parameter files to define values for mapping parameters and variables to provide flexibility.
Implemented the Slowly Changing Dimensions (SCD) of type 1 and type 2 (flag, date) for the
purpose of incremental loading of target database.
Worked on performance tuning of the SQL queries, Informatica mappings, sessions, and workflows
to achieve faster data loads to the data mart.
Extensively used SQL DDL/DML commands with TOAD environment to create target tables and
perform different testing for accuracy & quality.