Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 5

Praveen kumar Sampath Kumar

+1-732-731-0310
wasim@gp-technologies.com

Experience Summary

 Having 15+ of experience in designing, developing and maintaining large business applica-
tions such as data migration, integration, conversion.
 Having 5 years of experience working in Hadoop Ecosystem, design/developing applica-
tions.
 Domain experience includes Tolling, financial services domain with expertise in Risk
Management, ERP, eCommerce, clearing and settlement of securities in F&O and cap-
ital market.
 Exposure in design and development of solutions for Big Data using the Hadoop eco system
technologies (Apache Spark(Scala Spark and Pyspark), HDFS, Hive, Sqoop, Kafka, Pig,
Oozie, Map Reduce)
 Working on AWS Components like S3, EMR, EC2, Kinesis Data Streams, Glue, Lambda,
Redshift, Data Lake, Athena, RDS, Data Lake, Step Function
 Worked extensively on Message queuing tools like Kafka, Streaming and No sql databases
like HBASE, Cassandra
 Hands on experience on Data ingestion tools like NIFI, Airflow.
 Worked extensively on Hadoop migration project and POCs.
 Capable of processing large sets of Structured, Semi-structured data, Complex data.
 Brought in simplification process and Optimization initiatives to bring efficiency into appli -
cations.
 Manned versatile roles across diverse applications as Data Engineer, Developer and automa-
tion projects.
 Strong data base experience in Oracle PL/SQL, Hive QL(HQL), Oracle Forms and Re-
ports, Sybase
 Strong scripting experience in Unix Shell Scripting and Windows Batch Programming
 Worked in Agile Methodologies. Certified Scrum Master(CSM)
(https://bcert.me/sdndriubx)
 Have good problem solving and analytical skills and love to innovate in order to perform bet-
ter.
 Have strong Interpersonal skills, communication skills and people skills to manage a team.

Skill Profile

Data Eco System Hadoop, Sqoop, Hive, Apache Spark, Yarn, HBASE, Snowflake,
AWS Cloud, Kafka, Cassandra, Nifi, Airflow
Distribution Cloudera
Databases Oracle (19c, 12c, 11g, 10g, 9i & 8i), Sybase
Languages Scala, Pyspark, Pro*C, HQL, PL/SQL, Shell scripting on UNIX,
Python
Operating Systems Linux, UNIX (HP UX) and Windows
Project Management Microsoft Project, Azure Devops
Operating Systems UNIX (HP UX), Windows
Application & Tools Azure Devops, Jira, GIT, GITHub, Eclipse, Oracle
Forms/Reports Developer 10g & 6i, RCS, VSS, Toad, Oracle
Enterprise Manager, SQL Extreme, iSQLW, Win SQL, Sql
Developer, PL/SQL Developer, Autosys

Trainings & Certifications

 Certified Scrum Master (https://bcert.me/sdndriubx)


 Attended PMP training
 Training in Oracle Apps Financials in General Ledger (GL), Accounts Payable (AP),
Accounts Receivable (AR) and Fixed Assets (FA)
 OCP: Introduction to ORACLE 9i SQL
 Risk Management Certification from TCS 2008
 NCFM Certification: Surveillance in Stock Exchanges in Apr 2007, AMFI - Mutual Fund
(Advisors) in Mar 2007, Capital Market (Dealers) in Nov 2006, Commodities Market in Aug
2006, Derivatives Market in Jul 2006

Awards and Accolades

 Received GEM Award in Hexaware.


 Received Value Champion Award in Hexaware.
 Received Client appreciation for Statement Exception clearing and Mailfee Dismissal
Automation.
 Received Client appreciation for successful migration of system and for implementing E&Y
audit requirements in NSE(FOCASS).
 Received client appreciation for Automation of Batch monitoring in DRM/RRSS in Citi
Bank.
 Successfully migrated all Pro-Cobol programs into Oracle PL/SQL environment in Gulf
International Bank.

Projects

Project : Transaction Processing Management System


Client Name: Conduent – Maryland, Germantown, USA Feb 2017 to Till date
Role: Spark-SQL Developer.
The main objective of Transaction Processing Management System is to handle the Electronic
Tolling with Inter Agency capability.
Responsibilities
 Performed Import and Export of data into HDFS/AWS S3 and Hive using Sqoop and
managed data within the environment.
 Involved in handling Data and processing it in Spark.
 Was responsible for Optimizing Spark sql queries that helped in saving Cost to the project.
 Exported necessary spark Jars to run in the cluster and AWS EMR
 Involved in working on the Data Analysis, Data Quality and data profiling for handling
the business that helped the Business team.
 Loaded and transformed large sets of structured and semi structured data likes XML,
Avro, Parquet.
 Handled complex data processing for JSON Data.
 Code & peer review of assigned task. Unit testing and Bug fixing.
Technologies: Hadoop, HDFS, Hive, Sqoop, Spark sql(Scala and Pyspark), AWS S3, EC2,
EMR, Redshift, RDS, Data Lake, Snowflake, Kafka, Kinesis, Nifi, Airflow, HQL, HBASE,
Cassandra
Project:Dashboard,Inventory,SalesManagementSystem and Strategic Projects
Management System
Client Name: Ministry of Municipalities – Bahrain Mar 2014 to Oct 2015
Role: Senior Systems Analyst.
Responsibilities
 Development of database objects like Procedures, Functions, Triggers, complex views,
Materialized Views
 Performed SQL Tuning.
 Created complex Oracle Forms and Reports
Involved in working on the Data Analysis, Data Quality and data profiling for handling the busi-
ness that helped the Business team.
Technologies: Oracle PL/SQL, Unix Shell Scripting, Oracle Forms and Reports, Autosys, CA
Sheduler
Project: Middle Office Operations
Client Name: Gulf International Bank – Bahrain Jun 2011 to Mar 2014
Role : Systems Analyst.
Environment: Oracle Forms and Reports 6i, Oracle PL/SQL 11g, Sybase, Moodys Risk Analyst,
Acumen, Flexcube, Murex
The Main objective of Middle Office Operations is to handle the interface between Front Office
and Back Office Operations, Risk Management, In-house Financial Applications handling
Accounts Payable(AP), Fixed Assets(FA), Cheque Printing and Islamic Banking(IB) of Gulf
International Bank.
Handled a Migration Project to migrate Accounts Payable(AP), Fixed Assets(FA) and Cheque
Printing modules of Financial Applications from Pro-Cobol to Oracle PL/SQL Environment with
a team of 3 members.
Project : eCommerce
Client Name : Citi Group – Bahrain Mar 2010 to Apr 2011
Environment : Pro*C, Oracle PL/SQL 10g , Sybase, PL/SQL, Unix Shell Scripting
Role : Systems Analyst.
The Main objective of eCommerce is to handle Foreign Exchange, Money Market and Fixed
Income trading systems of Citi Bank.
I was involved in collecting the requirements from the Business Users and developing the
required functionalities to meet the product requirements of eDealer and FXLM.
Project: Credit Engine
Client Name : Citi Group (London / Chennai) Feb 2009 to Jan 2010
Environment : Sybase, PL/SQL, Unix Shell Scripting, Crontab
Role : Systems Analyst.

The Main objective of Credit Engine is to calculate various risk factors like Settlement Risks.
Pre-Settlement Risks, Lending Risks, Mark to Market and generate various Reports.
I was involved in the development and Support of Credit Engine Application of Citi Bank. I had
to interact directly with Risk Analysts of the bank and support their requests.
Project: Debt Rating Model / Risk Rating and Scoring System (DRM/RRSS)
Client Name: Citi Group – USA Feb 2008 to Feb 2009
Role: Senior Oracle, Sybase Developer
Environment: Oracle Forms 10g, Oracle Reports 10g, PL/SQL, Unix Shell Scripting, Abinitio,
Sybase, Autosys
The main objective of DRM/RRSS is to generate Obligor Risk Rating (ORR) for companies
which would help them to get credit for their business investments. DRM is used for large
enterprises and RRSS is used for Small and Medium Enterprises (SME).
I was involved in creation of Packages, Procedures, Triggers and Abinitio Graphs and Autosys
JIL Scrips, Unix Shell Scrips which were used to produce the Obligor Risk Rating (ORR).

Project : Nationwide Clearing and Settlement System (NCSS) / Futures & Options
Clearing and Settlement System (FOCASS)
Client Name: National Stock Exchange - Mumbai, Jan2006 to Feb 2008
Environment: Pro*C, Oracle Forms 6i & 10g, Oracle 8i & 10g, Unix Shell Scripting,
PL/SQL
Role: Developer / Module Lead.

The main objective of NCSS is to analyze, design, develop and implement Nationwide Clearing
and Settlement System for Securities traded on NSEIL. NCSS is a turnkey project implemented
onsite at National Stock Exchange of India Limited, Mumbai, India.
Futures and Options Clearing & Settlement System (FOCASS) provides for auto-contract
generation, clearing, exercise – assignment, margin, obligation calculations and settlement for the
trades done on trading system.
I was leading a team of 5 developers. Had handled complex applications with a tight schedule.
Major Assignments:
Electronic Tolling System
Design and Development of Electronic Tolling System with Inter Agency capability.
Moody’s Risk Analyst and Acumen Support
Production support and Maintenance of Moodys Risk Analyst and Acumen Applications in Gulf
International Bank.
Pro Cobol Conversion
Converted the Financial Modules programs of Fixed Assets(FA) and Accounts Payable(AP) in
Pro Cobol environment to Oracle Environment using Forms 6i, PL/SQL, SQL Scripts, and
Windows Batch Programming.
Peer Comparison Report
Created an Online Peer Comparison Report in Moodys Risk Analyst for comparing Customer
Peer Ratios and Ratings with export option to Excel.
eCommerce
Developing the database objects of various applications like eDealer, PULSE, FXLM.
Credit Engine
L2 Support of Credit Engine system for Production and UAT environments from onsite and
offshore (London / Chennai).
Automation of Batch Monitoring
There were several batches running in DRM throughout the day and the results of these batches
were monitored manually. I proposed automation of this monitoring and implemented it. A new
batch was created which would send a mail with an excel attachment of the details of the batches.
Bulk Load and Rate DRM (BLRDRM)
The DRM application generates one rating at a time. BLRDRM application is used to generate
bulk ratings. About 1 Million ratings can be generated in 16 hours.
Securities lending and borrowing market (SLB)
A new market system was implemented in NSE for the clearing and settlement in the SLB market
in a very short period.
Reorganization and Optimization of Depositories Module
The entire processing flow of Securities Payin/Payout was reorganized. The Payin/Payout
cycle of securities which took 3 hours for processing takes only 45 minutes now.
Parallel processing and Batch processing of processes was introduced. Processes were
merged and optimised and automation of inputs to the processes was made by the system
instead of getting the inputs from the user.
Migration of the system to handle 100 million trades
The trading and clearing system could initially handle only 10 million trades. Since the
trading activity increased suddenly, the migration was done in a very short time.

You might also like