Professional Documents
Culture Documents
Purva K: Professional Profile
Purva K: Professional Profile
PROFESSIONAL PROFILE:
Over 7 years of experience in Development and Application Production Support having expertise in
UNIX, SHELL Scripting, Perl Scripting, SQL, PL/SQL.
Extensive functional and technical knowledge of Data warehouse, Business Intelligence (BI),
Decision Support Systems (DSS) and ETL.
Excellent knowledge in Data Analysis, Data Validation, Data Cleansing, Data Verification and
Identifying data mismatch.
Good Knowledge of Software Development Life Cycle (SDLC) and have good working knowledge of
development and testing methodologies .Experience in working both an Agile and
Waterfall environment.
Significant Experience in requirement analysis, designing, code development, unit testing , system
testing and application production support.
Excellent Experience of Data Extraction, Transforming and Loading (ETL) using Shell scripting,
UNIX, Perl Scripting, SQL and PLSQL procedures.
Extensive experience in DataStage parallel jobs using Various Stages like Join, Lookup, Change
Capture, Filter, Funnel, Copy, Sort, File set, Sequential File, XML Input, XML Output, Websphere MQ
connector, Data set, Oracle Enterprise, Merge, Aggregator, and Remove Duplicate Stages.
Significant experience in Information analyzer using Column Analysis, Primary Key Analysis,
Foreign Key analysis and Base Line Analysis reports.
Extensive work on job sequences to control the execution of jobs flow using various activities like Job
Activity, Email Notification, Sequencer, Routine activity and Exec Command.
Strong experience in Data warehouse development life cycle.
Good Hands on experience on databases including Oracle and Sybase.
Excellent Knowledge of ITIL processes. ITIL V1 Level Certified.
Excellent experience in L1/L2 production support dealing with User queries and requirements.
Vast experience in monitoring batch jobs and maintenance support for the production systems using
Control M Monitoring Tool, TIVOLI and GENEOS Monitoring Tool.
Expertise in resolving the Production issues using backend technologies like Shell Scripting, Perl
Scripting and SQL queries.
Significant experience of participating in crisis calls and interacting with the Business, Users and the
Support teams.
Good experience on Release and Change management.
Good experience with Incident Management and Bug Management.
Experience in creating reports using COGNOS Reporting Tool.
Good domain knowledge of Financial Services, Equities and Fixed Income Trading and Post Trade
Services.
Good Exposure on working in offshore/onsite model with ability to understand and/or create functional
requirements working with client and also have good experience in requirement analysis and generating
artifacts from requirements docs.
An excellent team player & technically strong person who has capability to work with business users,
project managers, team leads, architects and peers, thus maintaining healthy environment in the project.
Basic knowledge of TERADATA, Informatica.
Worked in Financial Services & Telecom Domain.
TECHNICAL SKILLS
PROFESSIONAL EXPERIENCE
Requirement analysis , articulation and refining of requirements . Attend the business requirements (BRD)
reviews , HLD reviews to finalize the requirements.
Prepare the low level design documents for the requirements finalised.
Understanding the source systems, database layers and data flow.
Analyze and understand the Data Integration requirement.
Understanding the data feed files and the directory structures.
Designing & Development of complex parallel Datastage jobs, performance tuning of jobs, job
monitoring, job scheduling, etc
Implementation of different ETL logics like aggregation, pivoting, joining, lookup, etc
Writing shell script to pull and push the data from different source and target systems.
Handling job executions via shell script
Writing stored procedures, packages and functions in Oracle.
Work with DataStage Director to Run and check the logs of jobs.
Perform unit testing for the development done and also provide support the testing team for system test.
Perform the deployment of code in Production environment.
Monitor the Production Datastage jobs , the data feeds at regular intervals through shell scripts.
Participate in meetings with Business users and Source system teams to define data
elements of Data warehouse.
Involve in Business analysis and technical design sessions with Business and technical
staff to
Develop data model, requirements document, and ETL specifications.
Translate the Business requirements into Data warehouse design and implemented
common ETL
Frame work for auditing the jobs status.
Work with Information Analyzer to understand the nature of data and define metadata to
build
Data model
Experience of Information analyzer to understand the nature of source data and quality
by
Running the Column analysis, Primary Key analysis, Foreign Key analysis and Base Line
Reports.
Environment : IBM Information Server 8.7, Datastage 8.7, Erwin 5.5, Unix, Oracle 10g, Teradata, Flat
files, SQL, PL SQL, Putty
Global Banking and Markets is the investment banking arm of HSBC. It provides investment banking and
financing solutions for corporate and institutional clients, including corporate banking, investment banking,
capital markets, trade services, payments and cash management, and leveraged acquisition finance. It
provides services in equities, credit and rates, foreign exchange, money markets and securities services,
in addition to asset management services.
Cash Equities Securities includes a number of Front Office , Back Office and Middle office web- based
applications used by Equities securities operators based in New York , Hong Kong, London, Japan ,
Korea and Taiwan. Fidessa (Front Office) , GEMS - Global Execution Management System (Middle
Office) , FISCAL (Back Office) , FIX , OASYS GLOBAL are some of the applications from Equities
Securities Trade flow.
GEMS being the middle office application majorly contribute in Trade Validation, Trade Confirmation and
Trade Affirmation.
Environment : Unix , Perl scripting , Shell Scripting, SQL , PL/SQL, SQL Developer, ORACLE 10g,
SYBASE, Control M , TIVOLI, GSD , GENEOS , COGNOS , FTP, SFTP
Global Banking and Markets is the investment banking arm of HSBC. It provides investment banking and
financing solutions for corporate and institutional clients, including corporate banking, investment banking,
capital markets, trade services, payments and cash management, and leveraged acquisition finance. It
provides services in equities, credit and rates, foreign exchange, money markets and securities services,
in addition to asset management services.
Equities and Fixed Income Trade flow includes number of front office, middle office, back office
applications.GSA(Global Settlement Assistant) provided a real time view on the Open Securities
Transactions to the Exchange Operators , Users globally. It basically acted as a Settlement Assistant . It
provided information about the status of the trade being open/ settled in FISCAL (Back Office) . Open
trades reports and WIM exception reports which could provide a detailed view of trades not settled in the
Back Office system with exceptions and errors encountered during settlement.
Environment: Unix , Shell Scripting, SQL , SQL Developer, ORACLE 10g, Control M , TIVOLI, GSD ,
SYBASE, COGNOS , FTP, SFTP
Global Banking and Markets is the investment banking arm of HSBC. It provides investment banking and
financing solutions for corporate and institutional clients, including corporate banking, investment banking,
capital markets, trade services, payments and cash management, and leveraged acquisition finance. It
provides services in equities, credit and rates, foreign exchange, money markets and securities services,
in addition to asset management services.
Roles and Responsibilities:
Perform L1 Support for Post Trade Flow for the Equities Securities.
Day to day operational support including Batch Job monitoring and health checks at regular intervals.
Logging Incidents for the issues reported by End Users or system related issues.
Resolving the User queries related to data feeds , static data , confirmations for the trades.
Generation of Static data reports for particular client or market or Listing on request from Users.
Assisting the Users to track the Orders entered from Front Office (Fidessa)through its processing to the
settlement at the Back Office (GRBS, FISCAL).
Creating sql select queries to generate the adhoc reports requested by the Users.
Creating shell scripts for automating the monitoring on the data feeds or the daily health checks.
Communicate with other teams involved in the Post Trade flow applications to maintain the continuity of
the data flow and resolve the issues if any.
Logging of Problem records and assigning it to appropriate teams for resolution of issues.
Assist the senior members in the team in Production deployment.
Attend the deployment calls with the development and testing team to understand the code changes and
ensure they work as expected in UAT environment.
Attend the crisis calls to understand the issues during the outages in Production.
Knowledge sharing sessions within the team to help understand the application in depth.
Environment: Unix , Shell Scripting, SQL , SQL Developer, ORACLE 10g, Control M , TIVOLI, GSD ,
SYBASE, COGNOS , FTP, SFTP
Project: PMOSS
PMOSS Performance Management Operation Support System is an application supporting the
performance tunning of the monitoring and the alerting done for the AT&T network. It involves an AT&T
proprietary Datawarehouse called Daytona and AT&T specific Reporting tools for network performance
analysis. It deals with the VOIP and the Uverse, IPTV networks.
Environment: Unix, Shell Script, Perl Script, SQL , Stored procedures, FTP, SFTP
TECH MAHINDRA LTD - Pune, India April '06 - Sept '06
Role: Trainee
Description - Training attended to gain knowledge of the software technologies - C , C++ , Unix shell
scripting , SQL , PL/SQL , RDBMS . Trainings attended on Corporate Etiquette, Interpersonal skills,
communication skills.
EDUCATION: