Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 14

Nithya Narasimhan

Email: nithyakn2009@gmail.com
Mob: +91 8056090002

Roles Professional Profile


 Big Data Cloud Architect
 Worked with customers to understand their
 Hadoop Solution Architect
 Project Lead landscape, identify the right tools and
technologies to help solve complex problems as
Strengths a Big Data Solution Architect
 Have a proven track record in as a Cloud
 Leadership Skills Architect, data analysis and data engineering
 Business Development alongside the business knowledge.
 Global Project Lead  Have worked in challenging roles where the
 Solution Architecture
delivery of high-quality projects and services
 Technical Architecture
have had a genuine, positive impact on Customer
 End to End Delivery
 Strategic Innovation relations.
 Test Strategy  Have always had an innovation and thought
 Customer leadership focus and contributed to a number of
Presentations/Workshops White papers and Solution Briefs in the Big Data
and Telecommunication Space
Technology Skills  Played roles such as Engineering Architect,
Solution Architect, Technology Architect and
 GCP Cloud Architect
Technical Lead for Big Data Deliveries for
 Data Solution Architect
Customers around the globe
 Hadoop/Big Data
 Good knowledge of  Have lead teams in multi-geographical locations
AWS/Azure to effectively collaborate and Deliver projects.
 Scala Programming  Have a passion for operational improvement
 Java Programming and technical improvement of products in the
 Unix Shell Scripting SDM and Big data portfolio of HPE.
 Proactively undertook Production deployment
White Papers and management of key solutions delivered to
leading Telecom providers.
 MNP as an FAS Solution
 Smart Profile Server for Entity  Effectively combining technical expertise with
Profiles commercial acumen and expansive
understanding of best practices and
Education methodology, always aligning with a vision and
influencing strategy.
 B.Tech in Electronics &  Of the 22 years of IT industry experience, close to
Communication 12 years have been in the Big Data space.
 Have a deep understanding of Hadoop, Spark,
Hive, Kafka, Cassandra, Oozie both at an
architectural Level and operational level
 Google Cloud Platform certified. Have excellent
understanding of cloud platforms.
 Believe in Leading by example and working with
Public
the team by providing innovative solutions to
Work Experience

 Currently working as a Data Engineering Architect in Hewlett Packard Enterprise Pvt.


Ltd., Bangalore since August 2006.

 Technology Consultant at Tata Consultancy Services from June 2002 till July 2006.

 Software Engineer at Syntel India Pvt. Ltd., Chennai from April 2000 till June 2002.

Projects Details

Project : Big Data Solution on Harmony Cloud platform


Environment : EKS cluster on AWS Cloud
Tools : Pulsar, Spark Delta lake and Timescale DB
Duration : August 2021 – Current Date
Role : Big Data Solution Architect, Cloud Solution Architect

Project Scope

Design and implement an end to end solution on Harmony Cloud platform , which runs on
an EKS Cluster on AWS, to ingest Data in the form of files and Redfish events from various
HPE servers around the globe. Refine the data using custom Spark jobs in Spark Delta lake.
Data storage is in timescale DB for consumption by downstream applications for analytics
and Visualization.

Responsibilities
 Design and implement an end to end solution on Harmony Cloud platform.
 Configuration of Pipelines for data ingestion and refinement for different analytical usecases
 Development of Custom spark jobs for moving data between different environments
 Development of triggers and procedures in Timescale DB for storing the data
 Development of visualizations and APIs for data consumption

Project : Hadoop based on-Premise Big Data Solution


Environment : CDP private cloud
Tools : NiFi, HDFS, Hive, Spark SQL, Ranger, Qliksense
Duration : November 2020 – July 2021
Role : Big Data Solution Architect

Public
Project Scope

Design and implement an end to end bigdata solution on Cloudera Data Platform , Data is ingested
Data in the form of files using Nifi and stored as avro files. Data Refinement is done in batch mode
using Hive and Spark SQL and the data is stored in Hive tables.

Responsibilities
 Configuration of Pipelines for data ingestion and refinement for different analytical usecases
 Development of Hive and spark jobs for moving data between different environments
 Development of daily and weekly jobs for data refinements and aggregation
 Development of visualizations and reports for data consumption

Project : Vodafone India


Environment : HDFS, Impala, Kudu, Vertica, Shell Script, CDH Cluster, Linux
Duration : May 2019 –October 2020
Role : Big Data Solution Architect

Project Scope
Vodafone India is using HPE’s Fault Archival and Statistics (FAS) platform and Service Manager (SM)
for analysis of Alarms and Tickets generated for the Vodafone IDEA footprint. The FAS platform
Stores this raw fault information in Kudu and performs analytics on the same and generates reports
using Impala.

Responsibilities
 Setup, Installation, Configuration, Testing and Commissioning of Data Center (DC)
And Disaster Recovery CDH Cluster.
 Configuration of CDH Services, Cloudera Management Services & Oracle 12c (as Metastore)
on both the clusters.
 Requirement Analysis of FAS implementation in Vodafone
 Design and Development of reporting module using Kudu and Impala based on alarm data
generated in Hadoop.
 System testing and UAT with Customer

Project : Maxis Communications, Malaysia


Environment : HDFS, Hive, Vertica, Shell Script, CDH Cluster, Linux
Duration : July 2018 – April 2019
Role : Big Data Solution Architect

Public
Project Scope
Maxis Big Data platform was initiated by MAXIS Information Services Department (ISD) to offload
existing SAS Enterprise Data Warehouse (EDW) and Customer Value Management (CVM) systems.
Additionally, it creates a new Business Support System (BSS) Operational Data Store to align with
MAXIS architecture using HPE Telco Data Model.

Responsibilities
 Conducted multiple workshops with Customer to identify scope of work
 Requirement Analysis to identify different streams
 Design of end to end mappings from data sources to build business attributes
 Review of Vertica and Hive loading and transformation scripts
 Development of System test plan and Acceptance plan and test cases
 UAT Support

Project : Docomo, Japan


Environment : Spark Streaming, Scala, Hive, Oozie, Kafka, Vertica, Shell Scripts, HDP
Cluster
Duration : November 2017 – June 2018
Role : Big Data Solution Architect

Project Scope
The project involved implementation of the Use-Cases as a POC involving Location Analytics related
to prediction of customer’s (subscriber’s) location. The scope of this solution was prediction of
customer’s (subscriber’s) location within a set of districts in Tokyo, Japan. The solution was
delivered in such a way that the same can be easily scaled to serve other districts in other provinces
of Japan.

Responsibilities
 Conducted multiple sessions with Customer and HP Japan to define scope of the Proof of
Concept
 Investigated Spark integration with Oozie and developed prototypes to identify best fit
technologies for solution
 Design of solution based on Lambda architecture using Kafka, Spark Streaming and Vertica
 Development of System and Acceptance test plan and testcases for POC
 Customer Presentation of POC

Project : Smart Profile Server


Environment : AngularJS, Node.js, Spark, Map Reduce, Hive, Oozie, HDP Cluster

Public
Duration : February 2017 – October 2017
Role : Solution Architect

Project Scope
The project involved development of Near Real-time, and Batch Profiles of entities in the Telecom
domain by commuting multiple business attributes configured by users through a highly flexible and
configurable GUI. Templates of multiple attributes are pre-defined through the GUI and made
available to users to instantiate. Once a profile is configured by the User (drag and drop feature in
GUI), the user can schedule it for execution at various intervals. The results of profile execution are
available as graphs or HDFS files in required formats, for consumption by downstream systems.
Near real-time attributes are computed using Spark streaming and Kafka, while batch attributes are
computed using either Map Reduce or Hive. Oozie is used as the scheduler for the attributes. The
data is available in Hadoop Data Lake for the computation.

Responsibilities
 Presented the idea as a product proposal
 Investigated best fit tools for the development of Smart Profile Server
 Proof of Concept for couple of attributes using PADL building blocks
 Designed the GUI for development of Smart Profile Server solution
 Designed the end to end flow of the Solution
 Code and Unit Test case reviews
 Development of System Test Plan and Test Cases
 Identified and baselined Acceptance Test Criteria
 Development of User Guide

Project : TASPS-PADL
Environment : HDFS, MapReduce, Hive, Spark, Oozie, Linux
Duration : February 2016 – January 2017
Role : Solution Architect

Project Scope
TASPS-PADL is a product feature of the HPE Telecom Analytics (HPE TA) solution portfolio. The
objective is to enable TASPS to run computational and analytical jobs on the massive volume of
structured and unstructured data stored from various sources in Hadoop Data Lake using various
tools available in Hadoop Eco-system like MapReduce jobs, Hive QL, Spark, and Oozie.

Responsibilities
 Investigated best fit method to productize solution
 Participation in application architecture & design discussions.

Public
 Designed Generic and Customized functions as building blocks using MapReduce, Hive and
Spark.
 Designed application metadata consisting rules & configuration for building blocks and a
Dynamic workflow generation using Oozie to execute the job comprising multiple building
blocks.
 Orchestrate the Oozie workflows and coordinators.
 Code Review and review of test cases for unit testing, System testing & performance testing.
 Documentation of PADL building blocks

Project : TBDA
Environment : HDFS, Hive, Oozie, Vertica, HDP Cluster, Linux
Duration : May 2015 – January 2016
Role : Team Lead

Project Scope
TBDA Project involved three applications- CDRSearch, Advertising & Capillarity. TASPS-PADL is used
to run computational jobs on CDR data using REST APIs.
CDRSearch: It provides GUI interface to provide search conditions & using TASPS-PADL APIs
launches Hive/MapReduce jobs on CDRs stored in Hive tables.
Advertising: It provides GUI interface to understand Subscriber Profile Presence who checked in a
particular location/place (point of interest) during a selected period of time.
Capillarity: It provides Micro Strategy Dashboards for generating reports which enable customer to
make strategic decisions. Reports are generated using data residing in Vertica tables. TASPS-PADL is
used to launch Hive/MapReduce jobs on data in Hadoop data-lake and output of the same is loaded
in Vertica tables for reporting purpose.

Responsibilities
 RA workshop with Customer to understand Usecases
 Development and Sign-Off of RA Document
 Performance testing of MapReduce/Hive jobs to identify best fit solution to meet SLA
 Design and Development of Map Reduce and Hive jobs as per customer requirement
 Review of System test plan and Acceptance test plan
 Provided application training to customer & operation team.

Project : Service Assurance Solution


Environment : Java, Vertica, R, Linux
Duration : September 2014 – April 2015
Role : Solution Architect

Public
Project Scope
 Developed a proof of concept for a prominent CSP in US for analyzing Carrier IQ logs
generated by various different equipments and correlating those with Network equipment
issues reported to predict faults and take timely action.

Responsibilities
 Periodic loading of data into Vertica using loaders in SPS
 Scripts for creating analytical tables and roll ups for reports
 R based analysis of data to identify patterns for predictive analysis
 Developed a Root cause analysis system for call centers using predictive models

Project : Field Enablement of SDM Products


Environment : J2EE, Oracle 10g, WebLogic 9.2, Web Services, LDAP, JBOSS, Vertica
Duration : June 2012 – August 2014
Role : Pre-Sales

Project Scope
 Field Enablement is a unique set of activities taken up to support the Sales team. Use cases
are built around inter related products and showcased as Proof of Concept or prototypes to
prospective customers. Some of the major products used were Virtual Identity Profile
Broker, Vertica, Personal Campaign Manager, Personalization Portal

Responsibilities
 Responsible for POC to showcase use cases involving multiple related Products to
prospective customers
 Responsible for developing POC to showcase customizations required for targeted
customers
 Responsible for development of training materials and lab setup to showcase product
features
 Responsible for delivering training to the Field on these products

Project : Virtual Identity Profile Broker


Environment : J2EE, Oracle 10g, WebLogic 9.2, 10.3, Web Services, LDAP
Duration : October 2009 – May 2012
Role : Lead Developer

Project Scope

Public
 Virtual Identity Profile Broker is one of the flagship products in the SDM (Subscriber Data
Management) Portfolio of HP. In a Service Provider Environment, the data of a subscriber is
scattered across different applications. This problem is further compounded if the Service
Provider supports Triple play or quadruple play. The SDM broker dynamically brokers data
from different systems in the Service Provider network and generates a unified Profile of the
subscriber to be consumed by the querying systems.

Responsibilities
 Was responsible for developed of 2 Key features – SAML based Authentication and LDAP
interface
 Responsible for development of feature specifications, design, implementation and Unit
testing and product documentation around the same

Project : Mobile Porting Gateway for Celcom Malaysia


Environment : J2EE, Oracle 10g, WebLogic 10.2
Duration : August 2008 – September 2009
Role : Project Lead

Project Scope
 The Mobile Porting Gateway (MPG) is a Mobile Number Portability Solution. It receives
Porting Messages from other Operators through a regulatory body and processes these
messages within a commonly agreed upon SLA. The processing typically involves updating
Porting details into the internal database of MPG system and also downstream OSS and BSS
systems and responding back to the regulatory body.

Responsibilities
 Overall Lead for Project
 Was Responsible for conducting RA Workshop in Malaysia
 Documentation of System requirements and High Level Design
 Development and Unit Testing of individual modules and release to IT team
 Offshore support during Integration Testing and System Testing

Project : Mobile Porting Gateway Solution for Maxis Malaysia


Environment : J2EE, Oracle 10g, WebLogic 9.2
Duration : August 2007 – July 2008
Role : Module Lead

Public
Project Scope
 The Mobile Porting Gateway (MPG) is a Mobile Number Portability Solution. It receives
Porting Messages from other Operators through a regulatory body and processes these
messages within a commonly agreed upon SLA. The processing typically involves updating
Porting details into the internal database of MPG system and also downstream OSS and BSS
systems and responding back to the regulatory body.

Responsibilities
 Lead for LSMS Module and Web Based operator UI
 Part of the team that conducted the RA workshop in Malaysia
 Analysis and Documentation of Requirements and Design of LSMS and operator UI .
 Set up of Configuration Management System
 Development and Unit Testing of the LSMS and operator UI
 Offshore support during Integration Testing and System Testing

Project : Video-Conferencing Portal


Environment : JSP, Servlets, RMI, Oracle, Tomcat
Duration : April 2007 – July 2007
Role : Lead Developer

Project Scope
 The Videoconferencing portal is the front end of the Videoconferencing service developed
by the SNVT video new venture team. This portal provides online visualization and
operationalization of the conferences. This is used to create, administer and monitor video
conferences pertaining to a particular user.

Responsibilities
 Responsible for interacting with the client and gathering requirements, designing, and
developing the Web Portal and integrating with the Videoconferencing Server
 Development of Videoconferencing Portal for creation of Pre-defined Conferences and real-
time Conference Monitoring through Portal
 Interfacing through RMI to videoconferencing Server to store and Manage Conference
Information
 Videoconferencing GUI integrated with XDMS for generation of policy files used for
videoconferencing
 Development of the Web GUI for Videoconferencing
 Responsible for the Design, CUT and maintenance of the Videoconferencing GUI
 Creation and Maintenance of conference, contacts, groups contents and Admin module for
Videoconferencing GUI.

Public
Project : WWCAS – Global Authorization Network (MTP)
Environment : Linux, Oracle 9iJava 1.4, EJB, Servlets, JSP, struts Framework
Duration : October 2005 – August 2006
Role : Developer

Project Scope
 American Express intends to replace the manual Merchant Testing and Certification process
with an automated system. The system will enable third parties to test their applications
and will provide feedback on the results.
 A separate instance of the GAN server will be configured only to manage this merchant self-
testing function. The MTP project involves the development of this automated system.
 Merchants will be able to connect to MTP and test their connection with GAN against the
desired message specification. It will be made available for both new merchants and for
existing merchants migrating to the new specification
 The merchant will have access to a self-test system that is able to mimic a production
environment and provide diagnostic facilities to determine their level of compliance with
the message specification.
 The MTP module will store all data received, data sent and any errors encountered during
validation

Responsibilities
 Requirement analysis
 Understanding the business rules of credit Authorization system
 Preparation of Test specifications
 Coding of new Use-Cases and make changes to the impacted programs(Using
Java,EJB,JSP,HTML,Javascript)
 Review of requirements, Programs and Test Plans
 Unit Testing
 Coordination between onsite and offshore.

Project : JSR and J2ME Component & Application Testing for Mobile phone
software
Environment : Windows 2000/XP, Java , J2ME,C, MS Visual .Net, Eclipse, J2ME-Plug-in,
Rational Test RealTime, J2SDK, SUN Wireless Toolkit for J2ME, Mobile Phone Simulator
Duration : March 2005 – September 2005
Role : Developer

Public
Project Scope
 This project consists of JSR 135, JSR 172, JSR 120, Java Application Manager and other Java
application Component Testing includes functionalities, features and interfacing testing of
individual component.
 JSR component testing includes –Testing of individual JSR APIs. Development of Test
Harness by isolating the component to be tested and stubbing out other interacting
components.

Responsibilities
 Component Test requirement Analysis
 Design of Test Environment
 Development of Test Harness – This involves isolating the component to be tested and
stubbing out of interacting components
 Test Specification development
 Deriving Test Cases from Test Specification
 Execution of Test Cases in the Test harness
 Recording Actual results and expected results.

Project : National Telephone & Address System (NTAS)


Environment : Solaris 7/HP-UX 11, C++, Java, ESQL/C, Informix 7.1
Duration : December 2003 – February 2005
Role : Team Member

Project Scope
 National Telephone Number and Address System (NTAS) is an initiative/program to have
common Telephone Number (TN) and Address system for Verizon footprint. Before NTAS,
the TN and Address inventory was maintained in Assignment, Activation and Inventory
System (AAIS) in Verizon West and Livewire system in Verizon East. NTAS maintains the
inventory of Telephone Number and Address for the entire Verizon footprint. About 60
external systems interface with NTAS for assignment of TN, address validations, TN and
address inquiries, reservation, service order processing, DSL loop qualification (loop qual)
and other TN and address related transactions. NTAS has numerous TN, address, loop qual,
service order, ad-hoc reports which can be scheduled or real time. NTAS is a mission critical
application and uses high availability servers, high speed database engine, in-house
developed middleware to meet Verizon business needs.

Responsibilities
 Responsible for Design, Coding, implementation of functionalities in Reports Module using
C++, Informix and Java Struts architecture.

Public
Project : Local Number Portability
Environment : HP-UX, Solaris, C++, Java, JSP, ESQL/C
Duration : December 2002 – November 2003
Role : Team Member

Project Scope
 Local Number Portability (LNP) which is also referred to as Service Provider Portability is a
technology and process where the Verizon customer can change local service providers and
purchase local service from a CLEC and keep their Verizon number by having it actually
activated in the CLEC Central Office.
 The LNP (Local Number Portability) Gateway system automates all processing required for
an LNP order prior to its provisioning. The system is based on a 3 tier client-server
architecture (thin web-based user interface, C++ backend and a TCP/IP based custom
middleware for IPC). The main feature of the system is the order processing engine, which
automatically interfaces with the order entry system NOCV (National Order Collection
Vehicle) to obtain order information, and interfaces mainly with SGW (Supplier Gate Way),
CBSS (Customer Billing Service System), and SOA (Service Order Administration) system to
perform the necessary tasks related to porting a telephone number between Verizon and a
different LEC (Local Exchange Carrier), or within Verizon. In addition, the web interface can
be used to manually perform various tasks in in these external systems (Mainly to correct
the errors from the automated process and look up). Also, LNP has a number of supporting
applications such as security, auditing, reporting and web query

Responsibilities
 Responsible for Design and Coding of the web interface for the entire MultisystemQuery
system of LNP gateway.
 Responsible for the development of Switch, NOCV and AAIS interfaces for
MultisystemQuery system.

Project : Assignment, Activation & Inventory Services


Environment : Solaris, C++, Java, Informix, ESQL/C
Duration : June 2002 – November 2002
Role : Team Member

Project Scope
 Functionally, AAIS consists of two core engines sitting on top of a Verizon network inventory
database - a service fulfilment /assignment & activation engine and an inventory
management engine. AAIS is principally a back end system within the Verizon infrastructure
i.e. most end-users interact with AAIS through other front-end systems e.g. Customer

Public
contact representatives use NOCV which interfaces with AAIS. Therefore, a significant
portion of AAIS is interfacing logic to the Verizon legacy embedded systems infrastructure
coupled to our core engines through application programming interfaces (APIs). For the
clients, AAIS has client applications - typically graphical user interfaces (GUI) customized by
job function eg. The manual assignment users in the FAC will use the AAIS manual
assignment GUI. Also, AAIS has a number of supporting applications such as security,
auditing, reporting, system management etc. It’s a huge application in which nearly 20000
users are using the system throughout the nation and having inventory of 22 million
Telephone lines.
 AAIS is being designed as a client-server application following the 3-tier thin client
architectural model. The AAIS infrastructure consists of a data server cluster, a gateway
infrastructure and client graphical user interface utilities (GUI). The clients communicate
with the data servers via the gateways using the AAIS middleware (described in a later
section). This middleware is based on an asynchronous messaging model suitable for high
transaction rate systems.

Responsibilities
 Responsible for fixing and testing Incident Reports in TN Module of AAIS

Project : GMAC
Environment : JAVA,JSP,Servlets,EJB,Oracle 8,Weblogic,Solaris
Duration : July 2001 – May 2002
Role : Team Member

Project Scope
 This project involves development of offer sites for General Motors for marketing
various insurance schemes that GM offers to its employees and customers. I was
involved in the Development of the Enrollment Module and Reports Module.
Enrollment Module takes care of all procedures involved in enrolling the user to the
site after capturing various details such as Name, Address, Telephone Number and
Credit card information. Reports module involves generation of reports giving details
of persons registering in the site, for admin purposes. These modules were
developed using MVC1 architecture.

Responsibilities
 Single Point of Contact
 Requirements Gathering, Analysis, Design, Coding and Unit Testing

Project : Syntelinc.com

Public
Environment : JAVA, Servlets, JSP, JAVA Beans, SQL, JavaScript, Weblogic, Solaris
Duration : April 2000 – June 2001
Role : Team Member

Project Scope
 This is the corporate site of Syntel Inc. It consists of 6 different modules, which are
updated on a daily basis with the help of a content management tool. Basic
components of the web site
 Client Browser - to send the requests from web site
 Presentation Layer - contains JSP, HTML, ASP
 Application Layer - contains JAVA Beans and Servlets
 Database Layer - contains Database components

Responsibilities
 Maintenance, enhancements, addition of new functionality and production support
for the site.
 Requirement studies, Analysis, Design, Coding, Unit Testing, System Testing and
production support.

Personal Profile

Date of Birth : 23 November 1977


Passport No : J9424211 (B1 Visa)
Address : Block C - 302, Casa Grande Irene , M.G.Road Manapakkam
Chennai 600125

Public

You might also like