Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 9

Principal Architect | Big Data

Architect | Solutions Architect


Resume
 Hire Now

SUMMARY:
 He is a resourceful, high energy technical solution architect, enterprise data architect offering
13+ years of expertise in architecture definition of large distributed systems, technical
consulting, project management and technology implementation in Cloud,Big Data, Hadoop,
Enterprise Information Management, Data Management, Product Management and
Application integration.
 His experience in technology, software, consulting and IT mainly focused on enterprise data
platforms, applications, Cloud, Big data solutions, Big Data Analytics and
Hadoopimplementations for retail, banking and finance, customer support, high tech
manufacturing, healthcare & fitness, social and online gaming. His technical expertise
includes Big data integration, Modern data architecture, cloud solutions, solution architecture,
data architecture and project management. He is passionate about Big data, modern data
architecture and Predictive analytics that enable business & customers to get better insights
and data driven strategic decisions.
 Proficient in Data Architecture/DW/Bigdata/Hadoop/Data Integration/Master Data
Management, Data Migration and Operational Data Store, BI Reporting projects with a deep
focus in design, development and deployment of BI and data solutions using custom, open
source and off the shelf BI tools.
 Specialized in delivery of data and BI projects for various business groups and proven track
record of executing BI and Data Projects end to end.
 Provide technical thought leadership on Big Data strategy, adoption, architecture and design,
as well as data engineering and modeling.
 Provide expert level guidance on Big Data best practices and standards
 Work with product owners, business SME and data ingestion and reporting architects to
identify requirements and consolidate enterprise data model consistent with business
processes
 Prioritize and scale architecture efforts in close coordination with business teams, Data Lake
operational team, and other stakeholders.
 Lead key business critical projects in the capacity of a Data Architect and Advisor
 Review business requirements and technical design documents to develop effective data
and database solutions
 Support and Influence projects/initiatives and drive decisions related to data acquisition,
processing and utilization through Big Data Platform
 Define Data Architecture standards, best practices and participate in governance activities
during Big Data projects
 Create, maintain, and define Data Architecture artifacts based on Enterprise Architecture
requirements
 Develop and maintain technical roadmap for Enterprise Big Data Platform for different
platform capabilities
 Research, evaluate and recommend tools for Enterprise Big Data Platform
 Ingesting wide variety of data like structured, unstructured and semi structured into the Big
data eco systems with batch processing, real time streaming and SQL.
 Expert level understanding of open source technologies around Hadoop ecosystem.
 Managing the setup of Enterprise tableau reporting for analytical reporting.
 Experience with Big Data and Big Data on Cloud, Master Data Management and Data
Governance
 Strong familiarity with data management, data governance, and best practices
 Highly competent with Data Lake concept and Big Data capabilities
 Highly competent with data modeling with relational database design and SQL
 Strong aptitude to learn business processes/products and the role of data within business
domain
 Strong understanding of NoSQL data sources, their implementation and management
 Experience with Hadoop distributions (open source and commercial)
 Strong Experience in Java, Linux/Unix
 Experience with Scala, Python, R, and Spark
 Experience with implementing Modern Data Warehouse / Data Lake
 Hands on experience with Hadoop distributions like Cloudera, Horton works,EMR,
HDInsight’s, IBM Big Insights, Hadoop architecture and technology stack (Hive, HBase, Map
Reduce, Sqoop, HDFS, Oozie and zoo keeper, Informatica Bigdata Edition, Kafka, Spark,
Storm, Kinesis and Lambda).
 Hands on experience in AWS Cloud technologies using AWS EC2, EMR, S3, Redshift,
Aurora, Datapipeline, AWS Quick sights, SQS, Dynamo DB, RDS and DMS.
 Hands on experience in scripting languages like python, java and shell.
 Hands on experience with AWS Redshift, MongoDB, Cassandra, HBase, DynamoDB, NoSql
databases.
 Experience with automated CM and deployment tools such as Chef, Puppet, or Ansible
 Excellent Leadership skills.
 Worked in Agile environment and sprints.

TECHNICAL SKILLS:
Data Warehousing: Relational MySQL, Oracle, SQL Server, DB2 UDB, Netezza, IBM DB2, Hive,
NoSQL MPP Teradata, Teradata Aster, Redshift, Vertica, Greenplum 
Analytical/Columnar: HBase, SAP HANA
NoSQL: Cassandra, MongoDB, Neo4J, Elastic search, Dynamo DB, Redshift
Distros: Apache, Cloudera Distribution, Hortonworks Distribution, IBM Big data, Informatica BigData
Edition
Big Data Frameworks: Hadoop HDFS, MapReduce, Hive, Pig, Sqoop, Flume, Ooziee, Zoo keeper,
JAQL, Solr, Lucene
Stacks:  Hadoop, Spark
Data Analysis:  Hive, Pig, R, Python Data Transformation
ETL/ELT: Informatica, Talend, Pentaho Data Integration, Informatica Big Data /PC/DI Edition,
Datastage, Abinitio, SSIS, Informatica Vibe, Informatica BigData Edition, SAP BW, SSIS, SSRS,
IBM Big Data Edition, Amazon Data pipeline
Data Modelling: Erwin, Visio, MDM
Data Collection: API's, Web services, SQL, ODBC calls. Amazon S3
Machine Learning: Microsoft ML, Rapid Miner, Spark MLIB, IBM Data Science
Cloud/OS: Amazon Web Services, Microsoft Azure, Google Cloud, IBM Cloud
Virtualization: VMWare, Virtual Box
BI: Tableau, Business Objects, Cognos 8, OBIEE, Microstrategy, Spot fire, Hyperion Essbase 9
Scheduling: Autosys, Control M, Shell scripting, DAC
Languages: Java, Python, Shell Scripting, XML, SQL/PL - SQL
Load/Unload Utilities: Teradata - BTEQ, Fast Load, MLoad, Tpump, Fast Export, Mysql Load
Utilities, Oracle Load Utilities, MySql Utilities, SQL Server utilities
Version Controlling: Git, PVCS, CVS, Sub-version
OLTP: Siebel CRM, Salesforce, PeopleSoft, Mainframe systems, SAP, Oracle
(EBS/Finance/Supplychain/Inventor) Apps, Microsoft Dynamics, Contact Center Applications, Oracle
MDM: Informatica MDM, Informatica IDQ, Informatica ILM
Others: Oracle SQL Developer, TOAD, SQL Work bench, Aqua Data Studio, Toad MySql
EXPERIENCE SUMMARY:
Confidential
Principal Architect | Big Data Architect | Solutions Architect
Responsibilities:
 Principal Solutions Architect responsible for Modern Data Architecture, Hadoop, Big data,
data and BI requirements and defining the strategy, technical architecture, implementation
plan, management and delivery of Big Data applications and solutions.
 Provide technical thought leadership on Big Data strategy, adoption, architecture and design,
as well as data engineering and modeling.
 Expertise in identify the product road map and product management activities.
 Provide expert level guidance on Big Data best practices and standards
 Work with product owners, business SME and data ingestion and reporting architects to
identify requirements and consolidate enterprise data model consistent with business
processes
 Prioritize and scale architecture efforts in close coordination with business teams, Data Lake
operational team, and other stakeholders
 Ensure that all measures for data quality, data governance, security and compliance of data
are adhered to
 Define and implement standards and best practices for data ingestion, analytics, and
integration
 Lead and evaluate emerging technology, industry and market trends to assist in project
development and/or operational support activities
 Prioritize and scale architecture efforts in close coordination with business teams, Data Lake
operational team, and other stakeholders
 Lead key business critical projects in the capacity of a Data Architect and Advisor
 Review business requirements and technical design documents to develop effective data
and database solutions
 Support and Influence projects/initiatives and drive decisions related to data acquisition,
processing and utilization through Big Data Platform
 Define Data Architecture standards, best practices and participate in governance activities
during Big Data projects
 Create, maintain, and define Data Architecture artifacts based on Enterprise Architecture
requirements
 Develop and maintain technical roadmap for Enterprise Big Data Platform for different
platform capabilities
 Research, evaluate and recommend tools for Enterprise Big Data Platform
 Ingesting wide variety of data like structured, unstructured and semi structured into the Big
data eco systems with batch processing, real time streaming and SQL.
 Architecting cloud solutions on AWS, Microsoft and Google solutions.
 Strong Understanding and Implementation of Advance Analytics concepts like Data Mining,
Text Analytics, Stream Analytics, Machine Learning, Neural Networks, etc.
 Expert level understanding of open source technologies around Hadoop ecosystem
 Hands on experience with scalable solutions on DW, Big data, Hadoop, BI platforms using
AWS Redshift, MySQL, Java, Scala, python Hadoop, Spark, Kafka, Oozie, HDFS, Yarn,
Tableau, HBase, Hive, Hadoop Ecosystem, Amazon S3, EC2, Microsoft Azure HDInsight,
Hortonworks.
 Implementation of Big data solutions on the AWS Cloud platform and Azure cloud platform.
 Managing teams across different geographic regions to build and support business
applications.
 Bring maturing model in building the teams across different technical and business horizons.
 Presenting CDO strategy and roadmap to the EIM Practice, senior management and clients
for various Big Data, Data Integrations and Analytics projects.
 Involved in projects implementing Data Lake Architecture, Big Data Analytics and Modern
Data warehouse applications.
 Data Architecture and strategy implementations, Cloud implementations and strategy.
 Provide information architecture leadership within the consultant team as well as to the client
or third-party resources.
 Evaluate, select and integrate any big data tools and framework required to provide
requested capabilities, as well as comply with regulatory policies and architecture standards
as necessary.
 Develop and Implement Hadoop, Spark, Big Data Analytics and Integrations, Microsoft Azure
cloud data solutions, AWS Cloud Big Data solutions.
 Enterprise Tableau Reporting Architecture and Implementations.
 Provide architecture recommendations, strategy, data architecture and implement enterprise
data solutions.
 Evaluation of best of breed Big data, Hadoop and cloud based products for various types of
data integration and processing implementations.
 Successful delivery of the projects right from conceptual phase to the final product.
 Implementation of Hadoop, Big Data, Real time and Application integration at enterprise
level for various clients.

Confidential
Big Data Architect
Responsibilities:
 Responsible for assessing Applications, Big data, Hadoop, data and BI requirements and
defining the strategy, technical architecture, implementation plan, and delivery of data
warehouse and for establishing the long-term strategy and technical architecture as well as
the short-term scope for a multi-phased Big data applications and data warehouse effort.
 Architecting, managing and delivering the technical projects /products for various business
groups.
 Implementations on AWS Cloud solutions for Bigdata stack.
 Building and setting up the Hadoop eco systems and stack from the ground up.
 Hands on implementation of Big Data Analytics and applications in cloud environment
deployments.
 Proof Of Concept projects in implementing Spark and Hadoop applications using java,
python and Scala.
 Implementing Data Engineering projects using data pipelines, java, python, Unix, SQL,
Scala, NoSQL and RDBMS.
 Ingesting wide variety of data like structured, unstructured and semi structured into the Big
data eco systems with batch processing, real time streaming and SQL.
 Hands on experience with scalable solutions on DW, Big data, Hadoop, BI platforms using
AWS Redshift, Oracle, MySQL, Talend DI, Java, Hadoop, Spark, Tableau, HBase, Hive,
Hadoop Ecosystem, Amazon S3, EC2.
 Implementation of Big data solutions on the AWS Cloud platform.
 Working on POC’s in implementing Spark and Hadoop applications using java, python and
Scala.
 Implementing Data Engineering projects using java, python, SQL, Scala, NoSQL and
RDBMS.
 Working on POC’s in implementing Spark and Hadoop applications using java, python and
Scala.
 Ingesting wide variety of data like structured, unstructured and semi structured into the Big
data eco systems with batch processing, real time streaming and SQL.
 Involved and ingesting different types of the data streams to the Data lake like click stream
data and web log data ingestion for more near micro batch processing.
 Implementation of Data lake architecture to support the different products and business
analytics needs.
 Managing the data architecture, file system storage and maintenance of Hadoop clusters.
 Hands on withHadoop administration, security and management of the clusters.
 Responsible for delivering end to end data products, business analytics applications and
product engagement apps.
 Managing end to end technical delivery of data products and projects.

Confidential
Data Architect | Lead Data Engineer
Responsibilities:
 Responsible for assessing Applications, DW,Big data, Hadoop andBI requirements and
defining the strategy, technical architecture, implementation plan, and delivery of data
warehouse and for establishing the long-term strategy and technical architecture as well as
the short-term scope for a multi-phased data warehouse effort.
 Leading a team of developers both onsite and offshore for data integration and data
reporting requirements.
 Architecting, managing and delivering the technical projects /products for various business
groups.
 Implementations on AWS Cloud solutions for Big data stack.
 Hands on experience with scalable solutions on DW, Big data, Hadoop, BI platforms using
AWS Redshift, Oracle, MySQL, Talend DI, Python, Java, Hadoop, Kafka, Spark, Tableau,
HBase, Presto, Hive, Hadoop Ecosystem, Cassandra, Spark, Amazon S3, EC2, EMR,
Java,pythonand Cassandra.
 Implementing Data lake architecture and data hub for wide variety of data streams like
adobe Omniture data processing, web log data, transactional data, CRM, email marketing
campaign data, external vendor data and micro batch processing.
 Working on POC’s in implementing Spark and Hadoop applications using java, python and
Scala.
 Implementing Data Engineering projects using java, python, SQL, Scala, NoSQL and
RDBMS.
 Implemented various technologies like Big Data POCs, data integration, data consuming
platforms, B2B applications, consumer facing applications and reporting applications.
 Define key business drivers for the data warehouse initiative
 Deliver a project scope that directly supports the key business drivers
 Implementation of Tableau reporting at enterprise level.
 Ingesting wide variety of data like structured, unstructured and semi structured into the Big
data eco systems with batch processing, real time streaming and SQL.
 Implementation of Data lake architecture to support the different products and business
analytics needs.
 Managing the data architecture, file system storage and maintenance of Hadoop clusters.
 Hands on experience with Hadoop administration, security and management of the clusters.
 Responsible for delivering end to end data products, business analytics applications and
product engagement apps.
 Managing end to end delivery of data products and projects.

Confidential
Technical Architect | Technical Lead
Responsibilities:
 Responsible for assessing Big data, data and BI requirements and defining the strategy,
technical architecture, implementation plan, and delivery of data warehouse and for
establishing the long-term strategy and technical architecture as well as the short-term scope
for a multi-phased data warehouse effort.
 Leading a team of developers both onsite and offshore for data integration and data
reporting requirements.
 Architecting, managing and delivering the technical projects /products for various business
groups.
 Implementation of big data pipe line, hive jobs, pig jobs on Hadoop clusters.
 Architecting, managing and delivering the technical projects /products for various business
groups.
 Managing and delivered various technical projects for different business groups like call
center, manufacturing, sales, CRM etc.
 Leading team of developers both onsite and offshore for data integration and data reporting
requirements.
 Design and development of complete Data warehouse and system data integration, platform
development and BI reporting solutions.
 Solution architecture for various Business intelligence systems.
 Managing the system support and stability of the BI applications.
 Drive new technological solutions to support data and catering to different business needs.
Delivering complex data integration and reporting projects to support various business needs.
 Hands on design, architecture and development of various Data warehouse system
integration, platform development and BI reporting solutions.
 Solution architecture for various Business intelligence systems.
 Managing the system support and stability of the BI applications.
 Drive new technological solutions to support data and catering to different business needs.
 Implemented various technologies like data integration, data consuming platforms, B2B
applications, consumer facing applications and reporting applications.
 Implemented Big data initiatives using Hive, Pig, Hadoop using Bigdata technology stack.
 Define key business drivers for the data warehouse initiative
 Deliver a project scope that directly supports the key business drivers
 Supporting the different business user groups for various BI reporting requirements in Agile
and Traditional Data warehousing methodologies.
 Define, Design and Build the overall data warehouse architecture using Informatica, Oracle
and Teradata for ETL and Business Objects/Tableau for reporting.
 Define technical requirements, technical and data architectures for the data warehouse
 Recommend/select data warehouse technologies (e.g., ETL, DBMS, Data Quality, BI)
 Design and develop the ETL process, including data quality and testing
 Developing the ETL process integrating with the source systems, staging areas,
Presentation layer (BI layer) for BI projects and reporting requirements.
 Design and direct the information access and delivery effort for the data warehouse
 Define and develop the implementation of security requirements for the data warehouse
 In detail in development cycle designing the ETL processes using Oracle, setting up
Informatica workflows/mappings/sessions as per the interface requirements for processing
data on to the EDW area.
 Data Architecture and modeling using various tools like ERWIN, SQL Data Modeler etc.
 Technical reviews with the Architects, project managers, collaboration teams and project
stakeholders.
 Data migration projects from CRM, OLTP, ODS applications to the EDW area for supporting
the business needs.
 Involved in developing BI Reporting needs using SAP Business Objects and Tableau by
designing BO universes, User reports, Crystal reports on top of the staging tables.
 Fine Performance tuning of the ETL Interfaces, Oracle PLSQL Procedures and Teradata
Procedures used in the build of EDW Jobs/Interfaces.
 Batch Scheduling of the Informatica and Teradata jobs using Autosys and Shell scripts.
 Providing some POC for some business user needs and requirements on various new tools
and technologies.
 Managing production support related activities with the support team.
 Define metadata standards for the data warehouse.
 Direct the data warehouse meta data capture and access effort.
 Define production release requirements and sustainment architecture.
 Post Production Warranty support for the systems that have been implemented.

Confidential
Project Lead | Technical Architect
Responsibilities:
 Responsible for assessing data and BI requirements and defining the strategy, technical
architecture, implementation plan, and delivery of data warehouse and for establishing the
long-term strategy and technical architecture as well as the short-term scope for a multi-
phased data warehouse effort.
 Architecting, managing and delivering the technical projects /products for various business
groups.
 Hands on experience with scalable solutions data and reporting solution platforms, IBM
Bigdata Edition, Microstrategy, DB2 and Unix etc.
 Managing and lead a technical team of around 10 people both onsite and offshore members.
 Leading team of developers for technical delivery of projects.
 Design, architect and develop various data integration and reporting projects which varies
from medium to complex in nature.
 Providing solution architecture for various Data warehouse and Business intelligence
systems.
 Maintain system support and stability of the BI applications.
 Drive new technological solutions to support data and catering to different business needs.
 Delivering complex data integration and reporting projects to support various business needs
 Serving as a Senior DW ETL Architect/Lead in defining, developing, and enhancing custom
DW/BI system
 Primary responsibilities include but not limited to coordinating, planning, and conducting
interviews with various business users in identifying and capturing business requirements.
 Defining and developing project scope/plan, creating and developing Enterprise wide Data
warehouse and multi-dimensional Datamart data models.
 Designed, Developed and implemented complex data models for large enterprise data like
Merchant Credit Card system, Card member data reporting and analytics solutions.
 Mainly involved in architecting the project right from the initial design phase, existing
reporting system migration to the new proposed systems, building the ETL integration part,
Staging layer and BI layer for reporting solutions.
 Designed and Developed the EDW Model which has 13 subject areas and data marts for
 Supporting various needs of the business using DB2, Datastage, Control M and Micro
strategy
 Designing data profiling strategies for improving the source data quality that is being loaded
to the EDW systems.
 Supported the production support teams during the pre-andpost-warranty of the BI reporting
projects on time in resolving production failures or issues.
 Proposed and maintained performance tuning improvements to the ETL processes, reporting
solutions in attaining the optimal performance or meeting the SLA requirements of the
business.
 Training business users in how to navigate through the environment, creating and accessing
reports.

Confidential
Sr. Software Engineer | Lead
Responsibilities:
 Lead a team of people for various data projects.
 Implemented enterprise level data integration and data warehouse projects.
 Design, architect and development various data integration and reporting projects which
varies from medium to complex in nature.
 Solution architecture for various Business intelligence systems.
 Manage system support and stability of the BI applications.
 Lead and involved technical development team in construction of ETL, BI solutions and Data
warehousing implementations.
 Interacting with the project stakeholders, technical teams, source systems involved in the
project life cycle
 Designed data marts employing Kimball’s Dimensional Modeling methodologies
 Designed, developed, implemented the REDW and DataMart’s for Enterprise wide financial
credit systems using IBM Information Server Data stage, Informatica, Siebel OBIEE Suite,
Oracle and DAC.
 Gathered the requirements, completed the proof of concept, designed, developed and tested
Physical Layer, Business Layer and Presentation Layer of OBIEE.
 Implemented the Guest CRM Applications integration to EDW for enabling reporting
solutions using Datastage, Control M and DB2 Etc.
 Majorly involved in Informatica Mappings, Workflows, jobs for complete ETL interfaces and
BI reporting products.
 Designed, created and tested reports, dashboards and created ad-hoc report according to
the client needs
 Designed, Developed and Tested Data security and Dashboard security in OBIEE
 Worked with Data Governance board on developing roles, tasks, processes, standards, etc.
 Involved in development of huge and enterprise wide data warehouse implementations for
Target Financial services and Target Guest systems.
 Implemented OBIEE for Development, QA and Production environments.
 Track the project efficiently with constants updates on the progress and hurdles, focusing on
the overall deliverables.
 Production support systems in support of warranty pre and postproduction implemented
projects.

You might also like