Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 6

Gopi Pittala

224-587-9796
gopipittala@yahoo.com

SUMMARY:
 22 years of experience including extensive of Cloud, Ab Initio, Data stage, Informatica, Cognos, Talend, Teradata,
Redshift, Snowflake and Azure SQL data warehousing/Data modeling experience in the areas of data
administration, physical database design, system architecture, database performance tuning, and ETL design and
development. I am also skilled in the implementation of end user data access tools and client-server reporting
applications.
 2 years of experience with Hadoop Technologies like HDFS, Hive, Pig, Impala, YARN, Spark, NoSQL, Python and
Sqoop. We implemented two of the applications on Hadoop successfully.
 10+ years of experience using one/more scalable (MPP) relational databases such as Teradata, Redshift,
Snowflake and SQL data warehouse for advanced analytics.
 4 years Hands on experience with AWS S3, EC2, IAM and Redshift services.
 Experience with SAP Supply chain and Sales and Distribution data.
 13 years of experience with creating logical and physical data model and converting logical data models into
Teradata/Redshift/Snowflake physical database designs using Hackolade/Erwin/ER Studio/SAP Power designer,
system implementation, performance tuning, and support.
 Strong knowledge of leading concepts in Data Quality, Master Data Management and Data Governance.
 Hands on experience with IBM MVS mainframe environment using JCL, TSO, ISPF, IBM utilities and DB2.
 Proven record of success in design, development and implementation of software applications using object-
oriented technology.
 Excellent communication and interpersonal skills and adept in working with team’s offsite.
TECHNICAL SKILLS:
 Servers: IIS, Web logic, Web sphere, ATG Dynamo
 Databases: Teradata, Oracle, DB2, SQL Server, MySQL, PostgreSQL, Redshift and Snowflake
 Teradata Tools & Utilities Query Man / SQL Assistant, WinDdi, Teradata Query Manager, Bteq, FastLoad,
Multiload.
 Reporting Tools: Crystal Reports, Tableau, Power BI and Cognos
 ETL tools: Informatica, DataStage, Ab Initio and Talend
 Bigdata: Hadoop, HDFS, Sqoop, Impala, Python, Hive, Pig
 Cloud: AWS S3, Athena, RDS, PostgreSQL, DynamoDB, MongoDB, Redshift, Snowflake
 ERP: SAP HANA/S4
 Data Model: Erwin, E/R Studio, Power Designer and Hackolade.

PROFESSIONAL EXPERIENCE:
Cloud9, Chicago, IL Jul 2019 to Present
Data Architect
 Designed and developed ETL, replication schemas, and query optimization techniques that supported highly-
varied (structured, semi-structured, and unstructured) and high-velocity (near real-time) data processing and
delivery
 Develop standards and methodologies for benchmarking, performance, evaluation, testing, data security and
data privacy.
 Owned and managed all changes to the data models. Created data models, solution designs and data
architecture documentation for complex information systems.
 Wrote Company's MetaData standards and practices, including naming standards, modeling guidelines and Data
Warehouse Strategy.
 Gathered and analyzed business requirements and designed database architecture for reporting or analytics as
required as well as prepared accurate and complete documentation for data design and data-flow work
 Expanding the data warehouse to meet new business initiatives, logical data model design, physical data model
design enhancing performance, and constructing a data mart.

McDonald’s, Chicago, IL May 2017 to Jun 2019


Data Architect/Sr. Data Modeler
 Involved in meetings to gather information and requirements from the business analytics.
 Created conceptual, logical & physical data models and assists the DBA in developing the physical data model
(PDM) with Erwin.
 Provides naming standards and conventions and ensures that data dictionaries are maintained across multiple
database environments.
 Worked on AWS Data Pipeline to configure data loads from S3 to into Redshift, AWS RDS, Snowflake, PostgreSQL.
 Used JSON schema to define table and column mapping from S3 data to Redshift
 Doing a POC on snowflake database on AWS.
 Experience with MPP columnar databases such as Redshift, Snowflake.
 Optimizing and tuning the Snowflake environment for Tableau Visual Analytics
 Worked on design the tables for NoSQL database (MongoDB).
 Involved in migrating data from Teradata to Redshift database.
 Implemented data quality and data governance for UberEATS and SAP Supply chain master data.
 Define MDM Strategy and Architecture including customer, product and suppliers’ data.
 Created tables to load large sets of structured, semi-structured and unstructured data.
 Developed Talend jobs to populate the claims data to data warehouse - star schema.
 Created data models of the following subject areas: Supply chain, POS, UberEATS, Customer and labor
Sears Holdings Corporation, Chicago, IL Aug 2012 to Apr 2017
Sr. Data Architect/ETL Architect
 Created logical and physical data model for Party using Teradata rLDM.
 Implemented Teradata temporal data type for time dimension. Supporting ETL and BI teams to create SLV and
SLT.
 Integrate data from multiple database sources to Hadoop.
 Created logical & physical data models with SAP Power Designer.
 Perform ETL, ELT using Pig, Hive and MapReduce as per requirements.
 Implemented data quality and data governance for customer data warehouse.
 Import the data from Teradata database using Fast Export scripts and loading the data into HDFS.
 Designed to extract data from current data warehouse to load in to data lakes to use for reporting.
 Processed data into HDFS by developing solutions, analyzed the data using Map Reduce, Pig, Hive and
produce summary results from Hadoop to downstream systems.
 Used Sqoop to import the data from Hadoop Distributed File System (HDFS) to RDBMS.
 Comprehensive understanding of MDM domains - Customer, Product and Vendor Domains as well as Business
Rules and workflows within an MDM application.
 Designed and created Master data for Customer and product.
 Experience in Agile development processes and familiar with methodologies.
 Worked on Microsoft projects Flow, PowerBi, SSIS.
 Created Tableau worksheet, which involves Schema Import, Implementing the business logic by customization.
 Designed and converted Teradata schema to Redshift. Extracted and loaded the existing warehouse data to
Redshift.
 Created adhoc reports to users in Tableau by connecting various data sources.
 Performance tuning of new or existing SQL and assisted users on performance tuning queries at application level.
 Created LDM and PDM for member analytics data.
Kohl’s, Milwaukee, WI Jun 2012 to Jul 2012
Sr. Teradata Data Architect
 Involved in meetings to gather requirements from the Business Analysts.
 Created logical & physical data models with ER Studio.
 Created logical and physical data model for Purchase Order Agreements using Teradata rLDM.
 Implemented Teradata temporal data type for time dimension.
 Created data pipeline from Salesforce to Teradata for Customer data management.
 Supporting ETL and BI teams to create SLV and SLT.
GE Aviation, Cincinnati, OH Feb 2012 to May 2012
Sr. Data Architect
 Involved in meetings to gather requirements from the Business Analysts.
 Analyze the existing data and designs logical and physical data models.
 Created logical and physical data models using Teradata mLDM.
 Implemented Teradata temporal data type for time dimension.
 Provides naming standards and conventions and ensures that data dictionaries are maintained across multiple
database environments.
 Performance tuning of new or existing SQL and assisted users on performance tuning queries at application level.
 Performance tuning designed best use of secondary joined and hash indexes.
 Created data models of the following subject areas: Party, Organization, Geo Graphic Location, Item, Sales
Order/Return, Contract and Invoice.
Teradata Corp / GE Healthcare Milwaukee, WI Nov 2011 to Jan 2012
Sr. Data Architect
 Involved in meetings to gather requirements from the Business Analysts.
 Analyze the existing data and designs logical and physical data models.
 Created logical and physical data models using Teradata mLDM.
 Provides naming standards and conventions and ensures that data dictionaries are maintained across multiple
database environments
 Implemented Teradata temporal data type for time dimension.
 Map logical and technical requirements to the ODS model.
 Created data models of the following subject areas: Party, Organization, Geo Graphic Location, Item, Sales
Order/Return, Contract and Adjustment.
Navistar International, Lisle, IL Jan 2011 to Oct 2011
Sr. Data Architect
 Involved in meetings to gather requirements from the Business Analysts.
 Created logical and physical data models using Teradata mLDM.
 Created conceptual, logical & physical data models and assists the DBA in developing the physical data model
(PDM).
 Analyze the existing data and designs logical and physical data models.
 Provides naming standards and conventions and ensures that data dictionaries are maintained across multiple
database environments.
 Performance tuning of new or existing SQL and assisted users on performance tuning queries at application level.
 Performance tuning designed best use of secondary joined and hash indexes.
 Created & maintained metadata related to the model
 Created data models of the following subject areas: Party (Dealer, Customer, Supplier and Organization), General
Ledger, Sub Ledger, Journal Entry, Chart of Accounts Balance, Geo Graphic Location, Invoice, Item, Cost, Contract
and Adjustment.
RJR, Winston-Salem, NC Nov 2007 to Dec 2010
Sr. Data Architect/Sr. Data Modeler
 Involved in meetings to gather information and requirements from the business.
 Created conceptual, logical & physical data models and assists the DBA in developing the physical data model
(PDM).
 Analyze the existing data and designs logical and physical data models
 Provides naming standards and conventions and ensures that data dictionaries are maintained across multiple
database environments
 Created data pipeline from Salesforce to Teradata.
 Created & maintained metadata related to the model
 Designing data structures and creating an ERD, work with standards & Full system development life cycle.
 Created data models of the following subject areas: Customer, Job & Call, PRP Payments and Contracts. Designed
new database structures for Customer, Job & Call and Contracts.
 Performance tuning of new or existing SQL and assisted users on performance tuning queries at application level.
 Performance tuning designed best use of secondary joined and hash indexes.
 Created logical and physical data models using Teradata mLDM.
Gap/IBM, San Bruno, CA Sep 2006 – Oct 2007
Teradata DBA/ Architect
 Involved in meetings to gather information and requirements from the clients.
 Monitor and maintain a production Teradata database environment, including runtime optimization, capacity
management and planning, security, configuration, scheduling, and execution of maintenance utilities.
 Technical expert in the areas of relational database logical design, physical design, and performance tuning of the
RDBMS.
 Maintain and Tune Teradata Production and Development systems. Supported application development
timelines by implementing designs as well as incremental changes to database definition in a timely manner into
production and non-production Teradata systems.
 Performance tuning of new or existing SQL and assisted users on performance tuning queries at application level.
 Perform tuning and optimization of database configuration and application SQL Define database backup &
recovery strategies, implement and administer. Create and Maintain users for Production/Development Teradata
Systems.
 Performance tuning designed best use of secondary joined and hash indexes.
 Involved in data encryption (Protegrity) installation.
 Completed the physical database design and administered the production objects in a change control
environment.
 Expanding the data warehouse to meet new business initiatives, logical data model design, physical data model
design enhancing performance, and constructing a data mart.
 Involved in database upgradation from V2R5.1 to V2R6.2.
IRI/Acxiom Jul 2005 to Aug 2006
Sr. ETL/Datastage Consultant
 Involved in meetings to gather information and requirements from the clients.
 Design and Develop ETL processes using DataStage, PL/SQL, Oracle, Perl and Unix shell programming for FDW
data warehouse.
 Develop complex ETLs by using DataStage parallel extender parallel processing capabilities.
 Help team members in tuning and improve performance for handling high-volume ETL jobs.
 Resolve technical and functional issues with the design through test phase of development.
 Implemented data staging, cleansing and transformation mechanisms, normalized databases, Star/ Snowflake
schemas, and Business Intelligence delivery mechanisms as required, including loading of data marts.
 Developed Unix shell scripts to automate the Data Load processes to the target Data warehouse.
 Developed migration strategy and implemented DataStage 7.1 to 7.5 version upgrade.
 Involved in maintenance and ongoing production support for the current system including ON-CALL rotation
24X7.
 Assist the business analyst in mapping the data from source to target and prepare prototype models to support
the development efforts.
 Design and develop reusable components for data sourcing, cleansing, and integration.
 Define programming standards, procedures and best practices in support of ETL processes.
 Develop and implement a standard approach for ETL error processing, audits, and controls and development of
communication and resolution procedures.
 Successful project and team leadership including but not limited to: daily task assignments, issue resolution,
status reporting and resource allocation along with Onshore – Offshore Experience.
Sprint/IBM, KS Mar 2004 to Jun 2005
Sr. Ab Initio/Teradata Consultant
 Involved in meetings to gather information and requirements from the clients.
 Involved in Designing the ETL process to Extract translates and load data from OLTP Oracle database system to
Teradata Datawarehouse.
 Technical expert in the areas of relational database logical design, physical design, and performance tuning of the
RDBMS.
 Performance tuning of new or existing SQL and assisted users on performance tuning queries at application level.
 Maintain and Tune Teradata Production and Development systems. Supported application development
timelines by implementing designs as well as incremental changes to database definition in a timely manner into
production and non-production Teradata systems.
 Creating Mainframe datasets, submitting JCL, check the dataset files using File-Aid Using 3278 Terminal.
 Involved in unit testing, systems testing, integrated testing and user acceptance testing.
 Developed Ab Initio graphs based on Detailed Design documents and Reviewed High Level Design document for
data movement.
 Involved in creating Flat files using dataset components like Input file, Output file, Intermediate file in Ab Initio
graphs.
 Extensively Used Transform Components Aggregator, Match sorted Join, DE normalize sorted, Reformat, Rollup
and Scan Components.
 Implemented the component level, pipeline and Data parallelism in Ab Initio for ETL process for Data warehouse
 Extensively used Partitioning Components Broad Cast, partition by key, partition by Range, partition by round
robin and Deportation components like Concatenate, Gather and Merge in Ab Initio.
 Export Amdocs/Little Billers files to Teradata using Fast load, Fast Export, Multiload, TPump.
 Developed Mapplets using corresponding Source, Targets and Transformations.
 Designed and documented the validation rules, error handling and test strategy of the mapping.
 Involved in Query Analyzing, performance tuning and testing.
NCR Corporation, Dayton, OH May 2002 to Feb 2004
Sr. Java/Teradata Developer
 Designed and developed Teradata load scripts on a Teradata V2R5/5.1 platform in a UNIX environment.
Performed DBA functions and data validation.
 Created the table views, stored procedures and macros for CPP.
 Created tables, views, macros, stored procedures for ERR, Event Plus page.
 Monitored and tuned the Teradata Database and ETL scripts written in Teradata SQL, for performance.
Developed download the reports in Excel formats.
 Imported Sources and Targets to create Mappings based on business logic and developed Transformations using
Power center Designer. Used Informatica Workflow Manager, Workflow Monitor to create sessions and batches.
 Worked on all the transformations like Lookup, Aggregator, Expression, Filter, Router, Update Strategy, Stored
Procedure and Sequence Generator.
 Involved in Designing the ETL process to Extract translates and load data from OLTP Oracle database system to
Teradata Datawarehouse.
 Extensively involved in the design of data extraction, Transformation and migration.
 Designed and implemented an automated system for ongoing ETL job performance and Report creation, logging
start and end time of each step in each ETL Job and each Report.
 Involved in Query Analyzing, performance tuning and testing.
Deep Nines Inc, San Diego, CA Sep 2001 to Apr 2002
Java Developer
Sleuth9 Network Security which automatically controls DDoS, attacks. Reduces security breaches and protects the
network around the clock.
 Database design for creating defects in the Database.
 Developed defect-reporting system to enter the defects in the database using JFC Swings.
 Developed to track the information like IP Address, hostname, login when it was choking and tracking.
 Developed SMI Filter monitoring system for online tracking.
 Developed IP Filter monitoring for display the online tracking and choking information when Sleuth9 running.
 Developed a web site for Deep Nines.
 Developed a send mailing System when DDoS attacks was happened using sockets.
NCR Corporation, San Diego, CA Mar 2000 to Aug 2001
Java/Teradata Developer
 Developed the GUI of 2 different versions. Had the sole responsibility to develop the 2.0 GUI and to integrate it
with database, Teradata. In GUI used lot of swing features like JTables for reporting, JprogressBar for scoring and
analysis, JSlider for selecting the clusters, JPanels for Variables registration etc. Developed Server side programs
to retrieve the data from the database, SQL programs to connect the database and retrieve data from database
depend on the Variable selection criteria. Used Fast Export to transfer data from a table to local text file. Created
batch files and run the batch file at run time. Developed several servlets, which interacts with database Oracle.
 Built Data Providers on different Teradata tables, like queries on Universes, Free Hand SQL and Stored
procedures.
 Developed jsps for user interaction.
 Client-side validations were done using the JavaScript.
Motorola Computer Group, Phoenix, AZ Sep 1999 to Feb 2000
Java Developer
The Customer Problem Reporting (CPR) system is hardware and software Solution aimed at replacing the current
Technical Action Request (TAR) system used by the Motorola Computer Group (MCG) for reporting and tracking
vendor, customer and internally found problems against MCG hardware and software products. This project is
divided into four modules. The general user can search and view the CPR reports. The engineer can create, edit,
request close and search and view the CPR reports. The Manager can Assign an owner, create, edit, request close and
view the reports. The Administrator can do all the above things.
 Involved to develop Search forms using Apache Element Construction Set and HTML.
 Developed Web Configuration Files Modifications.
 Developed SQL programs retrieve data from database depend on search criteria.
 Developed Server-side programs to retrieve the data from the database.

Worked as a Programmer/analyst in India from Aug 1996 to Aug 1999.

EDUCATION AND TECHNICAL CERTIFICATION:


 Sun certified Java Programmer.
 Snowpro Core certified.
 Master of Computer Applications from Osmania University.
 Bachelor of Commerce from Nagarjuna University.

You might also like