Professional Documents
Culture Documents
RajasekharSF Resume
RajasekharSF Resume
RajasekharSF Resume
PROFESSIONAL EXPERIENCE
Around 5 years of professional experience in IT in the areas of Analysis, design, development, documentation, implementation,
Monitoring & Scheduling jobs, Unit testing, deployment, and post deployment maintenance of phases in Software Development
LifeCycles.
Around 3 years of experience in Snowflake Cloud Data Warehouse, Teradata, AWS S3 Bucket, DMS,RDS, Data
analysis andimplementation in DW Solutions for both on-premises and cloud solutions.
Hands on experience on Create and maintain Warehouse/Databases and Schemas.
Hands on experience on Create Resource Monitors, Views, Shares, Reader Accounts and implement Network Polices
as well as Whitelisting the IP’s.
Hands on experience on Data loading, Transformations, Metadata configuration, Snow pipe configurations, AWS -
Snowflake Integration, Creating Stages and Tasks creation for scheduling jobs, Using of Virtual Warehouses and cloning the
data.
Good exposure on Data Sharing, Zero Copy Cloning and Time travel in Snowflake.
Hands-on experience in loading and unloading data into Snowflake Cloud Data Warehouse using
COPY command.
Used COPY, INSERT, PUT, GET commands for loading data into Snowflake tables from internal/external stages.
Handling large and complex datasets like CSV files from various sources.
Handling python frameworks to process data loading from various upstream systems like SQL Server, MySQL, AWS S3
Bucket to Snowflake Data warehouse.
Involved in loading of CSV data formats into snowflake
Good exposure on creating Views & Roles to reporting teams to share Snowflake data.
Create task flows to automate the data loading, unloading to/from Snowflake Cloud Data Warehouse and AWS S3 Bucket.
Involved in planning, design, monitoring & scheduling, testing and release/delivery of project by using Agile methodology
TECHNICAL SKILLS
EDUCATIONAL QUALIFICATION
Project1: -
Company : Cap Gemini
Client : T-Mobile
Title : Data Migration
Duration : March- 2023 to Till Date
Role : Snowflake Developer
Project Description:-
Led a high-impact data migration initiative at T-Mobile, orchestrating the seamless transition large volume of data from Teradata to AWS S3
and subsequently into Snowflake Data Warehouse. Leveraged expert proficiency in T-Mobile own ETL frameworks such as TDI, TSM, and
TDAP to optimize data flow. Orchestrated the migration, ensuring data loading were addressed efficiently. Demonstrated advanced skills in
AWS, specifically in utilizing S3 for efficient storage and retrieval.
Collaborated effectively with cross-functional teams and communicated complex technical concepts to non-technical stakeholders.
Contributed to the establishment of Snowflake as the central data warehousing solution, forming the foundation for the entire reporting
system.
Roles & Responsibilities:-
Designed, developed, and implemented TSM and TDI job development for Teradata to Snowflake migration project.
Migrated TDAP from Teradata to Snowflake using ADF Pipelines.
Conducted HRA data validation to ensure data accuracy, consistency, and compliance.
Actively involved in BTEQ conversions and identified substantial changes required in SQL scripts.
Converted stored procedures in the Data Labs team to enhance SQL knowledge and gain hands-on experience in translating
code.
Proactively identified and resolved issues in the Marketing
Data Validation team, demonstrating technical skills and collaboration abilities.
Preparing Data Load configurations
Building Python scripts to perform New Data Loads
Handling both streaming and batch processing jobs
Implementing framework in snowflake to process daily loads using Snow pipes and Tasks.
Creating data shares & Reader accounts to the Business/User as per the requirement
Used GIT Bash to sync with Dev environment to Production/Test environment
Cloned Testing environments from production to perform system testing.
Involved in perform data loads from SQL Server, AWS S3 to Snowflake stage tables.
Coded SQL Queries for data handling and related operations.
Created various views & Roles to reporting teams to share the Snowflake data.
Designed various file formats to load data into Snowflake though internal & external stages
Environment: Snowflake, SQL, AWS Services, Teradata, ADF, Control M, Putty, Bit Bucket, GIT, Jira & ServiceNow.
Project2: -
Company: LTI Mind tree
Client : HP
Duration: Jan-2022 to Mar-2023
Role : Snowflake Developer
Project Description: -
Led the creation of a solid foundation for the Snowflake Database, making it easy to handle daily data updates. Set up a system to share data
securely and efficiently based on specific business needs. Ensured thorough testing by copying the working environment. Managed the
smooth transfer of data from different sources like SQL Server and AWS S3. Wrote code to organize and handle data and configured settings
for loading information. Created views and roles for reporting teams and designed file formats for loading data into Snowflake.
Roles & Responsibilities:-
Played a key role in the design and development of components within the Snowflake Database, contributing to the overall
architecture.
Successfully implemented a robust framework in Snowflake, leveraging Snow Pipes and Tasks to process daily data loads
efficiently.
Created tailored data shares and reader accounts, aligning with business/user requirements for seamless data access.
Executed the cloning of testing environments from production, facilitating comprehensive system testing processes.
Executed data loads from SQL Server and AWS S3 to Snowflake stage tables, ensuring smooth and accurate data transfers.
Prepared SQL queries to handle diverse data operations and related tasks, ensuring data integrity and optimal performance.
Prepared data load configurations to streamline and optimize the data loading process.
Successfully managed both streaming and batch-processing jobs, adapting to varied data processing needs.
Developed various views and roles specifically tailored for reporting teams, facilitating seamless access and sharing of
Snowflake data.
Designed diverse file formats to facilitate the efficient loading of data into Snowflake, utilizing both internal and external
stages.
Environment: MySQL, SQL, Snowflake, SQL, AWS Services, Azure, PowerApps, Power BI, Bit Bucket, Git, Jira
&Service Now.
Project3: -
Company: Tata Consultancy
Services
Client : PepsiCo
Duration: Nov-2018 to Dec-2021
Role : DB and BI developer
Project Description: -
Design and development of essential components for the Snowflake Database, organizing multiple tables and seamlessly integrating AWS
and Snowflake Key Services. Ensured optimal performance by implementing T-SQL Queries, creating Views, and monitoring the environment
with the Science Logic tool.
PERSONAL DETAILS
Yours Sincerely