Professional Documents
Culture Documents
Graceson DA
Graceson DA
619-880-0303 | gracesonthomas.work@gmail.com
SUMMARY
Skilled data Analyst with 6+ years’ experience collecting, organizing, interpreting and
analysing data to drive successful business solutions to support external vendors and
stakeholders within the company with analytical insights.
Proficient knowledge in statistics, visualizations and analytics with excellent
understanding of business operations and analytical tools to show patterns, trends across
data.
Developed dashboards and perform data pre-processing, manipulation, cleaning. Apply
statistical modelling using R studio packages.
Experience in agile methodologies, building and testing data tracking requirements,
survey analysis and improving its quality.
Highly skilled in visualization tools like Heap, Tableau, ggplot2 and Google Analytics to
create dashboards.
Energetic presenter and confident communicator with the ability to circulate information
in a way that is clear, efficient, and beneficial for end users.
Expert in merging data from various data sources into a meaningful dataset.
Experienced in customer analytics, insurance, global security services, transportation
sectors.
Creative in finding solutions to problems and determining modifications for optimal use
of organizational data.
Organized and timely in providing with reports on specific data findings and their impact
on organizational growth and success.
Worked and extracted data from different database sources like SQL, MS Access,
snowflake and regularly access JIRA for project management
Experienced in creating SQL objects such as Tables, Stored Procedures, Triggers, Table
valued parameters, functions, Views, Indexes, Relational Data Base Models.
Strong understanding of customer lifetime value, retention, funnel analysis, ROI and
survey analysis
Expertise in analysing large datasets and knowledge on business requirements gathering,
data mining and visualization.
Worked on data driven projects, utilizing knowledge of business operations to monitor,
measure and appraise transportation logistics.
Responsible for the advancement and maintenance of data tracking requirements within
the mobile data to monitor customer behaviour.
Experience designing visualizations using Tableau software, publishing and presenting
dashboards, storyline on platforms.
Experience with Data Analytics, Data Reporting, Ad-hoc Reporting, Graphs, Scales,
PivotTables, Pivot charts and visualization.
Actively involved in data extraction, data cleaning, update/modify code and generate
outputs
Experience in preparing KPIs, Heap, Google Analytics and Tableau dashboards.
Worked developing data modelling in logistic regression, sentiment analysis.
Built out data and reporting infrastructure using SQL, Tableau to provide real-time
insights into customer behaviour, funnels and business KPIs.
Well versed in implementation of operational assessments and conducting functional
requirements for businesses of all sizes.
Quantitative expert with exceptional speed in analysis of quarterly and annual reports and
providing comprehensive syntheses.
Education
Doctorate in Business administration April 2017 - March 2021
San Diego University for Integrative studies
Master’s in Computer Information Systems April 2014- March 2017
California University of Management and Science
Technical skills
Databases: Spark SQL, Oracle SQL, PL/SQL, MySQL, Microsoft SQL Server,
Teradata, PostgreSQL
Programming & SDLC Methodologies: Python, Pyspark, Agile (Scrum), Waterfall
Business Intelligence & Reporting Tools: PowerBI, Tableau, Alteryx, Informatica,
Mix-Panel, New Relic, Google Analytics
Cloud Platforms: Azure (Data Factory, Databricks, Data Lake)
Statistical Techniques: Machine Learning – Supervised, Unsupervised, Data
Wrangling, Exploratory Data Analysis, Data Pre-processing, Statistical Significance
Tests, Hypothesis Testing, Predictive Modelling, Multivariate Regression, A/B
Testing, Web Analytics, Data Visualization
Other Tools: GitLab, VBA, SAS JMP, SAS EG, SSIS, SAS Enterprise Miner, JIRA,
Advanced MS Excel (V/H-Lookup, pivot tables, macros), MS Access, MS Office
Work Experience:
Facebook Austin, TX Aug 2018- Till Date
Data Analyst
Responsibilities:
Created an aggregated report daily for the client to make investment decisions and help
analyse market trends.
Executed SQL queries & developed Tableau dashboards with real-time reporting for
stakeholders. Actively engaged with business partners to identify business problems and
brainstormed for potential solutions aligned with business requirements
Built an internal visualization platform for the clients to view historic data, make
comparisons between various issuers, analytics for different bonds and market.
The model collects, merges daily data from market providers and applies different
cleaning techniques to eliminate bad data points.
The model merges the daily data with the historical data and applies various quantitative
algorithms to check the best fit for the day.
Captures the changes for each market to create a daily email alert to the client to help
make better investment decisions.
Automate different workflows, which are initiated manually with Python scripts and Unix
shell scripting.
Create, activate and program in Anaconda environment.
Worked on predictive analytics use-cases using Python language.
Clean data and processed third party spending data into manoeuvrable deliverables within
specific format with Excel macros and python libraries such as NumPy, SQL Alchemy
and matplotlib.
Used Pandas as API to put the data as time series and tabular format for manipulation and
retrieval of data.
Helped with the migration from the old server to Jira database (Matching Fields) with
Python scripts for transferring and verifying the information.
Analyse Format data using Machine Learning algorithm by Python Scikit-Learn.
Experience in python, Jupiter, Scientific computing stack (NumPy, SciPy, pandas and
matplotlib).
Perform troubleshooting, fixed and deployed many Python bug fixes of the two main
applications that were a main source of data for both customers and internal customer
service team.
Write Python scripts to parse JSON documents and load the data in database.
Generating various capacity planning reports (graphical) using Python packages like
NumPy, matplotlib.
Analysing various logs that are been generating and predicting/forecasting next
occurrence of event with various Python libraries.
Created Autosys batch processes to fully automate the model to pick the latest as well as
the best bond that fits best for that market.
Created a framework using portly, dash and flask for visualizing the trends and
understanding patterns for each market using the history data.
Used python APIs for extracting daily data from multiple vendors.
Used Spark and Spark SQL for data integrations, manipulations. Worked on a POC for
creating a docker image on azure to run the model
Environment: Python, Pyspark, Spark SQL, Plotly, Dash, Flask, Post Man Microsoft Azure,
Autosys, Docker.