Professional Documents
Culture Documents
Darian Pena
Darian Pena
PENA
Remote - VA; (646) 543-7355 darianpena001@gmail.com
PROFESSIONAL SUMMARY
Technical Business Analysis and Data Management professional specializing in cloud and data intensive solutions,
architecture and design, scripting and process automation, and data visualization. Azure and AWS certified. Proficient in
Tableau, Power BI, SQL, Python, and Spark. Experience in multiple industries serving as an intersection between
technology groups and business clients, providing technical solutions and translating complex requirements. Worked with
project management tools such as Jira and Azure DevOps. Thorough understanding of big data technologies, such as
Databricks, Azure Data Lake, AWS S3, and relational database systems (Oracle, SQL Server, Postgres, IBM DB2).
WORK EXPERIENCE
Technical Business Analyst
Nestle (Enterra Solutions Consultant) ◦ Princeton, NJ May 2022 - Present
● Led daily scrums with a team of data architects, engineers, and full stack developers, managing action items and
addressing challenges. Contributed to the launch of an automated recommendation service using Azure Data Lake,
Snowflake, Python, Power BI, and React, enhancing data-driven decision-making at Nestle.
● Created Python scripts to process and transform raw data, mapped complex data models and optimized data pipelines,
enabling advanced decision science capabilities.
● Develop pipelines and data flows using Spark (Scala & PySpark) in Databricks to automate processes on curated
parquet files in Azure Data Lake Store Gen2 (ADLS2), achieving a 70% reduction in data load times and enhancing
real-time data availability for Power BI dashboards tracking operational KPIs.
● Created and optimized Bash and Python ETL scripts on Azure/AWS VMs, ensuring high data integrity and
compliance for sensitive financial data in a Postgres environment.
Technologies: Power BI, Databricks, Bitbucket, Jira, Azure DevOps, Data Factory, Azure Synapse Analytics, Azure Data
Lake Store Gen2, Amazon EC2
IT Business Analyst
Bank of America (USEReady Consultant) ◦ New York, NY August 2020 – May 2022
● Led the CCAR Data & Analytics initiative for Bank of America’s CCAR team, integrating disparate data sources into
cohesive Hadoop environments to streamline regulatory reporting for Federal Reserve CCAR Audits.
● Use Spark (Scala and SparkSQL) in Databricks to generate new tables with SQL DDL and automate structured
schemas in Hive/Impala to assist reporting needs of CCAR Business Analyst team.
● Create Azure Data Factory (ADF) pipelines for cloud migrations from SQL Server into the Azure SQL data
warehouse to build and maintain tables and views.
● Create and maintain tables and stored procedures using DDL scripts and T-SQL stored procedures in MS SQL Server
to develop data warehouses for BI and analytics teams leveraging Tableau.
● Implement dashboarding solutions in Tableau to visualize key indicators of system performance and data integrity
across multiple systems storing trades and other transaction information.
Technologies: Tableau, Tableau Prep, Tableau CRM, Salesforce, Power BI, HDInsight, Jenkins, Autosys, Scala, Python,
Hadoop, Spark, Hive, Impala, Snowflake, Databricks, Git, Bitbucket, Jira
IT Business Analyst
Morgan Stanley ◦ New York, NY July 2017 - August 2020
● Spearheaded cross-functional projects collaborating with Engineering, Audit, and Data Governance teams, identifying
key business needs, and defining product solutions for efficient data migration and user onboarding to big data
analytics tools (Hadoop, Hive, Spark), significantly accelerating data science initiatives.
● Developed Tableau dashboards for executive-level review, presenting sophisticated transaction and geographic trend
analysis on Wealth Management account activity, instrumental in pinpointing operational and regulatory risks.
● Conducted complex SQL queries and generated Tableau reports to scrutinize metrics across Legal & Compliance
deliverables, focusing on global financial crimes and account activity, enhancing the department's strategic decision-
making processes.
● Utilized Tableau in conjunction with Python libraries (pandas, ibm_db, etc.) to perform data consistency tests across
multiple environments, leading to the identification and resolution of critical bugs in ETL processes.
● Leverage PySpark, Scala, and SparkSQL within Azure Databricks to design ETL processes for trade surveillance
algorithms tracking market manipulation. Produce and manipulate DataFrames in Spark to structure data effectively
for machine learning models, contributing to the successful deployment of Morgan Stanley’s updated insider trading
algorithm with enhanced compliance and risk management capabilities.
Technologies: Scala, Python, Hadoop, Spark, Hive, Impala, DB2, Snowflake, Git, Jira, Tableau, Teradata, Jenkins