Professional Documents
Culture Documents
JD FS Snowflake de
JD FS Snowflake de
Job Description:
As a member of our Data Ops team, the role involves monitoring, maintaining,
troubleshooting issues and supporting a critical data pipeline within defined SLAs to
ensure smooth data flow and timely resolution of issues . You'll provide timely
updates on operational issues, suggest improvements for efficiency, and ensure data
pipeline reliability. Performance tuning and troubleshooting will be key in
identifying and resolving bottlenecks for optimal data processing. You'll uphold data
governance, implement security controls, and create monitoring tools for proactive
issue detection. Managing infrastructure, CI/CD pipelines, and collaborating within
cross-functional teams and actively participating in scrum ceremonies, share
knowledge to foster a cohesive work environment are also vital aspects of your
responsibilities.
Key skills:
• 3 + years of SnowFlake, Airflow, DBT, AWS, Terraform, CI/CD tools.
• Experience designing, building, and operating robust data systems with
reliable monitoring and logging practices.
• Effectively work across team boundaries to establish overarching data
architecture, and provide guidance to individual teams.
• Expert skills in managing and handling the relational databases and query
performance tuning (SQL).
• High level of scripting proficiency in Python.
• Experience architecting and implementing data governance processes and
tooling (such as data catalogs, lineage tools, role-based access control, and
PII handling)
• Monitor and manage the data pipelines
• Work on costs optimization on Cloud Infrastructure, Warehouse, BI and other
tools.
• Strong written and oral communication skills.
Candidate should be ready to work in shift timings (7AM to 3 PM IST) and the
positions offers flexibility of working from home. We are expecting the resource to be
able to join with in 15-30days.