8 months ago
Base Salary
$170k - $276k/yr
Responsibilities
- Define and drive the technical strategy for Docker's data platform architecture.
- Lead design and implementation of scalable data infrastructure using Snowflake, AWS, and other tools.
- Architect end-to-end data pipelines for real-time and batch analytics.
- Establish technical standards for data quality and operational excellence.
- Design and build robust data systems that process large volumes of data.
- Implement complex data transformations and modeling using DBT.
- Develop and maintain data orchestration workflows with Apache Airflow.
- Collaborate with cross-functional teams to understand analytics requirements.
- Own operational excellence for critical data systems including monitoring and incident response.
- Mentor junior engineers on technical skills and best practices.
Requirements
- 8+ years of software engineering experience with 3+ years in data engineering.
- Expert-level experience with Snowflake and advanced SQL.
- Deep proficiency in DBT for data modeling and transformation.
- Strong expertise with Apache Airflow for workflow orchestration.
- Extensive AWS experience including data services and infrastructure management.
- Proficiency in Python, SQL, and other programming languages used in data engineering.
- Bachelor’s degree in Computer Science, Engineering, or a related field.
Benefits
- Freedom & flexibility to fit work around life.
- Designated quarterly Whaleness Days and end of year Whaleness break.
- Home office setup for comfortable work environment.
- 16 weeks of paid parental leave.
- Technology stipend of $100 net/month.
- PTO plan encouraging time off for personal enjoyment.
- Training stipend for conferences and courses.
- Equity in the company as it grows.
- Docker swag and medical benefits vary by country.
- Remote-first culture with offices in Seattle and Paris.
Tech Stack
Categories
Data Engineering
