about 4 hours ago
Santa Clara, CA, USA
Staff+
H1B Sponsor
Base Salary
$211k - $285k/yr
Responsibilities
- Design and maintain scalable data pipelines for ingestion, transformation, and delivery across multiple data sources.
- Collaborate with Analytics Engineers and Product teams to curate datasets and establish data contracts.
- Develop and manage modern data architectures using tools like Databricks and Delta Lake.
- Optimize Snowflake usage and performance for data quality and cost efficiency.
- Support and scale orchestration platforms and monitoring tools.
- Build tools, alerting systems, and documentation for reliable operation across the data stack.
Requirements
- 8+ years of experience in data engineering or platform/infrastructure roles.
- Expertise in Python or Scala, and strong proficiency in SQL.
- Deep experience with data warehouse technologies like Snowflake.
- Experience with data lake and lakehouse architectures such as Databricks and Delta Lake.
- Proven ability to design and implement scalable ETL pipelines.
- Familiarity with infrastructure-as-code and job orchestration tools.
- Excellent collaboration and communication skills.
Benefits
- Flexible work environment.
- Unlimited Vacation.
- 100% paid employee health benefit options.
- Commuter Benefits.
- 401(k) with employer funded match.
- Corporate wellness program.
- Sabbatical leave for employees with 5+ years of service.
- Competitive paid parental leave and fertility/family planning reimbursement.
- Cell phone reimbursement.
- Catered lunch every day along with beverages and snacks.
- Employee Resource Groups and ZocClubs.
Tech Stack
Apache AirflowAWSDatabricksDatadogdbtPrestoPythonScalaSnowflakeSQL
Categories
AI & MLData Engineering
