3 days ago
Austin, TX, USAMid Level / Senior
Responsibilities
- Design and maintain backend data services and pipelines for telemetry and sensor data.
- Build robust batch and streaming data workflows integrating on-robot data sources and cloud infrastructure.
- Develop internal APIs and platform tooling for efficient data access.
- Establish data quality, lineage, and governance practices.
- Monitor and optimize storage systems and database performance.
- Collaborate with data scientists and engineers to deliver production-ready data infrastructure.
- Support secure deployment patterns across cloud and hybrid environments.
- Stay current on data engineering practices and emerging technologies.
Requirements
- Strong proficiency in Python; experience with Go is preferred.
- Experience with real-time and batch pipeline frameworks like Kafka or Spark.
- Strong command of relational databases such as PostgreSQL.
- Proficiency with cloud platforms and infrastructure tooling like Terraform.
- Experience with Kubernetes and Docker for deploying services.
- Familiarity with encryption and secure data access patterns.
- Experience building observability dashboards for data pipelines.
- Experience building REST APIs or gRPC services.
- Bachelor’s or Master’s degree in Computer Science or related field.
- Minimum of 3 years of experience in data engineering or backend engineering.
