about 23 hours ago
San Francisco, CA, USA
Entry Level / Mid Level
H1B Sponsor
Base Salary
$145k - $168k/yr
Responsibilities
- Contribute to the development and optimization of large-scale data workflows using technologies such as Apache Spark.
- Debug and resolve issues in distributed environments, including data inconsistencies and job failures.
- Maintain and enhance the services that support the distributed storage and compute platform.
- Assist in deploying and maintaining production systems, including CI/CD workflows.
- Work to make the platform more elastic and fault-tolerant.
- Provide technical input into roadmaps for the team.
- Write clean, maintainable, and well-tested code.
Requirements
- BS in computer science or equivalent experience.
- 1-3 years of professional software engineering experience (internships included).
- Must be work-authorized in the United States without the need for employer sponsorship.
- Candidates must reside within a 60-mile radius of San Francisco, CA.
- Familiarity with data processing frameworks such as Apache Spark, Hadoop, or similar.
- Familiarity with containerization tools (e.g., Docker and Kubernetes).
- Experience with workflow orchestration tools (e.g., Airflow) is a plus.
- Proficient in Java and/or Python programming languages.
- Linux system administration/automation experience.
- Strong problem-solving and debugging skills.
- Organized, detail-oriented personality.
Benefits
- Competitive salary with a performance bonus and equity.
- Comprehensive benefits package.
Tech Stack
Apache AirflowApache HadoopApache SparkDockerJavaKubernetesLinuxPython
Categories
AI & MLData Engineering