about 5 hours ago
New York, NY, USA +3 more
Staff+
H1B Sponsor
Responsibilities
- Design, build, and operate stream compute infrastructure centered around Apache Flink.
- Collaborate with product and platform teams to understand requirements and improve stream processing infrastructure.
- Implement operational best practices to enhance resilience and reliability at scale.
- Drive automation and standardization through self-service workflows and self-healing systems.
- Lead initiatives to improve Flink availability and state durability.
- Evaluate and productionize Flink ecosystem capabilities to enhance developer experience.
- Engage with the open source community to adopt new features and contribute back.
Requirements
- 10+ years of experience in building and operating large-scale production systems.
- Experience as a technical lead for teams working on distributed systems.
- Hands-on experience with big data technologies like Flink, Spark, and Kafka.
- Experience developing and debugging distributed systems with open source tools.
- Strong software engineering skills and a passion for Big Data Distributed Systems.
- Ability to write high-quality code in languages like Go, Java, or Scala.
- Comfortable operating with high autonomy and ownership.
- Strong written and verbal communication skills.
Tech Stack
Apache FlinkApache HadoopApache KafkaApache SparkApache StormAWSGoJavaScala
Categories
AI & MLData EngineeringData Science