1 day ago
Cambridge, United KingdomSenior / Staff+
H1B Sponsor
Responsibilities
- Design and build APIs and backend services using Spring Boot.
- Write clean, maintainable, and efficient code and tests.
- Develop and maintain data pipelines and ETL workflows using Apache Spark and Airflow.
- Optimize data storage, retrieval, and processing systems.
- Develop complex queries and analytics solutions using Druid, Trino, and StarRocks.
- Monitor, troubleshoot, and improve data systems.
- Collaborate with data scientists and software engineers to deliver integrated solutions.
- Provide technical guidance and mentorship to junior engineers.
Requirements
- 8+ years of experience in software and/or data engineering.
- Expertise in big data technologies such as Apache Spark and Apache Airflow.
- Strong understanding of SOLID principles and distributed systems architecture.
- Proven experience in distributed data processing and real-time data pipelines.
- Strong programming skills in Java, Python, or Scala.
- Advanced SQL skills with expertise in query optimization.
- Experience building highly scalable, low-latency APIs.
- Exceptional problem-solving abilities and communication skills.
- Experience with cloud platforms like AWS, GCP, or Azure (preferred).
- Experience in AdTech and advertising data platforms (preferred).
- Knowledge of CI/CD pipelines and DevOps practices (preferred).
- Experience with building Agentic AI systems (preferred).
- Bachelor's degree in computer science, Engineering, or a related field.
Benefits
- Comprehensive benefits including healthcare, life, and retirement options.
- Global access to mental health and financial wellness support.
- Support for taking time off in accordance with local leave policies.
- Flexible remote work options on Fridays.