1 day ago
Responsibilities
- Design and build APIs and backend services using Spring Boot.
- Write clean, maintainable, and efficient code with adherence to best practices.
- Develop and maintain data pipelines and ETL workflows using Apache Spark and Apache Airflow.
- Optimize data storage, retrieval, and processing systems for reliability and performance.
- Monitor, troubleshoot, and improve data systems to maximize efficiency.
- Design and maintain large scale, low latency API systems using SpringBoot and Kubernetes.
- Provide technical guidance and mentorship to junior engineers.
Requirements
- 8+ years of experience in software and/or data engineering.
- Expertise in big data technologies such as Apache Spark and Apache Airflow.
- Strong understanding of SOLID principles and distributed systems architecture.
- Proven experience in distributed data processing and real-time data pipelines.
- Advanced SQL skills with expertise in query optimization for large datasets.
- Exceptional problem-solving abilities and capacity to work independently or collaboratively.
- Excellent verbal and written communication skills.
- Experience with cloud platforms like AWS, GCP, or Azure, and containerization tools like Docker and Kubernetes (preferred).
- Experience in AdTech and audience management (preferred).
- Strong programming skills in Python, Java, and/or Scala (preferred).
- Knowledge of CI/CD pipelines and DevOps practices (preferred).
- Expertise in data modeling and schema design (preferred).
- Bachelor's degree in computer science, Engineering, or a related field.
Benefits
- Comprehensive benefits including healthcare, life, accident, and disability insurance.
- Global access to mental health and financial wellness support.
- Flexible work arrangements with a hybrid work approach.
- Support for taking time off in accordance with local leave policies.