about 8 hours ago
Responsibilities
- Design and build APIs and backend services using Scala and Java.
- Write clean, maintainable, and efficient code with adherence to best practices.
- Develop and maintain data pipelines and ETL workflows using Apache Spark and Airflow.
- Optimize data storage, retrieval, and processing systems for reliability and performance.
- Monitor, troubleshoot, and improve data systems to minimize downtime.
- Collaborate with analytics and software engineering teams to deliver integrated solutions.
- Provide technical guidance and mentorship to junior engineers.
Requirements
- Bachelor's degree in computer science, Engineering, or a related field.
- 8+ years of experience in software and/or data engineering.
- Expertise in big data technologies such as Apache Spark and Airflow.
- Advanced SQL skills with expertise in query optimization for large datasets.
- Strong programming skills in Python, Java, and/or Scala.
- Exceptional problem-solving abilities and capacity to work independently or collaboratively.
- Excellent verbal and written communication skills.
- Experience with cloud platforms like AWS, GCP, or Azure, and containerization tools like Docker and Kubernetes (preferred).
- Knowledge of CI/CD pipelines and DevOps practices (preferred).
- Experience with building Agentic AI systems (preferred).
- Experience in AdTech and advertising data platforms (preferred).
Benefits
- Comprehensive benefits including healthcare, life, accident, and disability insurance.
- Global access to mental health and financial wellness support.
- Flexible work arrangements with a hybrid work approach.
- Support for taking time off in accordance with local leave policies.