Software Engineer, Data Backend (Ad Cloud)
Appier
7 months ago
Taipei, Taiwan
Entry Level / Mid Level
Responsibilities
- Design, develop, and maintain RESTful APIs using Python.
- Build and manage robust data warehouses utilizing ClickHouse, Trino/Presto, and Pinot.
- Design and develop data pipelines using Apache Airflow and Apache Spark.
- Work closely with cross-functional teams to develop automation tools that streamline daily operations.
- Implement state-of-the-art monitoring and alerting systems to ensure optimal system performance and stability.
- Address queries from applications in a timely and effective manner, ensuring high client satisfaction.
- Work on cloud platforms such as AWS and GCP, leveraging their capabilities to optimize data operations.
- Utilize Kubernetes (k8s) for container orchestration to facilitate efficient deployment and scaling of applications.
Requirements
- BS/MS degree in Computer Science.
- 2+ years of experience in building and operating large-scale distributed systems or applications.
- Experience in Kubernetes development, Linux/Unix.
- Experience in managing data lake or data warehouse.
- Expertise in developing data structures and algorithms on top of Big Data platforms.
- Ability to operate effectively and independently in a dynamic, fluid environment.
- Ability to work in a fast-moving team environment and juggle many tasks and projects.
- Eagerness to change the world in a huge way by being a self-motivated learner and builder.
Tech Stack
Apache AirflowApache FlinkApache HadoopApache HiveApache SparkAWSClickHouseGoogle Cloud PlatformJavaKubernetesLinuxPrestoPythonScala
Categories
AI & MLBackendData EngineeringDevOps