about 1 month ago
Base Salary
$220k - $350k/yr
Responsibilities
- Own the technical execution of regional and global data localisation strategies within the Databricks environment.
- Design scalable patterns that satisfy strict regulatory requirements while maintaining a unified global data model.
- Reconcile local compliance constraints with global consistency in definitions, metrics, and knowledge artifacts.
- Partner with Data Science, AI Engineering, Product, and Risk teams to ensure the platform supports experimentation and high-quality decision-making.
- Lead global data modeling efforts, creating unified schemas that accommodate regional variance.
- Define and implement 'Agent Interfaces'—APIs and tools that enable AI agents to interact with Knowledge and Skills layers.
- Implement Governance-as-Code, ensuring automated data lineage and quality controls are embedded directly into the data platform.
Requirements
- 10+ years of experience designing and building large-scale data systems with strong architectural fundamentals.
- Hands-on experience with the Databricks ecosystem, including Unity Catalog and Delta Lake.
- Experience implementing semantic layers or metric stores to enable consistent reporting and analytics.
- Solid understanding of how AI/agent-based systems consume data, with experience building APIs or data services to support them.
- Experience designing data solutions that meet regional data residency and compliance requirements.
- Demonstrated senior-level individual contributor experience, including leading design reviews and mentoring engineers.
Benefits
- Competitive salary plus valuable equity within Airwallex.
- Collaborative open office space with a fully stocked kitchen.
- Regular team-building events.
- Freedom to be creative in a rapidly growing global fintech environment.
Tech Stack
Apache KafkaApache SparkDatabricks
