Overview
Help build Auckland Transport's (AT) future big data platform.
Develop real time data pipelines, designing software to scale to performance, and promote data engineering within large teams.
Be a catalyst for data solutions in a digital data environment, backed by very large data sets, an Agile and collaborative culture, and a focus on cloud-based data architectures and services.
Responsibilities
Independently design and build / configure big data pipelines as per the design
Identify and solve big data problems independently
Work closely with data developers to report issues in build
Develop producer / consumer API, create topics, code K sqlDB, Flink and connectors
What you will bring
5 years Apache Kafka essential; Confluent Kafka is preferred
Proven experience using SQL, relational databases and API development
Extensive experience handling high velocity, high volume data events
Extensive knowledge of Java, Scala, Python, shell scripting, Terraform
Knowledge of APIM
At AT, we embrace inclusion and fully support building a diverse workplace where people come first and we feel safe, respected, valued and supported, inclusive of our differences.
Our values are Manaakitanga, Tiakitanga, Whanaungatanga, and Auahatanga.
Benefits
Investment in your professional development and training
25% discount on public transport services across Auckland
Opportunity for greater leadership and accountability
Modern CBD (Wynyard Quarter) waterfront-based office
Paid parental and partner leave
Study assistance
Health and wellbeing benefits
Professional memberships
AT Connect groups such as Mana ka Maori, Women@AT, and many more
Come join us in shaping the future of transport in Auckland — apply today!
Applications close 21st September 2025.
For further information or a confidential chat, please contact Rennie Sharma at
#J-18808-Ljbffr
Senior Specialist • Auckland, New Zealand