EOI - Principal / Senior Data Engineer

Check with seller
Event Management Jobs
1 month
New Zealand
Canterbury
Christchurch Get directions →
1 view
ID: 806324
Published 1 month ago by Simple Machines
Check with seller
Christchurch, Canterbury, New Zealand
Get directions →
1 item view
Simple Machines NZ – Job Ad – Principal / Senior Data Engineer
Position: Senior to Principal Data Engineer
Location: Christchurch, New Zealand
Simple Machines. Data Engineered to Life™


Simple Machines is a leading independent boutique technology firm with a global presence, including teams in London, Sydney, San Francisco, and New Zealand. We specialise in creating technology solutions at the intersection of data, AI, machine learning, data engineering, and software engineering. Our mission is to help enterprises, technology companies, and governments better connect with and understand their organisations, their people, their customers, and citizens. We are a team of creative engineers and technologists dedicated to unleashing the potential of data in new and impactful ways. We design and build bespoke data platforms and unique software products, create and deploy intelligent systems, and bring engineering expertise to life by transforming data into actionable insights and tangible outcomes.

We engineer data to life™.

Requirements

The Role:


The Senior to Principal Data Engineer at Simple Machines is a dynamic, hands-on role focused on building real-time data pipelines and implementing data mesh architectures to enhance client data interactions. This position blends deep technical expertise in modern data engineering methods with a client-facing consulting approach, enabling clients to effectively manage and utilise their data. Within a team of top-tier engineers, the role involves developing greenfield data solutions that deliver tangible business outcomes across various environments.


Technical Responsibilities


Developing Data Solutions: Implement and enhance data-driven solutions integrating with clients' systems using state-of-the-art tools such as Databricks, Snowflake, Google Cloud, and AWS. Embrace modern data architecture philosophies including data products, data contracts, and data mesh to ensure a decentralized and consumer-oriented approach to data management.
Data Pipeline Development: Develop and optimise high-performance, batch and real-time data pipelines employing advanced streaming technologies like Kafka, and Flink. Utilise workflow orchestration tools such as Dataflow and Airflow.
Database and Storage Optimisation: Optimise and manage a broad array of database technologies, from traditional relational databases (e.g., PostgreSQL, MySQL) to modern NoSQL solutions (e.g., MongoDB, Cassandra). Focus on strategies that enhance data accessibility, integrity, and performance.
Big Data Processing Analytics: Utilise big data frameworks such as Apache Spark and Apache Flink to address challenges associated with large-scale data processing and analysis. These technologies are crucial for managing vast datasets and performing complex data transformations and aggregations.
Cloud Data Management: Implement and oversee cloud-specific data services including AWS Redshift, S3, Google BigQuery, and Google Cloud Storage. Leverage cloud architectures to improve data sharing and interoperability across different business units.
Security and Compliance: Ensure all data practices comply with security policies and regulations, embedding security by design in the data infrastructure. Incorporate tools and methodologies recommended for data security and compliance, ensuring robust protection and governance of data assets.
Consulting Responsibilities


Client Advisory: Provide expert advice to clients on optimal data practices that align with their business requirements and project goals.
Training and Empowerment: Educate client teams on the latest technologies and data management strategies, enabling them to efficiently utilise and maintain the solutions we have developed.
Professional Development: Keep up with the latest industry trends and technological advancements, continually upgrading skills and achieving certifications in the technologies Simple Machines implements across its client base.


Ideal Skills and Experience


Core Data Engineering Tools Technologies: Demonstrates proficiency in SQL and Spark, and familiarity with platforms such as Databricks and Snowflake. Well-versed in various storage technologies including AWS S3, Google Cloud BigQuery, Cassandra, MongoDB, Neo4J, and HDFS. Adept in pipeline orchestration tools like AWS Glue, Apache Airflow, and DBT, as well as streaming technologies like Kafka, AWS Kinesis, Google Cloud Pub/Sub, and Azure Event Hubs.
Data Storage Expertise: Knowledgeable in data warehousing technologies like BigQuery, Snowflake, and Databricks, proficient in managing various data storage formats including Parquet, Delta, ORC, Avro, and JSON to optimise data storage and retrieval.
Building and Managing Large-scale Data Systems: Experienced in developing and overseeing large-scale data pipelines and data-intensive applications within production environments.
Data Modelling Expertise: Proficient in data modelling, understanding the implications and trade-offs of various methodologies and approaches.
Infrastructure Configuration for Data Systems: Competent in setting up data system infrastructures, favouring infrastructure-as-code practices using tools such as Terraform and Pulumi.
Programming Languages: Proficient in Python and SQL, with additional experience in programming languages like Java, Scala, GoLang, and Rust considered advantageous.
CI/CD Implementation: Knowledgeable about continuous integration and continuous deployment practices using tools like GitHub Actions and ArgoCD, enhancing software development and quality assurance.
Testing Tools and Frameworks: Experienced with data quality and testing frameworks such as DBT, Great Expectations, and Soda, ensuring the reliability of complex data systems.
Commercial Application of Data Engineering Expertise: Demonstrated experience in applying data engineering skills across various industries and organisations in a commercial context.
Agile Delivery and Project Management: Skilled in agile, scrum, and kanban project delivery methods, ensuring efficient and effective solution development.
Consulting and Advisory Skills: Experienced in a consultancy or professional services setting, offering expert advice and crafting customised solutions that address client needs. Effective in engaging stakeholders and translating business requirements into practical data engineering strategies.


Professional Experience and Qualifications


Professional Experience: At least 8+ years of data engineering or equivalent experience in a commercial, enterprise, or start-up environment. Consulting experience within a technology consultancy or professional services firm is highly beneficial.
Educational Background: Degree or equivalent experience in computer science or a related field.
Right to Work: Must have full New Zealand working rights and reside in Christchurch Read more

Published on 2025/09/10. Modified on 2025/09/10.

Description

Simple Machines NZ – Job Ad – Principal / Senior Data Engineer
Position: Senior to Principal Data Engineer
Location: Christchurch, New Zealand
Simple Machines. Data Engineered to Life™


Simple Machines is a leading independent boutique technology firm with a global presence, including teams in London, Sydney, San Francisco, and New Zealand. We specialise in creating technology solutions at the intersection of data, AI, machine learning, data engineering, and software engineering. Our mission is to help enterprises, technology companies, and governments better connect with and understand their organisations, their people, their customers, and citizens. We are a team of creative engineers and technologists dedicated to unleashing the potential of data in new and impactful ways. We design and build bespoke data platforms and unique software products, create and deploy intelligent systems, and bring engineering expertise to life by transforming data into actionable insights and tangible outcomes.

We engineer data to life™.

Requirements

The Role:


The Senior to Principal Data Engineer at Simple Machines is a dynamic, hands-on role focused on building real-time data pipelines and implementing data mesh architectures to enhance client data interactions. This position blends deep technical expertise in modern data engineering methods with a client-facing consulting approach, enabling clients to effectively manage and utilise their data. Within a team of top-tier engineers, the role involves developing greenfield data solutions that deliver tangible business outcomes across various environments.


Technical Responsibilities


Developing Data Solutions: Implement and enhance data-driven solutions integrating with clients' systems using state-of-the-art tools such as Databricks, Snowflake, Google Cloud, and AWS. Embrace modern data architecture philosophies including data products, data contracts, and data mesh to ensure a decentralized and consumer-oriented approach to data management.
Data Pipeline Development: Develop and optimise high-performance, batch and real-time data pipelines employing advanced streaming technologies like Kafka, and Flink. Utilise workflow orchestration tools such as Dataflow and Airflow.
Database and Storage Optimisation: Optimise and manage a broad array of database technologies, from traditional relational databases (e.g., PostgreSQL, MySQL) to modern NoSQL solutions (e.g., MongoDB, Cassandra). Focus on strategies that enhance data accessibility, integrity, and performance.
Big Data Processing Analytics: Utilise big data frameworks such as Apache Spark and Apache Flink to address challenges associated with large-scale data processing and analysis. These technologies are crucial for managing vast datasets and performing complex data transformations and aggregations.
Cloud Data Management: Implement and oversee cloud-specific data services including AWS Redshift, S3, Google BigQuery, and Google Cloud Storage. Leverage cloud architectures to improve data sharing and interoperability across different business units.
Security and Compliance: Ensure all data practices comply with security policies and regulations, embedding security by design in the data infrastructure. Incorporate tools and methodologies recommended for data security and compliance, ensuring robust protection and governance of data assets.
Consulting Responsibilities


Client Advisory: Provide expert advice to clients on optimal data practices that align with their business requirements and project goals.
Training and Empowerment: Educate client teams on the latest technologies and data management strategies, enabling them to efficiently utilise and maintain the solutions we have developed.
Professional Development: Keep up with the latest industry trends and technological advancements, continually upgrading skills and achieving certifications in the technologies Simple Machines implements across its client base.


Ideal Skills and Experience


Core Data Engineering Tools Technologies: Demonstrates proficiency in SQL and Spark, and familiarity with platforms such as Databricks and Snowflake. Well-versed in various storage technologies including AWS S3, Google Cloud BigQuery, Cassandra, MongoDB, Neo4J, and HDFS. Adept in pipeline orchestration tools like AWS Glue, Apache Airflow, and DBT, as well as streaming technologies like Kafka, AWS Kinesis, Google Cloud Pub/Sub, and Azure Event Hubs.
Data Storage Expertise: Knowledgeable in data warehousing technologies like BigQuery, Snowflake, and Databricks, proficient in managing various data storage formats including Parquet, Delta, ORC, Avro, and JSON to optimise data storage and retrieval.
Building and Managing Large-scale Data Systems: Experienced in developing and overseeing large-scale data pipelines and data-intensive applications within production environments.
Data Modelling Expertise: Proficient in data modelling, understanding the implications and trade-offs of various methodologies and approaches.
Infrastructure Configuration for Data Systems: Competent in setting up data system infrastructures, favouring infrastructure-as-code practices using tools such as Terraform and Pulumi.
Programming Languages: Proficient in Python and SQL, with additional experience in programming languages like Java, Scala, GoLang, and Rust considered advantageous.
CI/CD Implementation: Knowledgeable about continuous integration and continuous deployment practices using tools like GitHub Actions and ArgoCD, enhancing software development and quality assurance.
Testing Tools and Frameworks: Experienced with data quality and testing frameworks such as DBT, Great Expectations, and Soda, ensuring the reliability of complex data systems.
Commercial Application of Data Engineering Expertise: Demonstrated experience in applying data engineering skills across various industries and organisations in a commercial context.
Agile Delivery and Project Management: Skilled in agile, scrum, and kanban project delivery methods, ensuring efficient and effective solution development.
Consulting and Advisory Skills: Experienced in a consultancy or professional services setting, offering expert advice and crafting customised solutions that address client needs. Effective in engaging stakeholders and translating business requirements into practical data engineering strategies.


Professional Experience and Qualifications


Professional Experience: At least 8+ years of data engineering or equivalent experience in a commercial, enterprise, or start-up environment. Consulting experience within a technology consultancy or professional services firm is highly beneficial.
Educational Background: Degree or equivalent experience in computer science or a related field.
Right to Work: Must have full New Zealand working rights and reside in Christchurch
Simple Machines
Simple Machines
5440 active listings

Recently viewed

Job Mitra Job Mitra 1 month
Security Officer |Ex|Army| Defence|
Check with seller
Security Officer |Ex|Army| Defence|
Description : 1. Overall responsibility to ensure the safety and security of Man, Machine Factory Premise of Plant Building/or any kind of company property. 2. To efficiently supervise and manage the Security Personnel during his tenure. 3. To train the Security Personnel in all security drills and security procedures laid down. 4. To act as Liaison between ...
1 month Security Guard Jobs views
Check with seller
City of Greeley City of Greeley 1 month
Mechanic Jobs 1 month
Maintenance Technician|Water Systems Tech I | II | Construction Crew
Check with seller
Maintenance Technician|Water Systems Tech I | II | Construction Crew
Salary Ranges JOB DESCRIPTION Maintenance Tech I: $1+.57 - $25.48 Hourly Anticipated Hiring Range: $1+.57 - $22.52 Hourly Water Systems Tech I: $21.54 - $28.03 Hourly Anticipated Hiring Range: $21.54 - $24.78 Hourly Water Systems Tech II: $23.70 - $30.82 Hourly Anticipated Hiring Range: $23.70 - $27.26 Hourly • Any offer will be based upon qualifications at ...
1 month Mechanic Jobs views
Check with seller
Are you a professional Recruiter? Create an account