Who we are

Sustainability software specialist, AMCS, is headquartered in Ireland, with offices in Europe, the USA, and Australasia. With over 1,300 highly skilled employees across 22 countries, we specialize in delivering technology solutions to facilitate a carbon neutral future.


What we do

Our innovative SaaS solutions increase efficiency and boost sustainability in resource-intensive industries. Over 5,000 customers across 23 countries already benefit from our Performance Sustainability software, ensuring we deliver practical solutions for improved profitability and environmental resilience across the globe.


The role

We are seeking a highly skilled and motivated Data Engineering Manager to join our dynamic engineering team. Working in an Agile environment, you will manage teams of Data Engineers and Data Scientists working on our next generation Unified Data Platform.

This is an opportunity to contribute to the design and build out of our Data platform using the latest technology advancements and methodologies. As well as managing the teams and delivery of the platform, you will also collaborate with Data Architects and Product Managers to deliver Enterprise ready solutions.

A strong work ethic with a can-do attitude is expected, alongside a culture of friendly collaboration and teamwork. You will encourage continuous improvement and high performing teams. You will care about KPI’s and monitor individual & team performance. You will have a keen interest in doing things the right way and for taking on new challenges. 

The ideal candidate will have a strong Data Engineering background combined with a deep understanding of how Data Platforms really work. As a Data Engineering Manager, you will not only manage and guide our engineers but also participate in architectural and key decision-making forums regarding choice of technologies, design approach and Product roadmap. You will maintain a strong focus on the reliability and performance of our systems centred around a positive customer experience.


Key Responsibilities

  • Possess a good understanding of cloud platforms and their related Data offerings to be able to contribute to the design and lead the implementation of scalable, resilient data architectures across Azure Fabric and GCP

  • Oversee end-to-end Data engineering projects, managing timelines, deliverables, and stakeholder communications

  • Collaborate with product & operations teams to align data strategies with evolving business goals

  • Ensure good quality code with high levels of code coverage that follow best Engineering practices

  • Establish and enforce data quality, integrity, and security standards across all pipelines and storage layers

  • Mentor, coach, and grow a high-performing team of data engineers, fostering best practices in code review, testing, and documentation

  • Evaluate evolving data technologies (e.g., Fabric, BigQuery, etc.) and integrate new tools to optimize performance and cost

  • Define and track key metrics for pipeline health, data freshness, and system performance—driving continuous improvement

  • Partner with Architecture, DevOps and Platform Engineering teams to automate deployments, enforce governance, and ensure compliance with regulatory requirements

  • Ensure data security, privacy, and compliance with all applicable policies and standards.


Requirements

  • Bachelor’s or master’s degree in data engineering, Computer Science or a related field or equivalent practical experience.

  • 8+ years of experience in a Data Engineering or related role with at least 2 of those years involving management of team(s).

  • Strong understanding of distributed systems, data modelling, and cloud-native architectures.

  • Good knowledge and understanding of cloud platforms (Azure, GCP) with proven expertise in the Fabric offering and/or BigQuery.

  • Proficiency with tooling for ETL (Data Factory, Dataflow), Analysis and BI/Reporting (PowerBi, Looker Studio, etc.).

  • Experience of implementing Datamarts and Data warehouses (SQL, BigQuery, Snowflake)

  • Knowledge of Data and Event streaming tooling (Kafka, Google Cloud Pub/Sub, etc.)

  • Use of Development practices & Data Ops for streamlined automated CI/CD pipelines that improve the quality and reliability of deliverables.

  • Familiarity with monitoring and logging tools like Prometheus, Grafana and DataDog.

  • A keen interest in AI technologies and tooling and how they can be leveraged by teams to accelerate Delivery and value to the business and our customers.

  • Excellent leadership, communication, and stakeholder management skills


Apply for position now