Amach is an industry-leading technology driven company with headquarters located in Dublin and remote teams in UK and Europe.
Our blended teams of local and nearshore talent are optimised to deliver high quality and collaborative solutions.
Established in 2013, we specialise in cloud migration and development, digital transformation including agile software development, DevOps, automation, data and machine learning… We are looking for a highly experienced Data Architect to design and implement cutting-edge cloud data solutions for our customer.
The ideal candidate will have strong expertise in AWS Data Tools, SQLMesh, Terraform, Snowflake and Tableau, alongside a proven ability to design scalable data infrastructures that support advanced analytics and reporting.
You will provide technical leadership, guide the engineering team and collaborate with stakeholders to ensure data strategies align with long-term business objectives.
Strong skills in data security, Agile methodologies, and translating complex technical concepts into business language are essential for success in this role.
Please note the successful candidate is expected to work from our customer's office in Warrington from time to time.
Required skills: Experience in designing and implementing leading edge on premise and in Cloud data solutions Experience working closely with the client and the delivery team to develop strategies and roadmaps that deliver client needs and requirements Designing architecture solutions that are in line with long-term business objectives Experience around data security Excellent knowledge of AWS Data Tools, SQLMesh, Terraform, Snowflake and Tableau Designing a data infrastructure that supports complex data analytics, reporting and visualisation services Providing technical leadership and direction to the engineering teams Building effective relationships with senior technical staff so that there is a common understanding of goals and challenges Meeting with clients or executive team members to engage in architectural and requirement analysis discussions Creating documentation and diagrams that show key data entities and creating an inventory of the data needed to implement solutions Helping to maintain the integrity and security of data assets Relevant 3rd level qualification with a strong technical focus Excellent knowledge and proven experience of working with IT Software Development Lifecycle methodologies with particular focus on Agile as the de facto methodology Experience working with Agile teams Experience in working with third party suppliers in the delivery of business or IT change initiatives – including experience of working with remote and co-located teams and vendors Experience leading projects based on legacy technologies in an organisation A strong understanding of best practices, tools and techniques for delivery management with ability to continuously improve these processes in an agile delivery organisation Ability to translate technical to business speak and sometimes vice versa Key responsibilities & duties include: Assembling large, complex sets of data that meet non-functional and functional business requirements Identifying, designing and implementing internal process improvements including re-designing infrastructure for greater scalability, optimising data delivery and automating manual processes Translating business requirements into technical specifications, including data streams, integrations, transformations, databases and data warehouses Defining the data architecture framework, standards and principles, including modelling, metadata, security, reference data and master data Defining reference architecture, which is a pattern others can follow to create and improve data systems Defining data flows, i.e., which parts of the organisation generate data, which require data to function, how data flows are managed and how data changes in transition Collaborating and coordinating with multiple departments, stakeholders, partners and external vendors Desirable Skills: Experience in large enterprise data warehouse Ability to build and optimise data sets, 'big data' pipelines and architectures Ability to perform root cause analysis on external and internal processes and data to identify opportunities for improvement and answer questions Excellent analytic skills associated with working on unstructured datasets Ability to build processes that support data transformation, workload management, data structures, dependency and metadata Knowledge of ODBC and Java Experience with Data warehousing, Cubes and emerging EPP/MPP data designs Experience with Snowflake and AWS Data system preferable AWS Cloud Practitioner, Big Data Specialist, Tableau Professional or other similar certifications desired Act as an influencer to help the existing team grow into modern modelling and reporting methodologies Data Security