Robert Half is supporting a Global Consulting Firm in recruiting a Data Warehouse Developer / Data Engineer for a critical role within a major data & analytics programme in the insurance sector.
This project involves designing and implementing high-performing, scalable data warehouse solutions to enhance business intelligence and analytics capabilities across the organisation.
The ideal candidate will possess strong experience with the Kimball methodology, dimensional modelling, SQL, hands-on experience with Azure Data Factory, Fabric, and Power BI.
Assignment Details: Location: Remote with 1-2 days on-site in Leeds Duration: Initial 6-month contract Day Rate: £425 per day via an FCSA Accredited umbrella company Start Date: Immediate, with 1-2 weeks for onboarding and setup Experience Required: Extensive experience in data warehousing and data engineering, particularly using the Kimball methodology and dimensional modelling to create efficient data structures for reporting and analytics.
Advanced SQL skills and expertise in ETL processes, with a focus on high performance, scalability, and data reliability.
Hands-on experience with Azure Data Factory , Fabric , and Power BI for end-to-end data integration, transformation, and visualisation, ideally within insurance or financial environments.
Proven ability to ensure data accuracy, integrity, and consistency across complex data environments.
Strong collaboration skills with a track record of effective engagement with business analysts, data engineers, and other key stakeholders.
Key Responsibilities: Data Warehouse Design & Development: Develop and implement data warehouse solutions using the Kimball methodology, focusing on dimensional data models to meet reporting and analytics needs.
ETL Processes: Design, optimise, and manage ETL workflows, leveraging Azure Data Factory for high-performance data integration and transformation.
Data Visualisation with Power BI: Create intuitive and insightful data visualisations in Power BI, enabling stakeholders to derive actionable insights from the data warehouse.
Data Quality & Performance: Ensure data accuracy and consistency within the data warehouse to enable reliable business intelligence efforts.
Documentation: Create and maintain comprehensive documentation for warehouse architecture, ETL processes, and best practices to ensure continuity and ease of understanding.
Stakeholder Collaboration: Engage closely with business analysts, data engineers, and other stakeholders to gather data requirements and deliver tailored, effective solutions.
Important Note: Candidates will undergo comprehensive financial and criminal background checks, which may take up to two weeks to complete.