Principal Data Engineer

Details of the offer

Principal Data Engineer Why Join Capco? Capco is a global technology and business consultancy, focused on the financial services sector. We are passionate about helping our clients succeed in an ever-changing industry.

You will work on engaging projects with some of the largest banks in the world, on projects that will transform the financial services industry.

We are/have: Experts across the Capital Markets, Insurance, Payments, Retail Banking and Wealth & Asset Management domains. Deep knowledge in various financial services offerings including Finance, Risk and Compliance, Financial Crime, Core Banking etc. Committed to growing our business and hiring the best talent to help us get there. Focused on maintaining our nimble, agile and entrepreneurial culture. Why Join Capco as a Data Engineer? You will work on engaging projects with some of the largest banks in the world, on projects that will transform the financial services industry. You'll be part of a digital engineering team that develops new and enhances existing financial and data solutions, having the opportunity to work on exciting greenfield projects as well as on established Tier1 bank applications adopted by millions of users. You'll be involved in digital and data transformation processes through a continuous delivery model. You will work on automating and optimizing data engineering processes, develop robust and fault-tolerant data solutions and enhance security standards both on cloud and on-premise deployments. You'll be able to work across different data, cloud and messaging technology stacks. You'll have an opportunity to learn and work with specialized data and cloud technologies to widen your skill set. As a Principal Data Engineer at Capco you will/have: Demonstrate practical experience of engineering best practices, while being obsessed with continuous improvement. Have expertise in a set of the team's domains, including the breadth of services, how they interact, and data flows between systems. Able to work individually or with teams drawing on experience to recommend tooling and solutions aligning with organizational strategies. Influences organization-wide testing strategy. Architect services and systems using well-accepted design patterns to allow for iterative, autonomous development and future scaling. Guides teams in anticipation of future use cases and helps them make design decisions that minimize the cost of future changes. Actively contribute to security designs based on the organization's security strategy. Foster a security-first mindset across teams, and lead by example. Have advanced knowledge of key security technologies, protocols & techniques (e.g. TLS, OAuth, Encryption, Networks). Be comfortable managing engineers ensuring they are tracking the team's efficiency and quality of work, assisting in regularly adjusting processes and timelines to ensure high-quality work is delivered. Have personally made valuable contributions to products, solutions and teams and can articulate the value to customers. Have played a role in the delivery of critical business applications and ideally customer-facing applications. Have the ability to communicate complex ideas to non-experts with eloquence and confidence. Have an awareness and understanding of new technologies being used in finance and other industries and loves to experiment. Have a passion for being part of the engineering team that is forming the future of finance. Skills & Expertise: Essentials Event Streaming

Able to build near real-time data streaming pipelines using technologies such as Kafka, Kafka Connect, Spark Streaming, Google PubSub, Knowledge and experience of using Change Data Capture and associated technologies. Experience with one or more of the following technologies: Apache Flink, Apache Beam, Apache Storm, Spark Streaming and KStreams. Databases

Hands-on experience with schema design using semi-structured and structured data. Experienced in data modeling and data warehouse design. Strong experience in SQL, RDBMS Databases and NoSQL Databases with a good understanding of the differences and trade-offs between both. Hands-on experience building both ETL and ELT based solutions, Experience with using low-code node-code ETL platforms. Previous experience in cloud migration projects with exposure to data lake formation and data warehousing on the cloud and shifting data from on-premise to CSP databases such as Big Query, Redshift and Snowflake. Big Data

Experience in traditional Big Data Technologies such as Hadoop, HIVE, Spark, Pig, SQOOP, Flume, Spark, Cloudera, Airflow, Oozie. Development Languages

Solid development experience using Python, Scala and Java. Desirable Cloud Environments

Strong cloud provider's experience on GCP and with exposure to AWS and Azure. DevOps

Experience using version control tool such as GIT. Experience in design, build and maintain CI/CD Pipelines on Jenkins, CircleCI. Exposure and or experience in using Ansible. Experience in using building observability using tools such as Prometheus, Grafana, Elastic, Splunk…. Infrastructure

Experience of using IAC to deploy data pipeline infrastructure to cloud environments with tools such as Terraform. Exposure to DevOps or DataOps, with experience of productionizing data pipelines which can handle high availability and disaster scenarios. Experience building the supporting orchestration, monitoring and alerting features which enable robust and reliable data pipelines. Has a good understanding of high availability and disaster recovery. We offer: A work culture focused on innovation and building lasting value for our clients and employees. Ongoing learning opportunities to help you acquire new skills or deepen existing expertise. A flat, non-hierarchical structure that will enable you to work with senior partners and directly with clients. A diverse, inclusive, meritocratic culture. Enhanced and competitive family-friendly benefits, including maternity/adoption/shared parental leave and paid leave for sickness, pregnancy loss, fertility treatment, menopause, and bereavement. Joining Capco means joining an organization that is committed to an inclusive working environment where you're encouraged to #BeYourselfAtWork. We celebrate individuality and recognize that diversity and inclusion, in all forms, is critical to success. It's important to us that we recruit and develop as diverse a range of talent as we can, and we believe that everyone brings something different to the table – so we'd love to know what makes you different. Such differences may mean we need to make changes to our process to allow you the best possible platform to succeed, and we are happy to cater to any reasonable adjustments you may require. You will find the section to let us know of these at the bottom of your application form or you can mention it directly to your recruiter at any stage and they will be happy to help.

#J-18808-Ljbffr


Nominal Salary: To be agreed

Source: Jobleads

Requirements

Power Bi Analyst, Remote (Uk), £50K

Power BI Analyst Role, Work From Home, £50k Pearson Carter is currently collaborating with a prominent global industrial company looking to recruit a Power B...


Pearson Carter - England

Published a month ago

Senior Backend Engineer

For more information on what we do and (more importantly) why we do it, head over to our blog and check out our  rebranding to Goodstack If you've been looki...


Goodstack - England

Published 19 days ago

Fscs Analyst

Investec is a distinctive Specialist Bank serving clients principally in the UK and South Africa. Our culture gives us our edge: we work hard to find colleag...


Investec - England

Published 7 days ago

Senior Full-Stack Engineer

Qualifications Bachelor's or Master's degree in Computer Science or similar (PhDs will be given preference) 7+ years of experience in full-stack development;...


Scopeworker - England

Published 7 days ago

Built at: 2024-11-22T10:33:12.798Z